## Precision-Recall Curve for Classification model analysis

Accuracy is something which gives an intuition of model performance. i.e. ratio of number of correct predictions with respect to total sample present. But what in case of unbalanced data.Imagine a case where we need to make a model to predict click through rate over a display rate. The click trough rate used to be […]

## Image Data Augmentation. How and Why?

If we have limited amount of Data, We can diversify it using data augmentation. It is like instead of collecting new data elements we just transform which is already there to increase the sample size along with diversity. We will consider unstructured data i.e. Image data for augmentation process. Editorial Team

## Quick Pandas function and Attributes

Pandas is one of the powerful library used in python for data science and analysis. It has n-number of functions, methods and attributes, which are comparatively easy in syntax and flexible in nature. So a data scientist or any one who wants certain insights from any huge set of data prefers it and let their […]

## Dimensionality Reduction – (PCA)

Variations in the dataset is actually the information from the dataset and this is what the PCA uses. In simple terms PCA or Principal component analysis is a process to emphasise variations in a data set and generate strong pattern out of it. We can figure out the whole concepts in 3 points as follows— […]

## Mean, Variance, Standard Deviation, Standard Score, Covariance & Data Projection

Variance – It is the measure of squared difference from the Mean. To calculate it we follow certain steps mentioned below: Calculate average of numbers For each numbers subtract the mean and square the result Calculate the average of those squared differences i.e. Variance Editorial Team

## Eigenvector represents greatest variance in case of PCA

In case of Principal Component Analysis we project our data points on a vector in a direction of maximum variance to decrease the number of existing components. In this case we consider the direction eigenvector generated using covariance matrix as the direction of maximum variance. In this article we look into the proof of why […]

## Eigen Vector and Eigen Values

Eigen vector is the direction in a coordinate space defined by a metrics which doesn’t change its direction with metrics transformation. Eigen value is a scaler number which is multiplied with Eigen vector to give same result as Eigen vector multiplier with existing metrics. Editorial Team