Sign in

Aspiring Data Scientist | Enthusiastic ML practitioner | Fellow at IIT Kanpur | Drama Lover | Subscribe

Tools for the cops who investigate data and extract information and trends from it.

Green Colour represents new beginnings and growth. It also signifies renewal and abundance.

Data Visualization plays a very important role in Data mining. Various data scientist spent their time exploring data through visualization. To accelerate this process we need to have a well-documentation of all the plots.

Even plenty of resources can’t be transformed into valuable goods without planning and architecture. Therefore I hope this article would provide you a good architecture of all plots and their documentation.


  1. Introduction
  2. Know your Data
  3. Distribution Plots
    a. Dist-Plot
    b. Joint Plot
    c. Pair Plot
    d. Rug Plot
  4. Categorical Plots
    a. Bar Plot
    b. Count Plot
    c. Box Plot
    d. Violin Plot
  5. Advanced Plots a. Strip…

Experience of IIT Kanpur, one of the prestigious colleges in India.


Brief Introduction to my Background

I am a final year undergraduate at the Indian Institute of Technology, Kanpur, in the Department of Mechanical Engineering and Minors in the Department of Industrial Engineering and Management.

You may find it interesting that belonging to a core field, how I land a job as a Data Scientist.

In the campus placement season (Dec 2020), I got placed as a Data Scientist at HiLabs. HiLabs has a healthcare-focused AI solution that automatically detects data errors without human intervention. It is a combination of Big Data, AI, and medical cosmologies.

How I landed there?

The story behind how I landed as a Data Scientist…

These five obstacles may occur when you train a linear regression model on your data set.

Let's go from Yellow, the color of danger to Yellow, the color of sunshine, and happiness. (Photo by Casey Thiebeau on Unsplash)

Linear Regression is one of the most trivial machine algorithms. Interpretability and easy-to-train traits make this algorithm the first steps in Machine Learning. Being a little less complicated, Linear Regression acts as one of the fundamental concepts in understanding higher and complex algorithms.

To know what linear regression is? How we train it? How we obtain the best fit line? How we interpret it? And how we access the accuracy of fit, you may visit the following article.

After understanding the basic intuition of Linear regression, certain concepts make it more fascinating and more fun. These also provide a deep…

Relative Order Test for testing the existence of a Trend in a Time series

Time passes faster for your face than for your feet (assuming you’re standing up). Einstein’s theory of relativity dictates that the closer you are to the center of the Earth, the slower time goes — and this has been measured. At the top of Mount Everest, a year would be about 15 microseconds shorter than at sea level. (Photo by Nathan Dumlao on Unsplash)

A time series comprises four major components. A trend. A seasonal component. A cyclic component. And a stochastic/ random component.

You can have a recap of all the basics of a time series from my following article.

We extract all these components and analyze them to get information from a time series. There are lots of standard methods to extract the components from a time series.

But all these components may air may not be present in a time serious altogether. Therefore, before estimating these components, we need to first check for their existence. …

DR is one of the most critical steps of the predictive modeling problem. The world is generating a large amount of data with large dimensions. Hence it is crucial to optimize the dimensional space of the data.

Pink is a light red hue and is typically associated with love and romance. People associate the color with qualities that are often thought of as feminine, such as softness, kindness, nurturance, and compassion (Photo by Isi Parente on Unsplash)

What is Dimensionality Reduction (DR)?

Suppose you want to solve a predictive modeling problem, and for the same, you start to collect data. You would never know what exact features you want and how much data is needed. Hence, you go for the upper limit, and you collect all possible features and observations.

Consequently, you realize that you have collected a large amount of data. And, these extra features are intensifying the noise and time.

  1. Noise: There may be some feature, which model find irrelevant. Hence they are just adding noise to the model.
  2. Time: The time I am talking about is computational time. For…

Time Series (TS) is considered to be one of the less known skills in the data science space. This article is a self-starter to the concepts in TS and a lot more coming.

Photo by Curtis MacNewton on Unsplash

From the point of time we came to know that data contains trends and we can extract knowledge from it, we started collecting it. In some instances, we try to generate trends from data where the time is not so large. Hence we do not find any trend concerning time.

But now, after decades of data collection, we can find at least some patterns with respect time and this is called a Time Series analysis.

What is a Time Series?

A series of observations recorded sequentially over a while i.e. a collection of observations recorded along with the timestamp is called a Time series.


This is an introduction to the young and fast-growing field of data mining (also known as knowledge discovery from data, or KDD for short). It focuses on fundamental data mining concepts and techniques for discovering interesting patterns from data in various applications.

Source: Pixabay

The world we that we see today have automated data collection tools, databases systems, world wide web, and computerized society. This results in an explosive growth in data, from terabytes to petabytes.

We are drowning in the ocean of data but starving for knowledge.

A huge velocity, volume, and variety of data are what our new age has provided us. We have cheaper technology, mobile computing, social networking, Cloud computing which has evoked this data storm.

These are the reasons why conventional methods fade away and we need some novel methods like Data mining to process the new era of…

A complete theoretical guide to probability and concepts required for data science and machine learning.

Photo by Javier Allegue Barros on Unsplash

The first question that comes to my mind is that why is probability even necessary to learn machine learning and data science? After some web searching, I came to some important conclusions about why probability is vital.

Why Probability?

Probability is used several times in predictive circumstances. Observing this will help us to understand why probability is indispensable.

  1. Classification Problem: A classification problem requires us to predict the probability that the input example belongs to a particular class. Whether it is an image classification or object detection, we predict the probability of the input belonging to each class.
  2. Models based on Probability…

A complete guide to exploratory data analysis

Kaprekar constant, or 6174, is a constant that arises when we take a 4-digit integer, form the largest and smallest numbers from its digits, and then subtract these two numbers. Continuing with this process of forming and subtracting, we will always arrive at the number 6174. (Photo by Morgan Housel on Unsplash)

Most data analysis problems start with understanding the data. It is the most crucial and complicated step. This step also affects the further decisions that we make in a predictive modeling problem, one of which is what algorithm we are going to choose for a problem.

In this article, we will see a complete tough guide for such a problem.


  1. Reading Data
  2. Variable Identification
  3. Univariate analysis
  4. Bivariate analysis
  5. Missing values- types and analysis
  6. Outlier treatment
  7. Variable Transformation

Reading data and Variable Identification

Reading the data infers getting the answers to the following questions

  • What is the shape of my data?
  • How many features does…

These ways will take your deep learning application to the next level of accuracy.

Photo by Yusuf Evli on Unsplash

The new era of machine learning and artificial intelligence is the Deep learning era. It not only has immeasurable accuracy but also a huge hunger for data. Employing neural nets, functions with more exceeding complexity can be mapped on given data points.

But there are a few very precise things which make the experience with neural networks more incredible and perceiving.

Xavier Initialization

Let us assume that we have trained a huge neural network. For simplicity, the constant term is zero and the activation function is identity.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store