Writes about Data Science, Deep Learning, and Programming (https://pr2tik1.github.io)

Computer Vision, Deep Learning

An introduction to CNN with an easily accessible dataset for beginners in deep learning.

Source

Introduction 🍵

The MNIST dataset is the most overused dataset for getting started with image classification. MNIST dataset comprising of 10-class handwritten digits introduced by Yann LeCun in 1998 come up over and over again, in scientific papers, blog posts, and so on. It contains 28×28 (also 32x32) grayscale images of handwritten digits, each with integers between 0 and 9.

The reason MNIST is so popular has to do with its size, allowing deep learning practitioners to quickly check, train, and publish their algorithms. There are certain variations and limitations of MNIST.

The main objective is to come up with a fresh dataset to understand Image Classification using CNNs. …


Analyzing developer and programming trends on survey data using python.

Photo by Sincerely Media on Unsplash

Programmers fight a battle to keep their skills relevant. Each year brings new methodologies, frameworks, and languages to learn. Within the context of a changing industry, it’s important to find out which skills, tools, and trends are worthy of time.

At the time of writing this post, the world is suffering from the COVID-19 pandemic. The pandemic has a considerable impact on everyone’s life. Many have managed to work using various remote tools and technologies. But jobs of many are affected due to this epidemic. Yet, people are following their responsibilities and are now aware of the consequences. …


Deep Learning

Understanding the concepts behind neural networks.

Photo by Josh Hild on Unsplash

Introduction ☕

Our brain is one of the wonders of the world that we are still trying to understand. We have been curious about its functioning and complexity for a long time. At present we are able to understand it to some extent, but there is a long way to go! On the quest to understand more about the brain and it’s wonderful capabilities we were inspired to develop something that we today call — Artificial Intelligence.

“Our intelligence is what makes us human, and A.I. is an extension of that quality.” — Yann LeCun Professor, New York University

We will be going from basics and covering all parts needed to understand neural networks in depth. This part of the series contains a brief introduction to common terms that you may have come across like, deep learning, machine learning, artificial neural networks, etc. …


Inside AI

Exploring the underlying concepts in the effectiveness of Neural Networks.

Photo by Brooke Lark on Unsplash

Where it all started?

It all started with the idea of understanding how the brain actually works. Back in the 1940s McCulloch and Pitts introduced the Neuron[1] and in 1950s the first Perceptron by Frank Rosenbolt was introduced[2]. The neural networks were with us since the 1940s but, the field faced ups and downs due to lack of practical implementation. The recent growth in the practice of deep learning techniques involving a variety of neural network architectures is because of two major advances, first, computation power (High-performance CPUs and GPUs), and second, amount of data available.

Geoff Hinton and two of his graduate students showed that how one could take a very large dataset called ImageNet, with 10,000 categories and 10 million images, and reduce the classification error by 20 percent using deep learning. This happened back in 2012 at the NIPS meeting which was remarkable as Terrence Sejnowski


Explore new ways of showing your GitHub resume as a developer or opensource contributor

Photo by Ash Edmonds on Unsplash

Create a README markdown file that is your new opensource resume/CV !

I came accross a very interesting feature that Github launched, which is the readme of your profile. In this readme, you can use markdown and let your imagination run wild. Since you can do many things with the markdown, the sky is the limit!

Recently, GitHub introduced a new feature within a developer’s repository. The user can now put on their details in a special repository within a README file.


Inside AI

An introduction to one of the Survival Analysis techniques.

Photo by Kaleidico on Unsplash

What is Survival Analysis?

It is a set of statistical operations for data analysis for which the outcome variable of interest is time until an event occurs. The events could be death, disease incidence, customer churn, recovery, etc.

It is used to estimate the lifespan of a particular population under study.

It is also called ‘Time to Event’ analysis as the goal is to estimate the time for an individual or a group of individuals to experience an event of interest. Survival analysis is used to compare groups when time is an important factor. Other tests, like simple linear regression, can compare groups but those methods do not factor in time. …


In-Depth Analysis

An overview of data processing, data cleaning, and exploratory data analysis steps in a data science workflow.

Photo by Mika Baumeister on Unsplash

Data Science, Machine Learning and Deep Learning are the recent hot topics of research and development. These are not new but have currently gained high importance and high attention due to advancements in tools, technology, and computing power of systems. But the main part is the data. Most of the time goes in data wrangling, feature extraction, data cleansing, etc. You may have heard,

About 80 percent time goes in finding, cleaning, and reorganizing huge amounts of data and rest 20 percent for the actual data analysis or other purposes.

Hence the time spent with data is important. As more time is given to build something, it yields better results. To know more about data and to reach insightful data analysis, we can divide the case study into the following…

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store