From my experience, I have realized that the evaluation of a model requires more focus and effort than the model itself. The more complex your model is, so should be its evaluation metrics. Models should be regularly monitored to capture the drift in data, features, concept and model performances.

There should a robust Model evaluation framework that should test the assumption (if any) in the objective function or model and also evaluate the input and output. A typical Model evaluation framework can be seen below-

I remember an interview I gave few years back, where I was explaining a neural net classifier to the interviewer. He strongly believed only in Logistic Regression and hence it was tough for me. One of his question was ‘how well calibrated these models are?’, for which I didn’t have an answer then. Accuracy is not the only factor but calibration is equally important in giving confidence of the system. Calibration becomes critical in models used in medical image analysis and autonomous driving.

Calibration is a procedure in statistical classification to determine class membership probabilities which assess the uncertainty of…

Word embeddings are the representation of words in a numeric format, which can be understood by a computer. Simplest example would be (Yes, No) represented as (1, 0). But when we are dealing with large texts and corpora, this may not be the efficient way to represent words and sentences. For large corpora, the co-occurrences of words and its probabilities play a major role.

Let’s explore some techniques of word representations…

In one-hot encoding, each word in a sentence is represented by a vector.

For e.g. consider the sentence ‘I love dogs.’ …

In “The Big Bang Theory” TV show S12:E16, Leonard created a whole network of celebrities who are likely to play Dungeons and Dragons with Whil Wheaton. He uses Graph Theory, a branch of Mathematics to map social networks of Whil Wheaton by using his twitter and other online media networks.

Graphs are mathematical structures used to study complex relational data. It is widely used in social network models, recommendation systems, Google maps, genomics etc. With the Big data era, latest storage spaces like Neo4j and advanced algorithm such as GNN, GraphEDM under the field of study Geometric Deep Learning, Graph…

Topic modelling is a statistical technique used to extract specific topic is a given collection of documents. LDA is one of the most prominent and widely used topic model.

Let us start with its definition as per the research paper and then move on to each components in detail along with its comparison to previous papers.

From the name we can infer that it has something to do with latent variables (derived variable in a dataset) and Dirichlet distribution. (https://www.statisticshowto.com/dirichlet-distribution/)

*Latent Dirichlet allocation (LDA) is a generative probabilistic model of a corpus. The basic idea is that documents are represented…*

With the launch of ELECTRA, most likely we are set for another revolution in the NLP and NLU tasks and we are all looking forward to it. The Google AI blog of March2020 has an elaborate details regarding ELECTRA. However, I thought give a comparative study of the model architecture of both in this blogpost.

It is inevitable to start without discussing **the transformers**, which initiated the transformation from RNN and LSTM’s for sequence modeling and transduction problems such as translations and language modeling. The transformers have eliminated the sequential nature and have introduced parallelism by relying entirely on **attention**…

As the name suggests Regularized Greedy Forest is a forest, greedy and regularized :-). Let’s delve into more details.

Decision tree is one of most commonly used technique for classification and also regression problems. Decision tree does not make any assumptions of the underlying variables and hence this is the simplest algorithm to deal with both linear and non-linear classifications. They have root node, internal node and terminal node or leaves. Decision trees makes decision based on entropy or Gini impurity at the nodes and we calculate its accuracy based on misclassification rate.

Although decision trees are easy to interpret…

Reinforcement Learning (RL) is a machine learning technique which is in focus after supervised and unsupervised learning. Deep Reinforcement Learning focuses on study using neural nets. In this article I will try to explain the mathematical concepts of RL in a very simplified manner.

What is RL and how is it different from supervised and unsupervised machine learning problems?

Supervised learning — we have a **structured data with labelled target variable** and we **predict the target** for a similar unlabeled structured dataset. Classification and regression are examples.

Unsupervised learning — we have an **unlabeled and unstructured dataset** and we **learn…**

With Quantum computing booming the internet and giants investing heavily in quantum AI, we are eager to know more about. I am sure in this quest, we all came across Quantum mechanics, complex numbers, Hilbert spaces, qubits etc. and here will discuss more about this.

Quantum computing leverages on quantum mechanical phenomena of superposition and entanglement to create states to enable complex computations effectively.

In the below sections I have detailed the mathematical elements, however the knowledge of linear algebra would be helpful in understanding Quantum computing.

Let’s starts with Quantum mechanics postulates in first section and the latter sections…

Let me start by giving an idea where logistic regression falls in the big picture of machine learning.

For centuries, we have been using analytics as a powerful tool in analyzing and determining the strength and relationship between the certain variable of our interest and other variables in the business. And in the past few years analytics has developed leaps and bounds to identify speech and images as well. Powerful software along with advanced mathematics has paved way for this development.

Robust analytics framework includes three distinct steps:

I. **Descriptive Analytics**- Aggregates and summarizes the data for meaningful insights.

II…