The Beard Sage

  • Machine Learning
  • Computer Architecture
  • Theoretical Computer Science
  • About Me

Post Category → Machine Learning

Inspecting word2vec Matrices

posted in Machine Learning on March 23, 2020 by TheBeard 0 Comments

A detailed look inside the weight matrices of the word2vec model. Both CBOW and SkipGram models are discussed

Continue reading →

N-gram Language Models

posted in Machine Learning on March 20, 2020 by TheBeard 0 Comments

The concept of language models as a sequence of words is explored via probabilistic interpretations.

Continue reading →

Maximum Likelihood Estimation – finding the best Parametric Model

posted in Machine Learning on March 12, 2020 by TheBeard 0 Comments

An intuitive explanation of what likelihood is and how maximizing it gives the best parametric model that fits the data. Few examples are also worked out in detail

Continue reading →

Gradient Descent for a Single Artificial Neuron

posted in Machine Learning on March 9, 2020 by TheBeard 0 Comments

This article explore the derivation of the gradient descent algorithm for a single artificial neuron for the logistic activation function. Few properties of the logistic function are also discussed.

Continue reading →

Primer on Lambda Calculus

posted in Machine Learning on March 6, 2020 by TheBeard 0 Comments

The basic concepts of Lambda Calculus are discussed including functions, applications, the scope of free and bound variables and the order of evaluation. Few examples are worked out.

Continue reading →

Hidden Markov Models Training – The Forward-Backward Algorithm

posted in Machine Learning on March 5, 2020 by TheBeard 0 Comments

This article discusses the problem of learning the HMM parameters given an observation sequence and the set of possible states. The concept of backward Probability is defined and an iterative algorithm for learning is presented. Derivations and diagrams are sketched out.

Continue reading →

Hidden Markov Models Decoding – The Viterbi Algorithm

posted in Machine Learning on March 2, 2020 by TheBeard 0 Comments

This article discusses the problem of decoding – finding the most probable sequence of states that produced a sequence of observations. A dynamic programming approach is presented. Derivations and diagrams are sketched out and time complexity is analyzed.

Continue reading →

Short Primer on Probability

posted in Machine Learning on February 26, 2020 by TheBeard 2 Comments

A primer on probability explaining the concepts of random variables, joint probabilities, marginalization, conditional probabilities, Bayes Rule, probabilistic inference and conditional independence. Examples and formulae included.

Continue reading →

Hidden Markov Models Likelihood Computation – The Forward Algorithm

posted in Machine Learning on February 12, 2020 by TheBeard 0 Comments

This article computes the probability of an observation being output by a given Hidden Markov Model. The brute force method is discussed, followed by a dynamic programming optimization. Derivations and diagrams are sketched out and time complexity is analyzed.

Continue reading →

d-Separation through examples

posted in Machine Learning on February 9, 2020 by TheBeard 0 Comments

This article presents a procedure to check whether two nodes in a Bayesian Network are conditionally independent from each other using the idea of active triples within the path connecting the nodes.

Continue reading →
← Older posts
Copyright (C) 2021. All rights reserved.