Call : (+91) 968636 4243
Mail :

Machine Learning Foundations

( Duration: 3 Days )

This Machine Learning Foundations training course focuses on the mathematics and algorithms used in Data Science. You will learn core skills and explore machine learning algorithms along with their practical application and limitations. With this knowledge, you will build the intuition necessary to solve complex machine learning problems.

By attending Machine Learning Foundations workshop, delegates will learn:

  • Core machine learning mathematics and statistics
  • Supervised Learning vs. Unsupervised Learning
  • Classification Algorithms including Support Vector Machines, Discriminant Analysis, Naïve Bayes, and Nearest Neighbor
  • Regression Algorithms including Linear and Logistic Regression, Generalized Linear Modeling, Support Vector Regression, Decision Trees, and k-Nearest Neighbors (KNN)
  • Clustering Algorithms including k-Means, Fuzzy clustering, and Gaussian Mixture
  • Neural Networks including Hidden Markov (HMM), Recurrent (RNN), and Long-Short Term Memory (LSTM)
  • Dimensionality Reduction, Single Value Decomposition (SVD), and Principle Component Analysis (PCA)
  • How to choose an algorithm for a given problem
  • How to choose parameters and activation functions
  • Ensemble methods

  • Strong foundational mathematics skills in Linear Algebra and Probability
  • Basic Python skills
  • Basic Linux skills
  • Familiarity with command line options such as ls, cd, cp, and su

The Machine Learning Foundations class is ideal for:

  • Experienced Data Scientists, Data Analysts, Developers, Administrators, Architects, and Managers interested in a deeper exploration of common algorithms and best practices in machine learning.



Core Machine Learning Mathematics Review

  • Statistics Overview and Review
  • Mean, Median, Variance, and deviation
  • Normal/Gaussian Distribution

Probability Review

  • Probability Theory
  • Discrete Probability Distributions
  • Continuous Probability Distributions
  • Measure-Theoretic Probability Theory
  • Central Limit and Normal Distribution
  • Probability Density Function
  • Probability in Machine Learning

Supervised Learning

  • Supervised Learning Explained
  • Classification vs. Regression
  • Examples of Supervised Learning
  • Key supervised algorithms

Unsupervised Learning

  • Unsupervised Learning
  • Clustering
  • Examples of Unsupervised Learning
  • Key unsupervised algorithms

Regression Algorithms

  • Linear Regression
  • Logistic Regression
  • Support Vector Regression
  • Decision Trees
  • Random Forests

Classification Algorithms

  • Bayes Theorem and the Naïve Bayes classifier
  • Support Vector Machines
  • Discriminant Analysis
  • k-Nearest Neighbor (KNN)

Clustering Algorithms

  • k-Means Clustering
  • Fuzzy Clustering
  • Gaussian Mixture Models

Neural Networks

  • Neural Network Basics
  • Hidden Markov Models (HMM)
  • Recurrent Neural Networks (RNN)
  • Long-Short Term Memory Networks (LSTM)

Ensemble Methods

  • Ensemble Theory and Methods
  • Ensemble Classifiers
  • Bucket of Models
  • Boosting
  • Stacking

Encarta Labs Advantage

  • One Stop Corporate Training Solution Providers for over 6,000 various courses on a variety of subjects
  • All courses are delivered by Industry Veterans
  • Get jumpstarted from newbie to production ready in a matter of few days
  • Trained more than 50,000 Corporate executives across the Globe
  • All our trainings are conducted in workshop mode with more focus on hands-on sessions

View our other course offerings by visiting

Contact us for delivering this course as a public/open-house workshop/online training for a group of 10+ candidates.