Call : (+91) 968636 4243
Mail : info@EncartaLabs.com
EncartaLabs

# Machine Learning Foundations

( Duration: 3 Days )

This Machine Learning Foundations training course focuses on the mathematics and algorithms used in Data Science. You will learn core skills and explore machine learning algorithms along with their practical application and limitations. With this knowledge, you will build the intuition necessary to solve complex machine learning problems.

By attending Machine Learning Foundations workshop, delegates will learn:

• Core machine learning mathematics and statistics
• Supervised Learning vs. Unsupervised Learning
• Classification Algorithms including Support Vector Machines, Discriminant Analysis, Naïve Bayes, and Nearest Neighbor
• Regression Algorithms including Linear and Logistic Regression, Generalized Linear Modeling, Support Vector Regression, Decision Trees, and k-Nearest Neighbors (KNN)
• Clustering Algorithms including k-Means, Fuzzy clustering, and Gaussian Mixture
• Neural Networks including Hidden Markov (HMM), Recurrent (RNN), and Long-Short Term Memory (LSTM)
• Dimensionality Reduction, Single Value Decomposition (SVD), and Principle Component Analysis (PCA)
• How to choose an algorithm for a given problem
• How to choose parameters and activation functions
• Ensemble methods

• Strong foundational mathematics skills in Linear Algebra and Probability
• Basic Python skills
• Basic Linux skills
• Familiarity with command line options such as ls, cd, cp, and su

The Machine Learning Foundations class is ideal for:

• Experienced Data Scientists, Data Analysts, Developers, Administrators, Architects, and Managers interested in a deeper exploration of common algorithms and best practices in machine learning.

1

## Core Machine Learning Mathematics Review

• Statistics Overview and Review
• Mean, Median, Variance, and deviation
• Normal/Gaussian Distribution
2

## Probability Review

• Probability Theory
• Discrete Probability Distributions
• Continuous Probability Distributions
• Measure-Theoretic Probability Theory
• Central Limit and Normal Distribution
• Probability Density Function
• Probability in Machine Learning
3

## Supervised Learning

• Supervised Learning Explained
• Classification vs. Regression
• Examples of Supervised Learning
• Key supervised algorithms
4

## Unsupervised Learning

• Unsupervised Learning
• Clustering
• Examples of Unsupervised Learning
• Key unsupervised algorithms
5

## Regression Algorithms

• Linear Regression
• Logistic Regression
• Support Vector Regression
• Decision Trees
• Random Forests
6

## Classification Algorithms

• Bayes Theorem and the Naïve Bayes classifier
• Support Vector Machines
• Discriminant Analysis
• k-Nearest Neighbor (KNN)
7

## Clustering Algorithms

• k-Means Clustering
• Fuzzy Clustering
• Gaussian Mixture Models
8

## Neural Networks

• Neural Network Basics
• Hidden Markov Models (HMM)
• Recurrent Neural Networks (RNN)
• Long-Short Term Memory Networks (LSTM)
9

## Ensemble Methods

• Ensemble Theory and Methods
• Ensemble Classifiers
• Bucket of Models
• Boosting
• Stacking