Foundations of Machine Learning Algorithms: Pen-Paper Calculations

" Foundations of Machine Learning (Algorithms): Pen-paper calculations " course is a non-coding course, which is a MUST for ALL persons desirous of learning Machine Learning from mathematical and algorithmic point of view. We will focus more on the theoretical aspects of the algorithms, parameters and hand-calculations will be done on dummy data step-by-step. In some cases, to automate the calclations, we will be using MS Excel.

Course Contents:

The Following algorithms are tentatively planned to be discussed and detailed tutorials/examples will be worked out in the class.

  • General Introduction:
    • Parametric and Non-parametric Machine Learning Algorithms
    • The Supervised, Unsupervised and semi-supervised Learning
    • The Bias-Variance Trade-off
    • Overfitting and Underfitting
  • Essential Mathematics for Machine Learning - Part – A: Basic Probability

    • Basic Definitions
    • Even & odds of an event
    • Bayes Theorem & applications
    • Probability Distribution Functions
    Essential Mathematics for Machine Learning - Part – B: Basic Statistics
    • Mean, Mode, Median
    • Standard Deviation, Variance
    • Correlation and Correlation-coefficient
    • Standard Statistical Distributions

    Essential Mathematics for Machine Learning - Part – C: Linear Algebra

    1. Matrix Multiplication
    2. Operations and Properties
      1. Identity Matrix and Diagonal Matrices
      2. Transpose, Inverse, Trace, Norms and Determinant of Matrices
      3. Symmetric & Orthogonal Matrices
      4. Linear Independence and Rank
      5. Eigenvalues and Eigenvectors of Symmetric Matrices
    3. Matrix Calculus
      1. Gradients and Hessians of Quadratic and Linear Functions
      2. Least Squares
      3. Gradients of the Determinant
      4. Eigenvalues as Optimization
  • Linear Algorithms:
    • Gradient Descent.
    • Linear Regression.
    • Logistic Regression.
    • Linear Discriminant Analysis.
  • Non-Linear Algorithms:
    • Classification and Regression Trees.
    • Naive Bayes.
    • K-Nearest Neighbors.
    • Learning Vector Quantization.
    • Support Vector Machines.
  • Ensemble Methods:
    • Bagged Decision Trees and Random Forest.
    • Boosting and AdaBoost.

Course Highlights:

  • Clear algorithm explainations that help you to understand the principles that underlie each technique.
  • The step-by-step algorithm workout on black-board to show you exactly how each model learns.
  • Real worked examples so that you can see exactly the numbers in and the numbers out, there’s nowhere for the details to hide

Training Schedule: 

Saturday-Sunday Batch
6 Days 17, 24 Sept.; 1, 7, 8, 14 October 2017
Saturday Time: 9:30 AM - 1:00 PM
Sunday Time:  8:00 AM – 11:00 AM

After doing this course you would also like to register for Machine Learning in MATLAB, and/or Machine Learning in Python, and Deep Learning with Python courses.

All the information regarding Fees, discounts and Registration Form is available here.

 

Page Last Updated: Saturday 02-Sep-2017 02:48:42 IST