Pages

Monday, July 19, 2021

Supervised Learning - Linear Regression in Python

Colleagues, the Supervised Learning - Linear Regression in Python will equip you to apply Least Squares regression and its assumptions to real world data. Then we'll improve on that algorithm with Penalized Regression and non-parametric Kernel methods Learn to apply algorithms in the following areas: Least squares, Penalized least squares, Non-parametric methods, Model selection and fit on real world applications including Insurance and Healthcare. The curriculum addresses: 1) Introduction to Supervised Linear Regression, 2) Introduction to Machine Learning and Supervised Regression - Introduction to Machine Learning and Supervised Regression, Discuss the overall AI ecosystem and how Machine Learning (ML) is part of that ecosystem. - Understand the 3 different types of algorithms that make up ML - Provide some intuition for why functions and optimizations are important in ML, 3) Machine Learning - Understanding Assumptions and survey the statistical concepts important to understanding Linear Algorithms. - Design of experiments. - Conducting experiments. - Understand the difference between linear and non-linear functions, 4) Least Squares Regression - Ordinary Regression - Develop the simple linear regression algorithm. Understand the basic linear regression assumptions. Learn to identify when assumption violations occur. Understand how to evaluate model output, 5) Least Squares Regression - Multiple Regression - Extend the Least Squares algorithm to multiple dimensions Explore data to understand variable importance Prepare data for multiple regression Optimizing between Bias and Variance, 6) Penalized Regression - L1/L2 Optimization - understand motivation behind penalized regression Optimize L1 Regression (Lasso) parameters Optimize L2 Regression (Ridge) parameters Combine the L1/L2 penalties (Elastic Net) Understand the difference and trade offs between Subset Selection and Shrinkage Optimize hyper-parameters with Cross-Validation, 7) Kernel Methods - Support Vector Machines - Understand theory and motivation behind kernel methods. Derive a basic kernel and use the kernel trick. Build a support vector classifier. Extend to regression with a support vector machine. Optimize parameters with Cross validation and Grid Search, 8) Kernel Methods - Gaussian Process Regression - Understand multivariate distributions and non-parametric regression. Use Bayesian probability with joint probabilities. Develop theory behind Gaussian Process Regression. Optimize kernels and hyper-parameters, and 9) Summary and Real World Applications - Review Supervised Linear Regression topics. Perform Linear regression on real world data.

Enroll today (individuals & teams welcome): https://tinyurl.com/uvjs63tw 


Much career success, Lawrence E. Wilson - Artificial Intelligence Academy


No comments:

Post a Comment

Christmas Bonanza - Audible & Kindle Book Series (Amazon)

“Transformative Innovation” Audio and eBook series make a wonderful Christmas gift! Transformative Innovation series:   1 - ChatGPT, Gemini...