This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We'll cover some basic statistics of data sets, such as mean values and variances, we'll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these tools, we'll then derive PCA as a method that minimizes the average squared reconstruction error between data points and their reconstruction.
제공자:
이 강좌에 대하여
배울 내용
Implement mathematical concepts using real-world data
Derive PCA from a projection perspective
Understand how orthogonal projections work
Master PCA
귀하가 습득할 기술
- Dimensionality Reduction
- Python Programming
- Linear Algebra
제공자:
강의 계획표 - 이 강좌에서 배울 내용
Statistics of Datasets
Inner Products
Orthogonal Projections
Principal Component Analysis
검토
- 5 stars51.12%
- 4 stars22.58%
- 3 stars12.74%
- 2 stars6.65%
- 1 star6.89%
MATHEMATICS FOR MACHINE LEARNING: PCA의 최상위 리뷰
Definitely the most challenging of the course making up this specialization. Finishing it with full scores is proportionally far more satisfying!!! Well done Marc!
Very challenging at times, but very good course none the less. Would recommend to any one who has a solid foundation of Linear Algebra (Course 1) and Multivariate Calculus (Course 2).
This is one hell of an inspiring course that demystified the difficult concepts and math behind PCA. Excellent instructors in imparting the these knowledge with easy-to-understand illustrations.
Overall the course was great. The only thing was that there was a lot I didn't understand from the videos. The recommended textbook resource was a great help.
머신 러닝 수학 특화 과정 정보

자주 묻는 질문
강의 및 과제를 언제 이용할 수 있게 되나요?
이 전문 분야를 구독하면 무엇을 이용할 수 있나요?
재정 지원을 받을 수 있나요?
What level of programming is required to do this course?
How difficult is this course in comparison to the other two of this specialization?
궁금한 점이 더 있으신가요? 학습자 도움말 센터를 방문해 보세요.