This intermediate-level course introduces the mathematical foundations to derive Principal Component Analysis (PCA), a fundamental dimensionality reduction technique. We'll cover some basic statistics of data sets, such as mean values and variances, we'll compute distances and angles between vectors using inner products and derive orthogonal projections of data onto lower-dimensional subspaces. Using all these tools, we'll then derive PCA as a method that minimizes the average squared reconstruction error between data points and their reconstruction.
이 강좌에 대하여
귀하가 습득할 기술
- 5 stars51.02%
- 4 stars22.60%
- 3 stars12.83%
- 2 stars6.69%
- 1 star6.83%
MATHEMATICS FOR MACHINE LEARNING: PCA의 최상위 리뷰
Challenging, but doable. Has some bugs in coding assignments, but clearing them out makes you understand things better. Get ready to spend extra time understanding the concepts.
Course content is interesting and well planned, Can be improved by making it Simpler for Students as it was more technical than the other 2 courses of the Specialization.
Programming assignment for week 1 wastes to much time due to lack of instructions.
The notebook also does not work...(maybe locally , but I have other things to do).
Great capstone for the three-class Mathematics for Machine Learning series. Assignments were way harder and programming debugging skills had to be appropiate in order to finish the class.
머신 러닝 수학 특화 과정 정보
자주 묻는 질문
강의 및 과제를 언제 이용할 수 있게 되나요?
이 전문 분야를 구독하면 무엇을 이용할 수 있나요?
재정 지원을 받을 수 있나요?
What level of programming is required to do this course?
How difficult is this course in comparison to the other two of this specialization?
궁금한 점이 더 있으신가요? 학습자 도움말 센터를 방문해 보세요.