
Forcasting and dimension reduction
Objectives
• Invite the student to use some useful mathematical methods in data prediction and analysis.
• To familiarize the student with the mathematical language used in the fundamental methods of Data Science and Machine Learning.
• Formalize some supervised and unsupervised learning algorithms.
• Provide the student with the restrictions of use and the practical benefits that the mathematical formalization implies.
Syllabus
Block one: Prediction
1. Linear regressions
to. Examples
b. Advantages and disadvantages
c. Convex optimization
d. Stochastic noise
and. Stochastic attributes of the algorithm
F. Algebraic solution: matrix inversion
g. Geometric interpretation
h. Analytical solution: Gradient Descent
i. Stochastic Gradient Descent stochastic solution
j. Learning capacity
k. Logistic regressions
2. Generalizations
to. Polynomial regressions
b. Splines
c. Unvistazoaloskernels
d. A look at neural networks
3. The curse of dimension
to. K-nearest neighbors
b. Some concrete calculations
c. Solutions: regularizers or dimension reduction
4. Linear regressions with regularizers
to. Tychonoff's regularizer as a stabilizer
b. Ridge linear regression
i. Strongly convex optimization
ii. Algebraic solution
iii. Analytical solution
iv. Stochastic solution
v. Determination of the lambda parameter
saw. K-fold cross validation
c. Lasso linear regression
d. Elastic net linear regression
Block two: numerical linear algebra
1. Matrix algebra
to. Basic concepts
b. Relationships with linear regressions
c. Tensor products
2. Decomposition of matrices
to. Motivation: curve interpolation
b. Gaussian decomposition
c. Singular value decomposition
d. Singular value stochastic decomposition
and. Non-negative matrix factorization
F. Cholensky decomposition
3. Linear dimension reduction
to. PCA: Euclidean interpretation
b. PCA stochastic interpretation
c. Cut-off
d. Robust PCA
Descarga las notas aquí:
Visita nuestro
repositorio: