
Stochastic Calculation and Neural Networks
Goals
-
Study the theoretical and technical bases of stochastic processes that allow understanding the ideas and scope of Stochastic Calculus with an emphasis on financial theory.
-
Invite the student to the methods of deep learning through the training algorithms related to the stochastic approach: stochastic gradient.
-
Study some of the main hypotheses in finance related to the Brownian Movement as well as its interaction with Ito's formal definition of integral.
Course syllabus I
1. Martingales
-
Basic notions of probability spaces and random variables
-
Conditional hope and leaks
-
Basic definitions of Martingales
-
First examples: discrete case
-
Doob inequality and its financial interpretation
-
Law of large numbers for Martingales
-
Relationship with Markov chains and ergodic theorems
-
Polya Urns
2. Neural networks and stochastic approach
-
Classic Perceptron algorithm
-
Stochastic descending gradient algorithm for the Perceptron
-
Neural networks in general
-
Stochastic gradient in general
-
Relationship with Martingales and Stochastic Approach Algorithms
-
General Stochastic Approach
3. Brownian Movement
-
Formal definition
-
Gaussian processes and random beds
-
Markov property
-
Wiener integral and relation to stochastic integral
-
I don't know
-
I don't know
-
Course syllabus II
1. Introduction to mathematics of derived products
-
Case study: one-period model
-
Simplified model
-
Lack of opportunity
-
Geometric and probabilistic interpretation
-
-
A case with N periods
-
Relationship with martingales
-
Discreet Black-Scholes formula
-
2. Stochastic integral
-
Formal definition
-
Properties and first examples
-
Local Martingale
-
Ito processes
-
Ito's fundamental motto
3. Applications and relationships with deep learning