Meetings: TuTh 9:30-10:45 |

Instructor: Timo Seppäläinen |

Office: 425 Van Vleck. Office Hours: after class, or by appointment. |

Phone: 263-3624 |

E-mail: seppalai and then at math dot wisc dot edu |

Teaching Assistant: Congfang Huang, office hours 1-4 PM Mondays Van Vleck B205. |

This is the course homepage. Part of this information is repeated in the course syllabus that you find on Canvas. Here you find our weekly schedule and updates on scheduling matters. The Mathematics Department has also a general information page on this course. Deadlines from the Registrar's page.

632 is a survey of several important classes of stochastic processes: Markov chains in both discrete and continuous time, point processes, and renewal processes. The material is treated at a level that does not require measure theory. Consequently technical prerequisites for this course are light: calculus and linear algebra are sufficient. However, the material is sophisticated, so a degree of intellectual maturity and a willingness to work hard are required. For this reason some 500-level work in mathematics is recommended for background, preferably in analysis (521).

Good knowledge of undergraduate probability at the level
of UW-Madison Math 431 (or an equivalent course) is required.
This means familiarity with basic probability models,
random variables and their probability mass functions
and distributions, expectations, joint distributions,
independence,
conditional probabilities, the law of large numbers
and the central limit theorem.
If you wish to acquire a book for review, the Math 431 textbook *Introduction to
Probability* by Anderson, Seppäläinen and Valkó is recommended.

In class we go through theory, examples to illuminate the theory, and techniques for solving problems. Homework exercises and exam problems are paper-and-pencil calculations with examples and special cases, together with short proofs.

A typical advanced math course follows a strict theorem-proof format. 632 is not of this type. Mathematical theory is discussed in a precise fashion but only some results can be rigorously proved in class. This is a consequence of time limitations and the desire to leave measure theory outside the scope of this course. Interested students can find the proofs in the literature. For a thoroughly rigorous probability course students should sign up for the graduate probability sequence Math/Stat 733-734 which requires a background in measure theory from Math 629 or 721. An undergraduate sequel to 632 in stochastic processes is Math 635 Introduction to Brownian motion and stochastic calculus.

We will cover Chapters 1-4, and if time permits, part of Chapter 5.

Course grades will be based on homework (with occasional quizzes possible) (20%), two midterm exams (20%+20%), and a comprehensive final exam (40%). Midterm exams will be in class on the following dates:

- Exam 1 Thursday October 11 (Week 6)
- Exam 2 Thursday November 15 (Week 11)

- Final exam: Thursday December 20, 12:25-2:25 PM, EDUC SCI 228.

Here are grade lines that can be guaranteed in advance. A percentage score in the indicated range guarantees at least the letter grade next to it.

[100,90] A, (90,87) AB, [87,76) B, [76,74) BC, [74,62) C, [62,50) D, [50,0] F.

*Course grades will not be curved, but depending the outcome of the exams the tentative grade lines above may be adjusted.*

- Homework is collected in class on the due date, or alternately can be brought to the instructor's office or mailbox by 2 PM on the due date. No late papers will be accepted. You can bring the homework earlier to the instructor's office or mailbox.
**Observe rules of academic integrity.**Handing in plagiarized work, whether copied from a fellow student or off the web, is not acceptable. Plagiarism cases will lead to sanctions. You are certainly encouraged to discuss the problems with your fellow students, but in the end you must write up and hand in your own solutions.- Organize your work neatly. Use proper English. Write in complete English or mathematical sentences. Answers should be simplified as much as possible. If the answer is a simple fraction or expression, a decimal answers from a calculator is not necessary. But for some exercises you need a calculator to get the final answer.
- As always in mathematics, numerical answers alone carry no credit. It's all in the reasoning you write down.
- Put problems in the correct order and staple your pages together. Do not use paper torn out of a binder.
- Be neat. There should not be text crossed out. Recopy your problems. Do not hand in your rough draft or first attempt. Papers that are messy, disorganized or unreadable cannot be graded.

Week | Tuesday | Thursday |
---|---|---|

1 | BEFORE SEMESTER |
Lecture notes on Canvas: IID processes and Markov chains. Simple random walk. Transition probability. |

2 |
Multistep transition probabilities. | Recurrence, transience, strong Markov property. |

3 |
Strong Markov property revisited from lecture notes on Canvas. Durrett's book, Section 1.3: Lemma 1.4, Theorem 1.5, definitions of closed and irreducible sets. Homework 1 due. | Section 1.3 from Durrett finished. |

4 |
Examples of canonical decomposition and recurrence/transience (gambler's ruin, simple random walk). 1.4 Stationary distributions. Examples: two-state MC, gambler's ruin. | 1.4 Stationary distributions: renewal chain example (Durrett 1.22). Summary of facts concerning invariant measures and distributions. 1.5 Begin development towards MC convergence theorem: period of a recurrent state. Homework 2 due. |

5 |
Discussion and examples of the Markov chain convergence theorem: renewal chain, two-state Markov chain, convergence to a stationary stochastic process. Beginning the proof of the convergence theorem. | Conclusion of the proof of the Markov chain convergence theorem. Review strong law of large numbers (SLLN) for IID random variables. |

6 | Dissection principle for Markov chains. Limiting frequency of visits to a state. SLLN for Markov chains. Homework 3 due. |
Exam 1. Covers Sections 1.1-1.5 from Durrett's book. |

7 |
1.6 Detailed balance and reversible Markov chains. | 1.8 Exit distributions. 1.9 Exit times. |

8 |
2.1 Exponential, gamma and Poisson distributions. Homework 4 due. | 2.1 Exponential, gamma and Poisson distributions. 2.2. Poisson process on the line. |

9 |
2.2 Properties of the homogeneous Poisson process on the line. Exercise 2.22. Poisson process as a limit from a sequence of independent trials. | 2.3 Compound Poisson processes. 2.4 Thinning. Homework 5 due. |

10 |
2.4 Thinning, superposition and conditioning. | Conditioning: derivation of the joint conditional density f(x,y)=2/t^2 of (T conditional on _{1},T_{2})N(t)=2. Examples. |

11 |
Review of Poisson processes. Homework 6 due. | Exam 2. Poisson processes. |

12 |
3.1 Laws of large numbers for renewal processes and renewal-reward processes. 3.3 Definition of age and residual lifetime. Stopping times for IID sequences. | THANKSGIVING |

13 |
3.3 Limit distributions for age and residual lifetime. Stationary renewal process. | 4.1 Continuous-time Markov chains: definitions and first examples. |

14 |
4.1 Construction of continuous-time Markov chains from given rates. 4.2 Kolmogorov's forward and backward equations. Generator matrix. 4.3 Invariant distributions and the convergence theorem. Homework 7 due. | 4.2 Exponentials of matrices. 4.3 Detailed balance and reversibility. Queueing examples. |

15 |
Introduction to Brownian motion. Review questions. Homework 8 due. | SEMESTER OVER |

Check out the Probability Seminar for talks on topics that might interest you.

Timo Seppalainen