It finally happened. Both this and next year I will be teaching in French. Trying to be optimistic here so here are the good points:
- The load is concentrated in Fall terms with the salary spread throughout the year (+340EUR en gros for 64 hrs of classes a year).
- It’s the final opportunity to learn French (yep, first you begin teaching and then you learn the language).
- This is the way to deal with “common” and “professional” ETC credits. Yes, that was an occurence of RAS syndrome.
Courses are MAT116 (Calculus for Engineers, lectures and seminars combined) and STA230 (Applied Statistics, seminars). Taking into account my level of French, a cheat sheet with some corresponding vocabulary will be useful I guess. Here we go.
Strong convexity provides an improved lower bound for a convex function; therefore are in our right to expect the improvement of the rate of convergence of PGD. In this post we will find out how much we actually gain.
Discovered an interesting discussion on math.stackoverflow.
This is the first of a series of two posts with a goal of providing a background on information theory. For today, the itinerary is as follows: we will start with introducing basic information quantities, proceed with proving tensorization statements for them — i.e. that they behave well under the product measure. The tensorization, in turn, will allow us to prove a much stronger counterpart of McDiarmid’s inequality in the next post.
In the previous post of the mini-course, we proved Azuma–Hoeffding inequality. We were able to relax the assumption that were independent but we were dealing only with their sum. Now we are to demonstrate that martingale method lets also to control more general functions (now of independent arguments). The idea is to apply Azuma–Hoeffding inequality to the generic random sequence called Doob martingale. Continue reading