Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

One can also think of probability generating functions as (flipped) Z transforms, moment generating functions as (flipped Laplace transforms), and characteristic functions as Fourier transforms of the respective PMF/PDF. Lot of their properties then follow from simple properties of Signals and Systems.


Do you have a reference that explains this in more detail? I'd be curious to know.


Don't have a reference on the top of my head, but the main idea is as follows:

The definition of MGF of a random variable with PDF f(x) is

E[e^{sX}] = int_{-inf}^{inf} f(x) e^{sx} dx

The definition of Laplace Transform of a signal f(t) is

F(s) = int _{-inf}^{inf} f(t) e^{-st} dt

Hence MGF is 'flipped' Laplace transform

Now for we know that the MGF of sum independent RVs is the product of their MGFs. So if we take the inverse Laplace transform, the density of the sum is convolution of the individual densities.

Similarly, if we take derivative in frequency domain, that is same as multiplying in time domain: So M'_X(s) is the 'flipped Laplace transform' of x f(x) and its value at s=0 is the 'DC-gain' of the signal.

And so on... the properties are all immediate consequence of the definition of MGF and since the definition is essentially the same as that of a Laplace transform , there is an equivalent property in signals and systems as well.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: