Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Information Theory, Inference, and Learning Algorithms (free ebook edition) (cam.ac.uk)
72 points by mbrubeck on Sept 30, 2009 | hide | past | favorite | 21 comments


For anyone interested in learning algorithms (and not just free books), I would also suggest Pattern Recognition and Machine Learning by Chris Bishop. Not free, but worth while. http://research.microsoft.com/en-us/um/people/cmbishop/prml/


I've been reading through this bit by bit the last few days, mostly to get a handle on how to implement MCMC for Bayesian posteriors, and I have to say its fantastically written. I wouldn't call it comprehensive or unbiased, but it sets up the infrastructure of interrelatedness between noisy channels, information theory, statistics, and machine learning pretty much as effortlessly as possible.

Note: I'm buying it entirely because it has wide margins. Many of the calculations he outlines deserve to be worked out in full. Wide margins are absolutely the most important publishing concern for a math/science/engineering-based text.


If you want to learn how to implement MCMC I recommend:

Bayesian Logical Analysis Physical Sciences by Gregory

Gregory's book explains a lot more of the engineering (autocorrelations, step size jumping, etc..). Even better, it discusses how to perform model selection using a clever annealing technique. Though model selection may not be of interest to you.

ps - MacKay's book is my nightly reading, so I'm not dissing MacKay :)


Fantastic book. The problems are interesting, and nicely bring out the connections between topics that would on the surface, seem to be disparate.

Cover and Thomas is more textbookish, and in some ways, more detailed. Personally, I'd read this first, and then take on the interesting topics in Cover and Thomas.

I read a lot of math books, and I'd put this right on top along with Needham's 'Visual Complex Analysis'.


One of my favorite machine learning textbooks, if you can call it that. It's a little oddball though. For any topic, it's extremely interesting and insightful, though usually not comprehensive enough to rely on it as a reference.


I've said it before and I'll say it again: this book is great. This books provides a really great foundational understanding of ML, not the toolbox approach of other textbooks. I think the exposition of coding theory is especially nice compared to, say, Cover and Thomas.


God, this Mackay guy gets around: Sustainable Energy, this, and the Dasher project.


This has to be the only free CS theory book out there. It comes up way too often.



Wow! The Jaynes book is on the order of 3,000 pages! Looks like a potentially comprehensive reference for all things probability.

Can anyone comment on the quality of the writing?


Jaynes died before he finished it. It's still one of the most important books I ever read.


It's good, with a heavy focus on frequentist statistics and maximum entropy approach


I think to say the book has a heavy focus on frequentist statistics is a little misleading. Jaynes discusses a lot of frequentist methods but the emphasis is on doing so from a very Bayesian point of view.


He uses Bayesian methods in some cases, and non-Bayesian method in others. For instance on page 1412 he describes a problem for which Bayesian methods are "not appropriate." http://omega.albany.edu:8008/ETJ-PS/cc14g.ps

This is a bit of an anathema to purist Bayesians like Radford Neal who say that Jaynes Maximum Entropy method is not consistent with Bayesian methods and that it "doesn't make any sense"

http://groups.google.com/group/sci.stat.consult/msg/2cf57ceb...


More long winded than most, but it's normal "physics" writing. Very clear. I recommend the preface whether or not you intend to read the whole thing.


It's probably pretty good. (Sorry.)



The other reason why it comes up so often is that it's damn good.


I love the irony that it's written by a member of the Physics faculty.

Ah, Cambridge.


Machine learning and the like is often done by people outside the computer science department. EE, statistics, math and physics are common.


At Cambridge, you find high-level theoretical machine learning research in the Engineering, Computer Science and Physics faculties. (My spies inside the DAMTP don't point to much going on there - but who knows). None of the above ever seem to talk to each other, sadly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: