Mar 11th, 2009| 01:04 pm | Posted by hlee
by T. Cover and J. Thomas website: http://www.elementsofinformationtheory.com/
Once, perhaps more, I mentioned this book in my post with the most celebrated paper by Shannon (see the posting). Some additional recommendation of the book has been made to answer offline inquiries. And this book always has been in my favorite book list that I like to use for teaching. So, I’m not shy with recommending this book to astronomers with modern objective perspectives and practicality. Before advancing for more praises, I must say that those admiring words do not imply that I understand every line and problem of the book. Like many fields, Information theory has grown fast since the monumental debut paper by Shannon (1948) like the speed of astronomers observation techniques. Without the contents of this book, most of which came after Shannon (1948), internet, wireless communication, compression, etc could not have been conceived. Since the notion of “entropy“, the core of information theory, is familiar to astronomers (physicists), the book would be received better among them than statisticians. This book should be read easier to astronomers than statisticians. Continue reading ‘[Book] Elements of Information Theory’ »
Tags:
bandwidth,
book,
Cover,
data mining,
education,
Entropy,
Information theory,
Kolmogorov complexity,
Shannon,
Thomas Category:
Algorithms,
arXiv,
Cross-Cultural,
Data Processing,
Jargon,
Quotes |
Comment
Sep 5th, 2008| 08:28 pm | Posted by hlee
My greatest concern was what to call it. I thought of calling it “information”, but the word was overly used, so I decided to call it “uncertainty”. When I discussed it with John von Neumann, he had a better idea. Von Neumann told me, “You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name, so it already has a name. In the second place, and more important, nobody knows what entropy really is, so in a debate you will always have the advantage.”
Continue reading ‘An anecdote on entrophy’ »
Tags:
anecdote,
Entropy,
epistemology,
Information,
Information theory,
Shannon,
von Neumann,
Wikiquote Category:
Cross-Cultural,
Jargon,
Quotes,
Uncertainty |
Comment
Aug 27th, 2008| 02:35 pm | Posted by hlee
I didn’t realize this post was sitting for a month during which I almost neglected the slog. As if great books about probability and information theory for statisticians and engineers exist, I believe there are great statistical physics books for physicists. On the other hand, relatively less exist that introduce one subject to the other kind audience. In this regard, I thought the lecture note can be useful.
[arxiv:physics.data-an:0808.0012]
Lectures on Probability, Entropy, and Statistical Physics by Ariel Caticha
Abstract: Continue reading ‘A lecture note of great utility’ »
Tags:
Bayes Theorem,
Boltzmann,
Carnot,
Entropy,
Gibbs paradox,
Information,
laws of thermodynamics,
lecture note,
maximum likelihood,
probability,
Shannon,
statistical physics,
Tchebyshev inequality,
thermodynamics Category:
arXiv,
Bayesian,
Cross-Cultural,
Data Processing,
Fitting,
Physics,
Stat |
Comment
Oct 12th, 2007| 04:00 pm | Posted by hlee
Frankly, there was no astrostatistically interesting paper from astro-ph this week but profitable papers from the statistics side were posted. For the list, click Continue reading ‘[ArXiv] 2nd week, Oct. 2007’ »