Документ взят из кэша поисковой машины. Адрес оригинального документа : http://hea-www.harvard.edu/AstroStat/slog/groundtruth.info/AstroStat/slog/2009/book-elements-of-information-theory/index.html
Дата изменения: Unknown
Дата индексирования: Sat Mar 1 15:48:12 2014
Кодировка:

Поисковые слова: п п п п п п п п п п п п п п п п п п п п п п п
The AstroStat Slog » Blog Archive » [Book] Elements of Information Theory

[Book] Elements of Information Theory

by T. Cover and J. Thomas website: http://www.elementsofinformationtheory.com/

Once, perhaps more, I mentioned this book in my post with the most celebrated paper by Shannon (see the posting). Some additional recommendation of the book has been made to answer offline inquiries. And this book always has been in my favorite book list that I like to use for teaching. So, I’m not shy with recommending this book to astronomers with modern objective perspectives and practicality. Before advancing for more praises, I must say that those admiring words do not imply that I understand every line and problem of the book. Like many fields, Information theory has grown fast since the monumental debut paper by Shannon (1948) like the speed of astronomers observation techniques. Without the contents of this book, most of which came after Shannon (1948), internet, wireless communication, compression, etc could not have been conceived. Since the notion of “entropy“, the core of information theory, is familiar to astronomers (physicists), the book would be received better among them than statisticians. This book should be read easier to astronomers than statisticians.

My reason for recommending this book is that, personally thinking, having some knowledge in information theory (in data compression and channel capacity) would help to resolve limited bandwidth in the era of massive unprecedented astronomical survey projects with satellites or ground base telescopes.

The content can be viewed from the aspect of applied probability; therefore, the basics of probability theories including distributions and uncertainties become familiar to astronomers than indulging probability textbooks.

Many of my [MADS] series are motivated by the content of this book, where I learned many practical data processing ideas and objectives (data compression, data transmission, network information theory, ergodic theory, hypothesis testing, statistical mechanic, quantum mechanics, inference, probability theory, lossless coding/decoding, convex optimization, etc) although those [MADS] postings are not visible on the slog yet (I hope I can get most of them through within several months; otherwise, someone should continue my [MADS] and introducing modern statistics to astronomers). The most commonly practiced ideas in engineering could help accelerating the data processing procedures in astronomy and turning astronomical inference processes more efficient and consistent, which have been neglected because of many other demands. Here, I’d rather defer discussing details of particular topics from the book and describing how astronomers applied them (There are quite hidden statistical jewels from ADS but not well speculated). Through [MADS], I will discuss further more, how information theory could help processing astronomical data from data collecting, pipe-lining, storing, extracting, and exploring to summarizing, modeling, estimating, inference, and prediction. Instead of discussing topics of the book, I’d like to quote interesting statements in the introductory chapter of the book to offer delicious flavors and to tempt you for reading it.

… it [information theory] has fundamental contributions to make in statistical physics (thermodynamics), computer science (Kolmogorov complexity or algorithmic complexity), statistical inference (Occam’s Razor: The simplest explanation is best), and to probability and statistics (error exponents for optimal hypothesis testing and estimation).

… information theory intersects physics (statistical mechanics), mathematics (probability theory), electrical engineering (communication theory), and computer science (algorithmic complexity).

There is a pleasing complementary relationship between algorithmic complexity and computational complexity. One can think about computational complexity (time complexity) and Kolmogorov complexity (program length or descriptive complexity) as two axes corresponding to program running time and program length. Kolmogorov complexity focuses on minimizing along the second axis, and computational complexity focuses on minimizing along the first axis. Little work has been done on the simultaneous minimzation of the two.

The concept of entropy in information theory is related to the concept of entropy in statistical mechanics.

In addition to the book’s website, googling the title will show tons of links spanning from gambling/establishing portfolio to computational complexity, in between there are statistics, probability, statistical mechanics, communication theory, data compression, etc where the order does not imply relevance or importance of the subjects. Such broad notion is discussed in the intro chapter. If you have the book in your hand, regardless of their editions, you might first want to check Fig. 1.1 “Relationship of information theory to other fields” a diagram explaining connections and similarities among these subjects.

Data analysis tools, methods, algorithms, and theories including statistics (both exploratory data analysis and inference) should follow the direction of retrieving meaningful information from observations. Sometimes, I feel that priority is lost, ship without captain, treating statistics or information science as black box without any interests of knowing what’s in there.

I don’t know how many astronomy departments offer classes for data analysis, data mining, information theory, machine learning, or statistics for graduate students. I saw none from my alma matter although it offers the famous summer school recently. The closest one I had was computational physics, focusing how to solve differential equations (stochastic differential equations were not included) and optimization (I learned the game theory, unexpected. Overall, I am still fond of what I learned from that class). I haven’t seen any astronomy graduate students in statistics classes nor in EE/CS classes related to signal processing, information theory, and data mining (some departments offer statistics classes for their own students, like the course of experimental designs for students of agriculture science). Not enough educational efforts for the new information era and big survey projects is what I feel in astronomy. Yet, I’m very happy to see some apprenticeships to cope with those new patterns in astronomical science. I only hope it grows, beyond a few small guilds. I wish they have more resources to make their works efficient as time goes.

Leave a comment