Wednesday 4 July, 11:20
In the early twentieth century, Claude Shannon introduced the concept of entropy and information theory in the study of languages. Since that time, information theory has become widespread in a diverse range of disciplines and so a natural question to ask is whether these ideas can provide new insights for exoplanets too. I will start by introducing a means of quantifying the entropy of planetary system architectures, with compelling evidence that Kepler multis are non-random and thus retain some information content regarding their origins. Borrowing another idea from linguistics, I will also show how Zipf’s Law - which describes the rank occurrence of words in languages - remarkably appears to also be present in exoplanetary distributions. These insights can be used to define the principal features governing a predictive model for planetary systems, with the promise to predict planets much like how predictive text works on your smart phone. Together then, the ideas of grammatical induction and linguistic information theory may provide important ways of quantifying the planetary census.