Mutual information, one of the key concepts in information theory, provides a natural dependence measure for two random variables. Its generalization to multivariate dependence, termed multi-information, will be considered in this lecture. In particular, I will discuss in detail how the multi-information can be decomposed hierarchically as a sum of contributions from pairwise, triple and higher-order dependences with the help of the principle of maximum entropy. The theoretical concepts will be illustrated with simple examples from the logic functions, such as the AND, OR and XOR operations. Brief comments on the applications of the above concepts to real many-body systems will also be given as further study directions.