Workshop

Information-Theoretic Principles in Cognitive Systems (InfoCog)

Noga Zaslavsky · Rava Azeredo da Silveira · Ronit Bustin · Ron M. Hecht

Room 215 - 216
[ Abstract ] Workshop Website
Fri 15 Dec, 6:15 a.m. PST

Information theory provides a mathematical framework allowing to formulate and quantify the basic limitations of data compression and communication. The notions of data compression and communication, based in analog and digital communication, are also relevant toother domains; as such, information theory spans a number of research fields. Aiming to formulate, understand, and quantify the storage and processing of information is a thread that ties together these disparate fields, and especially the study of cognition in humans and machines. Specifically, the desire to reach an integrative computational theory of human and artificial cognition, is attempted by leveraging information-theoretic principles as bridges between various cognitive functions and neural representations. Insights from information theoretic formalization have also led to tangible outcomes which have influenced the operation of artificial intelligent systems. One example is the information bottleneck (IB) approach, yielding insights on learning in neural networks (NN), as well as tools for slow feature analysis and speech recognition. A central application of the IB approach on NN, is through the view of data transfer between layers as an autoencoder. The approach then uses a variational approximation of the IB to produce an objective for minimization that is feasible and results in efficient training (a.k.a. variational IB(VIB)). In the other direction, the variational autoencoder (VAE) framework has also been used to explain cognitive functions, as done for example in. The IB approach has also been applied to emergent communication (EC) in both humans and machines, using a vector quantization VIB(VQ-VIB) method, that extends the aforementioned VIB method. Another example is the trade-off between information and value in the context of sequential decision making. This corresponding formalism has led to tangible methods in the solution of sequential decision making problems and was even used in an experimental study of mouse navigation and study of drivers' eye gaze patterns and study of drivers' language models. In aiming at understanding machine learning (ML), specifically in the context of NNs, or cognition, we need theoretical principles (hypotheses) that can be tested. To quote Shannon: I personally believe that many of the concepts of information theory will prove useful in these other fields-and, indeed, some results are already quite promising-but the establishing of such applications is not a trivial matter of translating words to a new domain, but rather the slow tedious process of hypothesis and experimental verification. If, for example, the human being acts in some situations like an ideal decoder, this is an experimental and not a mathematical fact, and as such must be tested under a wide variety of experimental situations. Today, both ML and cognition can entertain huge amounts of data. Establishing quantitative theories and corresponding methods for computation can have a massive impact on progress in these fields. Broadly, this workshop aims to further the understanding of information flow in cognitive processes and neural networks models of cognition. More concretely, this year’s workshop goals are twofold. On the one hand we wish to provide a fruitful platform for discussions relating to formulations of storage and processing of information either in human or artificial cognition systems, via information-theoretic measures, as those formalisms mentioned above. Specifically, the workshop comes to allow information theory researchers to take part in such discussions, allowing first-hand sharing of knowledge and ideas. On the other hand, we hope this workshop can advance, sharpen and enhance the research done around the computation of information theoretic quantities, specifically for the needs and benefits of cognition research. The two aims of the workshop are not independent of one another - any information theoretic formalism that we wish to experimentally verify has to be, in some sense, computationally feasible. Moreover, we wish that computation and estimation methods are developed in a way that is tailored to the open questions in human and artificial cognition. The proposed workshop focuses on bringing together researchers interested in integrating information-theoretic approaches with researchers focused on the computation/estimation of information-theoretic quantities, with the aim of tightening the collaboration between the two communities. Researchers interested in integrating information-theoretic approaches come from cognitive science, neuroscience, linguistics, economics, and beyond. Efforts in the computation/estimation of information-theoretic quantities are pursued for many reasons, and is a line of research gaining increasing attention due to advances in ML. Furthermore, these researchers have created in recent years new methods to measure information-related quantities.

Chat is not available.
Timezone: America/Los_Angeles

Schedule