TBA
Panel recording and full program
Date: Apr 12, 2-3:30 pm PDT
Panel recording and full program.
Date: Feb 22, 2022, 9-11 am PST
Panel recording and full program
9-11 am PST, Jan 20, 2022
Panelists: Michael Graziano, Jonathan Cohen, Vasudev Lal, Joscha Bach
The seminal contribution "Attention is all you need" (Vasvani et al. 2017), which introduced the Transformer algorithm, triggered a small revolution in machine learning. Unlike convolutional neural networks, which construct each feature out of a fixed neighborhood of signals, Transformers learn which data a feature on the next layer of a neural network should attend to. However, attention in neural networks is very different from the integrated attention in a human mind. In our minds, attention seems to be part of a top-down mechanism that actively creates a coherent, dynamic model of reality, and plays a crucial role in planning, inference, reflection and creative problem solving. Our consciousness appears to be involved in maintaining the control model of our attention.
In this panel, we want to discuss avenues into our understanding of attention, in the context of machine learning, cognitive science and future developments of AI.
Panel recording and full program
Dec 9, 2021. 9:00-11:00 am PST
Cristiano Castelfranchi: Grounding Sociality in Goal Theory
Christian Balkenius: Motivation, Emotion, and Attention
Dietrich Dörner: The Competence Motivation
Joscha Bach: Motivation for individual and collective agency
Panel recording and full program
Oct 21, 2021. 9:00-11:00 am PDT
Mark Bickhard: Cognition and Truth Value
Stephen Grossberg: How Each Brain Makes a Mind: From Brain Resonances to Conscious Experiences
Yulia Sandamirskaya: Memory, intentionality, and autonomy enabled by neuronal attractor dynamics
Jerome Busemeyer: Modeling cognition and decision using quantum probability theory
Steven Rogers: What are the tenets for machine representations (artificial qualia?) that enable flexible behaviors?
Joscha Bach: Perception, Reflection and Coherence