Stephen Grossberg
Stephen Grossberg is a cognitive scientist, theoretical and computational psychologist, neuroscientist, mathematician, biomedical engineer, and neuromorphic technologist. He is the Wang Professor of Cognitive and Neural Systems and a professor emeritus of Mathematics & Statistics, Psychological & Brain Sciences, and Biomedical Engineering at Boston University.
Career
Early life and education
Grossberg first lived in Woodside, Queens, in New York City. His father died from Hodgkin's lymphoma when he was one year old. His mother remarried when he was five years old. He then moved with his mother, stepfather, and older brother, Mitchell, to Jackson Heights, Queens. He attended Stuyvesant High School in lower Manhattan after passing its competitive entrance exam. He graduated first in his class from Stuyvesant in 1957.He began undergraduate studies at Dartmouth College in 1957, where he first conceived of the paradigm of using nonlinear differential equations to describe neural networks that model brain dynamics, as well as the basic equations that many scientists use for this purpose today. He then continued to study both psychology and neuroscience. He received a B.A. in 1961 from Dartmouth as its first joint major in mathematics and psychology.
Grossberg then went to Stanford University, from which he graduated in 1964 with an MS in mathematics and transferred to The Rockefeller Institute for Medical Research in Manhattan. In his first year at Rockefeller, he wrote a 500-page monograph summarizing his discoveries to that time. It is called The Theory of Embedding Fields with Applications to Psychology and Neurophysiology. Grossberg received a PhD in mathematics from Rockefeller in 1967 for a thesis that proved the first global content addressable memory theorems about the neural learning models that he had discovered at Dartmouth. His PhD thesis advisor was Gian-Carlo Rota.
Entering academia
Grossberg was hired in 1967 as an assistant professor of applied mathematics at MIT following strong recommendations from Mark Kac and Rota. In 1969, Grossberg was promoted to associate professor after publishing a stream of conceptual and mathematical results about many aspects of neural networks, including a series of foundational articles in the Proceedings of the National Academy of Sciences between 1967 and 1971.Grossberg was hired as a full professor at Boston University in 1975, where he is still on the faculty today. While at Boston University, he founded the Department of Cognitive and Neural Systems, several interdisciplinary research centers, and various international institutions.
Research
Grossberg is a pioneer of the fields of computational neuroscience, connectionist cognitive science, and neuromorphic technology. His work focuses upon the design principles and mechanisms that enable the behavior of individuals, or machines, to adapt autonomously in real time to unexpected environmental challenges. This research has included neural models of vision and image processing; object, scene, and event learning, pattern recognition, and search; audition, speech and language; cognitive information processing and planning; reinforcement learning and cognitive-emotional interactions; autonomous navigation; adaptive sensory-motor control and robotics; self-organizing neurodynamics; and mental disorders. Grossberg also collaborates with experimentalists to design experiments that test theoretical predictions and fill in conceptually important gaps in the experimental literature, carries out analyses of the mathematical dynamics of neural systems, and transfers biological neural models to applications in engineering and technology. He has published 18 books or journal special issues, over 560 research articles, and has 7 patents.Grossberg has studied how brains give rise to minds since he took the introductory psychology course as a freshman at Dartmouth College in 1957. At that time, Grossberg introduced the paradigm of using nonlinear systems of differential equations to show how brain mechanisms can give rise to behavioral functions. This paradigm is helping to solve the classical mind/body problem, and is the basic mathematical formalism that is used in biological neural network research today. In particular, in 1957–1958, Grossberg discovered widely used equations for short-term memory, or neuronal activation ; medium-term memory, or activity-dependent habituation ; and long-term memory, or neuronal learning. One variant of these learning equations, called Instar Learning, was introduced by Grossberg in 1976 into Adaptive Resonance Theory and Self-Organizing Maps for the learning of adaptive filters in these models. This learning equation was also used by Kohonen in his applications of Self-Organizing Maps starting in 1984. Another variant of these learning equations, called Outstar Learning, was used by Grossberg starting in 1967 for spatial pattern learning. Outstar and Instar learning were combined by Grossberg in 1976 in a three-layer network for the learning of multi-dimensional maps from any m-dimensional input space to any n-dimensional output space. This application was called Counter-propagation by Hecht-Nielsen in 1987.
Building on his 1964 Rockefeller PhD thesis, in the 1960s and 1970s, Grossberg generalized the Additive and Shunting models to a class of dynamical systems that included these models as well as non-neural biological models, and proved content addressable memory theorems for this more general class of models. As part of this analysis, he introduced a Liapunov functional method to help classify the limiting and oscillatory dynamics of competitive systems by keeping track of which population is winning through time. This Liapunov method led him and Michael Cohen to discover in 1981 and publish in 1982 and 1983 a Liapunov function that they used to prove that global limits exist in a class of dynamical systems with symmetric interaction coefficients that includes the Additive and Shunting models. This model is often called the Cohen-Grossberg model and Liapunov function. John Hopfield published the special case of the Cohen-Grossberg Liapunov function for the Additive model in 1984. In 1987, Bart Kosko adapted the Cohen-Grossberg model and Liapunov function, which proved global convergence of STM, to define an Adaptive Bidirectional Associative Memory that combines STM and LTM and which also globally converges to a limit.
Grossberg has introduced, and developed with his colleagues, fundamental concepts, mechanisms, models, and architectures across a wide spectrum of topics about brain and behavior. He has collaborated with over 100 PhD students and postdoctoral fellows.
These models have provided unified and principled explanations of psychological and neurobiological data about processes including auditory and visual perception, attention, consciousness, cognition, cognitive-emotional interactions, and action in both typical, or normal, individuals and clinical patients. This work models how particular brain breakdowns or lesions cause behavioral symptoms of mental disorders such as Alzheimer's disease, autism, amnesia, PTSD, ADHD, visual and auditory agnosia and neglect, and slow-wave sleep.
The models have also been applied in many large-scale applications to engineering, technology, and AI. Taken together, they provide a blueprint for designing autonomous adaptive intelligent algorithms, agents, and mobile robots.
These results have been combined in a self-contained and non-technical exposition in a conversational style in Grossberg's 2021 publication Conscious Mind, Resonant Brain: How Each Brain Makes a Mind.
This book won the 2022 PROSE book award in Neuroscience of the Association of American Publishers.
Models that Grossberg introduced and helped to develop include:
- the foundations of neural network research: competitive learning, self-organizing maps, instars, and masking fields, outstars, avalanches, gated dipoles ;
- perceptual and cognitive development, social cognition, working memory, cognitive information processing, planning, numerical estimation, and attention: Adaptive Resonance Theory, ARTMAP, STORE, CORT-X, SpaN, LIST PARSE, lisTELOS, SMART, CRIB;
- visual perception, attention, consciousness, object and scene learning, recognition, predictive remapping, and search: BCS/FCS, FACADE, 3D LAMINART, aFILM, LIGHTSHAFT, Motion BCS, 3D FORMOTION, MODE, VIEWNET, dARTEX, cART, ARTSCAN, pARTSCAN, dARTSCAN, 3D ARTSCAN, ARTSCAN Search, ARTSCENE, ARTSCENE Search;
- auditory streaming, perception, speech, and language processing: SPINET, ARTSTREAM, NormNet, PHONET, ARTPHONE, ARTWORD;
- cognitive-emotional dynamics, reinforcement learning, motivated attention, and adaptively timed behavior: CogEM, START, MOTIVATOR; Spectral Timing;
- visual and spatial navigation: SOVEREIGN, STARS, ViSTARS, GRIDSmap, GridPlaceMap, Spectral Spacing;
- adaptive sensory-motor control of eye, arm, and leg movements: VITE, FLETE, VITEWRITE, DIRECT, VAM, CPG, SACCART, TELOS, SAC-SPEM;
- autism: iSTART
Career and infrastructure development
All of these institutions were aimed at answering two related questions: i) How does the brain control behavior? ii) How can technology emulate biological intelligence?
In 1987, Grossberg founded and was first President of the International Neural Network Society, which grew to 3700 members from 49 states of the United States and 38 countries during the fourteen months of his presidency. The formation of INNS soon led to the formation of the European Neural Network Society and the Japanese Neural Network Society. Grossberg also founded the INNS official journal, Neural Networks, and was its Editor-in-Chief from 1987 to 2010. Neural Networks is also the archival journal of ENNS and JNNS.
Grossberg's lecture series at MIT Lincoln Laboratory triggered the national DARPA Neural Network Study in 1987–88, which led to heightened government interest in neural network research. He was General Chairman of the first IEEE International Conference on Neural Networks in 1987 and played a key role in organizing the first INNS annual meeting in 1988, whose fusion in 1989 led to the International Joint Conference on Neural Networks, which remains the largest annual meeting devoted to neural network research. Grossberg has also organized and chaired the annual International Conference on Cognitive and Neural Systems from 1997 to 2013, as well as many other conferences in the neural networks field.
Grossberg has served on the editorial board of 30 journals, including Journal of Cognitive Neuroscience, Behavioral and Brain Sciences, Cognitive Brain Research, Cognitive Science, Neural Computation, IEEE Transactions on Neural Networks, IEEE Expert, and the International Journal of Humanoid Robotics.