Geoffrey Hinton


Geoffrey Everest Hinton is a British-Canadian computer scientist, cognitive scientist, and cognitive psychologist known for his work on artificial neural networks, which earned him the title "the Godfather of AI".
Hinton is University Professor Emeritus at the University of Toronto. From 2013 to 2023, he divided his time working for Google Brain and the University of Toronto before publicly announcing his departure from Google in May 2023, citing concerns about the many risks of artificial intelligence technology. In 2017, he co-founded and became the chief scientific advisor of the Vector Institute in Toronto.
With David Rumelhart and Ronald J. Williams, Hinton was co-author of a highly cited paper published in 1986 that popularised the backpropagation algorithm for training multi-layer neural networks, although they were not the first to propose the approach. Hinton is viewed as a leading figure in the deep learning community. The image-recognition milestone of the AlexNet designed in collaboration with his students Alex Krizhevsky and Ilya Sutskever for the ImageNet challenge 2012 was a breakthrough in the field of computer vision.
Hinton received the 2018 Turing Award, together with Yoshua Bengio and Yann LeCun for their work on deep learning. They are sometimes referred to as the "Godfathers of Deep Learning" and have continued to give public talks together. He was also awarded, along with John Hopfield, the 2024 Nobel Prize in Physics for "foundational discoveries and inventions that enable machine learning with artificial neural networks".
In May 2023, Hinton announced his resignation from Google to be able to "freely speak out about the risks of A.I." He has voiced concerns about deliberate misuse by malicious actors, technological unemployment, and existential risk from artificial general intelligence. He noted that establishing safety guidelines will require cooperation among those competing in use of AI in order to avoid the worst outcomes. After receiving the Nobel Prize, he called for urgent research into AI safety to figure out how to control AI systems smarter than humans.

Education

Hinton was born on 6 December 1947 in Wimbledon, England, and was educated at Clifton College in Bristol. In 1967, he matriculated as an undergraduate student at King's College, Cambridge, and after repeatedly switching between different fields, like natural sciences, history of art, and philosophy, eventually graduated with a Bachelor of Arts degree in experimental psychology at the University of Cambridge in 1970. He spent a year apprenticing carpentry before returning to academic studies. From 1972 to 1975, he continued his study at the University of Edinburgh, where he was awarded a PhD in artificial intelligence in 1978 for research supervised by Christopher Longuet-Higgins, who favored the symbolic AI approach over the neural network approach.

Career

After his PhD, Hinton initially worked at the University of Sussex and at the MRC Applied Psychology Unit. After having difficulty getting funding in Britain, he worked in the US at the University of California, San Diego and Carnegie Mellon University. He was the founding director of the Gatsby Charitable Foundation Computational Neuroscience Unit at University College London. He University Professor Emeritus in the Department of Computer Science at the University of Toronto, where he has been affiliated since 1987.
Upon arrival in Canada, Geoffrey Hinton was appointed at the Canadian Institute for Advanced Research in 1987 as a Fellow in CIFAR's first research program, Artificial Intelligence, Robotics & Society. In 2004, Hinton and collaborators successfully proposed the launch of a new program at CIFAR, "Neural Computation and Adaptive Perception", which today is named "Learning in Machines & Brains". Hinton would go on to lead NCAP for ten years. Among the members of the program are Yoshua Bengio and Yann LeCun, with whom Hinton would go on to win the ACM A.M. Turing Award in 2018. All three Turing winners continue to be members of the CIFAR Learning in Machines & Brains program.
Hinton taught a free online course on Neural Networks on the education platform Coursera in 2012. He co-founded DNNresearch Inc. in 2012 with his two graduate students, Alex Krizhevsky and Ilya Sutskever, at the University of Toronto's department of computer science. In March 2013, Google acquired DNNresearch Inc. for $44 million, and Hinton planned to "divide his time between his university research and his work at Google".
In May 2023, Hinton publicly announced his resignation from Google. He explained his decision, saying he wanted to "freely speak out about the risks of A.I." and added that part of him now regrets his life's work.
Notable former PhD students and postdoctoral researchers from his group include Peter Dayan, Sam Roweis, Max Welling, Richard Zemel, Brendan Frey, Radford M. Neal, Yee Whye Teh, Ruslan Salakhutdinov, Ilya Sutskever, Yann LeCun, Alex Graves, Zoubin Ghahramani, and Peter Fitzhugh Brown.

Research

Hinton's research concerns the use of neural networks for machine learning, memory, perception, and symbol processing. He has written or co-written more than 200 peer-reviewed publications.
In the 1980s, Hinton was part of the "Parallel Distributed Processing" group at Carnegie Mellon University, which included notable scientists like Terrence Sejnowski, Francis Crick, David Rumelhart, and James McClelland. This group favoured the connectionist approach during the AI winter. Their findings were published in a two-volume set. The connectionist approach adopted by Hinton suggests that capabilities in areas like logic and grammar can be encoded into the parameters of neural networks, and that neural networks can learn them from data. Symbolists on the other side advocated for explicitly programming knowledge and rules into AI systems.
In 1985, Hinton co-invented Boltzmann machines with David Ackley and Terry Sejnowski. His other contributions to neural network research include distributed representations, time delay neural network, mixtures of experts, Helmholtz machines and product of experts. An accessible introduction to Geoffrey Hinton's research can be found in his articles in Scientific American in September 1992 and October 1993. In 1995, Hinton and colleagues proposed the wake-sleep algorithm, involving a neural network with separate pathways for recognition and generation, being trained with alternating "wake" and "sleep" phases. In 2007, Hinton coauthored an unsupervised learning paper titled Unsupervised learning of image transformations. In 2008, he developed the visualization method t-SNE with Laurens van der Maaten.File:Deep Thinkers on Deep Learning.jpg|thumb|upright=1.2|In 2016, from left to right,
Russ Salakhutdinov, Richard S. Sutton, Geoffrey Hinton, Yoshua Bengio, and Steve JurvetsonWhile Hinton was a postdoc at UC San Diego, David Rumelhart, Hinton and Ronald J. Williams applied the backpropagation algorithm to multi-layer neural networks. Their experiments showed that such networks can learn useful internal representations of data. In a 2018 interview, Hinton said that "David Rumelhart came up with the basic idea of backpropagation, so it's his invention". Although this work was important in popularising backpropagation, it was not the first to suggest the approach. Reverse-mode automatic differentiation, of which backpropagation is a special case, was proposed by Seppo Linnainmaa in 1970, and Paul Werbos proposed to use it to train neural networks in 1974.
In 2017, Hinton co-authored two open-access research papers about capsule neural networks, extending the concept of "capsule" introduced by Hinton in 2011. The architecture aims to better model part-whole relationships within objects in visual data. In 2021, Hinton presented GLOM, a speculative architecture idea also aiming to improve image understanding by modeling part-whole relationships in neural networks. In 2021, Hinton co-authored a widely cited paper proposing a framework for contrastive learning in computer vision. The technique involves pulling together representations of augmented versions of the same image, and pushing apart dissimilar representations.
At the 2022 Conference on Neural Information Processing Systems, Hinton introduced a new learning algorithm for neural networks that he calls the "Forward-Forward" algorithm. The idea is to replace the traditional forward-backwards passes of backpropagation with two forward passes, one with positive data and the other with negative data that could be generated solely by the network. The Forward-Forward algorithm is well-suited for what Hinton calls "mortal computation", where the knowledge learned isn't transferable to other systems and thus dies with the hardware, as can be the case for certain analog computers used for machine learning.

Honours and awards

Hinton is a Fellow of the US Association for the Advancement of Artificial Intelligence since 1990. He was elected a Fellow of the Royal Society of Canada in 1996, and then a Fellow of the Royal Society of London in 1998. He was the first winner of the Rumelhart Prize in 2001. According to the Royal Society:
In 2001, Hinton was awarded an honorary Doctor of Science degree from the University of Edinburgh. He was awarded as International Honorary Member of the American Academy of Arts and Sciences in 2003. Also, in this year he was elected a Fellow of the US Cognitive Science Society. He was the 2005 recipient of the IJCAI Award for Research Excellence lifetime-achievement award. He was awarded the 2011 Herzberg Canada Gold Medal for Science and Engineering. In that same year, he also was awarded an honorary DSc degree from the University of Sussex In 2012, he received the Canada Council Killam Prize in Engineering. In 2013, he was awarded an honorary doctorate from the Université de Sherbrooke. Hinton was elected an Honorary Foreign Member of the Spanish Royal Academy of Engineering in 2015.
In 2016, Hinton was elected an International Member of the US National Academy of Engineering "for contributions to the theory and practice of artificial neural networks and their application to speech recognition and computer vision". He received the 2016 IEEE/RSE Wolfson James Clerk Maxwell Award. In 2016, he furthermore won the BBVA Foundation Frontiers of Knowledge Award in the Information and Communication Technologies category, "for his pioneering and highly influential work" to endow machines with the ability to learn.
Together with Yann LeCun and Yoshua Bengio, Hinton won the 2018 Turing Award for conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing. Also in 2018, he became a Companion of the Order of Canada.
In 2021, he received the Dickson Prize in Science from the Carnegie Mellon University and in 2022 the Princess of Asturias Award in the Scientific Research category, along with Yann LeCun, Yoshua Bengio, and Demis Hassabis. In the same year, Hinton received an Honorary DSc degree from the University of Toronto. In 2023, he was named an ACM Fellow, elected an International Member of the US National Academy of Sciences, and received Lifeboat Foundation's 2023 Guardian Award along with Ilya Sutskever.
In 2024, he was jointly awarded the Nobel Prize in Physics with John Hopfield "for foundational discoveries and inventions that enable machine learning with artificial neural networks." His development of the Boltzmann machine was explicitly mentioned in the citation. When the New York Times reporter Cade Metz asked Hinton to explain in simpler terms how the Boltzmann machine could "pretrain" backpropagation networks, Hinton quipped that Richard Feynman reportedly said: "Listen, buddy, if I could explain it in a couple of minutes, it wouldn't be worth the Nobel Prize." That same year, he received the VinFuture Prize grand award alongside Yoshua Bengio, Yann LeCun, Jen-Hsun Huang, and Fei-Fei Li for groundbreaking contributions to neural networks and deep learning algorithms.
In 2025 he was awarded the Queen Elizabeth Prize for Engineering jointly with Yoshua Bengio, Bill Dally, John Hopfield, Yann LeCun, Jen-Hsun Huang and Fei-Fei Li. He was also awarded the King Charles III Coronation Medal.