Hebbian Learning

The Basic Idea

The neuroscientific concept of Hebbian learning was introduced by Donald Hebb in his 1949 publication of The Organization of Behaviour. Also known as Hebb’s Rule or Cell Assembly Theory, Hebbian Learning attempts to connect the psychological and neurological underpinnings of learning.

The basis of the theory is when our brains learn something new, neurons are activated and connected with other neurons, forming a neural network. These connections start off weak, but each time the stimulus is repeated, the connections grow stronger and stronger, and the action becomes more intuitive.

A good example is the act of learning to drive. When you start out, everything you do is incredibly deliberate. You remind yourself to turn on your indicator, to check your blind spot, and so on. However, after years of experience, these processes become so automatic that you perform them without even thinking.

Neurons that fire together, wire together.


–Donald Hebb

Theory, meet practice

TDL is an applied research consultancy. In our work, we leverage the insights of diverse fields—from psychology and economics to machine learning and behavioral data science—to sculpt targeted solutions to nuanced problems.

Our consulting services

Key Terms

Neuron

The basic building block of the brain, composed of a cell body, synapses and dendrites. They transmit information to other neurons through electric currents.

Synapse

The contact point where one neuron meets another, allowing them to pass messages to each other.

Synaptic Efficacy

How easily one neuron will activate another through a synapse, a connection that is strengthened with repetition.

Neural Network

A group of connected neurons.

Mirror Neurons

Neurons that not only fire when a person performs a certain action, but also fire when that same person observes someone else perform a similar action.

History

Canadian-born Donald Hebb originally wanted to be a writer, receiving a BA from Dalhousie University. He was a teacher until he finished his Master’s degree in psychology from McGill University. Intriguingly, Hebb sketched the basis of his idea of neural networks in his MA thesis at Dalhousie, although he described the paper as ‘nonsense’ in later years.1

Showing an interest in the physiology of psychology, Hebb went on to do a PhD with Karl Lashley at the University of Chicago, a renowned behaviorist of the time. It was likely this period of his career that sharpened his ability to combine behaviorist theories, particularly reinforcement learning, with physiology and neuroscience. Ivan Pavlov’s influence is particularly apparent in Hebb’s research, and many parallels have been drawn between Hebbian Learning and Pavlov’s conditioning theory.2

When he finished his work with Lashley, Hebb returned to Montreal in 1937 to work with Wilder Penfield at the Montreal Neurological Institute. Influenced by Lashley and other psychologists who had observed how cognitive functions (particularly memory) were localized in specific regions of the brain, Hebb first outlined what would be labelled Hebbian Learning in his 1949 book, The Organization of Behaviour3. The idea that neurons are capable of forming networks that create and store memories, crucial to learning, was groundbreaking at the time, and remains hugely influential to this day.

Hebb’s combining of psychological and neuroscientific concepts earned him the title of the “father of neuropsychology.” Hebb is credited with connecting the abstract concept of ‘the mind’, to specific physiological and biological brain functions. Neuropsychology is studied by both psychologists and neuroscientists to this day, who consistently explore the connection between brain function and behavior.

People

Donald Hebb

Canadian psychologist and inventor of Hebbian Learning, Donald Hebb is considered the ‘father of neuropsychology.’  His groundbreaking experiments linked conventional psychology with advances in physiology and biology.

Karl Lashley

American psychologist and behaviorist, Lashley is known for his many contributions to the study of learning and memory. Lashley was one of the first psychologists to perform experiments on the brains of rats; he studied  the effects of lesions by removing certain areas of the rat’s cortex. Lashley made several groundbreaking discoveries on how the brain stores and processes information, which greatly influenced his PhD student, Donald Hebb.

Consequences

Hebb’s theory of neural connections has had major implications for  how neuroscientists and psychologists understand memory. One example is Long-Term Potentiation (LTP), a theory that emerged in the late 1960s showing that synapses are strengthened by recent patterns of activity, therefore confirming the findings of Hebbian Learning.4 LTP is still a heavily researched topic today, with some exciting developments in the areas of dementia, Alzheimer’s disease and addiction treatment.

When it was first introduced, Hebbian Learning was seen as one part of the puzzle that surrounded memory function and storage. Around the 1950’s, however, researchers grew  certain that neural networks were  responsible for storing and retrieving associations, which opened up a whole new area of neuroscientific discovery. Combined with Lashley’s findings in his memory localization experiments, it was starting to become clearer that human beings possess different types of memories, and that each one is located in a different part of the brain.

These hypotheses were confirmed by  the fascinating case of Henry Molson – who was, and still is, famously referred to as HM. HM suffered from epilepsy, and in 1953, his surgeon recommended that he have the medial temporal lobes of his brain removed. The surgery was successful in curing his epilepsy, but it appeared to severely damage his explicit, long-term memory recall (ie. his memory for factual information). HM could easily deploy his short-term memory, and had no trouble acquiring new motor skills or using what psychologists call his ‘implicit’ long-term memory (ie. ability to complete procedural tasks like riding a bike). HM wasn’t able to recall, however, that he had ridden a bike the day before since this was a “fact.” As a result of HM’s experience, medial temporal lobes are considered responsible for the formation of long-term explicit memories. Hebb’s theory was used to support this approach to the study of memory, since the idea that memories are formed and stored through neural connections was supported by the case of HM.

More recent studies have looked at how neural connections can be strengthened, and therefore how learning can be enhanced. Indeed, many advancements in neuroplasticity and associative memory research can trace links back to Hebbian Learning.

Hebbian Learning is also the basis for several advances in computer science, especially artificial intelligence.

Controversies

In general, Hebbian Learning has been well-accepted across neuroscience and psychology.

A few criticisms do remain, especially when Hebbian learning is used as a basis for artificial intelligence or algorithm development. The primary issue is that incorrect or inappropriate neural connections can be made, and worse, strengthened, under Hebbian Learning. In some cases,  this could lead to bad habits, poor learning, and potentially destructive behavior. Another issue is that there is no upper limit for how strong a connection can become, so from a computational perspective some type of control is required to prevent ‘over-learning.’ Finally, hundreds of experiments have shown that human learning is hugely influenced by feedback, from other people as well as the stimuli we encounter in our daily lives. Hebbian Learning does not fully account for feedback mediation, although this area has received greater attention recently.5

Case Study

Hebbian Learning & Artificial Intelligence

The Hopfield Network, an artificial neural network introduced by John Hopfield in 1982, is based on rules stipulated under Hebbian Learning.6 By creating an artificial neural network, Hopfield found that information can be stored and retrieved in similar ways to the human brain. Through repetition and continuous learning, artificial intelligence can strengthen certain connections that speed its processing of certain situations, helping it ‘learn’ faster. This has proven especially useful in the case of pattern recognition, and the development of AI algorithms.

Related TDL resources

Algorithms for Simpler Decision-Making (1/2): The Case for Cognitive Prosthetics

Jason Burton explores how algorithms can and should be used to optimize our lives, but with a warning label attached.

Why do we retain information better when we learn it over a long time period?

The spacing effect occurs when information is repeatedly learned over a spaced-out period of time, resulting in an individual’s increased ability  to remember the information.

Tempting the Creation of Habits

Yasmine Kalkstein explores how we can create habits that become automatic behaviors. Much of this  thinking stems from ideas that are linked to Hebbian Learning and reinforcement theory.

Sources

  1. Milner, P. (2003). A brief history of the Hebbian learning rule. Canadian Psychology, 44, 5-9.
  2. Langille, J. J., & Brown, R. E. (2018). The synaptic theory of memory: a historical survey and reconciliation of recent opposition. Frontiers in systems neuroscience12, 52.
  3. Hebb, D. O. (1949). The organization of behavior; a neuropsycholocigal theory. A Wiley Book in Clinical Psychology62, 78.
  4. Purves, D., Augustine, G., Fitzpatrick, D., Katz, L., LaMantia, A., McNamara, J., & Williams, S. (2001). Neuroscience 2nd edition. sunderland (ma) sinauer associates. Types of Eye Movements and Their Functions.
  5. McClelland, J. L. (2006). How far can you go with Hebbian learning, and when does it lead you astray. Processes of change in brain and cognitive development: Attention and performance xxi21, 33-69.
  6. Sathasivam, Saratha (2008). “Logic Learning in Hopfield Networks”. arXiv:0804.4075 [cs.LO].

Read Next

Notes illustration

Eager to learn about how behavioral science can help your organization?