Table of Contents
Overview: From Nerve Signal to Memory
Information processing and storage in biology is about how nervous systems:
- receive and integrate signals from many sources,
- transform them into meaningful patterns,
- store some of these patterns as memories,
- retrieve and use stored information to guide behavior.
This chapter bridges the detailed cellular mechanisms covered under excitation, conduction, and sense organs with the systems-level topics that follow: nervous systems in different animals, memory and consciousness, sleep, and the effects of psychoactive substances. Here, the focus is on general principles that apply across many nervous systems, not on any one species.
At three main levels, information is handled as:
- Electrical patterns in single neurons – changes in membrane potential, firing or not firing.
- Spatiotemporal patterns in networks – which neurons fire, in what order, and how strongly they are connected.
- Long‑term changes in synapses and networks – structural and functional modifications that embody memory.
These levels are tightly linked: moment-to-moment electrical activity can gradually reshape synapses; the reshaped network in turn constrains future activity patterns.
Coding and Representing Information
To process information, nervous systems must encode stimuli and internal states in a form neurons can handle. Several coding principles recur in many systems.
Rate Coding
In rate coding, information is represented by how often a neuron fires action potentials in a given time.
- A stronger stimulus (e.g., brighter light, louder sound) often leads to a higher firing rate.
- The exact timing of individual spikes is less critical than the average number per unit time.
Rate coding is particularly common in early sensory pathways and in many motor pathways.
Temporal Coding
In temporal coding, the precise timing of spikes carries information.
Examples:
- The phase of spikes relative to an ongoing rhythm (e.g., brain oscillations).
- Delays between spikes in different neurons.
- Short bursts vs isolated spikes.
Temporal coding is especially important when:
- signals must be very fast (e.g., sound localization),
- information is sparse but precise timing is possible.
Population Coding
A single neuron is usually ambiguous: it can respond to many different inputs. Population coding reduces ambiguity by using patterns across many neurons:
- Each neuron responds best to certain features (e.g., a particular orientation of a line in the visual field).
- The combination of responses across a population uniquely specifies the stimulus.
Population codes are extremely common in sensory and motor systems. For example:
- In motor areas, a movement direction can be represented by averaging the “preferred directions” of many active neurons.
- In olfaction, each odor activates a particular combination of receptor types, forming an “odor pattern.”
Labeled Lines vs Distributed Codes
Two extreme strategies for representing different kinds of information:
- Labeled line: individual pathways are dedicated to particular stimulus types or qualities.
- Example: separate nerve fibers for warm vs cold receptors.
- Distributed code: no single line is dedicated; information is encoded by activity patterns over many shared neurons.
- Example: recognition of complex visual objects.
Most nervous systems use a mix: labeled lines at early, specialized stages; distributed codes for complex features and abstract categories.
Convergence, Divergence, and Integration
To process information, neurons must combine inputs and distribute outputs.
Convergence
Convergence occurs when many neurons send synapses to one neuron.
- A postsynaptic neuron receives:
- excitatory inputs (depolarizing),
- inhibitory inputs (hyperpolarizing),
- often from many different sources and sensory modalities.
Consequences:
- Integration across space: a neuron compares inputs from different regions (e.g., two eyes) or receptors.
- Integration across time: repeated inputs can summate.
- Increased sensitivity: weak signals from multiple sources can add up to reach threshold.
Divergence
Divergence occurs when one neuron sends branches to many targets.
- A single sensory neuron can influence many interneurons and motor neurons.
- Divergent circuits amplify signals and allow parallel processing in different pathways.
Convergence and divergence together create complex networks, enabling:
- feature extraction,
- comparison (e.g., left vs right),
- decision‑like operations (e.g., “fire only if input A and input B are active”).
Spatial and Temporal Summation
Neurons integrate inputs in two basic ways:
- Spatial summation:
- Inputs from different synapses on a neuron’s membrane add together.
- Many weak inputs at different locations can together bring the neuron to threshold.
- Temporal summation:
- Rapidly repeated inputs at the same synapse can add up if they arrive before the previous effect has fully decayed.
Whether the neuron fires depends on the balance of excitation and inhibition (see below).
Excitation, Inhibition, and Signal Filtering
Information processing is not just about exciting neurons, but also about controlling and shaping these excitations.
Excitatory vs Inhibitory Balance
- Excitatory synapses increase the likelihood of an action potential.
- Inhibitory synapses decrease the likelihood.
The balance between them:
- prevents runaway excitation (e.g., seizures),
- sharpens responses (e.g., focus on relevant inputs),
- shapes when and where activity can occur.
This balance is flexible and can be modified by learning, hormones, and neuromodulators.
Lateral Inhibition and Contrast Enhancement
A common circuit motif is lateral inhibition:
- Excited neurons inhibit their neighbors.
- Result: strong excitation in one region suppresses weaker activity nearby.
Functions:
- Enhances contrast in sensory systems (e.g., visual edges, tactile borders).
- Sharpening of representations (e.g., distinguishing similar stimuli).
- Prevents different activity patterns from blending into each other.
Feedforward and Feedback Inhibition
Two ways inhibitory neurons participate in circuits:
- Feedforward inhibition:
- An input excites both a principal neuron and an inhibitory neuron.
- The inhibitory neuron quickly limits or shortens the principal neuron’s response.
- Acts as a “brake” to prevent excessive excitation.
- Feedback inhibition:
- A principal neuron excites an inhibitory neuron, which then suppresses the principal neuron (and often its neighbors).
- Acts like an automatic gain control or stabilizer.
These motifs are key elements of filtering and shaping information flow.
Network Motifs and Simple Computations
Complex nervous systems are built from recurring network motifs that implement simple computational functions.
Feedforward Networks
In a feedforward network, information flows in one direction: from input to output without loops.
- Typical in early sensory processing.
- Implement simple transformations:
- feature detection (orientation, color, frequency),
- combination of multiple inputs into a single output.
Because there are no loops, feedforward networks respond quickly and predictably, but have limited internal memory.
Recurrent Networks
In recurrent networks, neurons form loops, feeding activity back into earlier stages.
- Can sustain activity after the original input ends.
- Support:
- short‑term representations,
- pattern completion (recreating a full pattern from partial input),
- decision‑like dynamics (settling into one of several stable states).
Recurrent circuits are crucial for working memory, attention, and complex sequence processing.
Pattern Separation and Pattern Completion
Two opposite but complementary functions in memory‑related networks:
- Pattern separation:
- Similar input patterns are transformed into more distinct activity patterns.
- Reduces confusion between similar memories.
- Pattern completion:
- Partial or noisy input triggers the full stored pattern.
- Enables recall from incomplete cues.
Networks can be arranged such that some regions emphasize separation (e.g., at encoding) and others emphasize completion (e.g., at retrieval).
Short‑Term vs Long‑Term Information Storage
Information storage occurs over multiple timescales, using different mechanisms.
Short‑Term (Working) Storage
Short‑term storage holds information over seconds to minutes, often to support ongoing tasks.
Key features:
- Limited capacity.
- Labile: easily disrupted by new inputs or distractions.
- Often relies on sustained patterns of activity in recurrent networks.
Mechanisms include:
- Persistent firing of specific neuron groups as long as information is held.
- Temporary changes in synaptic strength due to recent activity (short‑term synaptic plasticity).
No major structural changes in neurons are required; changes are mostly functional and reversible.
Long‑Term Storage
Long‑term storage keeps information for days to years.
Characteristics:
- More stable and resistant to interference.
- Often requires repeated or strong activation (e.g., practice, emotionally significant events).
Long‑term storage involves:
- Long‑lasting changes in synaptic strength (e.g., long‑term potentiation or depression; the exact details belong to later chapters on memory).
- Structural modifications:
- growth or elimination of synapses,
- changes in dendritic spines,
- altered expression of receptors and channels.
These changes provide a physical substrate (an engram) for stored information.
Synaptic Plasticity: The Basis of Learning and Memory
Synaptic plasticity is the activity‑dependent change in the strength or number of synapses. It links information processing to durable storage.
Activity‑Dependent Changes
Synapses can become:
- Stronger (more effective at exciting or inhibiting the postsynaptic neuron), or
- Weaker (less effective),
depending on specific patterns of activity.
General principles:
- Co‑activity: if presynaptic and postsynaptic neurons are active together, synapses between them often strengthen.
- Uncoordinated activity: if a presynaptic neuron is active but does not contribute effectively to postsynaptic firing, that synapse may weaken.
This implements a “correlation‑based” learning rule: cells that “participate together” in an event become more strongly linked.
Hebbian and Anti‑Hebbian Rules
A classic summary of one major principle is:
- “Cells that fire together, wire together” (Hebbian learning).
- If neuron A’s firing consistently helps neuron B fire, synapse A→B is strengthened.
- “Cells out of sync lose their link” (a related weakening rule).
- If neuron A’s activity does not correlate with neuron B’s firing, synapse A→B may be weakened.
Variations on these rules:
- Some circuits use anti‑Hebbian rules where co‑activity leads to weakening, often to enhance contrast or stabilize networks.
- Many forms are modulated by global signals (e.g., neuromodulators indicating reward or novelty).
Synaptic Scaling and Homeostasis
If plasticity were purely positive feedback (only strengthening frequently used synapses), networks would quickly saturate. To prevent this, neurons use homeostatic mechanisms:
- Synaptic scaling: global up‑ or down‑regulation of synaptic strengths to keep firing rates within useful ranges.
- Structural homeostasis: adjustments in number and distribution of synapses.
These mechanisms preserve overall stability while allowing specific connections to encode information.
Distributed Storage and Redundancy
In biological systems, information is rarely stored at a single location.
Engrams as Distributed Patterns
A memory trace (engram) is distributed:
- across many synapses and neurons,
- often across several brain regions.
Consequences:
- Robustness: loss or damage to some cells does not necessarily erase the memory.
- Graded degradation: damage may weaken or blur memories rather than eliminating them completely.
Redundancy and Degeneracy
Two types of protective design:
- Redundancy:
- Multiple copies of similar information are stored.
- Different neuron groups might encode similar patterns.
- Degeneracy:
- Different network configurations can produce functionally similar outputs.
- Allows flexible compensation after injury or developmental changes.
These properties make biological information storage more fault‑tolerant than a simple “one cell per item” scheme.
Internal Models and Prediction
Nervous systems do more than react to stimuli; they often build internal models of the body and environment.
Prediction and Error Signals
An internal model can:
- predict what sensory input should look like given current movements and past experience,
- compare predictions with actual input,
- generate error signals when reality deviates from prediction.
Error signals then:
- adjust synaptic strengths (learning),
- refine future predictions,
- guide adaptive behavior (e.g., correcting motor actions).
Combining Bottom‑Up and Top‑Down Information
Information processing integrates:
- Bottom‑up signals from sensory organs (what is currently being sensed).
- Top‑down signals from internal models, memories, and goals (what is expected or desired).
Top‑down processes can:
- enhance relevant inputs (attention),
- suppress irrelevant or distracting signals,
- fill in missing information.
The balance between bottom‑up and top‑down inputs is central to perception, decision‑making, and many cognitive functions discussed in later chapters.
Information Flow Across the Nervous System
Finally, information processing and storage depend on how different regions are connected.
Hierarchical and Parallel Pathways
Many systems combine:
- Hierarchical organization:
- Early stages process simple features (e.g., brightness, edges).
- Later stages integrate features into complex representations (e.g., objects, scenes).
- Parallel pathways:
- Separate routes process different aspects of information simultaneously (e.g., motion vs form, “what” vs “where”).
This organization:
- increases processing speed,
- allows specialization,
- provides multiple routes to achieve similar functions.
Reentry and Loops Between Areas
Higher‑order regions send feedback to earlier ones, forming loops (reentry):
- Early sensory areas do not just feed forward; they are modulated by expectations, attention, and memory.
- Motor and sensory systems interact continuously (e.g., adjusting ongoing movements based on sensory feedback).
These loops are major sites where short‑term processing can be integrated with long‑term storage, shaping perception and behavior in context‑dependent ways.
In summary, information processing and storage in biological systems are inseparable:
- patterns of activity encode current information,
- plastic changes in synapses and networks convert activity patterns into stored traces,
- distributed and redundant organization makes storage robust,
- ongoing interaction between stored information and incoming signals underlies perception, decision‑making, and behavior.
Subsequent chapters on nervous systems, memory and consciousness, sleep, and psychoactive substances will examine how these general principles are implemented and modulated in specific organisms and brain systems.