Kahibaro
Discord Login Register

Information Processing and Storage

Overview: From Nerve Signal to Memory

Information processing and storage in biology is about how nervous systems:

This chapter bridges the detailed cellular mechanisms covered under excitation, conduction, and sense organs with the systems-level topics that follow: nervous systems in different animals, memory and consciousness, sleep, and the effects of psychoactive substances. Here, the focus is on general principles that apply across many nervous systems, not on any one species.

At three main levels, information is handled as:

  1. Electrical patterns in single neurons – changes in membrane potential, firing or not firing.
  2. Spatiotemporal patterns in networks – which neurons fire, in what order, and how strongly they are connected.
  3. Long‑term changes in synapses and networks – structural and functional modifications that embody memory.

These levels are tightly linked: moment-to-moment electrical activity can gradually reshape synapses; the reshaped network in turn constrains future activity patterns.

Coding and Representing Information

To process information, nervous systems must encode stimuli and internal states in a form neurons can handle. Several coding principles recur in many systems.

Rate Coding

In rate coding, information is represented by how often a neuron fires action potentials in a given time.

Rate coding is particularly common in early sensory pathways and in many motor pathways.

Temporal Coding

In temporal coding, the precise timing of spikes carries information.

Examples:

Temporal coding is especially important when:

Population Coding

A single neuron is usually ambiguous: it can respond to many different inputs. Population coding reduces ambiguity by using patterns across many neurons:

Population codes are extremely common in sensory and motor systems. For example:

Labeled Lines vs Distributed Codes

Two extreme strategies for representing different kinds of information:

Most nervous systems use a mix: labeled lines at early, specialized stages; distributed codes for complex features and abstract categories.

Convergence, Divergence, and Integration

To process information, neurons must combine inputs and distribute outputs.

Convergence

Convergence occurs when many neurons send synapses to one neuron.

Consequences:

Divergence

Divergence occurs when one neuron sends branches to many targets.

Convergence and divergence together create complex networks, enabling:

Spatial and Temporal Summation

Neurons integrate inputs in two basic ways:

Whether the neuron fires depends on the balance of excitation and inhibition (see below).

Excitation, Inhibition, and Signal Filtering

Information processing is not just about exciting neurons, but also about controlling and shaping these excitations.

Excitatory vs Inhibitory Balance

The balance between them:

This balance is flexible and can be modified by learning, hormones, and neuromodulators.

Lateral Inhibition and Contrast Enhancement

A common circuit motif is lateral inhibition:

Functions:

Feedforward and Feedback Inhibition

Two ways inhibitory neurons participate in circuits:

These motifs are key elements of filtering and shaping information flow.

Network Motifs and Simple Computations

Complex nervous systems are built from recurring network motifs that implement simple computational functions.

Feedforward Networks

In a feedforward network, information flows in one direction: from input to output without loops.

Because there are no loops, feedforward networks respond quickly and predictably, but have limited internal memory.

Recurrent Networks

In recurrent networks, neurons form loops, feeding activity back into earlier stages.

Recurrent circuits are crucial for working memory, attention, and complex sequence processing.

Pattern Separation and Pattern Completion

Two opposite but complementary functions in memory‑related networks:

Networks can be arranged such that some regions emphasize separation (e.g., at encoding) and others emphasize completion (e.g., at retrieval).

Short‑Term vs Long‑Term Information Storage

Information storage occurs over multiple timescales, using different mechanisms.

Short‑Term (Working) Storage

Short‑term storage holds information over seconds to minutes, often to support ongoing tasks.

Key features:

Mechanisms include:

No major structural changes in neurons are required; changes are mostly functional and reversible.

Long‑Term Storage

Long‑term storage keeps information for days to years.

Characteristics:

Long‑term storage involves:

These changes provide a physical substrate (an engram) for stored information.

Synaptic Plasticity: The Basis of Learning and Memory

Synaptic plasticity is the activity‑dependent change in the strength or number of synapses. It links information processing to durable storage.

Activity‑Dependent Changes

Synapses can become:

depending on specific patterns of activity.

General principles:

This implements a “correlation‑based” learning rule: cells that “participate together” in an event become more strongly linked.

Hebbian and Anti‑Hebbian Rules

A classic summary of one major principle is:

Variations on these rules:

Synaptic Scaling and Homeostasis

If plasticity were purely positive feedback (only strengthening frequently used synapses), networks would quickly saturate. To prevent this, neurons use homeostatic mechanisms:

These mechanisms preserve overall stability while allowing specific connections to encode information.

Distributed Storage and Redundancy

In biological systems, information is rarely stored at a single location.

Engrams as Distributed Patterns

A memory trace (engram) is distributed:

Consequences:

Redundancy and Degeneracy

Two types of protective design:

These properties make biological information storage more fault‑tolerant than a simple “one cell per item” scheme.

Internal Models and Prediction

Nervous systems do more than react to stimuli; they often build internal models of the body and environment.

Prediction and Error Signals

An internal model can:

Error signals then:

Combining Bottom‑Up and Top‑Down Information

Information processing integrates:

Top‑down processes can:

The balance between bottom‑up and top‑down inputs is central to perception, decision‑making, and many cognitive functions discussed in later chapters.

Information Flow Across the Nervous System

Finally, information processing and storage depend on how different regions are connected.

Hierarchical and Parallel Pathways

Many systems combine:

This organization:

Reentry and Loops Between Areas

Higher‑order regions send feedback to earlier ones, forming loops (reentry):

These loops are major sites where short‑term processing can be integrated with long‑term storage, shaping perception and behavior in context‑dependent ways.


In summary, information processing and storage in biological systems are inseparable:

Subsequent chapters on nervous systems, memory and consciousness, sleep, and psychoactive substances will examine how these general principles are implemented and modulated in specific organisms and brain systems.

Views: 28

Comments

Please login to add a comment.

Don't have an account? Register now!