Computational Models of Memory Search
Memory models
Serial learning
give participant a list of words to learn
Associative chaining theory: forward bias
learn associations between item and its neighbours
strength of association decay monotonically with increasing distance between item presentation
Positional encoding theory: no bias
learn representation of the item's position in the list (index)
position cues item: house -> position 1 -> position 2 -> shoe

Representational assumptions
Memory matrix
two-dimensional matrix, each column is a memory vector
static
Although one can model such memories as a vector function of time, theorists usually eschew this added complexity, adopting a unitization assumption that underlies nearly all modern memory models.
Localist models
each item vector has a single, unique, nonzero element
each element corresponds to a unique item in memory
Distributed models
features representing an item distributed across many or all of the elements
The unitization assumption dovetails nicely with the classic list recall method in which the presentation of known items constitutes the miniexperiences to be stored and retrieved. But one can also create sequences out of unitary items, and by recalling and reactivating these sequences of items, one can model memories that include temporal dynamics.
Multi-trace theory
This model assumes that each item vector (memory) occupies its own “address,” much like memory stored on a computer is indexed by an address in the computer’s random-access memory. Repeating an item does not strengthen its existing entry but rather lays down a new memory trace.
retrieval of encoded item create new memory trace
this model implies number of traces can increase without bound, but brain capacity is finite
if search for an item is serial would take forever, if parallel would cause high demand on nervous systems
Composite memories
composite storage model
for recognition memory
storage equation:
Rather than summing item vectors directly, which results in substantial loss of information, we can first expand an item’s representation into a matrix form, and then sum the resultant matrices.
Summed similarity
Pattern completion
Contextual coding
Associative models
Recognition and recall
Serial learning
Recall phenomena
Serial position effects
Contiguity and similarity effects
Recall errors
Inter-response times
Memory search models
Dual-store theory
Retrieved context theory
context and item are retrieval cue for each other
Last updated
Was this helpful?