Hello,
Thanks for asking your question. You asked the following:
Define and evaluate the theories of long term and short term
memories.
1000 words.
1) Long term memory
Definition
The term long-term memory is somewhat of a catch-all phrase because it
can refer to facts learned a few minutes ago, personal memories many
decades old, or skills learned with practice. Generally, however,
long-term memory describes a system in the brain that can store vast
amounts of information on a relatively enduring basis.
Theories of Long Term Memory
General theories of memory, which are based on performance on memory
tasks, incorporate constraints on storage and retrieval that are
assumed to apply to any type of activity. When investigators began
studying more complex cognitive processes, such as problem solving,
decision making, and concept formation, the models they developed had
to be consistent with these theories. An adequate model of performance
in a task had to specify the relevant background knowledge and skills
subjects had as well as sequences of processes that did not violate
the constraints on the amount of information kept available in memory
(that is, working memory).
In standard theories of memory (Atkinson & Shiffrin, 1968) information
can be stored in LTM only after it has been stored in STM, and even
then, storage in LTM is a probabilistic event. Originally, Atkinson
and Shiffrin proposed that the probability of storage in LTM is a
function of the time an item was maintained in STM. More recently,
Anderson (1983) suggested that the probability of storage is a
function of the number of times an item enters STM. Subjects' control
of the storage of information appears to be limited, as shown, for
example, by low levels of free recall in list learning. Furthermore,
in more meaningful tasks subjects' recall of presented information is
not improved when they are instructed to study that information for
later recall (Craik & Lockhart, 1972). This finding implies that
subjects cannot achieve reliable storage of information in many of the
standard memory tasks. Anderson (1983) goes even so far as to argue
that subjects' inability to control storage in LTM is beneficial since
they cannot predict what information will be useful later on. The
memory performance exhibited by subjects in standard memory tasks is
clearly consistent with the view that storage of information in LTM
and efficient access of that information is too unreliable to be an
effective source of working memory. We argue later in this paper,
however, that the performance of untrained subjects who memorize lists
of unrelated items in the laboratory does not accurately describe the
efficient storage and retrieval that experts in specific domains can
achieve after many years of practice.
Newell and Simon (1972) proposed a production-system architecture for
cognitive processes that has influenced most subsequent efforts to
build models and theories. In this architecture the conditions of a
large number of productions (condition-action pairs) are matched
against the currently active elements (working memory). In more recent
models, such as Anderson's (1983) ACT*, working memory is the
transiently activated portion of LTM. The limits on the number of
elements in working memory are not determined by a fixed number but
rather by the amount of available activation. In his work on building
ACT* models of cognitive processing Anderson found that working memory
can sometimes contain over 20 units at one time. To reconcile such a
large capacity of working memory with the much smaller capacity of
STM, Anderson (1983) argued as follows: The activation of elements
decays very rapidly. For this reason the number of units that can be
actively maintained long enough to be included in immediate recall is
much less than all of the information activated at the start of
recall. Most investigators argue, however, that the capacity of
working memory must be far greater than the capacity of traditional
STM (Newell, 1990).
Working memory in production-system architectures was originally
viewed as a single construct with general resources. In a very
influential line of empirical research initiated by Baddeley and Hitch
(1974), investigators examined this assertion by studying the effect
on cognitive performance from an additional concurrent task
specifically designed to interfere with the capacity of working
memory. The result of over a decade's active research on that and
related paradigms (reviewed by Baddeley, 1986) conflicted with the
single-construct view. Although reliable decrements in speed and/or
accuracy of cognitive processes were often obtained for the main task
when an additional memory task was introduced, the primary task
performance decreased for the most part only slightly even when
subjects had to maintain 3 to 6 digits (near their digit span) in
working memory while executing the primary task. To account for these
findings Baddeley (1986) proposed that in addition to a central
executive there are two slave systems, the articulatory loop and the
visuo-spatial scratch pad, in which the central executive can store
information temporarily. Investigators have obtained converging
evidence for these subsystems by examining the relation between
individual differences on the main task and on tasks measuring various
types of memory performance. Of particular interest are findings from
neuro-psychological patients who exhibit exceptionally poor
performance on tasks that measure the capacity of one of the
subsystems, for example, memory span for words which is assumed to
measure the capacity of the articulatory loop. Consistent with the
independence of the subsystems in Baddeley's model, patients with
dramatically impaired subsystems are still able to perform text
comprehension tasks at normal levels. At the same time this finding
means that working memory in such skilled activities as text
comprehension must be accounted for by the central executive and thus
remains essentially unexplained.
In sum, recent research has shown that working memory does not consist
of a single general capacity, but rather consists of several
subsystems that can be relied on to complete various types of tasks
(Baddeley, 1986).
From Microsoft Encarta:
There seems to be no finite capacity to long-term memory. People can
learn and retain new facts and skills throughout their lives. Although
older adults may show a decline in certain capacitiesfor example,
recalling recent eventsthey can still profit from experience even in
old age. For example, vocabulary increases over the entire life span.
The brain remains plastic and capable of new learning throughout ones
lifetime, at least under normal conditions. Certain neurological
diseases, such as Alzheimers disease, can greatly diminish the
capacity for new learning.
Psychologists once thought of long-term memory as a single system.
Today, most researchers distinguish three long-term memory systems:
episodic memory, semantic memory, and procedural memory.
Episodic Memory
Episodic memory refers to memories of specific episodes in ones life
and is what most people think of as memory. Episodic memories are
connected with a specific time and place. If you were asked to recount
everything you did yesterday, you would rely on episodic memory to
recall the events. Similarly, you would draw on episodic memory to
describe a family vacation, the way you felt when you won an award, or
the circumstances of a childhood accident. Episodic memory contains
the personal, autobiographical details of our lives.
Semantic Memory
Semantic memory refers to our general knowledge of the world and all
of the facts we know. Semantic memory allows a person to know that the
chemical symbol for salt is NaCl, that dogs have four legs, that
Thomas Jefferson was president of the United States, that 3 × 3 equals
9, and thousands of other facts. Semantic memories are not tied to the
particular time and place of learning. For example, in order to
remember that Thomas Jefferson was president, people do not have to
recall the time and place that they first learned this fact. The
knowledge transcends the original context in which it was learned. In
this respect, semantic memory differs from episodic memory, which is
closely related to time and place. Semantic memory also seems to have
a different neural basis than episodic memory. Brain-damaged patients
who have great difficulties remembering their own recent personal
experiences often can access their permanent knowledge quite readily.
Thus, episodic memory and semantic memory seem to represent
independent capacities.
Procedural Memory
Procedural memory refers to the skills that humans possess. Tying
shoelaces, riding a bicycle, swimming, and hitting a baseball are
examples of procedural memory. Procedural memory is often contrasted
with episodic and semantic memory. Episodic and semantic memory are
both classified as types of declarative memory because people can
consciously recall facts, events, and experiences and then verbally
declare or describe their recollections. In contrast, nondeclarative,
or procedural, memory is expressed through performance and typically
does not require a conscious effort to recall.
http://encarta.msn.com/encnet/refpages/RefArticle.aspx?refid=761578303&pn=1#s2
Here is another summary on the selected theories of long-term memory
from the psychology syllabus of CUNY:
THE STANDARD VIEW
Atkinson and Shiffrin's model
STM- temporary buffer- Information is in a "temporary state".
LTM- permanent store- Information in this system is in a "permanent
state".
Control processes- are operations that move information from STM to
LTM. They include: rehearsal, coding, and imaging.
Tests of the model demonstrated that participants learned more when
they focused on information that was in either an unlearned state
(e.g., not in STM or LTM) or a temporary state (e.g., in STM)
(Atkinson, 1972a, 1972b).
THE LEVELS OF PROCESSING APPROACH (LOP)
According to the Atkinson-Shiffrin model information is remembered if
it makes it into LTM. The levels of processing view, is an attempt to
specify what kinds of processes result in "good" memory and what kinds
of processes result in "poor" memory. This emphasis is a departure
from the standard view because the type of learning rather than the
location of the information is stressed."
http://www.lehman.cuny.edu/depts/psychology/sailor/cognition/ltm.html#standard
2) Short term memory
Definition
From the Academic Press:
The part of the human memory system that stores information shortly
after the material is presented; characterized by rapid decay and a
limited capacity.
http://www.academicpress.com/inscight/12241997/short-t1.htm
Theories of Short Term Memory
Information Processing Theory
George A. Miller has provided two theoretical ideas that are
fundamental to the information processing framework and cognitive
psychology more generally. The first concept is `chunking' and the
capacity of short term (working) memory. Miller (1956) presented the
idea that short-term memory could only hold 5-9 chunks of information
(seven plus or minus two) where a chunk is any meaningful unit. A
chunk could refer to digits, words, chess positions, or people's
faces. The concept of chunking and the limited capacity of short term
memory became a basic element of all subsequent theories of memory.
The second concept, that of information processing, uses the computer
as a model for human learning. Like the computer, the human mind takes
in information, performs operations on it to change its form and
content, stores and locates it and generates reponses to it. Thus,
processing involves gathering and representing information, or
encoding; holding information or retention; and getting at the
information when needed, or retrieval. Information processing
theorists approach learning primarily through a study of memory.
(Miller, 1956)
From Microsoft Encarta:
Psychologists originally used the term short-term memory to refer to
the ability to hold information in mind over a brief period of time.
As conceptions of short-term memory expanded to include more than just
the brief storage of information, psychologists created new
terminology. The term working memory is now commonly used to refer to
a broader system that both stores information briefly and allows
manipulation and use of the stored information.
Psychologists often study working memory storage by examining how well
people remember a list of items. In a typical experiment, people are
presented with a series of words, one every few seconds. Then they are
instructed to recall as many of the words as they can, in any order.
Most people remember the words at the beginning and end of the series
better than those in the middle. This phenomenon is called the serial
position effect because the chance of recalling an item is related to
its position in the series. The results from one such experiment are
shown in the accompanying chart entitled Serial Position Effect. In
this experiment, recall was tested either immediately after
presentation of the list items or after 30 seconds. Subjects in both
conditions demonstrated what is known as the primacy effect, which is
better recall of the first few list items. Psychologists believe this
effect occurs because people tend to process the first few items more
than later items. Subjects in the immediate-recall condition also
showed the recency effect, or better recall of the last items on the
list. The recency effect occurs because people can store recently
presented information temporarily in working memory. When the recall
test is delayed for 30 seconds, however, the information in working
memory fades, and the recency effect disappears.
Working memory has a basic limitation: It can hold only a limited
amount of information at one time. Early research on short-term
storage of information focused on memory spanhow many items people
can correctly recall in order. Researchers would show people
increasingly long sequences of digits or letters and then ask them to
recall as many of the items as they could. In 1956 American
psychologist George Miller reviewed many experiments on memory span
and concluded that people could hold an average of seven items in
short-term memory. He referred to this limit as the magical number
seven, plus or minus two because the results of the studies were so
consistent. More recent studies have attempted to separate true
storage capacity from processing capacity by using tests more complex
than memory span. These studies have estimated a somewhat lower
short-term storage capacity than did the earlier experiments. People
can overcome such storage limitations by grouping information into
chunks, or meaningful units.
Working memory is critical for mental work, or thinking. Suppose you
are trying to solve the arithmetic problem 64 × 9 in your head. You
probably would need to perform some intermediate calculations in your
head before arriving at the final answer. The ability to carry out
these kinds of calculations depends on working memory capacity, which
varies individually. Studies have also shown that working memory
changes with age. As children grow older, their working memory
capacity increases. Working memory declines in old age and in some
types of brain diseases, such as Alzheimers disease.
Working memory capacity is correlated with intelligence (as measured
by intelligence tests). This correlation has led some psychologists to
argue that working memory abilities are essentially those that
underlie general intelligence. The more capacity people have to hold
information in mind while they think, the more intelligent they are.
In addition, research suggests that there are different types of
working memory. For example, the ability to hold visual images in mind
seems independent from the ability to retain verbal information."
http://encarta.msn.com/encnet/refpages/RefArticle.aspx?refid=761578303&pn=1#s2
Working memory , the more contemporary term for short-term memory, is
conceptualised as an active system for temporarily storing and
manipulating information needed in the execution of complex cognitive
tasks (Baddeley 1986) (e.g., learning, reasoning, and comprehension).
Experimental evidence has shown that the working memory is of limited
size (Miller 1956), and hence, due to the high conceptual demands,
complexity of laboratory experiments and potential information
overload associated with problems solving, in both chemistry and
physics, there are clear instructional implications.
http://dbweb.liv.ac.uk/ltsnpsc/AB/AB-html/node10.html
Articulatory Loop Theory of Working Memory from the University of
Alberta:
The articulatory loop (AL) is one of two passive slave systems within
Baddeley's (1986) tripartite model of working memory. The AL,
responsible for storing speech based information, is comprised of two
components. The first component is a phonological memory store which
can hold traces of acoustic or speech based material. Material in this
short term store lasts about two seconds unless it is maintained
through the use of the second subcomponent, articulatory subvocal
rehearsal. Prevention of articulatory rehearsal results in very rapid
forgetting. Try this experiment with a friend. Present your friend
with three consonants (e.g., C-X-Q) and ask them to recall the
consonants after a 10 second delay. During the 10 second interval,
prevent your friend from rehearsing the consonants by having them
count 'backwards by threes' starting at 100. You will find that your
friend's recall is significantly impaired! See Murdoch (1961) and
Baddeley (1986) for a complete review.
http://www.psych.ualberta.ca/~mike/Pearl_Street/Dictionary/contents/A/articulatory_loop.html
Mechanisms of Working Memory from the Georgia Institute of Technology:
Given that working memory plays a key role in our ability to predict
human performance, what do we know about how the human working memory
system actually works? It turns out that several important mechanisms
have been identified, such as: Decay. Items in working memory decay
over time. That is, the longer it has been since an item was needed in
working memory, the less likely it is that it is currently available.
Displacement/interference. As new items enter into working memory,
there are at least two repercussions in the rest of the system: other
items tend to become harder to access and the cognitive system becomes
less efficient, effectively slowing down. Basic processing speed.
Partially as a result of attempting to determine what causes the
age-related decline in working memory capacity, researchers have
discovered that there is a strong link between simple processing speed
and working memory capacity.
http://www.acm.org/sigchi/chi96/proceedings/doctoral/Byrne/mdb_txt.htm
Here are some excepts from a book entitled Models of Working Memory.
It gives abstracts of current theories of working memory:
Working Memory - Multiple Component Model:
The authors' own definition of working memory is that it comprises
those functional components of cognition that allow humans to
comprehend and mentally represent their immediate environment, to
retain information about their immediate past experience, to support
the acquisition of new knowledge, to solve problems, and to formulate,
relate, and act on current goals. Their theoretical approach has
developed in the framework of working memory comprising multiple
specialized subcomponents of cognition. Although the Baddeley-Logie
model maintains the original tripartite structure proposed by A. D.
Baddeley and G. J. Hitch (1974), it has undergone a number of
important changes, particularly in regard to specifying functions of
the central executive.
The Embedded Process Model:
This chapter presents the author's Embedded-Processes Model, a
broad-scope information processing framework originally developed to
synthesize a vast array of findings on attention and memory. The
mnemonic functions preserving information that can be used to do the
necessary work collectively make up working memory. This is a
functional definition in that any processing mechanism contributing to
the desired outcome are said to participate in the working memory
system. In contrast, some researchers appear to prefer to define
working memory according to the mechanism themselves. Though my
framework has much in common with those of other researchers, a
functional definition of working memory seems more likely to encourage
a consideration of diverse relevant mechanisms. Some theories of
working memory equate it to the focus of attention and awareness and
some equate it to the sum of activated information. Often, the
distinction between activation and awareness is left unclear, but I
argue that the distinction is important and that working memory must
involve both, and some long-term memory information as well.
The Soar Cognitive Architecture:
We examine the various phenomena of human working memory (WM) from a
cognitive architecture of broad scope, the Soar architecture, which
focuses on the functional capabilities needed for a memory system to
support performance in a range of cognitive tasks. We argue for and
demonstrate 3 points concerning the limitations of WM. 1. We show how
the cognitive system, even with a limited-capacity short-term store,
can handle complex tasks that require large quantities of information,
by relying heavily on recognition-based long-term memory working in
concert with the external environment. 2. We argue that limitations on
WM arise even in purely functional cognitive systems built without
preset capacity constraints, and hence that empirically demonstrated
limitations of effective WM do not necessarily imply a
capacity-constrained underlying memory system. 3. We show how a
specific mechanism of similarity-based interference can act as a
resource constraint on the cognitive system and offer a coherent
account of a wide range of psycholinguistic phenomena.
Working memory in a multilevel hybrid connectionist control
architecture (CAP2):
In a connectionist control network, working memory is implemented via
short-term activation and connection changes that support cognitive
operations. The Controlled Automatic Processing version 2 (CAP2)
approach is a model of skilled processing and learning. When applied
to working memory the model instantiates multiple forms and mechanisms
of working memory.... The chapter provides an interpretation of
working memory based on biological, information-processing, and
behavioral constraints. The chapter interrelates previously published
themes, which include the distinction between controlled and automatic
processing, the role of multileveled connectionist control structure
in working memory, the use of that control structure to enable
learning by instruction, attentional control, and the role of
consciousness in the effective control of processing. Topics include
(a) a brief review of physiological themes and mechanisms underlying
working memory; (b) a detailed description of the CAP2; (c) some brief
brain-imaging results about the changes that occur as skills is
acquired; and (d) the conclusion with a discussion of the specific
working memory questions, as well as a commentary about the other
models presented in this volume.
A biologically based computational model of working memory:
This chapter presents a biologically based model of working memory.
The authors' connectionist framework represents an attempt to start
developing an explicit computational model of working memory and
executive control that is biologically plausible and is firmly rooted
in the principles of cognitive processing in the brain. This chapter
brings studies of working memory into closer alignment with our
rapidly expanding knowledge of its underlying biological and neural
basis. We define working memory as controlled processing involving
active maintenance and/or rapid learning, where controlled processing
is an emergent property of the dynamic interactions of multiple brain
systems, but the prefrontal cortex (PFC) and hippocampus (HCMP) are
especially influential owing to their specialized processing abilities
and their privileged locations within the processing hierarchy (both
the PFC and HCMP are well connected with a wide range of brain areas,
allowing them to influence behavior at a global level).
http://cogweb.ucla.edu/index.html
Various theories on short term memory can be found here:
http://www.stehouwer.com/LearningOHch8.pdf
Here is a nice summary on working memory:
http://www.geocities.com/Athens/Acropolis/3041/workingmem.html
Richard Young of the University of Hertfordshire discusses the Soar
Architecture of Working Memory here in this thorough document:
http://phoenix.herts.ac.uk/pub/R.M.Young/publications/99.Young-Lewis.pdf
Randall OReilly of the University of Colorado discusses the
Biologically Based Computational Model of Working Memory here in this
thorough document:
http://psych.colorado.edu/~oreilly/papers/OReillyBraverCohen99.pdf
Please use any answer clarification before rating this answer. I will
be happy to explain or expand on any issue you may have.
Thanks,
Kevin, M.D.
Internet search strategy:
Using FAST, Google, Inktomi and Teoma via Hotbot.com
1) long term memory
2) short term memory
3) working memory
4) all of the above AND theories
5) all of the above AND models
Bibliography:
Atkinson, R. C., & Shiffrin, R. M. (1968). Human memory: A proposed
system and its control processes. In K. Spence and J. Spence (Eds.),
The psychology of learning and motivation,(Vol. 2, pp. 89-195) . New
York: Academic Press.
Anderson, J. R. (1983). The architecture of cognition. Cambridge,
Mass.: Harvard University Press.
Craik, F. I. M., & Lockhart, R. S. (1972). Levels of processing: A
framework for memory research. Journal of Verbal Learning and Verbal
Behavior, 11, 671-684.
Newell, A., & Simon, H. A. (1972). Human problem solving. Englewood
Cliffs, N. J.: Prentice-Hall.
Newell, A. (1990). Unified theories of cognition.Cambridge, Mass:,
Harvard University Press.
Baddeley, A. D., & Hitch, G. J. (1974). Working memory. In G. H. Bower
(Ed.), The psychology of learning and motivation,(Vol. 8, pp. 47-90).
New York: Academic Press.
Baddeley, A. D. (1986). Working memory. New York: Oxford University
Press.
Memory (psychology)," Microsoft® Encarta® Online Encyclopedia 2002
http://encarta.msn.com © 1997-2002 Microsoft Corporation. All Rights
Reserved.
Miller, G.A. (1956). The magical number seven, plus or minus two: Some
limits on our capacity for processing information. Psychological
Review, 63, 81-97.
Murdock, B.B. Jr. (1961). The retention of individual items. Journal
of Experimental Psychology, 62, 618-625.
Links:
Theories of Long Term Memory
http://mailbox.univie.ac.at/~trimmem2/kogpsych_ws2001-2002/lengauer.pdf
Memory Theories and Processes
http://brain.web-us.com/memory/theories_and_processes.htm
The Difference Between Short Term and Long Term Memory
http://www2.ntu.ac.uk/soc/bscpsych/memory/goodhead.htm
Long Term Memory
http://www.lehman.cuny.edu/depts/psychology/sailor/cognition/ltm.html
Neural Pathways to Long Term Memory
http://ahsmail.uwaterloo.ca/kin356/ltm/ltm.htm
Learning Theories
http://www.emtech.net/learning_theories.htm
Short Term Memory
http://www.sandiego.edu/~taylor/stm.html
Ohio State University: Working Memory
http://www.psy.ohio-state.edu/psy312/wm.html
Theories of Memory Slide Show
http://psychol-carp.oswego.edu/klatsky/psy405/memoryintro/
Psychpapers.com Memory
http://search.psychpapers.com:9000/cgi-bin/query?mss=psychpapers&q=memory |