Tag Archives: working memory

Review of “The Sciences of the Artificial,” by Herbert Simon

It is perhaps the curse of the successful explorer that, after new lands have been found and the surveys made, his own writings become collections of obvious cliches and bizarre assertions. Surely Christopher Columbus felt this. In 1492 you are a visionary and a hero. But by 1505 everyone knows there is a large land mass west of Europe, and no one believes it is China.

herbert-simon-sciences-of-the-artificial

Herbert Simon’s The Sciences of the Artificial is, objectively, one of the worst books I have ever read.
There is some obvious discussion of local maxima and satisficing (picking the best solution you can find, rather than holding out for the perfect one), some more-or-less obvious if somewhat simplistic speculations about how business functions (it might be modeled as a problem-solving entity), and an incredibly tedious discussion of “standard social sciences model” (SSSM) psychology, in which a the interpretation of some early psychology studies is tortuously interpreted to imply that the human mind runs on a relatively small number of simple algorithms, albeit in a complex environment.

(If you read the Wikipedia page on SSSM , you will find criticisms that the SSSM is a ridiculous straw-man, and that it was “comical” to assert anyone believed it. But Herbert Simon, very definitely pushes such a view in his book.)

general_model_of_performance

Indeed, Simon’s discussion of psychology is so dangerously wrong-headed I will spend a paragraph here refuting it. Simon describes the human memory system, and describes two systems long-term memory (which he is generally accurate about) and “short term memory” (which appears to be a confused mix of working memory, associated with general intelligence, and sensory memory, which provides the awareness of taste, etc). In mainstream psychology, long-term-memory and working-term memory as associated with the automatic, highly parallel, intuitive, and effortless “System 1” cognition system, and the manual, serial, logical, and painfully slow “System 2” cognition system. In academia these two systems are often studied under “dual process theory,” and in the military they are described as part of the OODA loop.”

ooda

While Simon does not use the “System” nomenclature, his description of it is oddly incomplete, basically missing the most important studies of the last few decades to provide an oddly limited view of human thinking. The only mental processes he implies occurs in System 1 is the passive maintenance of memories. And while he cites such famous studies as “The Magic Number Seven, Plus or Minus Two” (a measure of the range of working memory) and B.F. Skinner,” influential System 2 studies simply do not occur.

Now…. the reason for this is that Simon wrote The Sciences of the Artificial in 1969, and the last edition was published in 1996, perhaps the last year that his view on psychology could be taken seriously. Simon is a Nobel prize winner and, even more prestigious, a Turning Award winner. The reason for his digressions into “satisficing” and “organizational behavior” is he coined the term satisficing, and is a founding father of organizational behavior. The Sciences of the Artificial is like a letter from Columbus in 1505, describing his views on geography: Cliched, tired, ridiculous, and an artifact of a pathfinder.

I read The Sciences of the Artificial in the Nook edition.

The Unfairness of Working Memory

Several interrelated posts this morning, including “Intelience and the President of the United States, “Capturing my Thoughts: How could Demographic Warfare me used with 5GW?,” “Fixing Milwaukee Notes: Milwaukee School District Governance,” and “U.S. college panel calls for less focus on SATs.”

The topics all revolve around Working Memory, the capacity of the adult to keep 7 (ish) things in mind at the same time. Some people have more, some have less. Working memory is heritable and impacts life outcomes. Working memory is not “fair.” It is predicted by your class origin, your socio-economic status, your race, and so on while its variance is predicted by your sex. (Being male is risky business.)

Many social problems will be eleviated when we can use retroviruses or stem cell therapy to increase the working memory of the underclass. At the same time, any individual with low working memory can more than compensate by building up his long-term memory (his knowledge and experience), his self-efficacy (how he responds to failure), and his behavior.

Working Memory and Orientation

Three articles this week on working memory.

Three articles today: “Am Embedded-Processes Model of Working Memory” by Nelson Cowan, “Working Memory: The Multiple-Component Model” by Alan D. Baddeley and Robert H. Logie, and “Modeling Working Memory in a Unified Architecture: The ACT-R Perspective” by Marsha C. Lovett, Lynne M. Reder, and Christian Lebiere.

The ACT-R paper (Lovett, et al) is not very relevent to what I am doing. It continues the attempt to apply literal information processing theory to human thinking, in the tradition of George Willer and nowadays of John Robert Anderson. ACT-R, like the other theories, is perhaps better for building a computer that works in ways analogous to the brain rather than understanding the brain itself.

The Baddeley piece was assigned to set the stage for the episode buffer, which he covered in his Nature Reviews Neuroscience article I read a bit ago. So: an OK article, but recognized by everyone (including Baddeley) as out of date.

What was really exciting was Nelson’s Cowan “embedded” working memory model, which is actually a dual processing model. Excitingly, it appears to date from the same time as Boyd’s final presentation, and even includes orientation! An excerpt:

THe focus of attention is controlled conjointly by voluntary processes (a central executive system) and involuntary processes) the attentional orienting system.)

All of this is exciting to read this morning, especially as this afternoon I present the OODA loop as a “Dual-Processing Theory of Learning” to some colleagues today. Talk about neat!

Lack of working memory to be the curse of 2008?

My application of the OODA loop to educational psychology has been centering on “working memory” (in other words, “general intelligence” or “attention”). More working memory lets you consciously think about more things at the same time, letting you make better decisions than you could otherwise.

Some tasks require more attention / intelligence / working memory than you have. Where possible, you should rely on your orientation (which you can sometimes tell from your gut- or fingertip- feeling) in those situations. But often you are called on to make decisions in situations where your gut feeling just isn’t good enough — and you can’t pay attention to everything you have to! This is called “cognitive load” or “information overload,” and has been the main application of working memory research in educational psychology.

Thus, I may end up with a trendy paper at the end of all of this, because, as Wired (and Slashdot) notes: information overload has been predicted as the problem of the year in 2008:

“It’s too much information. It’s too many interruptions. It’s too much lost time,” Basex chief analyst Jonathan Spira declared. “It’s always too much of a good thing.”

Information overload isn’t exactly new, but Spira said the problem has grown as technology increases societal expectations for instantaneous response. And more information available, he said, also means more time wasted looking for the right information, whether in an old e-mail or through a search engine.

Hilariously, Wired’s page on information overload is so bad at preserving working memory, I feel dumber just looking at it!:


Attention-thieving page about attention-thieving problems

How to measure working memory

Working memory is nearly the same thing as general intelligence. It is highly heritable, it determines how much attention you pay to tasks, and without it logical reasoning is impossible.

For the spring experiment I will be stressing participants’ working memory (like I did in The wary student), but this time I will be measuring the amount of working memory the high-workload condition consumes, too.

Therefore, I’m incredibly grateful Randall Engle’s lab at University of Georgia, as well as open-access articles such as “How does running memory span work? by Bunting, Cowan, and Saults.


Measures of working memory capacity that work

Working memory capacity and attentional control: An electrophysiological analysis

(Working memory can also be improved through pharmaceuticals, by the way.)

Can intelligence be taught?

Klingberg, T., Forssberg, H., & Westerberg, H. (2002). Training of working memory in children with ADHD. Journal of Clinical and Experimental Neuropsychology, 24(6), 781-791.

It’s a good time for social scientists who like biology. From Shakespeare’s wordplay to accelerating human evolution, biology is being used to explain more and more of our world. This is also true of topics that some people react to very emotionally, such as the role of genetic and environmental causes of human diversity in general intelligence.

The New Yorker has a mostly good article criticizing the role of genetics. Aside from a personal attack near the beginning, the article mostly emphasizes that changing environment changes general intelligence. It got me thinking if general intelligence itself could be changed by instruction, especially considering the finding that working memory correlates almost perfectly with IQ.

Well, Torkel Klinberg and colleagues asked that same question in 2002 and the answer is yes.

The researchers looked not only at measures of working memory, such as visuospatial working memory, and measures of general intelligence, such as Raven’s progressive matrices, but also things you wouldn’t expect: like head bobbing.

The measurement of head movements has been described in previous publications (Teicher et al., 1996). An infrared motion analysis system (OPTAx Systems, Burlington, MA) recorded the movements of a small reflective marker attached to the back of the head of the child. A movement was designed to begin when the marker moved 1.0mm or more from its most recent resting location. The number of movements was recorded during a 15-min period when the child was performing a version of a continuous performance task. In this task subjects were asked to respond to a target and withhold response to nontargets, with no requirement of holding any information in WM. Stimuli were presented every 2.0 s, and 50% of stimuli were target

Why include head-bobbing, you ask? Well, head-bobbing among ADHD students is already subject to medication — so you can compare the benefits of training with the benefits of drugging:

The number of head movements was significantly reduced in the treatment group compared to the control group (Table 1, Fig. 1c). Again, this effect was evident in all subjects in the treatment group (Fig. 1c). The number of head movements during retest in the control group was about 6% higher than during the first testing. This is consistent with previous data on test-retest changes after administration of pharmacological placebo, where an increase of about 8% was found on the second testing (Teicher et al., 2000; Teicher, personal communication). The reduction of head movements in the treatment group was 74% (SEM 7). In comparison, a probe dose of methylphenidate (approximately 0.4 mg/kg) reduced the number of head movements by 62% (Teicher et al., 2000).

While the authors warn that more work is needed to see if this really leads to an increase in general intelligence, things look hopeful:

The present study showed that intensive and adaptive, computerized [working memory] WM training gradually increased the amount of information that the subjects could keep in WM (Tables 1 and 3, Figs. 1 and 2). The improved performance occurred over weeks of training, and is in this respect similar to the slow acquisition of a perceptual skill or a motor skill (Karni et al., 1995; Recanzone, Schreiner, & Merzenich, 1993; Tallal et al., 1996). Furthermore, the improvement from training was evident both for a group of children with ADHD (Experiment 1), as well as for adult subjects without ADHD (Experiment 2). This shows that an initial deficit inWMwas not necessary for improvement to occur.

All learners have two board sources of ability: knowledge of what they are doing, and the intelligence to apply it. Both of these can be improved with a positive environment, and both can be weakened by a bad environment.

To the extent that we wish to have a functioning systems administration at home and abroad, we must encourage those institutions that help develop skills and intelligence, and discourage those institutions that diminish them both.

General intelligence, working memory, and how American Public Schools hurt those who need them most

Colom, R., Rebollo, I., Palacios, A., Juan-Espinosa, M., & Kyllonen, P.C. (2004). Working memory is (almost) perfectly predicted by g. Intelligence, 32(3), 277-296. doi:10.1016/j.intell.2003.12.002.

Andrew Sullivan, Ezra Klein, Half Sigma, and other bloggers of note are going around on the question of the heritability of intelligence in general, and the possibility of biological causes for the differences in general intelligence obsered in different groups. While occasionally people speak carelessly, it’s remarkable how far the Standard Social Sciences Model (SSSM) of all human differences being the result of different environments has already collapsed. There are three traditional ways to attack the notion in biologically-driven racial differences in general intelligence

  1. There is no such thing as general intelligence
  2. There are no such things are races
  3. The environmental conditions in which the races tend to exist are unequal

The first two criticism are discredited. One can deny g or ancestry in the same way that one can deny darwinian selection or the old Earth: through determined dogmatism.

The third criticism remains, if only because of the horrifying inequalities in the world today. Of course, environmental inequalities can rapidly turn into biological inequalities. One only needs to look at the Inbred Gap to know that. Yet it’s also true that one can be trained to perform better on any subset of tests that are used to measure general intelligence. Thus the Flynn Effect: this or that measure will suddenly deviate from the rest, causing illusionary growth or shrinkage in differences.

One measure that very closely approximates g (“(almost) perfectly predicts,” in the word of the paper’s excited authors) is working memory.

This article analyzes if working memory (WM) is especially important to understand g. WM comprises the functions of focusing attention, conscious rehearsal, and transformation and mental manipulation of information, while g reflects the component variance that is common to all tests of ability. The centrality of WM in individual differences in information processing leads to some cognitive theorists to equate it with g. There are several studies relating WM with psychometric abilities like reasoning, fluid intelligence, spatial visualization, spatial relations, or perceptual speed, but there are very few studies relating WM with g, defined by several diverse tests. In three studies, we assessed crystallised intelligence (Gc), spatial ability (Gv), fluid intelligence (Gf), and psychometric speed (Gs) using various tests from the psychometric literature. Moreover, we assessed WM and processing speed (PS). WM tasks involve storage requirements, plus concurrent processing. PS tasks measure the speed by which the participants take a quick decision about the identity of some stimuli; 594 participants were tested. Confirmatory factor analyses yielded consistently high estimates of the loading of g over WM (.96 on average). WM is the latent factor best predicted by g. It is proposed that this is so because the later has much in common with the main characteristic of the former.

Working memory allows you to make sense of information, so that you can remember it. It is most important in that it makes it easier to memorize things. This also explains why school appears to lower general intelligence of high-performing populations, such as Chinese: if you are in an environment where high academic achievement is socially punished, excess working memory capacity naturally atrophies. Similarly, this may explain why the heritability of g increases in life: once out of the socialized public schools, an individuals’ environment is more under his control, and an individual that enjoys tasks that involve the comprehension of complex materials will strengthen those neural connections more.

If g really is working memory, the educational implications are huge. The soft bigotry of low expectations is especially brutal to those apparently with low working memory capacity. Because working memory does not matter once a task is memorized. Memorization is the way-out of the trap of low working memory. And what’s needed for memorization is clear: practice, academic discipline, and practice. Yet who believes that fifty years after Brown v. Board of Education most majority-black schools are models of academic seriousness and discipline?

Even more tragic — if the link between general intelligence and working memory is strong — working memory is trivially easy to test. There’s no need for race-conscious policies at all to battle what may be the worst racial inequality through education. We could close much of the achievement gap, regardless of average biological differences between races.

Instead, we have America’s public schools.