John Sweller, “Cognitive Load Theory, Learning Difficulty, and Instructional Design”

In this article Sweller argues that, “when considering intellectual activities, schema acquisition and automation are the primary mechanisms of learning” (p. 295). He describes “schema” as “a cognitive construct that organizes the elements of information according to the manner with which they will be dealt” (p. 296) and cognitive load theory as suggesting that “instructional techniques that require students to engage in activities that are not directed at schema acquisition and automation, frequently assume a processing capacity greater than our limits and so are likely to be defective [sic]” (p. 299). Sweller considers—total—cognitive load to be “an amalgam of at least two quite separate factors: extraneous cognitive load which is artificial because it is imposed by instructional methods and intrinsic cognitive load over which instructors have no control [sic]” (p. 307). Throughout the course of this article, Sweller presents various arguments for the reduction or elimination of extraneous cognitive load and various ways that educators (instructional designers and teachers) can reduce or eliminate this extraneous load. Only under certain circumstances, does he see such a reduction or elimination as necessary, however, as “extraneous cognitive load that interferes with learning only is a problem under conditions of high cognitive load caused by high element interactivity. Under conditions of low element interactivity, re-designing instruction to reduce extraneous cognitive load may have no appreciable consequences” (p. 295).

Driscoll, Chapters 4 & 5

In this chapter Driscoll primarily discusses concepts put forward by American psychologist David Ausubel. One of the principals amongst these is “meaningful learning,” which Ausubel contrasts with rote learning. The main processes that make up meaningful learning are derivative and correlative subsumption (wherein “new, incoming ideas are subsumed under more general and inclusive anchoring ideas already in memory” [p. 118]), superordinate and combinatorial learning (which is learning “that is not subordinate in nature” [p. 121]), and assimilation theory (which states that the “’result of the interaction that takes place between the new material to be learned and the existing cognitive structure is an assimilation of old and new meanings to form a more highly differentiated cognitive structure’” [p. 123]).

The other principal concept this chapter explores is Ausubel’s “schema theory,” which is concerned with how schemata, “packets of knowledge… are represented and how that representation facilitates the use of the knowledge in particular ways” (p. 129). These schema then can (and should, Ausubel argues) be used to activate prior knowledge, by way of the likes of “advance organizers,” which are “relevant and inclusive introductory materials, provided in advance of the learning materials, that serve to ‘bridge the gap between what the learner already knows and what he needs to know before he can meaningfully learn the task at hand’” (p. 138).

In Chapter 5 Driscoll examines the theory of “situated cognition,” in which “cognition is assumed to be social and situated activity…; one learns a subject matter by doing what experts in that subject matter do” (p. 156). The theory further maintains that “’every human thought is adapted to the environment, that is, situated, because what people perceive, how they conceive of their activity, and what they physically do develop together’ (Clancey, 1997, pp. 1-2; italics in original). Moreover, what people perceive, think, and do develops in a fundamentally social context” (p. 157).

Mory and Wagner, “The Role of Questions in Learning”

Mory and Wagner state that “the premise of this chapter is that in order to understand variables affecting feedback [provided by educators to learners], research must ask the question, ‘Feedback for what?’” (p. 56). “Questions and feedback are inextricably related,” they contend, in that “[f]eedback is always related to a response generated by a question. In this sense, the meaning of feedback is dependent upon its context in the instruction” (p. 71). And so, when one asks “‘Feedback for what?’”, one has to “take into consideration the type of question, the stage of information processing, and conditions within the learner to arrive at an answer” (p. 72).

In the course of their development of this premise, Mory and Wagner explore the various properties and functions of short-term and long-term memory, as well as strategies that can be used to improve the functions of both kinds memory. One of those employed for short-term memory is “chunking,” which works to increase the “small capacity of the short-term memory to process information” (p. 65). That is, “[b]y organizing the stimulus [experienced by a learner] into a sequence of chunks, “the information is put into more manageable units (Miller, 1956). These chunks can then be remembered separately and chained together” (p. 65).

Smith and Ragan, Chapters 1 & 2

Smith and Ragan open their book with a series of solid, foundational definitions. As they describe it, “instructional design” is “the systematic and reflective process of translating principles of learning and instruction into plans for instructional materials, activities, information resources, and evaluation” (p. 2). “Instruction” is defined (for the authors’ expressed purposes, at least) in part as relatively distinct from terms such as “training,” “teaching,” and “education.” The term “design,” in turn, Smith and Ragan note, “implies a systematic or intensive planning and ideation process prior to the development of something or the execution of some plan in order to solve a problem” (p. 4).

The authors go on to describe the instructional design process—or, rather, (indicating that there is no “one size fits all solution” to the design process), they offer a framework or guide to some of the more used/accepted approaches to the process. Smith and Ragan also qualify their descriptions by noting that, while the instructional design process “may often be portrayed as linear, in practice it is frequently iterative, moving back and forth between activities as the project develops” (p. 11).

The two foundational poles the authors present in the second chapter are rationalism–of which constructivism is one form, wherein “reason is the primary source of knowledge and that reality is constructed rather than discovered” (p .19)—and empiricism—”sometimes termed objectivism,… postulates that knowledge is acquired through experience…. [, and that] experience allows an individual to come to know a reality that is objective and singular” (p. 22).

Smith and Ragan subcategorize Constructivism into individual constructivism, social constructivism, and contextualism. Between rationalism-constructivism and empiricism-objectivism, the authors place their own position, “‘Pragmatism,’ which might be considered a ‘middle ground’ between rationalism (constructivism) and empiricism…. Although pragmatists, like empiricists, believe that knowledge is acquired through experience, they believe that this knowledge is interpreted through reason and is temporary and tentative” (22).

Guenther, Chapter 1

Guenther opens Human Cognition (1998) with an historical overview of the terrain he’s looking to traverse in his book. The first, introductory chapter, he notes, is mean to serve as “a discussion of the important events in our history that brought about the transition from the predominantly supernatural perspective that characterized the cosmology of medieval Europe to the natural perspective on human mental life that characterizes the cosmology of our modem culture” (p. 1). We arrive at the view of cognitive science most easily recognized and readily accepted today by way of other theories of the mind (that Guenther undoes one by one)—spiritual, mechanistic, behavioralist and, finally, computerized. Guenther spends the bulk of this chapter addressing the problems in applying a computer metaphor to human cognition, though he explains the several ways (in 1998) in which neural nets may have some significant potential for thinking about our thinking. Guenther concludes by making an argument against those who would say that the materialistic approaches accepted by many today are, ultimately, dehumanizing.

Driscoll, pp. 71-77

In these opening sections to Chapter 3, “Cognitive Information Processing,” Driscoll—by way of some illustrative scenario-based examples (pp. 72-77)—provides an “Overview of the Information-Processing System” (p. 74), outlining both “The Stages of Information Processing” (p. 74) and the “The Flow of Information During Learning” (pp. 76-77).
Driscoll contextualizes his overview by describing the development of Cognitive Information Processing (CIP) in the field of psychology—wherein, following developments in computer science/technology after WWI, “the computer metaphor adopted for conceptualizing cognition” (p. 74).

“Most models of information processing,” Driscoll notes, “can be traced to Atkinson and Shiffrin…, who proposed a multistore, multistage theory of memory.
That is, from the time information is received by the processing system, it undergoes a series of transformations until it can be permanently stored in memory” (p. 74). “This flow of information,” Driscoll continues, is generally thought to be a “memory system” comprised of  “three basic stages”—“sensory memory, short-term memory, and long-term memory”—“along with the processes assumed to be responsible for transferring information from one stage to the next (p. 74)”