I am not currently teaching ASL3370: Sign to Spoken English Interpreting.Please note: All information currently available on this site represents work and due dates relevant to a previous semester/course. Please check back during later semesters for updated information on this course. Thank you.
Managing memory issues in an interpretation
Arguably the greatest challenge to equivalency in an interpretation event is the ability to control the data shared during the event: who did what to whom? What was the number? Did s/he just spell ‘car’ or ‘cat’?
Here are some learning outcomes of which we might want to be aware:
- How do I learn? Am I more of a verbal, auditory, or kinesthetic learner? How does that impact my ability to produce a real-time spoken language interpretation?
- How do I deal with message ‘decay’? Why do I make miscues?
- How can I harness the power of cognitive load in producing a spoken language interpretation?
Take this learning preferences style assessment to see where your learning tendencies might be focused. Required
Related: Take a Myers-Briggs personality exam for more detailed information on your learning and psychological preferences. The goal here is not to create a diagnosis, but to identify how you personally process (or dual process) information in an interpreting event.
What is the brain doing when it’s processing and/or looking at information? Required
This Canadian Science Writer’s Association article discusses some of the near-impossible science happening in our brains as we move between two languages. “Neurologically speaking, the job is tough. Speaking while listening and having two languages active at once are extreme feats of multitasking. Simultaneous interpretation is spontaneous, unpredictable, and demands attention to tone, body language, facial expressions, and word order, which differs across languages.” Required
Quora discussion/explanation about how information is actually stored and recalled; cf. some of our other discussions (on this page) about how this is accomplished. What is the application for sign language interpreting?
I’m not a huge fan of the ‘10 Things’ genre, but this is especially germane to our discussion.
Tangential to interpreting, the field of user experience spends a great deal of effort in understanding how users cognitively and psychologically process presented information. This article discusses cognitive effort, schema/schemata, and gestalt principles as a way of taxonomizing information. Nerd alert, but helpful in seeing that interpreters aren’t the only ones doing this.
Science shows our memory can easily be distorted and erased — but our forgetfulness also helps us survive.
Less an article and more a list of other articles, this ‘article’ helps us reevaluate why we make bad — or inequivalent — decision in our interpreting work.
Ever wonder why some people don’t see or think that they make errors? It’s because we don’t know what we don’t know. Here’s the paper/research. Beware the juice.
2011 NYT article on the very real issue of defining the energy needed to make decisions and exercise self-control. What is the relationship to SL interpreting?
Brain Games and Training
Games, exercises, and training to improve brain health and performance. (First few games free, paid account required)
Readings related to Cognitive Load Theory
CLT has roots in work by instructional psychologists Miller (1956; concept of 7±2;) and Simon & Chase (1973; “chunking”) and is the subject of a large body of study by John Sweller (1988, 1998, 1999, 2001, 2002, et al) at the University of New South Wales. Sweller’s work primarily deals with the amount of information/cognition that is transmitted in instruction (SL interpreters as ‘instructors’ anyone?) but the findings are clearly germane to SL interpreters and their work. Required
An introduction to efficiency in learning (read pp. 9–13)
This is Chapter 1 of Clark, R. C.; Nguyen, F; & Sweller, J. (2006). Efficiency in learning: evidence-based guidelines to manage cognitive load. (pp. 1–13) New York: John Wiley and Sons. This short chapter gives an introduction to CLT, explains a broad taxonomy of types of cognitive load (‘intrinsic’, ‘germane/relevant’, and ‘extraneous/irrelevant’), and gives an introduction to techniques in managing these kinds of load.
This excerpt (pp. 603–605) from Wilson, Brent & Cole, Peggy (1996), Cognitive teaching models. In D. H. Jonassen (Ed.), Handbook of research for educational communications and technology (pp. 601-621). New York: MacMillan is a simplified explanation of CLT; how does this apply to SL interpreting work?
First discussed by Allan Paivio, dual-coding theory hypothesizes that visual and verbal information are processed in different areas of the brain and schemas for each are produced in different visual and verbal channels. What implications does this have for how SL interpreters create schema for information in each channel? Required
Muscle memory is the concept of training behaviors (or literally, gross/fine motor skills) to be automatic; we perform tasks almost subconsciously and/or without purposely thinking because we’ve performed the task multiple times. There is an analog to this in the design of the user interfaces of our phones, iPods, etc. Does muscle memory also apply to SL interpreting? What skills (if any) can be made automatic and what skills (if any) must remain outside of automatcity? Additional reading in Karni & Meyer, et al (1998). Here’s a visual example of how a two-and-a-half year-old creates muscle memory with an iPad. Required