W3C home > Mailing lists > Public > w3c-wai-ua@w3.org > October to December 1999

UI Design Update newsletter - September, 1999 (fwd)

From: Lloyd G. Rasmussen <lras@loc.gov>
Date: Wed, 06 Oct 1999 00:09:04 -0400
Message-Id: <>
To: w3c-wai-ua@w3.org
 >This was forwarded through our LOC web design mailing list.  Maybe you've
already seen it.  If not, it looks like good ammo for the user agent
working group meeting.
 > -- Lloyd
 >>Approved-By:  "Elizabeth F. Miller" <emil@LOC.GOV>
 >>Date:         Fri, 24 Sep 1999 07:55:04 -0400
 >>Reply-To: "Library of Congress HTML Users' Forum" <HTML@loc.gov>
 >>Sender: "Library of Congress HTML Users' Forum" <HTML@loc.gov>
 >>From: "Elizabeth F. Miller" <emil@loc.gov>
 >>Subject:      UI Design Update newsletter - September, 1999 (fwd)
 >>Comments: To: html@loc.gov
 >>To: Multiple recipients of list HTML <HTML@RS8.LOC.GOV>
 >>---------- Forwarded message ----------
 >>Date: Thu, 23 Sep 1999 11:56:29 -0500
 >>From: hfi@humanfactors.com
 >>To: Newsletter@humanfactors.com
 >>Subject: UI Design Update newsletter - September, 1999
 >>Insights from Human Factors International, Inc. (HFI)
 >>Providing consulting and training in software ergonomics.
 >>Every month HFI reviews the most useful developments in
 >>UI research from major conferences and publications.
 >>In this issue Dr. Bob Bailey reviews:
 >>1. Multimedia and Working Memory Limitations - with complex
 >>    tasks, working memory capacity can be "increased" by using
 >>    two senses rather than one.
 >>2. Using Multimedia in Instruction - some guidlelines.
 >>When Two Sensory Modes are Better than One, Tindall-Ford, S.,
 >>Chandler, P. and Sweller, J., Journal of Experimental Psychology:
 >>Applied, 3(4), 257-287 (1997).
 >>With designing user interfaces, when is it better to take the time to
 >>allow users to use two rather than one sense? In other words, when
 >>is there a human performance advantage of having users both read
 >>and hear information?
 >>Tindall-Ford, Chandler and Sweller (1997) compared the
 >>performance outcome when participants read text and evaluated
 >>visual diagrams (visual-only), versus heard text and evaluated visual
 >>diagrams (auditory and vision). They postulated that any improved
 >>performance was due primarily to an effective expansion of
 >>"working memory" limitations. Human working memory consists
 >>of both a visual-spatial sketch pad for dealing with visual material
 >>(text, pictures, diagrams), and a phonological loop for dealing with
 >>auditory information. These two processors are assumed to
 >>operate independently.
 >>Several past studies were reviewed. They showed that:
 >>1. People were better able to carry out two tasks simultaneously
 >>    (repeating spoken words and learning new words) if each task
 >>    involved a different modality (visual vs. auditory),
 >>2. People were presented with a verbal description of a layout
 >>    that was sufficiently complex to be unintelligible unless visualized.
 >>    The description was presented either in auditory form or
 >>    simultaneously in auditory and written form. The auditory-alone
 >>    mode resulted in superior performance.
 >>3. Two groups of children were asked to either listen to or read
 >>    a story. Half of each group were instructed to visualize the story
 >>    while it was being presented. On a test given after the story, the
 >>    "visualizers" performed better, but only those that listened to, not
 >>    those that read the story.
 >>These findings suggest that in certain, complex situations working
 >>memory capacity can be "increased" by using two senses rather
 >>than one. For example, performance can be substantially degraded
 >>when people must attend to multiple sources of information that
 >>must be mentally integrated before meaning can be derived. Thus,
 >>designers should present information to users in ways that reduces
 >>the need for mental integration, and consequently reduces the
 >>demands on working memory.
 >>Tindall-Ford, et.al., conducted three experiments using electrical
 >>trade apprentices. In the first study, one group learned by using a
 >>diagram and separated written text, a second group used a diagram
 >>and integrated written text, and a final group used a diagram and
 >>auditory instructions. The latter two groups performed reliably
 >>better because their working memory resources were not exceeded.
 >>In the second study they evaluated user performance on a complex
 >>task when using:
 >>   (a) a table and related text, versus
 >>   (b) a table with an auditory explanation.
 >>The visual-audio group performed reliably better; again, because
 >>of the reduced load on working memory.
 >>In the third study, they had participants either look at diagrams
 >>and read instructional materials (visual-only) or look at diagrams
 >>while listening to instructional material (audio-visual). They
 >>performed two easy tasks and one difficult task. In the easy tasks
 >>there were no differences between visual-only and visual-audio.
 >>In the difficult task, the participants using two senses (vision and
 >>hearing) performed reliably better.
 >>This article makes a strong case for having designers take the time
 >>to physically integrate information in computer systems, i.e., put
 >>all required information within close proximity. When this is not
 >>possible, and when the task is complex, working memory capacity
 >>can be extended by presenting information using both visual and
 >>auditory modes.
 >>the Use of Multimedia in Instruction, Williams, J. R., Proceedings
 >>of the Human Factors and Ergonomics Society 42nd Annual
 >>Meeting, 1447-1451 (1998).
 >>Williams (1998) reviewed the literature on using multimedia in
 >>instruction. He extracted numerous guidelines on the effective use
 >>of multimedia after reviewing about 100 literature sources. One
 >>of his many discussions was a section on using combined visual
 >>and verbal information.
 >>In general, the past research seems to indicate that combining
 >>visual and verbal (auditory) information can lead to enhanced
 >>comprehension, when compared to their use alone (see the Tindall-
 >>Ford, et.al., article discussed above). But designers also should be
 >>aware that having both visual and audio modes may result in no
 >>performance improvements (if the task is too simple), and may
 >>or may not increase user satisfaction.
 >>Some guidelines:
 >>-- Past research suggests that visual and narrative information
 >>    should be presented simultaneously, or the visuals should
 >>    precede the narrative by no more than seven seconds.
 >>-- Both the visual and auditory information should be totally
 >>    relevant to the task being performed.
 >>-- When words are spoken, the content should be simple,
 >>    and the speed of narration should be about 160 words per
 >>    minute. The  narration should be slowed when used to
 >>    introduce new ideas or concepts.
 >>-- Off-screen narration should be used rather than on-screen
 >>    narration, unless the narrator is a recognized authority on
 >>    the topic.
 >>NOTE FROM BOB: If there are any references you feel are
 >>important to include in the Annual User Interface Update - 2000,
 >>or in future newsletters, please let us know. They could be either
 >>published articles, conference proceedings, or internal research
 >>papers. Send references to mailto:hfi@humanfactors.com.
 >>3-day 1999 Annual User Interface Update Seminar presented by
 >>Dr. Robert Bailey.
 >>Register for UI Update Seminar in Seattle - November 2-4/99.
 >>Suggestions, comments, questions?
 >>HFI editors at mailto:hfi@humanfactors.com.
 >>Want past issues?
 >>Subscribe? - http://www.humanfactors.com/library/subscribe.asp
 >>Do NOT want this newsletter?
 >>E-mail mailto:unsubscribe@humanfactors.com with a Subject of:
 >>"Unsubscribe Newsletter"
Lloyd Rasmussen, Senior Staff Engineer
National Library Service f/t Blind and Physically Handicapped
Library of Congress    (202) 707-0535  <lras@loc.gov>
HOME:  <lras@sprynet.com>   <http://lras.home.sprynet.com
Received on Wednesday, 6 October 1999 01:21:56 UTC

This archive was generated by hypermail 2.3.1 : Wednesday, 7 January 2015 14:49:24 UTC