[PDF] Removing the HCI Bottleneck: How the Human-Computer Interface









[PDF] Download Book High Tech Heretic

High Tech Heretic: Why Computers Don't Belong in the Classroom and Other Reflections by a Computer Contrarian Filesize: 6 87 MB Reviews
high tech heretic why computers don x t belong hi


The Usage of Instructional Technologies by Lecturers (Examples of

Keywords: lecturers instructional technology education faculty pre service teachers High-Tech Heretic: Reflections of a Computer Contrarian
pdf?md =a bbde f bc a a eb a c &pid= s . S main


[PDF] Removing the HCI Bottleneck: How the Human-Computer Interface

Stoll C High Tech Heretic: Why Computers Don't Belong in the Classroom and Other Reflections by a Computer Contrarian Doubleday New York 1999
ch


[PDF] Prayer in a High Tech World - Thomas Merton Center

Thomas Merton's writings on the intersection of faith and tech- Clifford Stoll High Tech Heretic: Reflections of a Computer Contrarian (An-
Thompson





[PDF] Schooling the Digital Generation: - Mediamanual

Teachers College Press 2004) See also Clifford Stoll High-Tech Heretic: Reflections of a Computer Contrarian (New York: Anchor 1995)
Buckingham Schooling the Digital Generation


[PDF] “ a load of ould Boxology !” - CORE

prototypes and reflection on that use and on new design possibilities the exercise is to show how computer technology can be


[PDF] Digital Equity in Education: A Multilevel Examination of Differences

13 fév 2006 · High tech heretic: Why computers don't belong in the classroom and other reflections by a computer contrarian New York: Doubleday


214068[PDF] Removing the HCI Bottleneck: How the Human-Computer Interface

©2001 CRC Press LLC

19

Removing the HCI

Bottleneck: How the

Human-Computer

Interface (HCI) Affects

the Performance of Data

Fusion Systems*

19.1 Introduction

19.2 A Multimedia Experiment

SBIR Objective • Experimental Design and Test Approach

CBT Implementation

19.3 Summary of Results

19.4

Implications for Data Fusion Systems

Acknowledgment

References

19.1 Introduction

During the past two decades, an enormous amount of effort has focused on the development of automated

multisensor data systems. 1-3 These systems seek to combine data from multiple sensors to improve the

ability to detect, locate, characterize, and identify targets. Since the early 1970s, numerous data fusion

systems have been developed for a wide variety of applications, such as automatic target recognition,

identification-friend-foe-neutral (IFFN), situation assessment, and threat assessment. 4

At this time, an

extensive legacy exists for department of defense (DoD) applications. That legacy includes a hierarchical

process model produced by the Joint Directors of Laboratories (shown in Figure 19.1), a taxonomy of algorithms, 5 training material, 6 and engineering guidelines for algorithm selection. 7 The traditional approach for fusion of data progresses from the sensor data (shown on the left side of Figure 19.1) toward the human user (on the ri ght side of Figure 19.1). Conceptually, sensor data are

preprocessed using signal processing or image processing algorithms. The sensor data are input to a Level

1 fusion process that involves data association and correlation, state vector estimation, and identity

*This chapter is based on a paper by Mary Jane Hall et al., Removing the HCI bottleneck: How the human

computer interface (HCI) affects the performance of data fusion systems,

Proceedings of the 2000 MSS National

Symposium on Sensor and Data Fusion, Vol. II,

June 2000, pp. 89-104.

Mary Jane M. Hall

TECH REACH Inc.

Capt. Sonya A. Hall

Minot AFB

Timothy Tate

Naval Training Command

©2001 CRC Press LLC

estimation. The Level 1 process results in an evolving database that contains estimates of the position,

velocity, attributes, and identities of physically constrained entities (e.g., targets and emitters). Subse-

quently, automated reasoning methods are applied in an attempt to perform automated situation assess-

ment and threat assessment. These automated reasoning methods are drawn from the discipline of artificial intelligence.

Ultimately, the results of this dynamic process are displayed for a human user or analyst (via a human-

computer interface (HCI) function). Note that this description of the data fusion process has been greatly

simplified for conceptual purposes. Actual data fusion processing is much more complicated and involves

an interleaving of the Level 1 through Level 3 (and Level 4) processes. Nevertheless, this basic orientation

is often used in developing data fusion systems: the sensors are viewed as the information source and

the human is viewed as the information user or sink. In one sense, the rich information from the sensors

(e.g., the radio frequency time series and imagery) is compressed for display on a small, two-dimensional

computer screen. Bram Ferran, the vice president of research and development at Disney Imagineering Company, recently

pointed out to a government agency that this approach is a problem for the intelligence community. Ferran

8 argues that the broadband sensor data are funneled through a very narrow channel (i.e., the computer screen on a typical workstation) to be processed by a broadband human analyst. In his view, the HCI

becomes a bottleneck or very narrow filter that prohibits the analyst from using his extensive pattern

recognition and analytical capability. Ferran suggests that the computer bottleneck effectively defeats one

million years of evolution that have made humans excellent data gatherers and processors. Interestingly,

Clifford Stoll

9,10 makes a similar argument about personal computers and the multimedia misnomer. Researchers in the data fusion community have not ignored this problem. Waltz and Llinas 3 noted that

the overall effectiveness of a data fusion system (from sensing to decisions) is affected by the efficacy of

the HCI. Llinas and his colleagues 11 investigated the effects of human trust in aided adversarial decision

©2001 CRC Press LLC

19

Removing the HCI

Bottleneck: How the

Human-Computer

Interface (HCI) Affects

the Performance of Data

Fusion Systems*

19.1 Introduction

19.2 A Multimedia Experiment

SBIR Objective • Experimental Design and Test Approach

CBT Implementation

19.3 Summary of Results

19.4

Implications for Data Fusion Systems

Acknowledgment

References

19.1 Introduction

During the past two decades, an enormous amount of effort has focused on the development of automated

multisensor data systems. 1-3 These systems seek to combine data from multiple sensors to improve the

ability to detect, locate, characterize, and identify targets. Since the early 1970s, numerous data fusion

systems have been developed for a wide variety of applications, such as automatic target recognition,

identification-friend-foe-neutral (IFFN), situation assessment, and threat assessment. 4

At this time, an

extensive legacy exists for department of defense (DoD) applications. That legacy includes a hierarchical

process model produced by the Joint Directors of Laboratories (shown in Figure 19.1), a taxonomy of algorithms, 5 training material, 6 and engineering guidelines for algorithm selection. 7 The traditional approach for fusion of data progresses from the sensor data (shown on the left side of Figure 19.1) toward the human user (on the ri ght side of Figure 19.1). Conceptually, sensor data are

preprocessed using signal processing or image processing algorithms. The sensor data are input to a Level

1 fusion process that involves data association and correlation, state vector estimation, and identity

*This chapter is based on a paper by Mary Jane Hall et al., Removing the HCI bottleneck: How the human

computer interface (HCI) affects the performance of data fusion systems,

Proceedings of the 2000 MSS National

Symposium on Sensor and Data Fusion, Vol. II,

June 2000, pp. 89-104.

Mary Jane M. Hall

TECH REACH Inc.

Capt. Sonya A. Hall

Minot AFB

Timothy Tate

Naval Training Command

©2001 CRC Press LLC

estimation. The Level 1 process results in an evolving database that contains estimates of the position,

velocity, attributes, and identities of physically constrained entities (e.g., targets and emitters). Subse-

quently, automated reasoning methods are applied in an attempt to perform automated situation assess-

ment and threat assessment. These automated reasoning methods are drawn from the discipline of artificial intelligence.

Ultimately, the results of this dynamic process are displayed for a human user or analyst (via a human-

computer interface (HCI) function). Note that this description of the data fusion process has been greatly

simplified for conceptual purposes. Actual data fusion processing is much more complicated and involves

an interleaving of the Level 1 through Level 3 (and Level 4) processes. Nevertheless, this basic orientation

is often used in developing data fusion systems: the sensors are viewed as the information source and

the human is viewed as the information user or sink. In one sense, the rich information from the sensors

(e.g., the radio frequency time series and imagery) is compressed for display on a small, two-dimensional

computer screen. Bram Ferran, the vice president of research and development at Disney Imagineering Company, recently

pointed out to a government agency that this approach is a problem for the intelligence community. Ferran

8 argues that the broadband sensor data are funneled through a very narrow channel (i.e., the computer screen on a typical workstation) to be processed by a broadband human analyst. In his view, the HCI

becomes a bottleneck or very narrow filter that prohibits the analyst from using his extensive pattern

recognition and analytical capability. Ferran suggests that the computer bottleneck effectively defeats one

million years of evolution that have made humans excellent data gatherers and processors. Interestingly,

Clifford Stoll

9,10 makes a similar argument about personal computers and the multimedia misnomer. Researchers in the data fusion community have not ignored this problem. Waltz and Llinas 3 noted that

the overall effectiveness of a data fusion system (from sensing to decisions) is affected by the efficacy of

the HCI. Llinas and his colleagues 11 investigated the effects of human trust in aided adversarial decision