Tactile reasoning and adaptive architecture for intelligence sense-making

Wong, B. L. William and Choudhury, Sharmin (Tinni) (2011) Tactile reasoning and adaptive architecture for intelligence sense-making. In: Symposium on "Emerged/Emerging "Disruptive" Technologies (E2DT)", 09-10 May 2011, Madrid, Spain.

[img]
Preview
PDF
578kB

Official URL: http://ftp.rta.nato.int/public/PubFullText/RTO/MP/...

Abstract

Visual analytics is the science of analytical facilitated by interactive visual interfaces [1]. Visual analytics combines automated analysis techniques with interactive visualizations to facilitate reasoning and making sense of large and complex data sets [2]. A key component of visual analytics is information visualisation, which is the communication of abstract data through visual representations that simplify, aggregate and reveal important relationships [3]. However, information visualisation is just one part of the equation that is visual analytics. The ability to manipulate the data directly and to query and initiate analytic processes through that manipulation with the resulting information is the other major component of visual analytics [1]. Together, interaction, visualisation, and analytics, combine to create powerful tools for supporting the analysis and reasoning with large, mix‐format, multi‐source data sets. We are interested in the application of tactile reasoning to visual analytics. We define tactile reasoning as an interaction technique that supports the analytical reasoning process by the direct manipulation of information objects in a graphical user interface (GUI). In a study by Maglio et al [4] they found that participants using scrabble pieces (individual alphabets on tiles) generated more words when they were allowed to manipulate the scrabble pieces than when they are not allowed to interact with the pieces. The act of tactile manipulation of the scrabble pieces, i.e. the ability to rearrange them, allowed the participants to form words that they could not form without interaction. Tactile reasoning, we therefore hypothesise, enables individuals to see patterns in visually presented data sets they might otherwise not see through the manipulation, rearrangement and other interaction with the information objects. In this paper we describe the concept of tactile reasoning in the context of visual analytics, and the adaptive architecture needed to support it during real‐time manipulation. We conduct our investigation through a lab prototype – INVISQUE – Interactive Visual Search and Query Environment [4,5]. INVISQUE provides an information visualisation interface coupled with a “reasoning workspace” that facilitates tactile reasoning. INVISQUE was funded by JISC to provide an alternative interface to improve information search and retrieval and sense‐making in electronic library resource discovery systems such as the Emerald and ISI electronic journal databases. We have developed an adaptive architecture which underlies INVISQUE and supports the sense‐making by providing the system with the capability to rapidly adapt to changing circumstances.

Item Type:Conference or Workshop Item (Paper)
Keywords (uncontrolled):visual analytic, tactile reasoning, human-computer interaction
Research Areas:Middlesex University Schools and Centres > School of Science and Technology > Computer Science
ID Code:7930
Useful Links:
Deposited On:06 Jun 2011 15:30
Last Modified:10 Dec 2014 20:17

Repository staff only: item control page

Full text downloads (NB count will be zero if no full text documents are attached to the record)

Downloads per month over the past year