



| |||
![]() ![]() ![]() ![]() | ![]() The Natural Interactive Walking project (NIW) will proceed from the hypothesis that walking, by enabling rich interactions with floor surfaces, consistently conveys enactive information that manifests itself predominantly through haptic and auditory cues. Vision will be regarded as playing an integrative role linking locomotion to obstacle avoidance, navigation, balance, and the understanding of details occurring at ground level. The ecological information we obtain from interaction with ground surfaces allows us to navigate and orient during everyday tasks in unfamiliar environments, by means of the invariant ecological meaning that we have learned through prior experience with walking tasks.
Objectives The project’s two main objectives are: i) The production of a set of foot-floor multimodal interaction methods, for the virtual rendering of ground attributes, whose perceptual saliency has been validated ii) The synthesis of an immersive floor installation displaying a scenario of ground attributes and floor events on which to perform walking tasks, designed in an effort to become of interest in areas such as rehabilitation and entertainment. Due to the lack of fundamental knowledge in foot-floor interaction as opposed to the fairly large amount of existing know-how in locomotion interfaces and related tasks, the consortium envisions that achieving those two objectives will mark a significant breakthrough in the field of walking interfaces. In more detail The project intends to define knowledge, methods and tools for the design of walking experiences through the auditory, haptic, and visual augmentation of otherwise perceptually neutral floors. This will be accomplished through the realization of human-computer walking interfaces conveying virtual ecological cues of ground attributes and floor events. NIW will focus on walking as an everyday experience in which to situate the sensing and display of rich haptic and acoustic information for simulation of basic floor properties, such as material, composition and texture. It will draw on vision in a complementary role, to communicate properties like deformation and the presence of objects through the animation of visually-rendered ground features and events. NIW will link floor properties to physical, material attributes that are considered crucial for characterizing familiar contact interactions of the feet with ground surfaces. Taken together, such attributes permit the characterization of a broad spectrum of materials, including: uniform solids such as concrete and wood; textured surfaces such as carpeted floors; loose and aggregate materials such as gravel, sand, and snow. While the rendering of realistic surface textures and contact interactions has been a significant aim in manually operated virtual environments , in interaction with ground surfaces, the additional material dimensions possessed by aggregate materials are also brought to the fore. Recent research indicates that the human haptic and auditory sensory channels are particularly sensitive to material properties explored during walking, and earlier studies have demonstrated strong links between the physical attributes of the relevant sounding objects and the auditory percepts they generate. The project will select, among these attributes, those which evoke most salient perceptual cues in subjects. With regard to events, the project will concentrate on basic dynamic effects such as impact, friction, and deformation, for the characterization of a simple but rich phenomenology of ecological foot-floor interaction events. Such effects account for normal and tangential contact interactions between feet and floors. They are elicited by diverse component actions undertaken during walking, such as the impact of a shoe against the floor, the deformation of the surface underfoot as weight is shifted on to it, the scraping of a shoe against the ground, the bouncing and rolling of a stone that is kicked. From the standpoint of auditory events, the three effects above, along with that of rolling constitute the four basic solid interaction types in Gaver’s taxonomy of everyday sound. NIW is strongly aimed at developing approaches to interaction that are complementary to the costly and complex robotic walking interfaces that have been a main focus of prior research on virtual walking experiences. The NIW project will develop knowledge toward the approximation of such experiences by means of simple, repeatable, and low-cost interaction methods and devices. Such experiences will be made effective through the careful application and validation of abstracted, illusory and cross-modal perceptual and perceptual-motor effects. As described below, walking interaction methods innovated in NIW will be applicable to closed VR environments, and readily exportable to many other unconstrained contexts that do not depend on the presence of VR display systems. A wide range of haptic illusions is known, and many of them have been successfully used for the design of new haptic interactive devices. This includes some that depend on unusual features of cutaneous or proprioceptive haptic perception, and others simulate ecologically-based phenomena, such as the rolling of a stone along a manipulated rod. Such effects will be at the heart of the development effort in NIW. In the area of auditory display, much has been recently accomplished in the interactive generation of everyday sounds by means of abstracted, lumped signal processing models, and these constitute another area of research of the project. Physically based sound synthesis models are capable of representing sustained and transient interactions between objects of different forms and material types, and such methods will be used in NIW in order to model and synthesize the sonic effects of basic interactions between feet and ground materials, including impacts, friction, or the rolling of loose materials. The project will likewise explore unfamiliar effects that are capable of thrusting users into paradoxical situations, moving from the phenomenon of “pseudo-haptic feedback”. As an example, the impression of being absorbed into the ground, as if being stuck in a swamp or in the mud might be conveyed through visual feedback. As in previous examples of pseudo-haptic feedback (e.g., related to the texture of a manipulated object) the origin of the illusion and cross-modal transfer is the modification of the control/display ratio, i.e., the visual gain or ratio between the real physical movement of the user and the visual displacement on the screen. Activities To date, the generation and use of simplified but perceptually-salient ecological cues in walking, such as those noted above, has been missing. Such phenomena are nonetheless highly salient to current and emerging augmented and virtual environment applications for diverse contexts. The NIW project will innovate a number of methods for closed-loop interaction centred on phenomena and problems where multisensory feedback and sensory substitution can be exploited to create unitary multimodal percepts of the sort encountered in everyday walking tasks. The S&T plan of the project is centred around three activities, devising a methodology that will lead to a progressive development of methods and prototypes. Two of these activities are mutually and tightly interconnected, and will both start at the beginning of the project: i) The design and prototyping of haptic and auditory methods, devices, and computational models for the synthesis of virtual floor attributes of material, texture, and elasticity and for the facilitation of ecologically-based multimodal interaction in walking (leading to milestone M1) ii) The systematic experimental validation of such devices and the measurement of the perceptual impact of cross- and multi-modal effects associated with different synthetic ground properties and floor events, via non-navigational walking tasks providing also visual information in addition to haptic and auditory cues (leading to milestone M2). Since the success of the experiments depends largely on the quality of the interface and vice-versa, it is expected that the design and validation activities will be iterated to progressively lead to improved interface designs and more robust verifications of the experimental hypotheses. Beginning in the second year, validated (in terms of perceptual saliency) walking interaction methods and prototypes will be progressively exported to an immersive and integrated setting also affording some freedom of navigation to users. This research will be presented in milestone M2, and will lead to milestone M3 through the following activity: iii) The assessment of the interaction designs and prototypes delivered in milestone M1 and validated in milestone M2 into an integrated immersive multimodal floor installation capable of engaging users in realistic tasks of walking across grounds of different nature and in the presence of simple floor events. Taken together, these activities constitute the three milestones of the project listed below: M1 – Design, engineering, and prototyping of floor interaction technologies (month 12) M2 – Validated set of ecological foot-based interaction methods, paradigms and prototypes, and designs for interactive scenarios using these paradigms (month 24) M3 – Integration and usability testing of floor interaction technologies in immersive scenarios (month 36). |