News Item: Conference Proceedings
(Category: e107 welcome message)
Posted by e107
Monday 14 November 2011 - 16:09:37

  1. L. Terziman, A. Lécuyer, S. Hillaire, J. Wiener, Can Camera Motions Improve the Perception of Traveled Distance in Virtual Environments?, IEEE International Conference on Virtual Reality, Lafayette, US, 2009

    We evaluated the influence of oscillating camera motions on the perception of traveled distances in virtual environments. In the experiment, participants viewed visual projections of translations along straight paths. They were then asked to reproduce the traveled distance during a navigation phase using keyboard keys. Each participant had to complete the task (1) with linear camera motion, and (2) with oscillating camera motion that simulates the visual flow generated by natural human walking. Taken together, our results suggest that oscillating camera motions allow a more accurate distance reproduction for short traveled distances.

  2. G. Cirio, M. Marchal, T. Regia-Corte, A. Lécuyer, The Magic Barrier Tape: a Novel Metaphor for Infinite Navigation in Virtual Worlds with a Restricted Walking Workspace, ACM Symposium on Virtual Reality Software and Technology, Yokohama, Japan, 2009

    We have developed a novel interaction metaphor called the Magic Barrier Tape, which allows a user to navigate in a potentially infinite virtual scene while confined to a restricted walking workspace. The technique relies on the barrier tape metaphor and its “do not cross” implicit message by surrounding the walking workspace with a virtual barrier tape in the scene. It uses a hybrid position/rate control mechanism to enable real walking inside the workspace and rate control navigation to move beyond the boundaries by “pushing” on the tape.

  3. C. Drioli, P. Polotti, S. Delle Monache, D. Rocchesso, K. Adiloglu, R. Annies, K. Obermayer Auditory Representations as Landmarks in the Sound Design Space, Proc. SMC09, Porto, Jul. 2009.

    A graphical tool for timbre space exploration and interactive design of complex sounds by physical modeling synthesis is presented. It is built around an auditory representation of sounds based on spike functions and provides the designer with both a graphical and an auditory insight. The auditory representation of a number of reference sounds, located as landmarks in a 2D sound design space, provides the designer with an effective aid to direct his search along the paths that lie in the proximity of the most inspiring landmarks.

  4. S. Papetti, F. Fontana, M. Civolani, A shoe-based interface for ecological ground augmentation, Proc. 4th Int. Haptic and Auditory Interaction Design Workshop, Dresden, 10th-11th Sep. 2009, vol. II, pp. 28-29, 2009.

    The prototype of a wearable shoe-based interface is presented, which provides the user with auditory cues of ground. Data supplied by sensors embedded in the shoes drive a set of physics-based synthesis models running on a laptop, whose audio output is sent to a pair of shoe-mounted loudspeakers. By informing the synthesis models with ecological properties of grounds, neutral floors can be interactively augmented so as to react like they were made of a different material.

  5. S. Serafin, R. Nordahl, L. Turchet, Extraction of ground reaction forces for real-time synthesis of walking sounds, Proc. Audiomostly, Glasgow, Sep. 2009.

    A shoe-independent system to synthesize real-time footstep sounds on different materials has been developed. A footstep sound is considered as the result of an interaction between an exciter (the shoe) and a resonator (the floor). To achieve our goal, we propose two different solutions. The first solution is based on contact microphones attached on the exterior part of each shoe, which capture the sound of a footstep. The second approach consists on using microphones placed on the floor. In both situations, the captured sound is analysed and used to control a sound synthesis engine. We discuss advantages and disadvantages of the two approaches.

  6. R. Nordahl, S. Serafin, F. Fontana. Exploring sonic interaction design and presence: Natural Interactive Walking in Porto. Proc. Presence conference, Los Angeles, CA, November 11-13, 2009.

    In this paper we report on the results of a three days workshop whose goal was to combine interactive sounds and soundscape design to simulate the sensation of walking in a specific location of Porto. We discuss advantages and disadvantages of the different solutions proposed in terms of the technology used, and issues of how sonic interaction combined with soundscape design affects presence in virtual environments.

  7. C. Drioli, D. Rocchesso, Acoustic Rendering of Particle-based Simulation of Liquids in Motion, Proc. DAFX09, Como, Italy, Sept. 1-4, 2009.

    This paper presents an approach to the synthesis of acoustic emission due to liquids in motion. First, the models for the liquid motion description, based on a particle-based fluid dynamics representation, and for the acoustic emission are described, along with the criteria for the control of the audio algorithms through the parameters of the particles system. Then, the experimental results are discussed for a configuration representing the falling of a liquid volume into an underlying rigid container.


  8. Y. Visell, J. Cooperstock, Design of a Vibrotactile Device via a Rigid Surface. Proc. IEEE Haptics Symposium, 2010.

    This paper describes the analysis, optimized redesign and evaluation of a high fidelity vibrotactile interface integrated in a rigid surface. The main application of the embodiment described here is vibrotactile display of virtual ground surface material properties for immersive environments, although the design principles are general. The device consists of a light, composite plate mounted on an elastic suspension, with integrated force sensors. It is actuated by a single voice coil motor. The structural dynamics of the device were optimized, within constraints imposed by the requirements of user interaction, and corrected via digital inverse filtering, in order to enable accurate simulation of virtual ground materials. Measurements of the resulting display demonstrate that it is capable of accurately reproducing forces of more than 40 N across a usable frequency band from 50 Hz to 750 Hz.


  9. Y. Visell, A. Law, S. Smith, J. Ip, J. Cooperstock, Interaction Capture in Immersive Environments via an Intelligent Floor Surface. IEEE International Conference on Virtual Reality, 2010.

    We present techniques to enable users to interact on foot with simulated natural ground surfaces, such as soil or ice, in immersive virtual environments. Position and force estimates from in-floor force sensors are used to synthesize plausible auditory and vibrotactile feedback in response. Relevant rendering techniques are discussed in the context of walking on a virtual frozen pond.


  10. Y. Visell, A. Law, S. Smith, R. Rajalingham, J. Cooperstock, Contact Sensing and Interaction Techniques for a Distributed, Multimodal Floor Display (Tech note). IEEE 3D User Interfaces, 2010.

    This paper presents a novel interface and set of techniques enabling users to interact via the feet with augmented floor surfaces. The interface consists of an array of instrumented floor tiles distributed over an area of several square meters. Intrinsic force sensing is used to capture foot-floor contact at resolutions as fine as 1 cm. These sensing techniques are combined with multimodal display channels in order to enable users to interact with floor-based touch surface interfaces. We present the results of a preliminary evaluation of the usability of such a display.


  11. Y. Visell, A. Law, J. Cooperstock, Toward Iconic Vibrotactile Information Display via Floor Surfaces. Proc. of World Haptics, 2009.

    This paper presents preliminary research on haptic information displays integrated in floor surfaces. We emphasize potential roles for the latter as vibrotactile communication channels that might be seamlessly integrated in everyday environments. We describe the interactive floor component that is the platform for our research, and an application we have developed to support the design of vibrotactile icon sets for display with the device. The results of a preliminary evaluation of this display method are presented, and future directions for this research are described.


  12. R. Nordahl, S. Serafin, L. Turchet, Sound synthesis and evaluation of interactive footsteps for virtual reality applications. Proc. of IEEE International Conference on Virtual Reality, 2010.

    A system to synthesize in real-time the sound of footsteps on different materials is presented. The system is based on microphones which allow the user to interact with his own footwear. This solution distinguishes our system from previous efforts that require specific shoes enhanced with sensors. The microphones detect real footsteps sounds from users, from which the ground reaction force (GRF) is estimated. Such GRF is used to control a sound synthesis engine based on physical models. Evaluations of the system in terms of sound validity and fidelity of interaction are described.

  13. L. Turchet, S. Serafin, R. Nordahl, Physically based sound synthesis and control of footsteps sounds. Proc of Digital Audio Effects Conference, 2010.

    We describe a system to synthesize in real-time footsteps sounds. The sound engine is based on physical models and physically inspired models reproducing the act of walking on several surfaces. To control the real-time engine, three solutions are proposed. The first two solutions are based on floor microphones, while the third one is based on shoes enhanced with sensors. The different solutions proposed are discussed in the paper.

  14. R. Nordahl, A. Berrezag, S. Dimitrov, L. Turchet, V. Hayward, S. Serafin, Preliminary Experiment Combining Virtual Reality Haptic Shoes And Audio Synthesis. Proc. of Eurohaptics, 2010.

    We describe a system that can provide combined auditory and haptic sensations that arise while walking on different grounds. The simulation is based on a physical model that drives both haptic transducers embedded in sandals and headphones. The model is able to represent walking interactions with solid surfaces that can creak, be covered with crumpling material. The simulation responds to pressure on the floor by a vibrotactile signal felt by the feet. In a preliminary discrimination experiment, 15 participants were asked to recognize four different surfaces in a list of sixteen possibilities and under three different conditions, haptics only, audition only and combined haptic-audition. The results indicate that subjects are able to recognize most of the stimuli in the audition only condition, and some of the material properties such as hardness in the haptics only condition. The combination of auditory and haptic cues does not significantly improve recognition.

  15. S. Serafin, L. Turchet, R. Nordahl, S. Dimitrov, A. Berrezag, V. Hayward, Identification of virtual grounds using virtual reality haptic shoes and sound synthesis. Proc. of Eurohaptics symposium on Haptic and Audio-Visual Stimuli, 2010.

    We describe a system which simulates in real-time the auditory and haptic sensation of walking on different surfaces. The system is based on physical models, that drive both the haptic and audio synthesizers, and a pair of shoes enhanced with sensors and actuators. In a discrimination experiment, subjects were asked to recognize the different simulated surfaces they were exposed to with uni-modal (auditory or haptic) and bi-modal (auditory and haptic) cues. Results show that subjects perform the recognition task better when using auditory feedback versus haptic feedback. The combination of auditory and haptic feedback only in some conditions significantly enhances recognition.

  16. S. Serafin, L. Turchet, R. Nordahl, Do you hear a bump or a hole? An experiment on temporal aspects in footsteps recognition. Proc. of Digital Audio Effects Conference, 2010.

    In this paper, we present a preliminary experiment whose goal is to assess the role of temporal aspects in sonically simulating the act of walking on a bump or a hole. In particular, we investigated whether the timing between heel and toe and the timing between footsteps affected perception of walking on unflat surfaces. Results show that it is possible to simulate a bump or a hole only by using temporal information.

  17. L. Turchet, R. Nordahl, S. Serafin, Examining the role of context in the recognition of walking sounds. Proc. of Sound and Music Computing Conference, 2010.

    In this paper, we present an experiment whose goal was to recognize the role of contextual information in the recognition of environmental sounds. Forty three subjects participated to a between-subjects experiment where they were asked to walk on a limited area in a laboratory, while the illusion of walking on different surfaces was simulated, with and without an accompanying soundscape. Results show that, in some conditions, adding a soundscape significantly improves surfaces’ recognition.

  18. L. Turchet, R. Nordahl, A. Berrezag, S. Dimitrov, V. Hayward, S. Serafin, Audio-haptic physically based simulation of walking sounds. Proc. of IEEE International Workshop on Multimedia Signal Processing, 2010.

    We describe a system which simulates in realtime the auditory and haptic sensation of walking on different surfaces. The system is based on physical models, that drive both the haptic and audio synthesizers, and a pair of shoes enhanced with sensors and actuators. We describe the algorithms developed, that simulate both aggregate and solid surfaces. Experiments were run to examine the ability of subjects to recognize the different surfaces with uni-modal (auditory and haptic) and bi-modal cues. Results show that subjects perform the recognition task with auditory rather than with haptic feedback. The combination of auditory and haptic feedback only in some conditions significantly enhances recognition.

  19. L. Turchet, S. Serafin, S. Dimitrov, R. Nordahl, Conflicting audio-haptic feedback in physically based simulation of walking sounds. Proc. of Haptic Audio Interaction Design Conference, 2010.

    We describe an audio-haptic experiment conducted using a system which simulates in real-time the auditory and haptic sensation of walking on different surfaces. The system is based on physical models, that drive both the haptic and audio synthesizers, and a pair of shoes enhanced with sensors and actuators. Such experiment was run to examine the ability of subjects to recognize the different surfaces with both coherent and incoherent audio-haptic stimuli. Results show that in this kind of tasks the auditory modality is dominant on the haptic one.

  20. S. Papetti, F. Fontana, M. Civolani, A. Berrezag, V. Hayward, Audio-tactile Display of Ground Properties Using Interactive Shoes. Haptic and Audio Interaction Design, Lecture Notes in Computer Science, 2010, Volume 6306/2010, 117-128, DOI: 10.1007/978-3-642-15841-4_13.

    We describe an audio-tactile stimulation system that can be worn and that is capable of providing the sensation of walking over grounds of different type. The system includes miniature loudspeakers and broadband vibrotactile transducers embedded in the soles. The system is particularly effective at suggesting grounds that have granular or crumpling properties. By offering a broad spectrum of floor augmentations with moderate technological requirements, the proposed prototype represents a solution that can be easily replicated in the research laboratory. This paper documents the design and features of the diverse components that characterize the prototype in detail, as well as its current limits.

  21. M. Civolani, F. Fontana, S. Papetti, Efficient Acquisition of Force Data in Interactive Shoe Designs. Haptic and Audio Interaction Design, Lecture Notes in Computer Science, 2010, Volume 6306/2010, 129-138, DOI: 10.1007/978-3-642-15841-4_14.

    A four-channel sensing system is proposed for the capture of force data from the feet during walking tasks. Developed for an instrumented shoe design prototype, the system solves general issues of latency of the response, accuracy of the data, and robustness of the transmission of digital signals to the host computer. Such issues are often left partially unanswered by solutions for which compactness, accessibility and cost are taken into primary consideration. By adopting widely used force sensing (Interlink) and analog-to-digital conversion and pre-processing (Arduino) components, the proposed system is expected to raise interest among interaction designers of interfaces, in which the reliable and sufficiently broadband acquisition of force signals is desired.

  22. R. Bresin, A. de Witt, S. Papetti, M. Civolani, F. Fontana, Expressive sonification of footstep sounds. Proc. Interactive Sonification Workshop (iSon2010), pp. 51-54, KTH Stockholm, Sweden, Apr. 2010.

    In this study we present the evaluation of a model for the interactive sonification of footsteps. The sonification is achieved by means of specially designed sensored-shoes which control the expressive parameters of novel sound synthesis models capable of reproducing continuous auditory feedback for walking. In a previous study, sounds corresponding to different grounds were associated to different emotions and gender. In this study, we used an interactive sonification actuated by the sensored-shoes for providing auditory feedback to walkers. In an experiment we asked subjects to walk (using the sensored-shoes) with four different emotional intentions (happy, sad, aggressive, tender) and for each emotion we manipulated the ground texture sound four times (wood panels, linoleum, muddy ground, and iced snow). Preliminary results show that walkers used a more active walking style (faster pace) when the sound of the walking surface was characterized by an higher spectral centroid (e.g. iced snow), and a less active style (slower pace) when the spectral centroid was low (e.g. muddy ground). Harder texture sounds lead to more aggressive walking patterns while softer ones to more tender and sad walking styles.

  23. K. Adiloglu, C. Drioli, P. Polotti, D. Rocchesso, S. delle Monache, Physics-Based and Spike-Guided Tools for Sound Design. Proc. of the 13th Int. Conference on Digital Audio Effects (DAFx-10), Graz, Austria , September 6-10, 2010.

    In this paper we present graphical tools and parameters search algorithms for the timbre space exploration and design of complex sounds generated by physical modeling synthesis. The tools are built around a sparse representation of sounds based on Gammatone functions and provide the designer with both a graphical and an auditory insight. The auditory representation of a number of reference sounds, located as landmarks in a 2D sound design space, provides the designer with an effective aid to direct his search for new sounds. The sonic landmarks can either be synthetic sounds chosen by the user or be automatically derived by using clever parameter search and clustering algorithms. The proposed probabilistic method in this paper makes use of the sparse representations to model the distance between sparsely represented sounds. A subsequent optimization model minimizes those distances to estimate the optimal parameters, which generate the landmark sounds on the given auditory landscape.

  24. S. Delle Monache, P. Polotti, D. Rocchesso, A Toolkit for Explorations in Sonic Interaction Design. Proc. of AudioMostly, Piteå, Sweden, 15-17 Sep. 2010.

    Physics-based sound synthesis represents a promising paradigm for the design of a veridical and effective continuous feedback in augmented everyday contexts. In this paper, we introduce the Sound Design Toolkit (SDT), a software package available as a complete front-end application, providing a palette of virtual lutheries and foley pits, that can be exploited in sonic interaction design research and education. In particular, the package includes polyphonic features and connectivity to multiple external devices and sensors in order to facilitate the embedding of sonic attributes in interactive artifacts. The present release represents an initial version towards an effective and usable tool for sonic interaction designers.

  25. R. Rajalingham, Y. Visell, J. Cooperstock, Probabilistic Tracking of Pedestrian Movements from In-floor Force Measurements. Proceedings of the 7th Canadian Conference on Computer and Robot Vision (CRV'10), 2010.

    This article presents a probabilistic approach to the tracking and estimation of the lower body posture of users moving on foot over an instrumented floor surface. The lat- ter consists of an array of low-cost force platforms provid- ing intermittent foot-floor contact data with limited spatial resolution. We use this data to track body posture in 3D space using Bayesian filters with a switching state-space model. Potential applications of this work to person track- ing and human-computer interaction are described.

  26. L. Turchet, M. Marchal, A. Lécuyer, R. Nordahl, S. Serafin,Influence of auditory and visual feedback for perceiving walking over bumps and holes in desktop VR. Proc. of the ACM Symposium on Virtual Reality Software and Technology, Hong Kong, China, 22-24 Nov. 2010.

    In this paper, we present an experiment whose goal is to investigate the role of sound and vision in the recognition of different surface profiles in a walking scenario. Fifteen subjects participated to two within-subjects experiments where they were asked to interact with a desktop system simulating bumps, holes and flat surfaces by means of audio, visual and audio-visual cues. Results of the first experiment show that participants are able to successfully identify the surface profiles provided through the proposed audio-visual techniques. Results of a second experiment in which conflictual audiovisual stimuli were presented, reveal that for some of the proposed visual effects the visual feedback is dominant on the auditory one, while for the others the role of dominance is inverted.

  27. T. Regia-Corte, M. Marchal, A. Lécuyer, Can You Stand on Virtual Grounds? A Study on Postural Affordances in Virtual Reality. Proc. of the IEEE International Conference on Virtual Reality, 2010.

    The concept of affordance, introduced by the psychologist James Gibson, can be defined as the functional utility of an object, a surface or an event. The purpose of this article was to evaluate the perception of affordances in virtual environments (VE). In order to test this perception, we considered the affordances for standing on a virtual slanted surface. The participants were asked to judge whether a virtual slanted surface supported upright stance. Two dimensions were considered for this perception: (a) the properties of the virtual environment and (b) the properties of the avatar in the VE. The first dimension (environment) was investigated in a first experiment by manipulating the texture of the slanted surface (Wooden texture vs. Ice texture). The second dimension (avatar) was investigated in a second experiment by manipulating the participant’s virtual eye height. Regarding Experiment 1, results showed an effect of the texture: the perceptual boundary (or critical angle) with the Ice texture was significantly lower than with the Wooden texture. Regarding Experiment 2, results showed an effect of virtual eye height manipulations: participants overestimated their ability to stand on the slanted surface when their virtual eye height was reduced. Taken together, these results reveal that perception of affordances for standing on a slanted surface in virtual reality is possible and comparable to previous studies conducted in real environments. More interestingly, it appears that virtual information about friction can be detected and used in VE and that the virtual eye height seems to be an important factor involved in the perception of affordances for standing on virtual grounds.

  28. M. Marchal, A. Lécuyer, G. Cirio, L. Bonnet, T. Regia-Corte, M. Emily, Walking Up and Down in Immersive Virtual Worlds: Novel Interaction Techniques Based on Visual Feedback. Proc. of the IEEE International Symposium on 3D User Interfaces, 2010.

    We introduce novel interactive techniques to simulate the sensation of walking up and down in immersive virtual worlds based on visual feedback. Our method consists in modifying the motion of the virtual subjective camera while the user is really walking in
    an immersive virtual environment. The modification of the virtual viewpoint is a function of the variations in the height of the virtual ground. Three effects are proposed: (1) a straightforward modification of the camera’s height, (2) a modification of the camera’s navigation velocity, (3) a modification of the camera’s orientation. They were tested in an immersive virtual reality setup in which the user is really walking. A Desktop configuration where the user is seated and controls input devices was also tested and compared to the real walking configuration. Experimental results show that our visual techniques are very efficient for the simulation of two canonical shapes: bumps and holes located on the ground. Interestingly, a strong ”orientation-height illusion” is found, as changes in pitch viewing orientation produce perception of height changes (although camera’s height remains strictly the same in this case). Our visual effects could be applied in various virtual reality applications such as urban or architectural project reviews or training, as well as in videogames, in order to provide the sensation of walking on uneven grounds.


  29. L. Terziman, M. Marchal, M. Emily, F. Multon, B. Arnaldi, A. Lécuyer, Shake-Your-Head: Revisiting Walking-In-Place for Desktop Virtual Reality. ACM Symposium on Virtual Reality Software and Technology, 2010.

    In this paper, we propose to revisit the whole pipeline of the Walking-In-Place technique to match a larger set of configurations and apply it notably to the context of desktop Virtual Reality. With our novel ”Shake-Your-Head” technique, the user is left with the possibility to sit down, and to use small screens and standard input devices such as a basic webcam for tracking. The locomotion simulation can compute various motions such as turning, jumping and crawling, using as sole input the head movements of the user. We also introduce the use of additional visual feedback based on camera motions to enhance the walking sensations. An experiment was conducted to compare our technique with classical input devices used for navigating in desktop VR. Interestingly, the results showed that our technique could even allow faster navigations when sitting, after a short learning. Our technique was also perceived as more fun and increasing presence, and was generally more appreciated for VR navigation.

  30. A. Lécuyer, M. Marchal, A. Hamelin, D. Wolinski, F. Fontana, M. Civolani, S. Papetti, S. Serafin, "Shoes-Your-Style": Changing Sound of Footsteps to Create New Walking Experiences, AIMI Workshop, CHItaly, Alghero, Italy, 2011.

    With “Shoes-Your-Style” we introduce the entertaining possibility to change auditory feedback and sound of footsteps when walking indoor and outdoor, to alter the nature (or “style”) of shoes and contacts with the ground. This approach requires two components: a step detector and a sound generator. Artificial sounds of footsteps can be generated and tuned by the system, or by the user, to simulate any type of shoes (high-toe, sneakers, flip-flops, etc) or ground (marble, snow, mud, gravel, etc), generate artificial walking events (collisions, cracking objects, etc) or even unrealistic sounds (noises, words, music, etc). Such new kind of feedback should pave the way to creative interactive walking experiences. Potential applications range from entertainment and multimedia, to sport training and medical rehabilitation and reeducation.

  31. G. Cirio, M. Marchal, A. LeGentil, A. Lécuyer, "Tap, Squeeze and Stir" the Virtual World: Touching the Different States of Matter Through 6DoF Haptic Interaction, (short paper) IEEE International Conference on Virtual Reality (IEEE VR), Singapore, 2011.

    Haptic interaction with virtual objects is a major concern in the virtual reality field. There are many physically-based efficient models that enable the simulation of a specific type of media, e.g. fluid volumes, deformable and rigid bodies. However, combining these often heterogeneous algorithms in the same virtual scene in order to simulate and interact with different types of media can be a complex task. In this paper, we propose the first haptic rendering technique for the simulation and the interaction with multistate media, namely fluids, deformable bodies and rigid bodies, in real-time and with 6DoF haptic feedback. Based on the Smoothed-Particle Hydrodynamics (SPH) physical model for all three types of media, our method avoids the complexity of dealing with different algorithms and their coupling. We achieve high update rates while simulating a physically-based virtual world governed by fluid and elasticity theories, and show how to render interaction forces and torques through a 6DoF haptic device.

  32. G. Cirio, M. Marchal, S. Hillaire, A. Lécuyer, The Virtual Crepe Factory: 6DoF Haptic Interaction with Fluids, ACM SIGGRAPH Emerging Technologies (ETech), Vancouver, Canada, 2011.

    The Virtual Crepe Factory illustrates our novel approach for 6DoF haptic interaction with fluids. It showcases a 2- handed interactive haptic scenario: a recipe consisting in using different types of fluid in order to make a special pancake also known as ”crepe”. The scenario guides the user through all the steps required to prepare a crepe: from the stirring and pouring of the dough to the spreading of different toppings, without forgetting the challenging flipping of the crepe. With the Virtual Crepe Factory, users can experience for the first time 6DoF haptic interactions with fluids of varying viscosity. Our novel approach is based on a Smoothed-Particle Hydrodynamics (SPH) physically-based simulation.

  33. J. Pettré, O. Siret, M. Marchal, A. Lécuyer, Joyman: an immersive and entertaining interface for virtual locomotion, ACM SIGGRAPH ASIA Emerging Technologies (ETech), Hong-Kong, 2011.

    We propose to demonstrate a novel interface called Joyman, designed for immersive locomotion in virtual environments. The interface is based on the metaphor of a ”human-scale joystick”. The device has a simple mechanical design that allows a user to indicate his virtual navigation intentions by leaning in the corresponding direction. It is often said that walking is equivalent to constantly falling ahead. The Joyman follows this assertion: whereas many previous interfaces preserve or stimulate the users proprioception, the Joyman aims at preserving equilibrioception in order to improve the feeling of immersion during virtual locomotion tasks. Thus, the interface could be applied in various virtual reality applications such as urban or architectural project reviews or training, as well as in videogames.

  34. L. Terziman, M. Marchal, F. Multon, B. Arnaldi, A. Lécuyer, Comparing Virtual Trajectories Made in Slalom UsingWalking-In-Place and Joystick Techniques, (short paper) Joint Virtual Reality Conference (Joint Eurographics Symposium on Virtual Environments-EuroVR Conference) (JVRC), 2011.

    In this paper we analyze and compare the trajectories made in a Virtual Environment with two different navigation techniques. The first is a standard joystick technique and the second is the Walking-In-Place (WIP) technique. We propose a spatial and temporal analysis of the trajectories produced with both techniques during a virtual slalom task. We found that trajectories and users' behaviors are very different across the two conditions. Our results notably show that with the WIP technique the users turned more often and navigated more sequentially, i.e. waited to cross obstacles before changing their direction. However, the users were also able to modulate their speed more precisely with the WIP. These results could be used to optimize the design and future implementations of WIP techniques. Our analysis could also become the basis of a future framework to compare other navigation techniques.

  35. M. Marchal, J. Pettre, A. Lécuyer, Joyman: a Human-Scale Joystick for Navigating in Virtual Worlds, IEEE International Symposium on 3D User Interfaces (IEEE 3DUI), Singapore, 2011.

    In this paper, we propose a novel interface called Joyman, designed for immersive locomotion in virtual environments. Whereas many previous interfaces preserve or stimulate the users proprioception, the Joyman aims at preserving equilibrioception in order to improve the feeling of immersion during virtual locomotion tasks. The proposed interface is based on the metaphor of a human-scale joystick. The device has a simple mechanical design that allows a user to indicate his virtual navigation intentions by leaning accordingly. We also propose a control law inspired by the biomechanics of the human locomotion to transform the measured leaning angle into a walking direction and speed - i.e., a virtual velocity vector. A preliminary evaluation was conducted in order to evaluate the advantages and drawbacks of the proposed interface and to better draw the future expectations of such a device.

  36. M. Otis, G. Millet, S. Beniak, J. R. Cooperstock, Modeling of Lower Limbs for Vibrotactile Compensation, Canadian Medical and Biological Engineering Conference, 2011.

    A rehabilitation device with a vibrotactile actuator is designed for analysis of slipping behaviour on different surfaces. However, the coupling between a subject's lower limbs and the device reduces the accuracy of the ground simulation. A real-time algorithm is developed, based on an FFT filter-bank and a rheological limb model, which compensates for lower limb effects on vibrotactile signals.

  37. G. Millet, M. Otis, G. Chaw, J. R. Cooperstock, Initial Development of a Variable-Friction Floor Surface, Canadian Medical and Biological Engineering Conference, 2011.

    This paper investigates the design of a variable-friction floor device for clinical and rehabilitation applications. Several designs based on rolling elements were proposed and investigated. Coefficients of static friction were measured to compare capabilities of floor tiles made of ball transfer units or covered with PTFE. The measurements showed that while a device based on rolling elements is limited in supported footwear and in simulating heel strike, and can lead to a complex friction-variation system, it can simulate friction coefficients as low as ice.

  38. F. Fontana, M. Civolani, S. Papetti, V. dal Bello, B. Bank, An exploration on the influence of vibrotactile cues during digital piano playing, 8th Sound and Music Computing Conference (SMC2011), Padova, Italy, 2011

    An exploratory experiment was carried out in which subjects with different musical skills were asked to play a digital piano keyboard, first by following a specific key sequence and style of execution, and then performing freely. Judgments of perceived sound quality were recorded in three different settings, including standard use of the digital piano with its own internal loudspeakers, and conversely use of the same keyboard for controlling a physics-based piano sound synthesis model running on a laptop in real time. Through its audio card, the laptop drove a couple of external loudspeakers, and occasionally a couple of shakers screwed to the bottom of the keyboard. The experiment showed that subjects prefer the combination of sonic and vibrotactile feedback provided by the synthesis model when playing the key sequences, whereas they promote the quality of the original instrument when performing free. Even if springing out of a preliminary evaluation, these results were in good accordance with the development stage of the synthesis software at the time of the experiment. They suggest that vibrotactile feedback modifies, and potentially improves the performer’s experience when playing on a digital piano keyboard.

  39. S. Papetti, F. Fontana, M. Civolani, Effects of ecological auditory and vibrotactile underfoot feedback on human gait: a preliminary investigation, Haptic and Audio Interaction Design Conference (HAID2011), Kyoto, Japan, 2011.

    We describe an experiment where subjects were asked to walk along a predefined path while wearing a pair of instrumented sandals which provide auditory and vibrotactile feedback. Three experimental conditions were set up, corresponding to different feedback: two conditions simulating different ground materials, and one neutral (i.e. control) condition without artificial feedback. Preliminary results indicate that non-visual interactive feedback in a walking task influences, although not significantly, the subjects’ gait patterns. Interaction designers may exploit these results in foot interfaces which provide vibrotactile and auditory cues to display virtual ground properties for non-intrusive navigation aid.

  40. F. Fontana, F. Morreale, T. Regia-Corte, A. Lécuyer, M. Marchal, Auditory recognition of floor surfaces by temporal and spectral cues of walking, 17th International Conference on Auditory Display (ICAD2011), Budapest, Hungary, 2011.

    In a multiple choice auditory experimental task, listeners had to discriminate walks over floors made of concrete, wood, gravel, or dried twigs. Sound stimuli were obtained by mixing temporal and spectral signal components, resulting in hybrid formulations of such materials. In this way we analyzed the saliency of the corresponding cues of time and frequency in the recognition of a specific floor. Results show that listeners differently weigh such cues during recognition, however this tendency is not polarized enough to enable interaction designers to reduce the functionality of a walking sound synthesizer to simple operations made on the temporal or spectral domain depending on the simulated material.

  41. S. Papetti, M. Civolani, F. Fontana, Rhythm’n’shoes: a wearable foot tapping interface with audio-tactile feedback, Int. Conf. of New Interfaces for Musical Expression (NIME’11), Oslo, Norway, 2011.

    A wearable shoe-based interface is presented, which enables users to play percussive virtual instruments by tapping with their feet. The interface consists of a pair of sandals equipped with four force sensors and actuators a ffording audio-tactile feedback. The sensors provide data via wireless transmission to a host computer, where they are processed and mapped to a physically-based sound synthesis engine. Since the system provides OSC and MIDI compatibility, alternative electronic instruments can be used as well. The audio signals are then sent back wirelessly to the audio-tactile exciters embedded in the sandals' sole, and optionally to headphones and additional external loudspeakers. The round-trip wireless communication only introduces very small latency (about 10 ms), thus allowing tight timing while playing.

  42. S. Serafin, L. Turchet, R. Nordahl. Auditory feedback in a multimodal balancing task: walking on a virtual rope. Proceedings of Sound and Music Computing Conference (SMC2011), Jul. 2011.

    We describe a multimodal system which exploits the use of footwear-based interaction in virtual environments. We developed a pair of shoes enhanced with pressure sensors, actuators, and markers. Such shoes control a multichannel surround sound system and drive a physically based sound synthesis engine which simulates the act of walking on different surfaces. We present the system in all its components, and explain its ability to simulate natural interactive walking in virtual environments. The system was used in an experiment whose goal was to assess the ability of subjects to walk blindfolded on a virtual plank. Results show that subjects perform the task slightly better when they are exposed to haptic feedback as opposed to auditory feedback, although no significant differences are measured. The combination of auditory and haptic feedback does not significantly enhances the task performance.

  43. L. Turchet, S. Serafin. An investigation on temporal aspects in the audio-haptic simulation of footsteps. A. Vatakis et al. (Eds.): Time and Time Perception 2010, pages 101-115, 2011.

    In this paper, we present an experiment whose goal is to assess the role of temporal aspects in sonically and haptically simulating the act of walking on a bump or a hole. In particular, we investigated whether the timing between heel and toe and the timing between footsteps affected perception of walking on unflat surfaces. Results show that it is possible to sonically and haptically simulate a bump or a hole only by varying temporal information.

  44. L. Turchet, S. Serafin, A preliminary study on sound delivery methods for footstep sounds. Proceedings of the 14th Conf. On Digital Audio Effects Conference, Paris, France, 2011.

    In this paper, we describe a sound delivery method for footstep sounds, investigating whether subjects prefer static rendering versus dynamic. In this case, dynamic means that the sound delivery method simulates footsteps following the subject. An experiment was run in order to assess subjects’ preferences regarding the sound delivery methods. Results show that static rendering is not significantly preferred to dynamic rendering, but subjects disliked rendering where footstep sounds followed a trajectory different from the one they were walking along.





This news item is from NIW Project
( http://www.niwproject.eu/news.php?extend.6 )