Jean Sreng, Ph.D.

Jean Sreng

+33 1 46 54 84 09

CEA - Fontenay-aux-Roses center
DTSI / Virtual reality, Cognitics and Interfaces Unit
Route du panorama
BP6 - F92265 Fontenay-aux-Roses Cedex
France
Last update: 20 July 2009

Current position

I'm currently a Researcher / Engineer at the CEA LIST (French Atomic Energy Commission) / DTSI in the Laboratory of Interactive Simulation (LSI) / Virtual Reality, Cognitics and Interface Units (SRCI). My current research activity is focused on multimodal interaction in virtual environments.

Education

2005 - 2008
CEA - INRIA: Ph.D. in computer science
Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments
2004 - 2005
Paris XI Orsay: M2 (Master's degree) in cognitive science
Intership at CEA on visual aids for virtual prototyping
2002 - 2005
École Centrale Paris: Engineer's degree
Third year specialization in embedded systems

Publications

VIS06
Using Visual Cues of Contact to Improve Interactive Manipulation of Virtual Objects in Industrial Assembly/Maintenance Simulations
Jean Sreng, Anatole Lécuyer, Christine Mégard and Claude Andriot
In IEEE Transactions on visualization and computer graphics 12(5):1013-1020, 2006
This paper describes a set of visual cues of contact designed to improve the interactive manipulation of virtual objects in industrial assembly/maintenance simulations. These visual cues display information of proximity, contact and effort between virtual objects when the user manipulates a part inside a digital mock-up. The set of visual cues encloses the apparition of glyphs (arrow, disk, or sphere) when the manipulated object is close or in contact with another part of the virtual environment. Light sources can also be added at the level of contact points. A filtering technique is proposed to decrease the number of glyphs displayed at the same time. Various effects –such as change in color, change in size, and deformation of shape– can be applied to the glyphs as a function of proximity with other objects or amplitude of the contact forces. A preliminary evaluation was conducted to gather the subjective preference of a group of participants during the simulation of an automotive assembly operation. The collected questionnaires showed that participants globally appreciated our visual cues of contact. The changes in color appeared to be preferred concerning the display of distances and proximity information. Size changes and deformation effects appeared to be preferred in terms of perception of contact forces between the parts. Last, light sources were selected to focus the attention of the user on the contact areas.

VRST07
Using an Event-Based Approach to Improve the Multimodal Rendering of 6DOF Virtual Contact
Jean Sreng, Florian Bergez, Jérémie Le Garrec, Anatole Lécuyer and Claude Andriot
In Proceedings of the ACM symposium on Virtual reality software and technology, 165-173, 2007
This paper decribes a general event-based approach to improve multimodal rendering of 6DOF (degree of freedom) contact between objects in interactive virtual object simulations. The contact events represent the different steps of two objects colliding with each other : (1) the state of free motion, (2) the impact event at the moment of collision (3) the friction state during the contact and (4) the detachment event at the end of the contact. The different events are used to improve the classical feedback by superimposing specific rendering techniques based on these events. First we propose a general method to generate these events based only on the objects’ positions given by the simulation. Second, we describe a set of different types of multimodal feedback associated to the different events that we implemented in a complex virtual simulation dedicated to virtual assembly. For instance, we propose a visual rendering of impact, friction and detachment based on particle effects. We used the impact event to improve the 6DOF haptic rendering by superimposing a high frequency force pattern to the classical force feedback. We also implemented a realistic audio rendering using impact and friction sound on the corresponding events. All these first implementations can be easily extended with other event-based effects on various rigid body simulations thanks to our modular approach.

EH08
Using vibration patterns to provide impact position information in haptic manipulation of virtual objects
Jean Sreng, Anatole Lécuyer and Claude Andriot
In LNCS(5024), Proceedings of EuroHaptics, 589-598, 2008
While standard closed haptic control loop used in haptic simulation of rigid bodies are bounded to low frequency force restitution, event-based or open-loop haptic, by superimposing high-frequency transient force pattern, can provide a realistic feeling of the impact. This highfrequency transient can provide the user with rich information about the contact such as the material properties of the object. Similarly, an impact on different locations of an object produces different vibration patterns that can be used to determine the impact location. This paper investigates the use of such high-frequency vibration patterns to provide impact position information on a simulated long rod held by the edge. We propose in this paper different vibration pattern models to convey the position information: a realistic model based on a numerical simulation of a beam and three empirical simplified models based on exponentially decaying sinusoids. A preliminary evaluation has been conducted with 15 participants. Taken together, our results showed that the users are able to associate vibration information with impact position efficiently.

VR09
Spatialized Haptic Rendering: Providing Impact Position Information in 6DOF Haptic Simulations Using Vibrations.
Jean Sreng, Anatole Lécuyer, Claude Andriot and Bruno Arnaldi
In Proceedings of IEEE VR, 3-9, 2009
In this paper we introduce a "Spatialized Haptic Rendering" technique to enhance 6DOF haptic manipulation of virtual objects with impact position information using vibrations. This rendering technique uses our perceptive ability to determine the contact position by using the vibrations generated by the impact. In particular, the different vibrations generated by a beam are used to convey the impact position information. We present two experiments conducted to tune and evaluate our spatialized haptic rendering technique. The first experiment investigates the vibration parameters (amplitudes/frequencies) needed to enable an efficient discrimination of the force patterns used for spatialized haptic rendering. The second experiment is an evaluation of spatialized haptic rendering during 6DOF manipulation. Taken together, the results suggest that spatialized haptic rendering can be used to improve the haptic perception of impact position in complex 6DOF interactions.

STH09
Perception tactile de la localisation spatiale des contacts
Jean Sreng, Anatole Lécuyer
Sciences et Technologies pour le Handicap, 3(1), 2009
Cet article porte sur la perception haptique et la localisation spatiale des contacts. Un impact sur un objet solide génère des vibrations se propageant le long de celui-ci en fonction de ses propriétés physiques. Ces vibrations, ressenties par la main manipulant l’objet, permettent par exemple de différencier plusieurs types de matériaux d’objets manipulés. D’autre part, la forme de ces vibrations dépend aussi de la position de l’impact sur l’objet. L’objectif de cet article est d’étudier la capacité à déterminer cette position d’impact à travers la perception de ces vibrations. Différents modèles vibratoires basés sur la description d’une poutre en console vibrante sont proposés. L’évaluation perceptive conduite sur ces modèles suggère qu’il est possible de percevoir une position d’impact par le biais de ces vibrations.

Posters

AFRV07
Approche Evénementielle pour l’Amélioration du Rendu Multimodal 6DDL de Contact Virtuel
Jean Sreng, Florian Bergez, Jérémie Le Garrec, Anatole Lécuyer and Claude Andriot
Actes des journées de l'Association Française de Réalité Virtuelle, 97-104, 2007
Cet article présente une approche Événementielle globale permettant d’améliorer le rendu multimodal 6DDL (degré de liberté) de contact entre objets lors de simulations virtuelles interactives. Les événements de contact utilisés représentent les différentes étapes de la collision entre deux objets : (1) le mouvement libre, (2) l’événement d’impact au moment de la collision, (3) l’état de frottement pendant le contact et (4) l’événement de décollement à la fin du contact. Ces différents événements et état sont utilisés afin d’améliorer le retour sensoriel classique en lui superposant différents rendus spécifiques basés sur ces derniers. Tout d’abord, une méthode générale est proposée pour générer ces événements reposant uniquement sur la position des objets fournis par la simulation. En second lieu, nous décrivons un ensemble de retours multimodaux associés à ces différents événements mis en oeuvre sur une simulation complexe de prototypage virtuel. Pour illustrer ces développements, un rendu visuel de l’impact, du frottement et du décollement basé sur des effets de particules est proposé. Nous présentons également une méthode basée sur l’événement d’impact afin d’améliorer le réalisme du retour haptique 6DDL par l’utilisation de motifs de forces haute-fréquence en parallèle d’un rendu classique. D’autre part, une implémentation sonore réaliste des événements d’impact et de frottement a également été réalisée. Ces premières implémentations peuvent être aisément complétées par d’autres rendus événementiels sur diverses simulations virtuelles grâce à l’approche modulaire que nous avons adoptée.

Ph.D. Thesis

I conducted my Ph.D. studies in CS with the Bunraku team at INRIA Rennes and the CEA (French Atomic Energy Commission) under the supervision of Dr Anatole Lécuyer and Dr Claude Andriot under the direction of Pr Bruno Arnaldi. I defended my Ph.D. thesis on December 9th, 2008 in public at Rennes (France).

Title: Contribution to the study of visual, auditory and haptic rendering of information of contact in virtual environments

Abstract: Virtual Reality technologies are increasingly used in numerous domains. In industrial applications for instance, virtual prototyping enables engineers to interactively test if one part can be assembled into another. In such simulations, the contact between virtual objects is an essential notion as it tightly governs the movement of objects, constraining their trajectory with respect to their direct surrounding. The contact helps the user to understand the interaction between geometries, notably through interactive manipulation. Providing information of contact in virtual environments raises many challenges, for instance due to the growing complexity of simulated scenes. In this thesis, we propose to investigate multimodal (visual, auditory and haptic) rendering techniques focused on the contact information in virtual environments. First, we propose an integrated approach for multimodal rendering of information of contact in 6DOF manipulations. We present a generic formulation independent from the underlying simulation. Then, we introduce a multimodal rendering architecture delivering visual, auditory, tactile and kinesthetic feedback of contact. Next, we further investigate the specific issue raised by complex shaped objects. The multiple contacts generated by the interactive manipulation of such objects are difficult to perceive. We propose to address this issue by providing the position information associated to each contact. Thus, we first present and evaluate a visual rendering technique based on glyph and light effects to provide information of contact in situations of multiple contacts between objects. Then, we introduce a haptic rendering technique based on high-frequency vibrations to convey the impact position information. A 1DOF haptic case study based on a vibrating beam is first presented and experimented. Then, we generalize this approach by presenting a spatialized haptic rendering technique for 6DOF manipulation. Two experiments are presented to optimize the rendering parameters of spatialized haptic rendering and provide subjective evaluation.

Keywords: virtual reality, rendering, contact, collision, impact, multimodal, visual, auditory, tactile, kinesthetic, haptic, event-based, vibration, 6DOF

Reviewer