Sign In
 
156
Collaborazione physics-based in un ambiente virtuale comune

 Catalog-Item Reuse ‭[2]‬

 Catalog-Item Reuse ‭[1]‬

Physics-based collaboration in a shared virtual environment

For physics-based collaboration in a shared virtual environment both architectural and implementative extensions are scheduled as maintenance activities of the Laboratory’s virtual reality simulator that will allow more users to immersively interact, including interfacing with the rich stream of multi-person tracking data. First implementation aims at local operation of participants co-located in the Laboratory, while the next one is designed to test and demonstrate the possibility for people to collaborate remotely in properly equipped laboratories networked with CIRA Laboratory. The geographical distribution version will also dealt with the issues of latency between the input and output (the effect visually perceived by each concurrent user) to minimize the annoying and inhibitory simulator sickness. A large-area multi-person tracking system will enable the implementation of a new, advanced contactless optical skeleton tracking of collaborative immersive experience participants (2+ people). It will consist of four Microsoft Kinect v2 units -   a modern RGB + Depth camera that provides real-time tracking of the main parts of the body (skeleton) of a maximum of 6 people at once. Each Kinect unit will cover a single tracking quadrant and it’s data, after initial co-registration, will be "fused" to create a single set of data used to determine the motion of the virtual mannequins in the common virtual environment that constitute the avatars of each party in the immersive collaborative simulation. The possibility to see the other participants through their mannequin-avatar activated in real time by corresponding tracking data is decisive in the collaborative environment. It is important that it be as realistic and life-like as possible, and to achieve that one has to attend to in real time to all the main sources of the Kinect v2 data: the depth map, the skeleton data at a high level and the color HD video stream.

 

 

Physics-based collaboration in a shared virtual environment<img alt="" src="http://webtest.cira.it/PublishingImages/articoloCVE.jpg" style="BORDER:0px solid;" />https://www.cira.it/en/environment/Monitoraggio-ambiente-e-territorio/cve/collaborazione-physics-based-in-un-ambiente-virtuale-comune/Physics-based collaboration in a shared virtual environmentPhysics-based collaboration in a shared virtual environment<p>​<span lang="EN-US" style="font-size:11pt;line-height:107%;font-family:calibri, sans-serif;">For physics-based collaboration in a shared virtual environment both architectural and implementative extensions are scheduled as maintenance activities of the Laboratory’s virtual reality simulator that will allow more users to immersively interact, including interfacing with the rich stream of multi-person tracking data. First implementation aims at local operation of participants co-located in the Laboratory, while the next one is designed to test and demonstrate the possibility for people to collaborate remotely in properly equipped laboratories networked with CIRA Laboratory. The geographical distribution version will also dealt with the issues of latency between the input and output (the effect visually perceived by each concurrent user) to minimize the annoying and inhibitory simulator sickness. A <em>large-area multi-person tracking</em> system will enable the implementation of a new, advanced contactless optical skeleton tracking of collaborative immersive experience participants (2+ people). It will consist of four Microsoft Kinect v2 units -   a modern RGB + Depth camera that provides real-time tracking of the main parts of the body (skeleton) of a maximum of 6 people at once. Each Kinect unit will cover a single tracking quadrant and it’s data, after initial co-registration, will be "fused" to create a single set of data used to determine the motion of the virtual mannequins in the common virtual environment that constitute the avatars of each party in the immersive collaborative simulation. The possibility to see the other participants through their mannequin-avatar activated in real time by corresponding tracking data is decisive in the collaborative environment. It is important that it be as realistic and life-like as possible, and to achieve that one has to attend to in real time to all the main sources of the Kinect v2 data: the depth map, the skeleton data at a high level and the color HD video stream.</span></p>2017-02-09T23:00:00Z

 Media gallery

 

 

 READ ALSO

 

 

Techniques of interaction and physics-based assembly with the new man-machine Natural User Interface type devices<img alt="" src="http://webtest.cira.it/PublishingImages/lrvt3_b.jpg" style="BORDER:0px solid;" />https://www.cira.it/en/environment/Monitoraggio-ambiente-e-territorio/cve/tecniche-di-interazione-e-assemblaggio-physics-based-con-i-nuovi-dispositivi-di-interfaccia-uomo-macchina-di-tipo-natural-user-interface/Techniques of interaction and physics-based assembly with the new man-machine Natural User Interface type devicesTechniques of interaction and physics-based assembly with the new man-machine Natural User Interface type devicesThe goal of this activity is to design, implement and incorporate in the software system of the Virtual Reality Laboratory of CIRA advanced techniques and the latest devices to improve current direct manual interaction capabilities of a person with objects in the virtual world so that such an interaction proves to be more and more simple, natural a2017-02-09T23:00:00Z