Skip to main content

Integrating Live Skeleton data into a VR Environment.

Hoxey, T. and Stephenson, I., 2018. Integrating Live Skeleton data into a VR Environment. In: Eurographics 2018, 16-20 April 2018, Delft, The Netherlands.

Full text available as:

integrating-live-skeleton.pdf - Accepted Version
Available under License Creative Commons Attribution Non-commercial No Derivatives.


Official URL:

DOI: 10.2312/egp.20181014


The aim of this project is to be able to visualise live skeleton tracking data in a virtual analogue of a real world environment, to be viewed in VR. Using a single RGBD camera motion tracking method is a cost effective way to get real time 3D skeleton tracking data. Not only this but people being tracked don’t need any special markers. This makes it much more practical for use in a non studio or lab environment. However the skeleton it provides is not as accurate as a traditional multiple camera system. With a single fixed view point the body can easily occlude itself, for example by standing side on to the camera. Secondly without marked tracking points there can be inconsistencies with where the joints are identified, leading to in- consistent body proportions. In this paper we outline a method for improving the quality of motion capture data in real time, providing an off the shelf framework for importing the data into a virtual scene. Our method uses a two stage approach to smooth smaller inconsistencies and try to estimate the position of improperly proportioned or occluded joints.

Item Type:Conference or Workshop Item (Poster)
Additional Information:The definitive version is available at
Uncontrolled Keywords:motion tracking; motion capture; skeleton tracking; kinect; real time; virtual reality; telepresence
Group:Faculty of Media & Communication
ID Code:30826
Deposited By: Symplectic RT2
Deposited On:05 Jun 2018 14:42
Last Modified:14 Mar 2022 14:11


Downloads per month over past year

More statistics for this item...
Repository Staff Only -