Changes

Jump to: navigation, search

Positional tracking

8,803 bytes added, 16:52, 19 June 2017
no edit summary
==Introduction==
Positional tracking tracks the absolute position of the [[HMD]], hands and other devices and body parts within the 3 dimensional space. Positional tracking records the [[translational movement]]s, left/right, forward/backward and up/down, of your device. Precise and low [[latency]] positional tracking is essential to have a good [[VR]] experience and produce [[presence]].
By contrastPositional tracking is a technology that allows a device to estimate its position relative to the environment around it. It uses a combination of hardware and software to achieve the detection of its absolute position. Positional tracking is an essential technology for [[virtual reality]] (VR), making it possible to track movement with six [[rotational trackingdegrees of freedom]](6DOF) <ref name=”1”> StereoLabs. Positional Tracking. Retrieved from https://www.stereolabs.com/documentation/overview/positional-tracking/introduction.html</ref> <ref name=”2”> Lang, records B. (2013). An introduction to positional tracking and degrees of freedom (DOF). Retrieved from http://www.roadtovr.com/introduction-positional-tracking-degrees-freedom-dof/</ref>. It has to be noted that head tracking is not the same as positional tracking. While head tracking only registers the rotation of the head ([[rotational movementRotational tracking]]s ), with movements such as pitch, yaw, and roll, positional tracking registers the exact position of the headset in space, recognizing forward/backward, up/down and left/right movement <ref name=”3”> Rohr, F. (2015). Positional tracking in VR: what it is and how it works. Retrieved from http://data-reality.com/positional-tracking-in-vr-what-it-is-and-how-it-works</ref>. Positional tracking VR technology brings various benefits to the VR experience. It can change the viewpoint of the user to reflect different actions like jumping, ducking, or leaning forward; allow for an exact representation of the user’s hands and other objects in the virtual environment; increase the connection between the physical and virtual world by, for example, using hand position to move virtual objects by touch; and detect gestures by analyzing position over time <ref name=”2”></ref> <ref name=”4”> Boger, Y. (2014). Overview of positional tracking technologies for virtual reality. Retrieved from http://www.roadtovr.com/overview-of-positional-tracking-technologies-virtual-reality/</ref>. It is also known that positional tracking improves the 3D perception of your HMDthe virtual environment because of parallax (the way objects closer to the eyes move faster than objects farther away). Parallax helps inform the brain about the perception of distance along with stereoscopy <ref name=”2”></ref> <ref name=”4”></ref>. Also, the 6DOF tracking helps reduce drastically motion sickness during the VR experience that is caused due the disconnect between the inputs of what is being seen with the eyes and what is being felt by the ear vestibular system <ref name=”2”></ref> <ref name=”3”></ref>. There are different methods of positional tracking. Choosing which one to apply is dependent on various factors such as turning the tracking accuracy and tiling the refresh rate required, the tracking area, if the tracking is indoor or outdoor, cost, power consumption, computational power available, whether the tracked object is rigid or flexible, and whether the objects are well known of can change <ref name=”4”></ref>. Positional tracking VR technology is a necessity for VR to work properly since an accurate representation of your objects like the heador the hands in the virtual world contribute towards achieving immersion and a greater sense of presence <ref name=”2”></ref> <ref name=”3”></ref> <ref name=”4”></ref> <ref name=”5”> RealVision. The dilemma of positional tracking in cinematic vr films. Retrieved from http://realvision.ae/blog/2016/06/the-dilemma-of-positional-tracking-in-cinematic-vr-films/</ref>.__TOC__==TypesMethod of positional tracking=='''[[Inside-out trackingFile:HMD and markers.png|thumb|1. Markers on an HMD (Image: www.roadtovr.com)]][[File:Optical marker.png|thumb|2. Optical marker by Intersense (Image: www.roadtovr.com)]]'''  There are various methods of positional tracking. The description of the methods provided below is based on Boger (2014) <ref name=”4”></ref>. ===Acoustic Tracking=== The measurement of the time it takes for a known acoustic signal to travel between an emitter and a receiver is known as acoustic tracking. Generally, several transmitters are placed in the tracked area and various receivers placed on the tracked objects. The distance between the receiver and transmitter is calculated by the amount of time the acoustic signal takes to reach the receiver. However, for this to work, the system must be aware of when the acoustic signal was sent. The orientation of a rigid object can be known if this object has multiple receivers placed in a known position. The difference between the time of arrival of the acoustic signal to the multiple receivers will provide data about the orientation of the object relative to the transmitters. One of the downsides of acoustic tracking is that it requires time- consuming calibration to function properly. The acoustic trackers are also susceptible to measurement error due to ambient disturbances such as noise and do not provide high update rates. Due to these disadvantages, acoustic tracking systems are commonly used with other sensors (e.g. inertial sensors) to provide better accuracy. Intersense, an American technology company, has developed successful acoustic tracking systems. ===Inertial Tracking=== Inertial tracking camera is placed made possible by the use of accelerometers and gyroscopes. Accelerometers measure linear acceleration, which is used to calculate velocity and the position of the object relative to an initial point. This is possible due to the mathematical relationship between position over time and velocity, and velocity and acceleration (4). A gyroscope measures angular velocity. It is a solid-state component based on microelectromechanical systems (MEMS) technology and operates based on the device same principles as a mechanical gyro. From the angular velocity data provided by the gyroscope, angular position relative to the initial point is calculated. This technology is inexpensive and can provide high update rates as well as low latency. On the other side, the calculations (i.e. integration and double-integration) of the values given by the accelerometers (acceleration) and gyroscope (angular velocity) that lead to the object’s position can result in a significant drift in position information - decreasing this method’s accuracy. ===Magnetic Tracking=== This method measures the magnitude of the magnetic field in different directions. Normally, the system has a base station that generates a magnetic field, with the strength of the field diminishing as distance increases between the measurement point and base station. Furthermore, a magnetic field allows for the determination of orientation. For example, if the measured object is rotated, the distribution of the magnetic field along the various axes is modified. In a controlled environment, magnetic tracking’s accuracy is good. However, it can be influenced by interference from conductive materials near the emitter of sensors, from other magnetic fields generated by other devices and from ferromagnetic materials in the tracking area.The [[Razor Hydra]] motion controllers is an example of implementation of this specific type of positional tracking in a product. Most [[HMDHead-mounted display|Head-mounted displays]](HMDs) being and smartphones contain [[IMUs]] or [[magnetometer|magnetometers]] that detect the magnetic field of Earth. ===Optical Tracking=== For optical tracking, there are various methods available. The commonality between them all is the use of cameras to gather positional information====Tracking with markers====
'''[[Outside-in This optical tracking]]''' - tracking cameramethod uses a specific pattern of markers placed on an object (sFigure 1) . One or more cameras then seek the markers, using algorithms to extract the position of the object from the visible markers. From the difference between what the video camera is detecting and the known marker pattern, an algorithm calculates the position and orientation of the tracked object. The pattern of markers that are placed in the external environment where the tracked device (HMD) object is within its viewnot random. The number, location, and arrangement of the markers are carefully chosen in order to provide the system with as much information possible so the algorithms do not have missing data.
'''[[Markerless tracking]]''' - tracking system There are two types of markers: passive and active. Passive markers reflect infrared light (IR) towards the light source. In this case, the camera provides the IR signal that is reflected from the markers for detection. Active markers are IR lights that does not use [[fiducial flash periodically and are detected by the cameras. Choosing between the two types of markers]]depends on several variables like distance, type of surface, required viewing direction, and others.
'''[[Markerless inside-out tracking]]'''====Tracking with visible markers====
==Systems=='''[[Lighthouse]]''' - laser-based system developed Visible markers (Figure 2) placed in a predetermined arrangement are also used in optical tracking. The camera detects the markers and their positions leading to the determination of the position and orientation of the object. For example, visible markers can be placed in a specific pattern on the tracking area, and an HMD with cameras would then use this to calculate its position. The shape and size of this type of markers can vary. What is important is that they can be easily identified by [[Valve]] for [[SteamVR]]the cameras.
'''[[Constellation]]''' - optical-based system developed by [[Oculus VR]] for [[Oculus Rift (Platform)]].====Markerless tracking====
==Methods=====Acoustic===Acoustic Objects can be tracked without markers if their geometry is known. With markerless tracking measures , the length of time of an acoustic signal takes to reach various receiverssystem camera searches and compares the received image with the known 3D model for features like edges or color transitions, for example.
Acoustic tracking can be influenced by surrounding noise causing inaccurate measurements. It also demands careful calibration and does not provide high update rates. Even with its faults, acoustic tracking can be used injunction with other methods to improve positional tracking.===Inertial=Depth map tracking====Inertial tracking is performed by [[accelerometer]] and [[gyroscope]]. Accelerometer measures the linear acceleration while gyroscope measures the angular velocity.
While accelerometer A depth camera uses various technologies to create a real-time map of the distances of the objects in the tracking area from the camera. The tracking is performed by extracting the object to be tracked (e.g. hang) from the general depth map and gyroscope supply low latency and high update rateanalyzing it. They do not provide positional information accurately due to factors like driftAn example of a depth map camera is Microsoft’s Kinect.
===Magnetic=Sensor Fusion====Magnetic tracking determines the strength of the magnetic field in different directions.
Magnetic Sensor fusion is a method of using more than one tracking is utilized technique in [[Razer Hydra]]order to improve the detection of position and orientation of the tracked object. It is usually By using a very accurate combination of techniques, one method’s disadvantage can be compensated by another. An example of this would be the combination of inertial tracking and optical tracking method. The former can develop drift, and the latter is susceptible to markers being hidden (occlusion). By combining both, if markers are occluded, although it the position can be influenced estimated by magnetic fields generated by other electronics the inertial trackers, and objects near even if the sensor or emitteroptical markers are completely visible, the inertial sensors provide updates at a higher rate, improving the overall positional tracking.
Most [[HMD]]s ===Oculus Rift and smartphones contain [[magnetometer]]s, [[IMU]]s that detect the magnetic field of Earth.HTC Vive’s positional tracking===
===Optical===One weakness of optical tracking is The [[occlusionOculus Rift]]. Occlusion occurs when objects are hidden positional tracking is different from the camera because they are behind other objectsone the HTC Vive uses.====Marker====Optical tracking with mark consist of adding While the Oculus Rift uses [[Constellation]], an unique marker onto an object you want to track. Then use IR-LED array that is tracked by a camera or multiple cameras to follow , the movement and position of the marker. Finally algorithms are used to figure out the position and orientation of the objectHTC Vive uses Valve’s [[Lighthouse]] technology, which is a laser-based system <ref name=”3”></ref>.
The marker can be active or pass. Active marker, such as an IR lightIn the Oculus Rift, can flash periodically movement is limited to sync with the sight area of the camera - when not enough LEDs are in sight of the camera, the software relies on data sent by the headset’s IMU sensors. A retroreflector can be used as a passive marker to reflect IR With Valve’s position tracking system, the tracking area is flooded with non-visible light emitted by which the camera to create the same effect as a active oneHTC Vive detects using photosensors <ref name=”3”></ref> <ref name=”6”> Buckley, S. (2015). This is how Valve’s amazing lighthouse tracking technology works. Retrieved from http://gizmodo.com/this-is-how-valve-s-amazing-lighthouse-tracking-technol-1705356768</ref>.
Multiple markers with unique designs can be deployed. They allow the camera(s) to track the positions of multiple objects at the same time. ===Positional tracking and smartphones===
====Markerless====Certain identifiable featuresPositional tracking in mobile VR still struggles to achieve a good level of accuracy mainly due to the power needed to handle a positional tracking VR system and the fact that using QR codes and cameras for tracking would contradict the essence of having a simple, such as the fingers on your handsintuitive, can be tracked by cameraand mobile VR experience (s) even without marker(s3). [[Leap Motion]] employs Currently, mobile devices are limited by their form factor and can only track the movements of a user’s head. Nevertheless, companies are still investing in the development of an accurate positional tracking system for smartphones. Having this system available to find anyone with a phone capable of VR would facilitate the adoption of VR by the general public, possibly unlocking the position and movement potential of your fingersthe VR market <ref name=”3”></ref> <ref name=”7”> Grubb, J. (2016). Why positional tracking for mobile virtual reality is so damn hard. Retrieved from https://venturebeat.com/2016/02/24/why-positional-tracking-for-mobile-virtual-reality-is-so-damn-hard</ref>.
==Types of positional tracking==Depth Map==== [[Depth Map camera]] constructs a live view of the distances of the objects from the camera.
'''[[MicrosoftInside-out tracking]]'s '' - tracking camera is placed on the device ([[KinectHMD]] uses depth map camera) being tracking.
===Sensor Fusion===Sensor fusion is when multiple '''[[Outside-in tracking methods are used. The different methods can cover each other]]''' - tracking camera(s weaknesses) is placed in the external environment where the tracked device (HMD) is within its view.
Inertial '''[[Markerless tracking can be used in conjunction with optical tracking. While optical tracking would be the main tracking method, when occlusion occurs, inertial tracking could estimate the position til the objects are visible to the optical camera again. inertial tracking could also generate position data in]]''' -between optical tracking's position data because inertial tracking has higher update ratesystem that does not use [[fiducial markers]].
'''[[Markerless inside-out tracking]]'''
[[Category:Terms]]==References==
349
edits

Navigation menu