Changes

Jump to: navigation, search

Positional tracking

10,498 bytes added, 17 April
Magnetic Tracking: Typo
==Introduction=={{TOCRIGHT}}{{see also|Tracking}}Positional tracking tracks is a technology that allows a device to estimate its position relative to the environment around it. It uses a combination of hardware and software to achieve the detection of its absolute position of the . It is an essential technology for [[HMDvirtual reality]](VR), hands and other devices and body parts within the 3 dimensional space. Positional tracking records the making it possible to track movement with six [[translational movementdegrees of freedom]]s, left(6DOF) <ref name=”1”> StereoLabs. Positional Tracking. Retrieved from https://www.stereolabs.com/documentation/overview/right, forwardpositional-tracking/backward and upintroduction.html</downref> <ref name=”2”> Lang, B. (2013). An introduction to positional tracking and degrees of your devicefreedom (DOF). Retrieved from http://www.roadtovr. Precise and low [[latency]] com/introduction-positional -tracking is essential to have a good [[VR]] experience and produce [[presence]]-degrees-freedom-dof/</ref>.
By contrast, Positional tracking is not the same as 3DOF head tracking. 3DOF head tracking only registers the rotation of the head ([[rotational Rotational tracking]]), with movements such as pitch, yaw, and roll. Positional tracking registers the exact position and orientation of the headset in space, recognizing forward/backward, up/down and left/right movement <ref name=”3”> Rohr, F. (2015). Positional tracking in VR: what it is and how it works. Retrieved from http://data-reality.com/positional-tracking-in-vr-what-it-is-and-how-it-works</ref>. Positional tracking VR technology brings various benefits to the VR experience. It can change the viewpoint of the user to reflect different actions like jumping, records ducking, or leaning forward; allow for an exact representation of the user’s hands and other objects in the virtual environment; increase the connection between the physical and virtual world by, for example, using hand position to move virtual objects by touch; and detect gestures by analyzing position over time <ref name=”2”></ref> <ref name=”4”> Boger, Y. (2014). Overview of positional tracking technologies for virtual reality. Retrieved from http://www.roadtovr.com/overview-of-positional-tracking-technologies-virtual-reality/</ref>. It is also known that positional tracking improves the 3D perception of the virtual environment because of parallax (the way objects closer to the eyes move faster than objects farther away). Parallax helps inform the brain about the perception of distance along with stereoscopy <ref name=”2”></ref> <ref name=”4”></ref>. Also, the 6DOF tracking helps reduce drastically motion sickness during the VR experience that is caused due the disconnect between the inputs of what is being seen with the eyes and what is being felt by the ear vestibular system <ref name=”2”></ref> <ref name=”3”></ref>. There are different methods of positional tracking. Choosing which one to apply is dependent on various factors such as the tracking accuracy and the refresh rate required, the tracking area, if the tracking is indoor or outdoor, cost, power consumption, computational power available, whether the tracked object is rigid or flexible, and whether the objects are well known of can change <ref name=”4”></ref>. Positional tracking VR technology is a necessity for VR to work properly since an accurate representation of objects like the head or the hands in the virtual world contribute towards achieving immersion and a greater sense of presence <ref name=”2”></ref> <ref name=”3”></ref> <ref name=”4”></ref> <ref name=”5”> RealVision. The dilemma of positional tracking in cinematic vr films. Retrieved from http://realvision.ae/blog/2016/06/the -dilemma-of-positional-tracking-in-cinematic-vr-films/</ref>. ==Methods of positional tracking== [[rotational movementFile:HMD and markers.png|thumb|1. Markers on an HMD (Image: www.roadtovr.com)]][[File:Optical marker.png|thumb|2. Optical marker by Intersense (Image: www.roadtovr.com)]]s  There are various methods of your HMDpositional tracking. The description of the methods provided below is based on Boger (2014) <ref name=”4”></ref>. ===Acoustic Tracking=== The measurement of the time it takes for a known acoustic signal to travel between an emitter and a receiver is known as acoustic tracking. Generally, several transmitters are placed in the tracked area and various receivers placed on the tracked objects. The distance between the receiver and transmitter is calculated by the amount of time the acoustic signal takes to reach the receiver. However, for this to work, the system must be aware of when the acoustic signal was sent. The orientation of a rigid object can be known if this object has multiple receivers placed in a known position. The difference between the time of arrival of the acoustic signal to the multiple receivers will provide data about the orientation of the object relative to the transmitters. One of the downsides of acoustic tracking is that it requires time-consuming calibration to function properly. The acoustic trackers are also susceptible to measurement error due to ambient disturbances such as turning noise and tiling of your headdo not provide high update rates. Due to these disadvantages, acoustic tracking systems are commonly used with other sensors (e.g. inertial sensors) to provide better accuracy. Intersense, an American technology company, has developed successful acoustic tracking systems.__TOC__==Types=Wireless tracking==='''Wireless tracking uses a set of anchors that are placed around the perimeter of the tracking space and one or more tags that are tracked. This system is similar in concept to GPS, but works both indoors and outdoors. Sometimes referred to as indoor GPS. The tags [[Inside-out triangulation (computer vision)|triangulate]] their 3D position using the anchors placed around the perimeter. A wireless technology called Ultra Wideband has enabled the position tracking to reach a precision of under 100 mm. By using sensor fusion and high speed algorithms, the trackingprecision can reach 5 mm level with update speeds of 200 Hz or 5 ms [[Latency (engineering)|latency]]''' .<ref name=”6”> IndoTraq. Positional Tracking. Retrieved from http://indotraq.com/?page_id=122</ref><ref name=”7”> Hands-On With Indotraq. Retrieved from https://www.vrfocus.com/2016/01/hands-on-with-indotraq/</ref><ref name=”8”> INDOTRAQ INDOOR TRACKING FOR VIRTUAL REALITY. Retrieved from https://blog.abt.com/2016/01/ces-2016-indotraq-indoor-tracking-for-virtual- reality/</ref> ===Inertial Tracking=== Inertial tracking camera is placed made possible by the use of accelerometers and gyroscopes. Accelerometers measure linear acceleration, which is used to calculate velocity and the position of the object relative to an initial point. This is possible due to the mathematical relationship between position over time and velocity, and velocity and acceleration (4). A gyroscope measures angular velocity. It is a solid-state component based on microelectromechanical systems (MEMS) technology and operates based on the device same principles as a mechanical gyro. From the angular velocity data provided by the gyroscope, angular position relative to the initial point is calculated. This technology is inexpensive and can provide high update rates as well as low latency. On the other side, the calculations (i.e. integration and double-integration) of the values given by the accelerometers (acceleration) and gyroscope (angular velocity) that lead to the object’s position can result in a significant drift in position information - decreasing this method’s accuracy. ===Magnetic Tracking=== This method measures the magnitude of the magnetic field in different directions. Normally, the system has a base station that generates a magnetic field, with the strength of the field diminishing as distance increases between the measurement point and base station. Furthermore, a magnetic field allows for the determination of orientation. For example, if the measured object is rotated, the distribution of the magnetic field along the various axes is modified. In a controlled environment, magnetic tracking’s accuracy is good. However, it can be influenced by interference from conductive materials near the emitter of sensors, from other magnetic fields generated by other devices and from ferromagnetic materials in the tracking area.The [[Razer Hydra]] motion controllers is an example of implementation of this specific type of positional tracking in a product. Most [[HMDHead-mounted display|Head-mounted displays]](HMDs) being and smartphones contain [[IMUs]] or [[magnetometer|magnetometers]] that detect the magnetic field of Earth. Magnetic tracking can be AC or DC. Magnetic tracking is great because it doesn't need a Kalman filter. It is much higher quality than all other tracking methods, but there are constraints on its usage, like how it cannot be used in environments with a lot of metal due to interference. ===Optical Tracking=== For optical tracking, there are various methods available. The commonality between them all is the use of cameras to gather positional information. ====Tracking with markers==== This optical trackingmethod uses a specific pattern of markers placed on an object (Figure 1). One or more cameras then seek the markers, using algorithms to extract the position of the object from the visible markers. From the difference between what the video camera is detecting and the known marker pattern, an algorithm calculates the position and orientation of the tracked object.The pattern of markers that are placed in the tracked object is not random. The number, location, and arrangement of the markers are carefully chosen in order to provide the system with as much information possible so the algorithms do not have missing data. There are two types of markers: passive and active. Passive markers reflect infrared light (IR) towards the light source. In this case, the camera provides the IR signal that is reflected from the markers for detection. Active markers are IR lights that flash periodically and are detected by the cameras. Choosing between the two types of markers depends on several variables like distance, type of surface, required viewing direction, and others. ====Tracking with visible markers====
'''[[Outside-Visible markers (Figure 2) placed in a predetermined arrangement are also used in optical tracking]]''' - tracking . The camera(s) is detects the markers and their positions leading to the determination of the position and orientation of the object. For example, visible markers can be placed in a specific pattern on the external environment where the tracked device (tracking area, and an HMD) with cameras would then use this to calculate its position. The shape and size of this type of markers can vary. What is important is within its viewthat they can be easily identified by the cameras.
'''[[====Markerless tracking]]''' - tracking system that does not use [[fiducial markers]].====
'''[[Markerless inside-out Objects can be tracked without markers if their geometry is known. With markerless tracking]]''', the system camera searches and compares the received image with the known 3D model for features like edges or color transitions, for example.
==Systems=='''[[Lighthouse]]''' - laser-based system developed by [[Valve]] for [[SteamVR]].Depth map tracking====
'''[[Constellation]]''' A depth camera uses various technologies to create a real- optical-based system developed time map of the distances of the objects in the tracking area from the camera. The tracking is performed by [[Oculus VR]] for [[Oculus Rift extracting the object to be tracked (Platforme.g. hang)]]from the general depth map and analyzing it. An example of a depth map camera is Microsoft’s Kinect.
==Methods==Sensor Fusion===Acoustic===Acoustic tracking measures the length of time of an acoustic signal takes to reach various receivers.
Acoustic Sensor fusion is a method of using more than one tracking technique in order to improve the detection of position and orientation of the tracked object. By using a combination of techniques, one method’s disadvantage can be influenced compensated by surrounding noise causing inaccurate measurementsanother. An example of this would be the combination of inertial tracking and optical tracking. It also demands careful calibration The former can develop drift, and does not provide high update ratesthe latter is susceptible to markers being hidden (occlusion). Even with its faultsBy combining both, acoustic tracking if markers are occluded, the position can be used injunction with other methods to improve positional tracking.===Inertial===Inertial tracking is performed estimated by [[accelerometer]] the inertial trackers, and [[gyroscope]]. Accelerometer measures even if the linear acceleration while gyroscope measures optical markers are completely visible, the angular velocityinertial sensors provide updates at a higher rate, improving the overall positional tracking.
While accelerometer ===Oculus Rift and gyroscope supply low latency and high update rate. They do not provide HTC Vive’s positional information accurately due to factors like drift.tracking===
===Magnetic===Magnetic The [[Oculus Rift]] positional tracking determines is different from the strength of one the magnetic field in different directionsHTC Vive uses. While the Oculus Rift uses [[Constellation]], an IR-LED array that is tracked by a camera, the HTC Vive uses Valve’s [[Lighthouse]] technology, which is a laser-based system <ref name=”3”></ref>.
Magnetic tracking In the Oculus Rift, movement is utilized limited to the sight area of the camera - when not enough LEDs are in [[Razer Hydra]]sight of the camera, the software relies on data sent by the headset’s IMU sensors. It is usually a very accurate With Valve’s position tracking methodsystem, although it can be influenced by magnetic fields generated by other electronics and objects near the sensor or emittertracking area is flooded with non-visible light which the HTC Vive detects using photosensors <ref name=”3”></ref> <ref name=”6”> Buckley, S. (2015). This is how Valve’s amazing lighthouse tracking technology works. Retrieved from http://gizmodo.com/this-is-how-valve-s-amazing-lighthouse-tracking-technol-1705356768</ref>.
Most [[HMD]]s ===Positional tracking and smartphones contain [[magnetometer]]s, [[IMU]]s that detect the magnetic field of Earth.===
===Optical===One weakness Positional tracking in mobile VR still struggles to achieve a good level of optical accuracy mainly due to the power needed to handle a positional tracking is [[occlusion]]VR system and the fact that using QR codes and cameras for tracking would contradict the essence of having a simple, intuitive, and mobile VR experience (3). Occlusion occurs when objects Currently, mobile devices are hidden from limited by their form factor and can only track the camera because they movements of a user’s head. Nevertheless, companies are behind other objects.====Marker====Optical tracking with mark consist still investing in the development of adding an unique marker onto an object you want accurate positional tracking system for smartphones. Having this system available to track. Then use anyone with a camera or multiple cameras to follow phone capable of VR would facilitate the movement and position adoption of VR by the marker. Finally algorithms are used to figure out general public, possibly unlocking the position and orientation potential of the objectVR market <ref name=”3”></ref> <ref name=”7”> Grubb, J. (2016). Why positional tracking for mobile virtual reality is so damn hard. Retrieved from https://venturebeat.com/2016/02/24/why-positional-tracking-for-mobile-virtual-reality-is-so-damn-hard</ref>.
The marker can be active or pass. Active marker, such as an IR light, can flash periodically to sync with the camera. A retroreflector can be used as a passive marker to reflect IR light emitted by the camera to create the same effect as a active one. ==Types of positional tracking==
Multiple markers with unique designs can be deployed. They allow '''[[Inside-out tracking]]''' - tracking camera is placed on the cameradevice (s[[HMD]]) to track the positions of multiple objects at the same timebeing tracking.
====Markerless====Certain identifiable features, such as the fingers on your hands, can be tracked by '''[[Outside-in tracking]]''' - tracking camera(s) even without markeris placed in the external environment where the tracked device (sHMD). [[Leap Motion]] employs this to find the position and movement of your fingersis within its view.
====Depth Map==== '''[[Depth Map cameraMarkerless tracking]]''' - tracking system that does not use [[fiducial markers]] constructs a live view of the distances of the objects from the camera.
'''[[MicrosoftMarkerless inside-out tracking]]'s [[Kinect]] uses depth map camera.'' - combines markerless tracking with inside-out tracking
===Sensor Fusion===Sensor fusion is when multiple '''[[Markerless outside-in tracking methods are used. The different methods can cover each other]]'s weaknesses.'' - combines markerless tracking with outside-in tracking
Inertial ==Comparison of tracking can be used in conjunction with optical tracking. While optical tracking would be the main tracking method, when occlusion occurs, inertial tracking could estimate the position til the objects are visible to the optical camera again. inertial tracking could systems=={{see also generate position data in-between optical |Comparison of tracking's position data because inertial systems}}{{:Comparison of tracking has higher update rate.systems}}
==References==
<references />
[[Category:Terms]] [[Category:Technical Terms]]
326
edits

Navigation menu