Michael Abrash

Revision as of 17:00, 11 March 2017 by Xinreality (talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Revision as of 17:00, 11 March 2017 by Xinreality (talk | contribs)

(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Michael Abrash is a game programmer and a technical writer that has worked in the computer industry for 37 years. The first game he vividly remembers playing is Breakout. Later, he would write his first game in FORTRAN and assembly language; an arcade-style game for the Vector Graphics VIP CP/M system. The first title published commercially was Space Strike, in 1982. He subsequently collaborated with Dan Illowsky in some published titles during the 80s, and has credits has a programmer or designer in several others during the years, including Quake and Portal 2 [1] [2].

Abrash worked as Graphic’s Lead in Windows NT during a hiatus as a game programmer. He later returned to games with the work on Quake (1996) after joining Id Software. He lists Quake as his favorite game, and some of the technology behind that title is documented in Abrash's Ramblings in Realtime, published in Dr. Dobb's Journal [1] [3]. Afterwards, he worked with RAD Game Tools as a programmer and also on Microsoft’s Xbox. He joined Valve Software, as a developer, from 2010 until 2014 where he researched about Virtual Reality. Currently, he is the Chief Scientist at Oculus VR [1] [2] [4].

As a technical writer, he is the author of several books, including Zen of Assembly Language Volume 1: Knowledge (1990) and the Graphics Programming Black Book (1997) [1].

Contents

Career history

During 1993 or 1994, Michael Abrash found the book Snow Crash on a shelf. He decided to buy it and started reading it. This would have a great impact on him, and he got fascinated by the idea of the Metaverse, thinking that a lot of concepts that were explored in the book could be implemented back then. He even credits it has being an essential inspiration that allowed him to embark on the path that would eventually lead him to work in the field of virtual reality. During this time period, he was working at Microsoft, and was approached by John Carmack, from Id Software. The seminal game Quake was being developed by that company. Abrash knew Carmack previously from the M&T bulletin board when both were learning how to write 3D graphics code. During a meeting, Carmack invited Abrash to go work for his company after talking about creating persistent internet game servers, level building by players, and cyberspace. Abrash was fascinated by the challenge, joining Id Software and the small development team of Quake. The development time of the game was fast, helping him to grow as a programmer. Technically, it was a groundbreaking title, and gave rise to a genre and a community that continue to this day [5] [6].

In 1996, two Microsoft employees (Mike Harrington and Gabe Newell) approached Id to license the Quake source code. They were leaving the company to start Valve Corporation and wanted that code to build their first game on. On the part of Id there was no particular interest to license the code. Abrash stepped in and facilitated getting the license done. It resulted in a good business for the parts involved and later Valve would build Half-life from that source code. He parted ways with Id Software shortly after helping with the deal, and wandered through a series of projects until joining Valve in October of 2010. Upon his arrival at the new company, he expected to be handed technical work, like visibility determination in the Source engine, or fog of war calculation in the game Dota 2. After a while, he understood that what was expected of him was research in high-impact things that no one else was doing; looking for the next platform shift. This led him to augmented reality, and he started researching it with several other people. They had some questions and issues to be explored that presented challenges. According to Abrash some of these were, “what does a wearable UI look like, and how does it interact with wearable input? How does the computer know where you are and what you’re looking at? When the human visual system sees two superimposed views, one real and one virtual, what will it accept and what will it reject? To what extent is augmented reality useful – and if it’s useful, to what extent is it affordably implementable in the near future? What hardware advances are needed to enable the software?” This project was purely research and development - an initial investigation into a promising area - with no product expected for the immediate future [4] [5] [6].

The research group reached the conclusion that virtual reality was potentially more interesting than they had previously thought. Consequently, they switched over to working on VR instead of AR. This decision was influenced by the founding of Oculus VR and the success of its Kickstarter campaign, and also by the book Ready Player One that Abrash recommended his team to read. This change of direction resulted in a VR system that could create a sense of presence, only a year and a half after they started research. This effort along with the success of Oculus VR technology helped resurrect virtual reality, making it the most exciting technology around and also solidifying Abrash’s reputation and knowledge in the VR field [6] [7]. Indeed, the two companies shared intelligence thanks to the relationship between Abrash and John Carmarck that became Oculus CTO (Chief Technology Officer). After more or less 4 years at Valve, Michael Abrash would be hired by Oculus VR to be its chief scientist, in 2014 [7].

According to Abrash, Facebook’s investment in Oculus VR means that it’s the best case scenario for those who want to see VR become a reality. His hiring also provided a rebuff to the idea that top VR talent wouldn’t be inclined to join Oculus after the company was bought by Facebook. The financial support given by the social media giant is a big factor in Abrash’s confidence in Oculus, as the resources and long-term commitment that will be provided allows for the expectation that the hard problems of VR will be solved. Indeed, Abrash thinks that VR is not The Next Big Platform but The Final Platform, the one that will end all other platforms [6] [7].

The future of VR

At 2016 Oculus Connect 3, Michael Abrash gave a presentation about the development of VR in the near future. While it’s often easier to predict what the world is going to be like 20 years in the future instead of three years, the Oculus chief scientist has been almost spot on in his predictions, attested by a presentation given some years earlier, while at Valve [8] [9]. The annual developer event, and the presentation by Michael Abrash, is a way of projecting forward thinking and an inspirational outlook at the future of VR. During his 2016 presentation he made specific predictions about the state of VR technology in five years. In general terms, he predicted that the boundaries between virtual reality and the real world will blur due to advancements in eye-tracking, optics, and audio. He also suggested that VR is going to leap ahead in the near future, and that currently we are in the edge of one of the most important technological revolutions of our lifetime [8] [10] [11].

Specifically, he talked about six main points predicting improvements in the current VR technology that will help it become a seamless experience and indistinguishable from the real world. Firstly, the field of view will improve, while still not reaching the level of a 20/20 vision. The visuals are a critical area for near-term improvement. Presently, high-end head-mounted displays (HMDs) have around 15 pixels per degree, with around 100 degrees field of view and 1080x1200 display panels. Assuming a 20/20 vision, humans are capable of seeing at least 220 degrees field of view at approximately 120 pixels per degree. In five years, he predicts that the current pixels per degree will increase to 30 and the field of view to 140 degrees, with a resolution of around 4000x4000 per eye. Also, the current depth of focus of headsets will become variable. Secondly, there will be improvements in eye tracking technology. This is essential in order to estimate the position of the fovea (the part of the retina that is responsible for providing sharp detail in vision), and necessary for foveated rendering - a technique where only the portion of the image that lands on the fovea is rendered at full quality, with the rest of the image remaining at a lower fidelity. This massively reduces the computational rendering requirements. Thirdly, he explored advancements in audio, with head-related transfer functions (HRTFs) providing an enhancement to the realism of positional audio. While the Oculus Rift 3D audio generates a real-time HRTF based on head tracking, it is general across users. With HRTFs varying between individuals, creating personalized HRTFs should improve the audio experience. Furthermore, he suggested that progress to the modelling of reflection, diffraction and interference patterns should improve sound propagation. Next, he talked about how he predicts that physical controllers will continue to dominate. Hand-held motion devices like the Oculus Touch could remain the main interaction technology in the next decades, with natural improvements in ergonomics, functionality and accuracy. Furthermore, he suggested that hand tracking (without the use of controller or gloves) will become standard. It will be accurate enough to represent precise hand movements in VR, and for simple interactions like web browsing or launching a movie. He made a comparison with smartphones, in which touchscreens are good for casual interaction but that physicals buttons are still superior for typing or intense gaming. He then spoke about his prediction that HMDs will be lighter until 2021, and that high end headsets will become wireless, untethered from PCs. The final point he explored was augmented VR; the potential of bringing the real world into the virtual space by way of scanning in real-time the real environment and rendering it realistically in the headset. The user could also be placed in another scanned environment and interact with it. This could serve as a mixed-reality system, allowing for the users to feel like they were anywhere on the planet. Today it is possible to create a high-resolution recreation of many environments, but not doing it in real-time in a consumer product, which Abrash believes will be solved in the next years [8] [10].

Bibliography

  • Abrash, M. (1997). Graphics Programming Black Book Special Edition. Coriolis Group Books
  • Abrash, M. (1996). Zen of Graphics Programming. Coriolis Group Books
  • Abrash, M. (1994). Zen of Code Optimization. Coriolis Group Books
  • Abrash, M. (1990). Zen of Assembly Language: Volume I. Pearson Scott Foresman
  • Abrash, M. (1989). Power Graphics Programming (Programming series). Que

Game Development

Abrash was involved, throughout the years, in the development of several games. Although his main role was as a programmer, for the majority of the titles, he also has credits in some in which he did not took on that role.

  • Portal 2 (2011)
  • Motocross Madness 2 (2000)
  • Microsoft Flight Simulator 2000 (1999)
  • Quake: The Offering (1999)
  • Half-Life (1998)
  • DOOM 64 (1997)
  • Quake Mission Pack No. 2: Dissolution of Eternity (1997)
  • Quake Mission Pack No. I: Scourge of Armagon (1997)
  • Final DOOM (1996)
  • Hexen: Beyond Heretic (1996)
  • Quake (1996)
  • DOOM (1995)
  • The Ultimate DOOM (1995)
  • Big Top (1983)
  • Cosmic Crusader (1982)
  • Snack Attack II (1982)
  • Space Strike (1982) [1]

References

  1. 1.0 1.1 1.2 1.3 1.4 Moby Games. Michael Abrash. Retrieved from https://www.mobygames.com/developer/sheet/view/developerId,213
  2. 2.0 2.1 Valve Pipeline (2013). Pipeline interviews: Michael Abrash on virtual reality & the future of gaming [video]. Retrieved from https://www.youtube.com/watch?v=l461vIVtDZM
  3. Abrash, M. (1996). Ramblings in real time. Retrieved from http://www.drdobbs.com/ramblings-in-real-time/184410037
  4. 4.0 4.1 Linkedin. Michael Abrash. Retrieved from https://www.linkedin.com/in/michael-abrash-25a01933
  5. 5.0 5.1 Abrash, M. (2012). Valve: How I got here, what it’s like, and what I’m doing. Retrieved from http://blogs.valvesoftware.com/abrash/valve-how-i-got-here-what-its-like-and-what-im-doing-2/
  6. 6.0 6.1 6.2 6.3 Abrash, M. (2014). Introducing Michael Abrash, Oculus Chief Scientist. Retrieved from https://www3.oculus.com/en-us/blog/introducing-michael-abrash-oculus-chief-scientist/
  7. 7.0 7.1 7.2 Ingraham, N. (2014). Oculus VR hires Michael Abrash away from Valve as its new chief scientist. Retrieved from http://www.theverge.com/2014/3/28/5558286/oculus-vr-hires-michael-abrash-as-its-new-chief-scientist
  8. 8.0 8.1 8.2 Langley, H. (2016). This is what virtual reality will (probably) look like in 2021. Retrieved from https://www.wareable.com/vr/michael-abrash-what-vr-will-look-like-in-2021
  9. Abrash, M. (2014). What VR could, should, and almost certainly will be within two years. Retrieved from http://media.steampowered.com/apps/abrashblog/Abrash%20Dev%20Days%202014.pdf
  10. 10.0 10.1 Brennan, D. (2016). Oculus chief scientist predicts the next 5 years of VR technology. Retrieved from http://www.roadtovr.com/michael-abrash-explores-next-5-years-vr-technology/
  11. Brown, M. (2016). Michael Abrash: “No sharp line between VR and reality” in 5 years. Retrieved from https://www.inverse.com/article/21885-oculus-michael-abrash-augmented-vr-ar-facebook