- Mohamed IsmailOptical Technologies and Architectures for Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) Head-Mounted Displays (HMDs)Bernard C. Kresstaught byJune 7, 2018

Mohamed Ismail
Optical Technologies and Architectures for Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) Head-Mounted Displays (HMDs)
Bernard C. Kress
taught by
June 7, 2018
Optical Technologies and Architectures for Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR) Head-Mounted Displays (HMDs)
Mohamed Ismail
Level: Intermediate
Length: 4 hours
Format: In-Person Lecture
Intended Audience:
Optical, mechanical and electrical engineers involved in the design and development of Enterprise and Consumer HMDs in all their declinations. Product and project managers involved in defining current and next generation HMD products, technology product roadmaps and next generation optical sub-systems.
Description:
The course starts by providing an extensive overview of the numerous optical technologies and architectures implemented in today’s wearable display consumer products such as in: Smart glasses and digital eyewear, Augmented Reality (AR) and Mixed Reality (MR) headsets, Virtual Reality (VR) and Merged Reality headsets.
The course describes the optical backbone of such head worn systems, and the various optical sub-systems building blocks are listed and analyzed. They include: Depth mapping sensors (either though structured illumination or time of flight), Head tracking sensors (either IMU or camera based), Gaze tracking sensors, Display engines including microdisplay panels, scanner based light engines and diffractive phase panels, Optical combiners integrated either in free space or waveguide platforms.
Emphasis is put on the design and fabrication techniques to provide the best immersion and comfort to the end user, along the following guidelines: Wearable comfort (size/ weight, center of gravity), Visual comfort (eye box size and IPD coverage, resolution, field of view, distortion, dynamic range, stereo overlay amount), Vergence / accommodation disparity (varifocal, multifocal, light fields and holographic displays), Foveated rendering and peripheral displays, Pupil swim and active distortion compensation.
The advantages and limitations of the various optical technologies addressing such specifications are reviewed and analyzed.
More specifically, emphasis will be put on eyebox definition as an experienced spec, subsequent eyebox replication and eyebox enlargement techniques as well as alternative eyebox generation techniques.
In order to design next generation head worn systems, one needs to fully understand the specifics and limitations of the human visual system, and design the optics and the optical architecture around such.
The course also lists the main challenges still lying ahead for next generation headworn systems, where immersion and comfort need to be addressed in concert. The course reviews how such drastic optomechanical specs may be addressed without compromising the features required to provide the user with the ultimate AR/VR experience.
Finally, the course reviews the major market analysts expectations for VR and AR, projected over the next 5 to 10 years, and lists the main actors (major consumer companies as well as start-ups and current investment rounds in such). Demonstration of some of the state of the art AR, MR and VR headsets will be offered to attendees at the end of the course.
Learning Outcomes:
This course will enable you to:
- identify the various consumer and enterprise head worn systems available in industry today, defined as smart glasses, digital eyewear, AR, MR and VR HMDs, and understand their fundamental differences and specifics
- explain the current optical technologies and sub-systems used in VR, AR and MR head worn systems, their advantages and limitations (such as depth mapping sensors, head tracking sensors, display engines, combiner optics, gaze trackers, etc…)
- explain the relations and implications between FOV, resolution, MTF, eyebox size, effective IPD coverage, screen door effects, pupil swim, vergence/accommodation disparity, foveated rendering, peripheral displays, etc.
- explain the limitations of current optical architectures and how such can be overcome by designing the optics around the human visual system. Gain a good understanding of the human visual system, its specifics and limitations
- explain the requirement for next generation head worn AR and VR systems, and review the critical enabling technologies
- the current AR/VR market status as well as the upcoming market expectations for each field (smart glasses, AR and VR)
Instructor(s):
Bernard C. Kress Over the past two decades, Bernard Kress has made significant scientific contributions as an engineer, researcher, associate professor, consultant, instructor, and author. He has been instrumental in developing numerous optical sub-systems for consumer and industrial products, generating IP, teaching and transferring technological solutions to industry. Application sectors include laser materials processing, optical anti-counterfeiting, biotech sensors, optical telecom devices, optical data storage, optical computing, optical motion sensors, digital displays systems, and eventually HUD and HMD displays (smart glasses, AR/MR/VR). Bernard has been specifically involved in the field of micro-optics, wafer scale optics, holography and nano-photonics. He has published half a dozen books and has more than 35 patents granted. He is a short course instructor for the SPIE and has been chair of various SPIE conferences. He is an SPIE fellow since 2013 and has been elected to the board of Directors of SPIE (2017-19). Bernard has joined Google [X] Labs. in 2011 as the Principal Optical Architect on the Google Glass project, and is since 2015 the Partner Optical Architect at Microsoft Corp. on the Hololens project.
Event: SPIE Photonics Europe 2018
Course Held: 22 April 2018
Issued on
June 7, 2018
Expires on
Does not expire