top of page

60PPD: by fast LCD but not by micro OLED

Increasing image resolution has always been one of the challenges in VR due to the lack of a sufficient number of pixels in available displays (lets skip computational power for now). The universal target is to obtain an angular resolution of 60 Pixels Per Degree (PPD) to match the normal human vision of the world with naked eyes. Currently, the highest resolution in a compact headset is available only in Apple VisionPro, which achieves about 40 PPD thanks to 4K micro-OLED displays by Sony and best commercially ready pancake lenses.

 

The early model of the AVP visual engine are presented in the webpage below and evolution of VR/MR visual engines is presented in the web-page above. As a result of evolution, today there are two leading visual engine approaches: (1) micro-OLED based pancake optics and (2) fast LCD based pancake optics. Despite that commercially available fast LCD resolution is below 3K, the Companies are working to release 4K resolution LCDs. For the end-user it is transparent what technology is used in the headset and the feeling of immersion based on FoV and PPD is what makes the difference. Therefore, what approach will prevail to reach 60 PPD and wide FoV as questioned on the chart below?:

Human field of view FOV and Visual Acuity PPD vs VR and 60ppd holly grail micro OLED with AVP or fast LCD with HO140

The HO140 visual engine by Hypervsion is already providing up to 180x130 deg configurable FoV and  is currently using 2.16K 2.56" fast LCD. The HO140 will will be available soon for wide-FoV VR/MR developers - see details in PanoVR1.RDK page.

PanoVR1 side-back view

There is a lot of enthusiasm around micro-OLED displays in the VR industry, as it is believed that such displays will enable the most compact and high resolution VR systems, finally achieving the holy grail of 60 PPD. The drawbacks of micro-OLED displays such as high price, small field of view (FoV) and minimal eye-box  are often ignored. The charts below are showing difference between fast-LCD (left) and micro-OLED (right) approaches:

Pancake LCD big eyebox margin vs micro OLED

There are two myths some experts may tell about the micro-OLED and this article is challenging those assumptions:

1. FoV could be increased by bigger micro-OLED

              <= may be not, the dimension is limited by lithographic standard equipment to 1.4" panel size

2. By making pixels small and reaching 6K resolution (for the same 1.4" micro-OLED) the resolution will reach 60PPD

              <= may be not; at the end of the article we show that current 40PPD is a limit  even for 6K micro-OLED

The initiation of that article was due to the fact that many Apple Vision Pro users noticed that the projected images do look “soft” or not very sharp. And detailed technical investigation of the AVP image quality has been reported by Karl Guttag. Several conclusions can be taken out from his analysis:

  1. The images projected through the AVP optics indeed are slightly blurred, and individual pixels are impossible to resolve. Meta Quest 3 provides noticeably more sharp images in comparison to AVP.

  2. One can disassemble an AVP optics module and take the display and lens apart. In some AVP modules, if the display is moved closer to the lens from its design position, the image becomes sharper. This may indicate that in the module, the virtual image was located behind the back of the user as it will be explained in this technical note. It could have been done intentionally by AVP engineers, or it can be a consequence of a high sensitivity of the optics for assembly and manufacturing tolerances.

  3. One can make a conclusion that the blurring of the image is embedded in the design in order to mask the screen-door effect. <== it may be not, continue to read ...

Considering that some people (including ourselves) experience some AVP headsets as providing more sharp images, and other AVP headsets – less sharp, the effect of image blurring does not look intentional. At least it is not repeatable , and it can be caused either by the lens aberrations or the position of the virtual image behind the user or a combination of both. To explain this experience we used simplest catadioptric (pancake) lens include just 2 light folding surfaces and no refractive surfaces as shown on the right. The lens includes two surfaces with optical power L1 (focal length f1 of the 50% mirror surface) and L2 (focal length f2 of reflective polarizer surface) separated by the distance a. The display is located at the distance b from the surface L1. If the display is placed at the back focal plane distance (BFL) from L2 meaning b=BFL-a, the virtual image is located at infinity (collimated) as shown by green rays. If the display is moved closer to the lens, b<(BFL-a), the virtual image moves closer to the observer as shown by blue rays. The image is formed by divergent rays in this case. An eye can accommodate and see the image sharp if it is not too close to the eye.

Catadioptric pancake simple model to explain manufacturing tolerances

Finally, if the display is moved further away from the lens, b>(BFL-a), the virtual image moves behind the observer as shown by red rays. The image is formed by convergent rays. An eye can never accommodate to the convergent image and the image will always be perceived as blurred.

The focal length, F, is calculated according to: 1/F=1/f1+1/f2 - a/(f1*f2) and back focal length, BFL, of a system of two thin lenses separated by the distance aBFL=f1(a-f2)/(a-(f1+f2)). Using the above equations, one can calculate the position of the virtual image for the system defined by the parameters a, b, f1, and f2. 

Assume that the position of the virtual image is set at L0 = 2 m in front of the observer. Consider  two pancake lenses: one designed for use with a micro OLED display and one – for use with a fast LCD. Let’s take the parameters close to the AVP lens for the first case, and for the second case consider a lens with a focal length of 25 mm and a hypothetical pixel size of 11 um to achieve the same PPD as the AVP. The remaining parameters of the fast LCD based HO140 visual engine are confidential.

Simple pancake model micro OLED vs HO140 focal length compare

Let’s simulate manufacturing tolerances for a and b parameters within +/- 0.2 mm and calculate the map of possible virtual image positions. Figure below shows such maps of the virtual image position in mm for the lens designed for micro-OLED (on the left) and fast LCD (on the right). The blue colored area on the maps corresponds to the negative image distances, meaning that the image is behind the user, and it is seen as blurred. For example, one can see that if two folding surfaces in a pancake lens become too far from each other, the image distance becomes negative and the user will see it as blurred.

tollerance sensitivity micro-OLED vs fast LCD color diagram for focus behind infinity

Also, one can see that the errors in a and b less than 0.1 mm are enough to bring the image distance into the blue area for the micro OLED lens. On the other hand, for the fast LCD lens, the image can always be seen as sharp for the errors in elements positioning of up to 0.1 mm. Such a lower tolerance sensitivity of the fast LCD lens is due to it longer focal length.

Typically, a virtual image projected by the VR optics is located at a distance of 1 m to 2 m from the user. This distance is chosen by the headset manufactures as a trade-off to reduce Vergence Accommodation Conflict. If for example due to the manufacturing tolerances the display in a headset optics module is located further away from the lens than it should be according to the design, the virtual image moves further away from the user. In a similar way, if the distance between the light folding surfaces in a VR pancake lens (reflecting polarizer and half-mirror) is larger than designed, the virtual image also moves away from the user. Typically, this image shift is limited, and eye can accommodate to a different distance and still see the image sharp. However, if the lens design is too sensitive to tolerances, the virtual image may move too far away – to infinity and even move behind the user’s back. This means that the virtual objects are located behind the back of the user, and the light entering the user eyes is converging. The human eye can never accommodate to see such an image sharp. Hence, the image is perceived as blurred by the user.

We did tolerance analysis to realize maximal angular resolution for theoretical 6K, 2.56" fast-LCDs vs theoretical 6K 1.4" micro-OLED under commercial quality manufacturing process. We simulated: (1) our HO140 visual engine (based on fast-LCD) vs (2) reversed engineered AVP visual engine (based on micro-OLED) as presented bellow:

HO140 by Hypervision visual engine vs AVP by Apple

When performing a lens tolerance analysis, one lets the lens design parameters to vary within certain limits defined by the manufacturing tolerances. At the same time, to compensate for the decreased lens performance when the design parameters deviate from the nominal ones, it is possible to introduce a compensator. A compensator is a parameter that can be changed to optimize the performance of a lens with the design disturbed by the manufacturing errors.

In our analysis we assume that eye accommodation can vary from 0.5m to 5m and can be such a compensator. Naturally, when viewing a virtual image an eye will automatically try to adjust accommodation to obtain the sharpest possible image.

We set the manufacturing errors to the values shown in the table from the right,  which are tighter as compared to the typical values for commercial optics, see for example lenses by Edmund Optics

 

Commercial precision optics tollerance table

There is a possibility to get and discuss with us the detailed article (that was prepared by Shimon Grabarnik, Ph.D.). There are following additional points in the detailed article:

1. Reversed Engineered AVP optics - Ray Tracing and RMS vs 1.5m and 2m eye accommodation

2. Monte-Carlo simulation for AVP and HO140 with use of compensator for view angles range 0-20deg.

3. Statistics showing that for AVP RMS spot size for 50% smaller than 6.4um and only 10% smaller than 4.8um. For 6K, 1.4" resolution screen - the pixel is 5um, that's why increased resolution wouldn't contribute more PPD. 

    As for HO140, 90% of systems RMS spot radius is below diffraction limit of 4.5um - support of 6K LCD (7.3um pixel)

Monte-Carlo of RMS optics AVP by Apple vs HO140 by Hypervision

4. Image simulation of 3x3 RGB pixels projected through AVP lens, possibly explaining "soft" or blurred image.

5. Real imaging through HO140 with pro-summer Sony camera showing clear distinction between R/G/B 7um subpixels

6. AVP MTF Monte-Carlo tolerancing analysis shows that more than 50% of the manufactured systems have a cut-off frequency below 40 PPD.

7. Eye contrast sensitivity showing that commercial quality optics can't support image quality by 60PPD micro-OLED

8. Comparison of brightness, Color & contrast between micro-OLES and fast-LCD

9. Absence of screen door effect for fast LCD despite small fill-factor for 4K and 6K 2.56" display resolution

    (the frequency of the possible screen-door effect is beyond the sensitivity of a human eye

The summary of the detailed article is presented in the table below:

Summary of compare AVP by Apple vs HO140 by Hypervision

Conclusion

The manufacturing errors in lens thicknesses and positions result in the virtual image shift while errors in the lens surface shape (irregularities) make the image blurry even if its in the right position. Some combination of these two processes probably results in the blurry (or so called "soft" or "screen door effect masking") images in some of the AVPs. This conclusion that the blurry images in AVP are due to the lens manufacturing errors is also supported by the fact that Apple is hiring (as of May 2024) a senior precision optics manufacturing engineer for next releases of AVP.

Analysis of Apple AVP senior optical manufacturing position and challenges Apple may experience with the current AVP optics blur

The next release of Apples mixed reality visual engine will be certainly improved, due to attention Apple gives to the optics manufacturing precision. However, doubt that 60PPD will be supported in the future for the visual engine based on micro-OLED display and quality grade that is not "precision optics". Due to significantly lower cost of fast-LCD, alongside with much bigger FoV, eye-box and ability to reach 60PPD (and in addition backlight upgrade resolving Vergence Accommodation Conflict and boosting the brightness - will be published in Q1-2025)  we believe that in a several years the fast-LCD will be mainstream technology in the Mixed Reality. 

Meanwhile, If you are Virtual / Mixed Reality developer, we would like to ask you advising us what is your wishful characterization of ultimate VR / MR device and fill questionary in the PanoVR1 Reference Design Kit page. While the PanoVR1.RDK already supporting up to 180x130 deg FoV based on 2.16K fast LCD (22PPD) we plan to upgrade it with 3.84K fast-LCD in 2026 (40 PPD) , and theoretically, once 6K fast-LCD will be available, the HO140 visual engine will provide the holly-grail 60PPD. 

PanoVR1 back side view
To get and discuss with us detailed article "Can 60 PPD be reached with micro OLED displays?" Please fill query below:

Thanks for registration for AVP App.Note ! We'll get back to you once will be compiled.

bottom of page