A dispersion-driven 3D color near-eye meta-display
Chromatic dispersion, an inherent wavelength-dependent phenomenon in optical systems, has traditionally been regarded as a detrimental effect to be minimized in imaging and display. Here, we present a paradigm shift by deliberately engineering and harnessing metalens dispersion as a functional mechanism for three-dimensional (3D) near-eye displays. Specifically, we exploit lateral dispersion to transform transverse offset between green and red objects into image-space angular separations that make their images intersected virtually, thereby creating color-merged 3D virtual-image perception. This meta-display architecture preserves compactness of conventional planar display while exhibiting less data requirements and lower hardware complexity than other near-eye 3D displays. Experimentally, we demonstrate a multi-color near-eye 3D system achieving an 11° field of view, 22 pixels-per-degree angular resolution, 0.9 m depth of field, and 19 distinct image planes. This work establishes a new pathway for metasurfaces toward visual displays and highlights great potential for future virtual/augmented reality.
💡 Research Summary
The authors present a novel near‑eye three‑dimensional (3D) display that deliberately exploits chromatic dispersion rather than suppressing it. By engineering a metalens that provides wavelength‑dependent lateral focal shifts while keeping the axial focal length constant, they convert transverse offsets between red and green objects into angular separations that generate depth‑dependent virtual images. The metalens consists of two interleaved metasurfaces shifted by one pixel; each metasurface is composed of rotating silicon nanobricks that impart a geometric phase twice the rotation angle. Finite‑difference‑time‑domain simulations identified optimal brick dimensions (110 × 80 nm for 520 nm, 200 × 100 nm for 660 nm) that maximize conversion efficiency contrast and minimize cross‑talk. The fabricated lens (10 mm focal length, 1 mm lateral separation of the red/green foci, 1 mm × 3 mm aperture) achieved measured efficiencies of 20.6 % (green) and 40 % (red), close to the theoretical values.
The imaging principle follows Δθ = Δx·d/f, where Δx is the object’s transverse shift, d the designed lateral focal offset, and f the focal length. This angular separation causes the red and green rays to intersect at distinct virtual points after reflection from a beam splitter, producing stereoscopic perception when the eye focuses on a particular depth. By varying Δx in integer multiples of the pixel pitch (3.6 µm), the system can theoretically generate 21 discrete image planes spanning 12 cm to 3.56 m, with non‑uniform depth intervals that match human depth discrimination. In the experimental demonstration, three planes (15 cm, 25 cm, 50 cm) were realized, overlaying virtual ellipsoid, cylinder, and cone shapes onto physical panels of an elephant, dog, and lion. A camera acting as a proxy for the human eye captured sharp, color‑merged images only when focused on the corresponding plane, while other planes displayed color‑dependent spatial shifts rather than conventional blur.
Color control is achieved by independently adjusting the intensities of the red and green channels; equal intensities produce yellow, while dominance of one channel yields red or green hues. The current silicon platform absorbs strongly in the blue, limiting the system to two primary colors, but the authors note that alternative low‑loss materials (GaN, Si₃N₄, TiO₂) could enable full‑color operation.
Compared with existing near‑eye 3D displays that rely on bulky optics, multiple projectors, or heavy computational rendering, this meta‑display offers an ultra‑compact form factor, modest data bandwidth, and low hardware complexity while delivering an 11° field of view, 22 pixels/degree angular resolution, 0.9 m depth of field, and up to 19 distinct image planes. The work demonstrates that engineered dispersion in metasurfaces can be a functional resource for visual display technologies, opening pathways toward lightweight, low‑cost augmented and virtual reality headsets. Future work will focus on expanding the color gamut, improving diffraction efficiency, and scaling the metasurface to larger apertures for commercial adoption.
Comments & Academic Discussion
Loading comments...
Leave a Comment