2
NEIL A. DODGSON W e are all familiar with the glasses needed to watch three-dimen- sional (3D) movies. But glasses are inappropriate for many possible uses of a 3D display. There are a range of technologies that can achieve 3D without the glasses. On page 348 of this issue, Fattal et al. 1 describe a new form of illumination device that could help to achieve it for mobile devices. Humans see the world stereoscopically. That is, our two eyes see two slightly different views of the world because they are separated by about 63 millimetres 2 . Our brains combine these two views into an internal 3D model of whatever we are seeing. Conventional 2D displays provide only a single flat image, so the two eyes both see the same picture on the screen. A 3D display device must instead pre- sent a different image to each eye. Glasses-based 3D systems can work in several ways. One way, known as active shut- tering, uses rapidly flickering shutters in the glasses, synchronizing them with rapid alter- nation of left and right images on the screen to give a different image to each eye. This is how most home 3D televisions work. A cheaper solution is passive polarization 3 , in which the two lenses in the glasses are polarized in dif- ferent directions, with the display showing two images simultaneously, one with each polariza- tion. This is how most 3D cinema has worked since the 1950s. And then there is the cheap and cheerful anaglyph solution: the familiar red–green glasses 4 , where each eye’s image is produced using filters of different colours (red and green). Anaglyph was used for the earli- est experiments in 3D movies, in 1922, but was quickly found to cause headaches, hence the migration to polarized systems. However, anaglyph is still used as a cheap alternative, especially in print media. Consider now how you could achieve 3D without using any special glasses. You would need to send a different image to each eye, but the mechanism for doing this now needs to be in the display itself. The individual pixels in the display therefore need directional con- trol. Conventional 2D displays have pixels that send light in all directions. A 3D display needs pixels that send light in carefully constrained directions, so that different light reaches each eye and the viewer’s two eyes thus see different images on the screen. You might assume that just two constrained directions are needed: one for each eye (Fig. 1a,b). But this is not the case, because the display does not know where your eyes are and therefore it has to provide imagery across the entire viewing space, to give Figure 1 | Autostereoscopic effects. ac, Three examples of a 3D imaging display in which the 3D effect is encoded in the display and can be viewed with no need for glasses. a, Only two viewing zones, one for each eye. A viewer with their nose in the right place sees 3D. Anyone else sees just 2D: the same image to each eye. b, Only two views, but the zones repeat. A viewer in the right place sees 3D. A viewer in the wrong place sees pseudoscopically: the wrong image to each eye. c, Sixteen views. Everyone sees a 3D image. This is the effect of multiview autostereoscopic displays such as that proposed by Fattal and colleagues 1 . c Left-eye viewing zones Right-eye viewing zones Left-eye viewing zone Right-eye viewing zone a b Optimal viewing distance Screen OPTICAL DEVICES 3D without the glasses A glasses-free three-dimensional display technology has been invented that may be an exciting alternative to current solutions for mobile devices. It makes use of an optical effect known to school physics students. See Letter p.348 the correct 3D effect no matter where the eyes are placed. The ideal solution is to have pixels that are able to display different colours in every possi- ble direction. This can be achieved using holo- graphic techniques but, currently, holography is practical only for still images. For updatable displays, or 3D video, we need the next best thing to holography: autostereoscopic multi- view 3D display. That is, a stereoscopic display (a different picture to each eye) that works automatically (without requiring the user to wear any special device) and displays multiple different images on one display screen, each visible from particular places in front of the screen. Figure 1c shows such a display with 16 different viewing zones 5 . In each zone, a different image is visible on the screen. Every viewer sees in 3D (a different picture to each eye) and each viewer sees 3D from their par- ticular position (different pictures as you move your head from side to side). There are several ways to achieve this for large-scale displays: optical elements such as lenticular arrays 6 or parallax barriers 7 in front of a liquid-crystal-display panel, multi- ple projectors behind a large lens system 8 , or rapid display of multiple images combined with complex optical switching 9 . Lenticular arrays are a relatively successful mechanism for desktop 3D. They have the advantage that their constituent lenticules (tiny abutting cylindri- cal lenses) comprise a thin sheet in front of a liquid-crystal display, making them cheap to manufacture. The disadvantage of lenticular arrays, shared with parallax barriers, is that they reduce the native resolution of the under- lying display by a factor equal to the number of distinct viewing directions. The most familiar glasses-free mobile display is the Nintendo 3DS, which uses a parallax-barrier system to create two viewing zones. With only two views, the user has to be roughly the right distance from the screen and must ensure that their nose is on the boundary between the two zones, so that each eye sees the appropriate view. The optics of these devices mean that the viewing zones are repeated as the head is moved left–right (Fig. 1b). Lenticular arrays, producing the same kind of effect, have also been used for other mobile devices. Fattal et al. offer a different approach to 316 | NATURE | VOL 495 | 21 MARCH 2013 NEWS & VIEWS © 2013 Macmillan Publishers Limited. All rights reserved

Optical devices: 3D without the glasses

  • Upload
    neil-a

  • View
    215

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Optical devices: 3D without the glasses

N E I L A . D O D G S O N

We are all familiar with the glasses needed to watch three-dimen-sional (3D) movies. But glasses are

inappropriate for many possible uses of a 3D display. There are a range of technologies that can achieve 3D without the glasses. On page 348 of this issue, Fattal et al.1 describe a new form of illumination device that could help to achieve it for mobile devices.

Humans see the world stereoscopically. That is, our two eyes see two slightly different views of the world because they are separated by about 63 millimetres2. Our brains combine these two views into an internal 3D model of whatever we are seeing. Conventional 2D displays provide only a single flat image, so the two eyes both see the same picture on the screen. A 3D display device must instead pre-sent a different image to each eye.

Glasses-based 3D systems can work in several ways. One way, known as active shut-tering, uses rapidly flickering shutters in the glasses, synchronizing them with rapid alter-nation of left and right images on the screen to give a different image to each eye. This is how most home 3D televisions work. A cheaper solution is passive polarization3, in which the two lenses in the glasses are polarized in dif-ferent directions, with the display showing two images simultaneously, one with each polariza-tion. This is how most 3D cinema has worked since the 1950s. And then there is the cheap and cheerful anaglyph solution: the familiar red–green glasses4, where each eye’s image is produced using filters of different colours (red and green). Anaglyph was used for the earli-est experiments in 3D movies, in 1922, but was quickly found to cause headaches, hence the migration to polarized systems. However, anaglyph is still used as a cheap alternative, especially in print media.

Consider now how you could achieve 3D without using any special glasses. You would need to send a different image to each eye, but the mechanism for doing this now needs to be in the display itself. The individual pixels in the display therefore need directional con-trol. Conventional 2D displays have pixels that send light in all directions. A 3D display needs pixels that send light in carefully constrained

directions, so that different light reaches each eye and the viewer’s two eyes thus see different images on the screen. You might assume that just two constrained directions are needed: one for each eye (Fig. 1a,b). But this is not the case, because the display does not know where your eyes are and therefore it has to provide imagery across the entire viewing space, to give

Figure 1 | Autostereoscopic effects. a–c, Three examples of a 3D imaging display in which the 3D effect is encoded in the display and can be viewed with no need for glasses. a, Only two viewing zones, one for each eye. A viewer with their nose in the right place sees 3D. Anyone else sees just 2D: the same image to each eye. b, Only two views, but the zones repeat. A viewer in the right place sees 3D. A viewer in the wrong place sees pseudoscopically: the wrong image to each eye. c, Sixteen views. Everyone sees a 3D image. This is the effect of multiview autostereoscopic displays such as that proposed by Fattal and colleagues1.

c

Left-eye viewing zones

Right-eye viewing zones

Left-eye viewing zone

Right-eye viewing zone

a

b

Optimalviewing distance

Screen

O P T I C A L D E V I C E S

3D without the glassesA glasses-free three-dimensional display technology has been invented that may be an exciting alternative to current solutions for mobile devices. It makes use of an optical effect known to school physics students. See Letter p.348

the correct 3D effect no matter where the eyes are placed.

The ideal solution is to have pixels that are able to display different colours in every possi-ble direction. This can be achieved using holo-graphic techniques but, currently, holography is practical only for still images. For updatable displays, or 3D video, we need the next best thing to holography: autostereoscopic multi-view 3D display. That is, a stereoscopic display (a different picture to each eye) that works automatically (without requiring the user to wear any special device) and displays multiple different images on one display screen, each visible from particular places in front of the screen. Figure 1c shows such a display with 16 different viewing zones5. In each zone, a different image is visible on the screen. Every viewer sees in 3D (a different picture to each eye) and each viewer sees 3D from their par-ticular position (different pictures as you move your head from side to side).

There are several ways to achieve this for large-scale displays: optical elements such as lenticular arrays6 or parallax barriers7 in front of a liquid-crystal-display panel, multi-ple projectors behind a large lens system8, or rapid display of multiple images combined with complex optical switching9. Lenticular arrays are a relatively successful mechanism for desktop 3D. They have the advantage that their constituent lenticules (tiny abutting cylindri-cal lenses) comprise a thin sheet in front of a liquid-crystal display, making them cheap to manufacture. The disadvantage of lenticular arrays, shared with parallax barriers, is that they reduce the native resolution of the under-lying display by a factor equal to the number of distinct viewing directions.

The most familiar glasses-free mobile display is the Nintendo 3DS, which uses a parallax-barrier system to create two viewing zones. With only two views, the user has to be roughly the right distance from the screen and must ensure that their nose is on the boundary between the two zones, so that each eye sees the appropriate view. The optics of these devices mean that the viewing zones are repeated as the head is moved left–right (Fig. 1b). Lenticular arrays, producing the same kind of effect, have also been used for other mobile devices.

Fattal et al. offer a different approach to

3 1 6 | N A T U R E | V O L 4 9 5 | 2 1 M A R C H 2 0 1 3

NEWS & VIEWS

© 2013 Macmillan Publishers Limited. All rights reserved

Page 2: Optical devices: 3D without the glasses

J A S O N M . B U T L E R & S H A H I N R A F I I

Stem and progenitor cells of the blood system reside in the safe haven of their immediate surroundings — a micro-

environment that contains a complex net-work of ‘niche’ cells. Interaction with niche cells prevents unauthorized egress of these haematopoietic stem and progenitor cells into the circulation, whereas their mature counter-parts are readily launched there. How the various niche cells decide to relocate some cells but not others, to meet the demand for blood cells under normal and stressful conditions, is not fully understood. On page 365 of this issue, Hoggatt et al.1 provide one notable clue*.

The authors show that interfering with the interaction between the messenger lipid molecule prostaglandin E2 (PGE2) and its G-protein-coupled receptor EP4 stimulates the mobilization of haemato poietic stem cells and haematopoietic progenitor cells (HSCs/HPCs) from the bone marrow to the circulation. This effect was evident following deletion of the EP4 gene in mice; using EP4 antagonists; and after lowering PGE2 production by treating mice

with non-steroidal anti-inflammatory drugs (NSAIDs), including aspirin, ibuprofen and meloxicam.

Individually, NSAIDs and EP4 antagonists were marginally effective in forcing HSC/HPC egress, but their effect was consider-ably improved when they were used together with other mobilizing agents — G-CSF and a CXCR4 antagonist. Most importantly, the authors report such enhanced effects of NSAIDs, alone or in combination with G-CSF in non-human primates and human subjects. This sets the stage for clinical studies to inter-rogate the potential of PGE2 for optimizing HSC/HPC mobilization protocols as part of therapies involving myeloablation, which involves the destruction of diseased blood cells followed by induced production of new blood cells from the stem-cell pool.

The mechanism by which PGE2–EP4 signal-ling regulates blood-cell trafficking is complex and involves balancing HSC/HPC mobilization both directly, through cell-specific signalling, and indirectly, by activating stromal cells of the niche. Together with earlier work2,3 show-ing that, by contrast, short-term exposure of haematopoietic cells to PGE2 increases their migration and engraftment, these

findings qualify the PGE2–EP4 signalling hub as a gatekeeper of the haematopoietic microenvironment (Fig. 1).

Uncovering the molecular pathways involved in the efficient mobilization and engraftment of HSCs/HPCs has major impli-cations for the treatment of various cancers with myeloablative therapies4,5. High blood levels of a range of factors — including G-CSF, CXCR4 antagonists, numerous CC and CXC chemokines4, VEGF-A and PlGF (ref. 5) — were known to stimulate the mobilization of HSCs/HPCs localized to specific haemato-poietic niches. However, the identity of factors that could modulate both mobilization and engraftment of these cells were not known.

PGE2 is one such factor, and, by interact-ing with EP4, it regulates bidirectional traf-ficking of HSCs/HPCs. Indeed, the effects of PGE2 could be biphasic. Brief stimulation of HSCs/HPCs with this molecule augments engraftment and reconstitution of blood cells by accelerating haematopoiesis6. This function of PGE2 is mediated by activation of the Wnt signalling pathway to control proliferation and programmed death of HSCs/HPCs7. Non-stop stimulation, however, may inhibit HPC expansion and recovery of haematopoiesis8, as well as prevent the trafficking of HSCs/HPCs, partly by CXCR4 downregulation. Therefore, capitalizing on the potential of the PGE2–EP4 signalling pathway to augment haematopoiesis may require a two-pronged approach in which EP4 is initially activated to trigger migration and engraftment of HSCs/HPCs, and then inhibited to encourage HPC expansion and haematopoietic reconstitution.

An intriguing question is exactly how NSAIDs modulate trafficking of the HSCs/HPCs within their microenvironment. EP4 is expressed by bone cells called osteoclasts and osteoblasts; by endothelial cells, which line the

producing an autostereoscopic multiview dis-play. They use a light guide combined with dif-fraction gratings (see Fig. 1a,b of the paper1) to direct light in specific directions. Their dem-onstration models can send light in 14 distinct viewing directions, and they believe that the design can be made to produce up to 64. The implication is that their devices could be used to produce an autostereoscopic display with up to 64 distinct viewing zones. With that many zones, the inter-zone spacing could be con-siderably smaller than the distance between the eyes, producing a smooth 3D effect that would be similar to viewing a white-light hologram.

However, there are considerable challenges to overcome before we will see hologram-like displays on a mobile device. First, we need to make sure that 3D will not compromise spatial resolution. Most current mobile devices have pixels about 250 micrometres across. The new illumination system has individual direction

pixels 36 micrometres across, theoretically allowing us to pack up to 48 directions into the space of a single pixel. This device thus seems to give us the desired directional con-trol with no loss of spatial resolution. Next, the 3D effect must not compromise image qual-ity. The example images in the paper show that there is considerable work to be done to improve quality to an acceptable level. Then, the display must have the 64 images to display. This is straightforward, although computa-tionally expensive, with computer-generated imagery but is a significant challenge for live action.

Finally, the new illumination system must be manufactured reliably, robustly and in quan-tity. This is a matter of careful engineering that can take a long time. I am reminded of the nine-year development of Texas Instruments’ digital micro-mirror device, now widely used in digital projectors, between the invention in 1987 and the first commercial product in 1996.

By analogy, if the authors can solve the prac-tical problems, then they have a compelling alternative to existing 3D display technology. All that remains is the more nebulous question of whether humans want or need 3D displays. Time will tell. ■

Neil A. Dodgson is at the Computer Laboratory, University of Cambridge, Cambridge CB3 0FD, UK.e-mail: [email protected]

1. Fattal, D. Nature 495, 348–351 (2013).2. Dodgson, N. A. Proc. SPIE 5291, 36–46 (2004).3. Land, E. H. US patent 2,099,694 (1937).4. du Hauron, L. D. US patent 544,666 (1895).5. Dodgson, N. A. Appl. Opt. 35, 1705–1710

(1996).6. van Berkel, C. & Clarke, J. A. Proc. SPIE 3012,

179–186 (1997).7. Konrad, J. & Halle, M. IEEE Signal Process. Mag.

24(6), 97–111 (2007).8. Kawakita, M. et al. Proc. SPIE 8288, 82880B

(2012).9. Travis, A. R. L. Appl. Opt. 29, 4341–4343 (1990).

*This article and the paper under discussion1 were published online on 13 March 2013.

S T E M C E L L S

Painkillers caught in blood-cell trafficking Haematopoietic stem and progenitor cells move from the bone marrow into the circulation to replenish normal blood-cell levels. Inhibiting a prostaglandin-mediated signalling pathway may promote this process. See Letter p.365

2 1 M A R C H 2 0 1 3 | V O L 4 9 5 | N A T U R E | 3 1 7

NEWS & VIEWS RESEARCH

© 2013 Macmillan Publishers Limited. All rights reserved