Open Access
10 June 2024 Real-time generation of computer-generated hologram for 360-degree panoramic views using cylindrical object light
Ryotaro Kon, Yuji Sakamoto
Author Affiliations +
Abstract

The enormous amount of time required to generate hologram data in electro-holography is a problem that hinders real-time display of holographic video. With the aim of achieving holometric video streaming that enables volumetric video streaming by holography, we propose a method for correcting calculations of cylindrical object light using a graphics processing unit and generating electro-holography in accordance with observer movement in real time. We confirmed through experiments using an optical system that the proposed method enables real-time 360-deg panoramic views of three-dimensional video for multiple users.

1.

Introduction

The use of augmented reality (AR) and virtual reality (VR) content with a head-mounted display (HMD) has been increasing. However, current three-dimensional (3D) display technology using HMDs does not satisfy some of the physiological factors by which humans perceive a stereoscopic effect, which can cause VR sickness and eyestrain.

Electro-holography, which electronically displays reconstructed images, is an ideal 3D display technology that can satisfy all physiological factors contributing to stereoscopic vision in humans. HMDs that use electro-holography are now attracting attention as next-generation HMDs that can display ideal 3D video. This type of holographic HMD is called a holo-HMD.13

A key problem in electro-holography is the enormous amount of time required to generate holographic video. The use of an HMD requires the real-time generation and display of video in accordance with the direction of the user’s head, but calculating hologram data in real time with a holo-HMD has been difficult. Various methods for making such real-time calculations have been studied and can be broadly divided into methods of devising and speeding up algorithms47 and those that use high-speed calculation hardware such as a cluster machine.8 There has been much activity in achieving high-speed calculations using a graphics processing unit (GPU),912 which has made it possible to generate hologram data of simple objects in real time. In practical terms, the display of realistic reconstructed images that include complex objects and hidden surface removal is important,1316 but the generation of these images in real time cannot be achieved. Supercomputers that can execute even faster calculations have been considered, but using such expensive equipment for a single HMD is unrealistic. In addition, the amount of hologram data is considerable, and many problems remain even in terms of compression.

To transmit hologram data that differ among multiple people, the amount of communications can be enormous in 1:1 communication for each holo-HMD. Holometric video streaming has been proposed for solving these problems.17 This technique is used to calculate object light data on the basis of hologram data using a large-scale computer to deliver those data to users simultaneously in a broadcast format and use this object light to generate hologram data for each user at any viewpoint in accordance with the user’s viewing direction. Using these ideas, a method has been proposed for taking into account the existence of a 360-deg cylindrical shape that encloses the target object and broadcasts object light onto the surface of that shape.18 This method enables the high-speed generation of hologram data for holo-HMD use in accordance with the user’s position from the broadcast cylindrical object light so that the object can be observed from 360 deg. This method, however, is not able to obtain 360-deg images (360-deg panoramic views) of the periphery surrounding the user, therefore cannot obtain holographic video within an immersive environment.

We propose a holometric video-streaming method that enables a user using a holo-HMD to obtain holographic video within an immersive environment. Our method corrects cylindrical object light data calculated from an object on the outer side of the cylinder surrounding the user to planar object light data.

2.

Related Research

2.1.

Holometric Video Streaming

Volumetric video streaming has been studied for capturing a target object using multiple cameras or a range camera to obtain 3D information and transmitting that information to a user on the receiving side wearing a device such as an HMD to enable free-viewpoint viewing. This technology is expected to find use in a variety of fields, such as entertainment and medical care.19 In contrast, holometric video streaming records and transmits all light waves (object light) emitted from an object in place of the object’s 3D information to enable viewing of 3D video. There are two holometric video-streaming methods for measuring the light waves (object light) reflected off of a 3D object:

  • 1. Digital holography for directly measuring light waves from a real object

  • 2. Computer-generated hologram (CGH) method for capturing an object using multiple cameras or a range camera to obtain 3D information and using those data to calculate reflected light waves through computer simulation.

With digital holography, measurements can only be done in a dark room in which outside light cannot enter, which is a major limitation. The CGH method, however, can be used under natural light, but an enormous amount of calculations are needed to generate object light. As shown in Ref. 18, they have therefore proposed a method for calculating object light—the most time-consuming step—using a high-performance, large-scale server or computer cluster, transmitting the calculated object light data using high-speed broadcast technology, such as 5G, and using the transmitted data to generate a free-viewpoint hologram on the client side consisting, for example, of an HMD (Fig. 1). Since it transmits object light data instead of a generated hologram, one advantage of this method is that it can display different 3D videos for multiple users in a more efficient manner than calculating and transmitting a hologram frame-by-frame for each viewpoint. The shape of the object light to be transmitted can be freely determined, and an ordinary hologram can be calculated using a planar shape in relation to the device displaying the hologram. In addition to a planar shape20 for the object light to be calculated as used in holometric video streaming, calculation using a cylindrical shape has been proposed, the details of which are given in the following section.

Fig. 1

Overview of holometric video streaming.

OE_63_6_065101_f001.png

The range of movement that a user wearing an HMD can take is called degrees of freedom (DoF). Commonly used HMDs enable 3- or 6-DoF.21 As shown in Fig. 2 [in this paper, we use a left-handed coordinate which is often used in computer graphics (CG)], 3-DoF corresponds to three different types of facial rotations, pitch, yaw, and roll, which correspond to the vertical movement of the neck, sideways movement of the neck, and tilting of the neck sideways, respectively. 6-DoF adds forward and backward, left and right, and up and down movement of a user wearing an HMD to the above facial rotations. Displaying 3D video corresponding to these movements can heighten the observer’s sense of immersion. Our proposed method enables observation corresponding to 3-DoF.

Fig. 2

(a) 3- and (b) 6-DoF movements.

OE_63_6_065101_f002.png

2.2.

Conventional Method (Viewing of Object from 360 deg)

Reference 18 can be cited as a conventional method using cylindrical object light. As shown in Fig. 3, they assumed that an object is arranged near the center of a cylinder, which is viewed from any position outside the cylinder. Given that this method uses cylindrical object light, the planar display mounted on ordinary HMDs cannot be used directly. It is therefore necessary to generate planar object light from cylindrical object light through correction calculations.

Fig. 3

Overview of conventional method.

OE_63_6_065101_f003.png

This conventional method converts cylindrical object light into planar object light using an approximation in which light waves propagate from a point at the center of a cylinder. It is assumed that the object and a planar-object-light surface are small compared with the radius of the cylinder and that the planar-object-light surface is situated near the cylinder. The wavefront is therefore considered to propagate radially from the center of the object. Based on this approximation, the point corresponding to the light wave of a single pixel on the planar object light lies on a line connected to the center of the object, and the wave at the point where that line intersects the cylindrical surface propagates outward. As shown in Fig. 4, the light wave of a pixel P on the planar object light is propagated from the light wave at a single point C where the straight line connecting the origin O and point P intersects with the cylinder. The propagation calculations are conducted using the path difference l between P and C. By denoting the complex amplitude of the cylindrical object light as Oc, and the wave number as k, planar object light Op can be determined as

Eq. (1)

Op=Ocexp(jkl).

Fig. 4

Overview of correction calculations (conventional method).

OE_63_6_065101_f004.png

3.

Proposed Method

3.1.

Overview

As shown in Fig. 5, our method situates the user at the center of the cylinder and has the user observe an object defined outside the cylinder. The viewpoint of the proposed method is opposite that of the conventional method. Compared with the conventional method, our method has the advantage of enabling the arrangement and observation of objects 360 deg around the user, making it easy for the user to feel a sense of immersion.

Fig. 5

Overview of the proposed method.

OE_63_6_065101_f005.png

Calculations of Oc are conducted using the point-light method. Although there are other methods for conducting these calculations using, for example, the fast Fourier transform for conducting object-light calculations at high speed,2226 the problem is that they are not fast enough to calculate in real time.

For the environment envisioned in our study, however, the plan is to conduct object-light calculations on a large-scale, high-performance server, so we used the point-light method27 for reconstructing detailed objects, thus enhancing the user’s sense of immersion. We calculated Oc by breaking down a CG model generated with polygons into a point cloud and using the point-light method on the basis of that cloud. Denoting cylindrical coordinates as (xc,yc,zc), point-light-source coordinates as (xl,yl,zl), amplitude as ai, light wavelength as λ, and initial phase of the point-light source as ϕi, the complex amplitude distribution ui on the cylindrical surface emitted from the light can be calculated as

Eq. (2)

ui(xc,yc,zc)=airiexp{j(kri+ϕi)}ri=(xlxc)2+(ylyc)2+(zlzc)2.

When the virtual object is defined as N point-light sources, cylindrical object light Oc(xc,yc,zc) of any point on the cylindrical surface can be expressed as

Eq. (3)

Oc(xc,yc,zc)=n=1Nui(xi,yi,zi).

In the conventional method, it is assumed that light waves from the center of the cylinder are emitted radially. However, our proposed method is required to enable the object to be placed at any position outside the cylinder, thus it is impossible to assume that the object wave is emitted from the center of the cylinder. Therefore, we have introduced a “phase-correction point,” from which light waves are emitted in correction calculations for calculating Op. By using this concept, the correction calculation used in the conventional methods can also be used in our method. The phase-correction point is basically set at the center of the object, but in the case of two or more objects to be displayed on the screen, it is set at the center of the generated hologram (viewpoint direction). We tested through an experiment the change in the reconstructed image in accordance with the position of the phase-correction point with respect to the target objects. When converting to Op from Oc, correction calculations are conducted using l, the same as in Ref. 18. The Op can be generated after determining where the line connecting the phase-correction point with each pixel (xp,yp,zp) on the plane set by the position and rotation of the plane intersecting the cylinder at (xc,yc,zc) (Fig. 6). In correction calculations of Oc at any coordinate, Op can be calculated using the following equation from the positional relationship among the phase-correction point, Op, and Oc using Oc and l:

Eq. (4)

Op=Ocexp(jkl).

Fig. 6

Overview of correction calculation for surrounding object.

OE_63_6_065101_f006.png

Since the maximum spatial frequency of the sampled object light on the cylindrical surface is determined by the sampling theorem, there is a limitation of the size of the zone plate, which is called zone-plate limitation.28 When calculating object light with respect to the cylinder, the zone-plate limitation must be set to prevent high-order diffracted images that hinder observation. In the calculations, denoting a certain pixel on the cylinder as pi, neighboring pixel as pi+1, and the distances between those points and the point-light source as ri and ri+1, respectively, high-order diffracted images can be prevented by not conducting any object-light calculations within the range in which the difference between ri and ri+1 exceeds λ/2 (Fig. 7). As a result of this zone-plate limitation, the area on which object light is recorded on the surface of the cylinder is limited. For the sake of simplicity, we consider points on the z-axis. We denote the coordinates of the point-light source as (0,0,zl) and those of the point on the cylinder used for correction as (xA,0,zA). The distance L between these two points can be expressed as

Eq. (5)

L=(zlzA)2+xA2.

Fig. 7

Zone-plate-limitation diagram.

OE_63_6_065101_f007.png

If we assume that x is small compared with zl and z, the l in the horizontal direction can be given as the product of the directional derivative of L and pixel pitch p as follows.

Eq. (6)

Lxp=xAzlzAp,
where z changes in accordance with the position on the surface of the cylinder as follows:

Eq. (7)

z=R2x2.

Since l is expressed by multiplying z/x by p, and assuming that Rx, z/x can be approximated as

Eq. (8)

zxxAR.

Therefore, the difference in the distance between the point-light source and each of these pixels can be calculated as

Eq. (9)

rpi+1rpi=xAzlzAp+xARp.

The zone-plate limitation when calculating object light with respect to a planar shape falls in a range such that the first term of Eq. (9) does not exceed λ/2; thus, the range in which recording can be executed on a cylindrical shape will be slightly smaller. The zone-plate-limited area xmax12 can be expressed as

Eq. (10)

xmax12=λ2p(R(zlzA)zlzA+R).

On the basis of the above, we can consider the field of view (FOV) that can show the maximum definable size of an object referring to Fig. 8. Denoting the pixel pitch on the hologram surface as d, the maximum electro-holographic diffraction angle θ can be expressed as

Eq. (11)

θ=sin1λ2d.

Fig. 8

Field of view.

OE_63_6_065101_f008.png

From this, we obtain the following for Hmax12 using plane depth zp and θ:

Eq. (12)

Hmax12θzpλzp2d.

The Qx in the figure can be expressed as follows from the equation for zone-plate limitation on the plane:

Eq. (13)

Qx=λ2d(zlzp).

The maximum FOV Vmax12 can therefore be expressed as

Eq. (14)

Vmax12=Qx+Hmax12=λzl2d.

When calculating Op from Oc, Hmax12 must be taken into account. If the range to be corrected exceeds Hmax12, high-order diffracted images will be generated, so processing that does not involve any calculations within this range is needed.

3.2.

Implementation

To directly calculate object light on a hologram plane from point-light sources, the calculation order is O(NXY), where X denotes the number of pixels in the horizontal direction on the hologram plane and Y is the number of pixels in the vertical direction. This shows that calculations for a complex object having many hologram pixels and a large number of point-light sources will result in extremely high computational complexity. In contrast, the proposed method involves processing on a 2D plane between the cylindrical surface and hologram surface unrelated to N. Therefore, the time needed to generate the hologram is short and the calculation order is O(XY). While this calculation order is said to be sufficiently fast, calculations using a central processing unit (CPU) have not yet reached a real-time level (30 fps) according to Ref. 18.

With the proposed method, we aim to conduct correction calculations of Oc at high speed in real time using a GPU, which features many calculation cores enabling massively parallel computing. Since the proposed method enables calculations of object light on the hologram surface independently, we can achieve high-speed processing through parallel computing in units of pixels. In the experiments described below, the calculations were conducted using Compute Unified Device Architecture cores that constitute an integrated development environment for GPUs provided by NVIDIA.

4.

Experiments

4.1.

Experimental Equipment

We conducted an experiment using an optical system to demonstrate the effectiveness of the proposed method.

Figures 9 and 10 show the photo and the block diagram of electro-holography system with a 4f optical system in which the size of the spatial light modulator (SLM) that displays the hologram is 1920×1080 pixels (W×H), pixel pitch is 8×8μm, optical wavelength is 512 nm, and cylinder radius is 0.25 m. The pixel pitch of Op and Oc are the same. It was assumed that rotation occurs in the counterclockwise direction with the positive direction of the x-axis at 0 deg. With respect to the range of correction calculations described in Sec. 3.1 (Eq. 12), xmax12 per single point-light source in Eq. (10) was taken to be 0.004 m as a parameter for this experiment. Calculations were not conducted in a range exceeding Hmax12=0.008m, so no special processing was needed.

Fig. 9

Optical system layout.

OE_63_6_065101_f009.png

Fig. 10

Schematic of optical system.

OE_63_6_065101_f010.png

4.2.

360-Deg Panoramic Views

We determined whether Op corresponding to the rotations of roll, pitch, and yaw could be generated using the proposed method. As shown in Fig. 11, a teapot was placed 0.5 m from the origin in the z direction, and the position of the phase-correction point was defined to be at the center of the teapot. The Oc with respect to the plane tangent to the cylinder was corrected and Op was generated. Reconstructed images were captured for a roll rotation of ±45  deg and yaw and pitch rotation each of ±1  deg from this plane.

Fig. 11

Definition of object.

OE_63_6_065101_f011.png

The results are shown in Fig. 12. Figure 12(a) shows the reconstructed image obtained by conducting correction calculations with respect to the plane along the cylinder and Figs. 12(b)12(d) show the reconstructed images for pitch, yaw, and roll rotations, respectively. Compared with the reconstructed image corresponding to the plane tangent to the cylinder (original position), the position and orientation of these reconstructed images change in accordance with each type of rotation. As shown in Fig. 13, we placed a rabbit, dragon, and teapot at a distance of 0.50m from the origin at 1 deg intervals and generated a hologram after rotating the plane tangent to the cylinder. The results are shown in Fig. 14.

Fig. 12

Reconstructed images rotated in pitch, yaw, and roll. (a) Original position, (b) pitch ± 1 deg, (c) yaw ±1 deg, and (d) roll ± 45 deg.

OE_63_6_065101_f012.png

Fig. 13

Arrangement of objects.

OE_63_6_065101_f013.png

Fig. 14

Reconstructed images of panoramic view. (a) 91.2 deg, (b) 91.3 deg, (c) 91.4 deg, (d) 91.5 deg, (e) 91.6 deg, (f) 91.7 deg, (g) 91.8 deg, (h) 91.9 deg, (i) 92.0 deg, (j) 92.1 deg, (k) 92.2 deg, (l) 92.3 deg, (m) 92.4 deg, (n) 92.5 deg, (o) 92.6 deg, (p) 92.7 deg, (q) 92.8 deg, (r) 92.9 deg, and (s) 93.0 deg (Video 1, MP4, 8.03 MB [URL: https://doi.org/10.1117/1.OE.63.6.065101.s1]).

OE_63_6_065101_f014.png

The phase correction point was set 0.50 m from the origin, the same as the distance to the objects but at a position corresponding to the viewpoint direction and not at the center of an object. The plane was rotated about its center in increments of 0.1deg. The defined objects could be observed seamlessly with the proposed method. On the basis of the above, a user’s surroundings can be observed in the horizontal direction with a planar device using Oc. These results confirm that a hologram can be generated on the user’s side in accordance with user head rotation. Since different reconstructed images can be observed from the same Oc, it should be possible to obtain 3-DoF reconstructed images for each user in the manner of holometric video streaming.

4.3.

Testing of Reconstructed Images

We also conducted experiments to examine the change in reconstructed images when the phase-correction point cannot be placed at the center of an object. In the following two experiments, the center of the teapot was defined to be at the position (0, 0, 0.50). We first examined the results of changing the phase-correction point in the depth direction while fixing the x-y directions of the point at the center of the object. Specifically, we compared the reconstructed image obtained from direct calculation of Op with the reconstructed images obtained by moving the phase-correction point away from the origin in intervals of 0.5 m (Fig. 15).

Fig. 15

Depth change of phase-correction point.

OE_63_6_065101_f015.png

The results are shown in Fig. 16. Figure 16(a) shows the reconstructed image from a hologram obtained from direct calculation of Op with no correction calculations, and Figs. 16(b)16(i) show the reconstructed images of holograms by Op generated by correction calculations. The phase-correction point was defined to be at the center of the object in Fig. 16(b) and was moved deeper at +0.5 m interval in Figs. 16(c)16(i). From examining these reconstructed images when the position of the phase-correction point was changed from the center of the object up to +1.5 m away, no major changes were observed compared with the reconstructed image obtained from direct calculation of Op. However, the brightness of the reconstructed image began to drop once the position of the phase-correction point exceeded +2.0 m and that the reconstructed images became increasingly blurry from that point on. These results indicate the need to place the phase-correction point within 1.5 m when defining multiple objects.

Fig. 16

Changes in reconstructed images in accordance with depth of phase-correction point. (a) Directly calculate planar object light, (b) (0, 0, 0.50), (c) (0, 0, 1.00), (d), (0, 0, 1.50), (e) (0, 0, 2.00), (f) (0, 0, 2.50), (g) (0, 0, 3.00), (h) (0, 0, 3.50), and (i) (0, 0, 0.40).

OE_63_6_065101_f016.png

Next, we examined the results of changing the phase-correction point in the horizontal and vertical directions. The SLM size was ∼0.016×0.008  m, and from the concept of the phase-correction point described in Sec. 3.1, and correction was carried out from the vicinity of the FOV. Therefore, the phase-correction points were set at all four corners of 0.004 m square and 0.008 m square, as shown in Fig. 17, and the changes in the reconstructed image were tested under these conditions. The position of the plane was defined to be 0.25 m.

Fig. 17

Horizontal and vertical change in phase-correction point.

OE_63_6_065101_f017.png

The results are shown in Fig. 18. Figure 18(a) shows the reconstructed image of a hologram obtained from direct calculation of Op with no correction calculations, and Figs. 18(b)18(j) show the reconstructed images of holograms by Op generated from correction calculations. Similar to the results obtained for changes in the depth direction, changes in brightness can be observed in accordance with the position of the phase-correction point, but no major changes were observed with respect to the reconstructed image obtained from direct calculation of Op. These results indicate that the quality degradation of the reconstructed image can be suppressed if the phase-correction point is in the vicinity of the FOV, so it is appropriate to define it at the center of the object being displayed or at the center of the FOV.

Fig. 18

Changes in reconstructed images in accordance with position of phase-correction point. (a) Directly calculate planar object light, (b) (0, 0, 0.5), (c) (−0.008, 0.008, 0.5), (d) (0.008, 0.008, 0.5), (e) (−0.004, 0.004, 0.5), (f) (0.004, 0.004, 0.5), (g) (−0.004, −0.004, 0.5), (h) (0.004, −0.004, 0.5), (i) (−0.008, −0.008, 0.5), and (j) (0.008, −0.008, 0.5).

OE_63_6_065101_f018.png

4.4.

Evaluation of Depth Expression

As shown in Fig. 19(a), we defined two stars with a depth difference of 0.1 m, calculated Oc, conducted correction calculations with respect to the plane tangent to the cylinder to generate holograms, and examined the reconstructed images. The results of this experiment are shown in Figs. 19(b) and 19(c). Figure 19(b) shows the reconstructed image when the camera was focused on the front star and Fig. 16(c) shows the reconstructed image when the camera was focused on the back star. When changing the camera’s focal length in this way, one of the objects was blurry while the other was in focus. These results indicate that depth expression can be obtained even for holograms created by Op generated from correction calculations.

Fig. 19

Depth expression: (a) arrangement of objects, (b) reconstructed image focused on 0.5 m and (c) on 0.6 m.

OE_63_6_065101_f019.png

4.5.

Computation Time

Finally, we measured computation time for generating Op with the proposed method. We compared the computation time from direct calculation of Op using a GPU, using the proposed method with a CPU, and the proposed method with a GPU. The CPU we used was an Intel® Core™ i7-8700K (3.20 GHz) with 16.0 GB memory, the operating system was Windows 10 Pro 64bit, and the GPU was NVIDIA GeForce GTX1080 with 8 GB memory. For each scenario, we took the average of five computation-time measurements. The results are shown in Fig. 20.

Fig. 20

Comparison of computation times.

OE_63_6_065101_f020.png

As discussed in Sec. 3.2, for directly calculating Op, computation time increased proportionally to the number of point-light sources, but with the proposed method, computation time stayed nearly constant since correction calculations could be conducted regardless of the number of point-light sources. While calculations using the CPU took 170  ms, they took 6  ms using the GPU, meaning that calculations were 28 times faster. Table 1 lists the measurement results of computational time, indicating that it is consistent with theoretical consideration. The computation time with the CPU was 6  fps, but that with the GPU was 166  fps, achieving the target for real-time calculations (30 fps). These results indicate that correction calculations for generating Op from Oc can be conducted in real time.

Table 1

Computation times.

Number of point-light sourcesDirect calculation Op (ms)Proposed method (CPU) (ms)Proposed method (GPU) (ms)
10002072.06171.125.68
500010,732.02174.565.44
10,00022,289.74170.125.58
50,000112,891.4167.766.27

5.

Conclusion

We proposed a method for generating 360-deg panoramic views of holographic images in real time as a step toward the practical implementation of holometric video streaming. The proposed method enables the observation of reconstructed images accompanying head rotation that differ among multiple users by calculating planar object light through correction calculations from cylindrical object light. It was confirmed through optical experiments that reconstructed images corresponding to head rotation could be displayed using these correction calculations. It was also shown that the amount of calculations in our method was small and that a frame rate of 166 fps could be achieved using a GPU. We anticipate the creation of systems that can provide the same VR experience as systems commonly used today by transmitting cylindrical object light and conducting correction calculations in real time using an HMD to display 3D video and by enabling the 360 deg observation of an object in combination with the method of Ref. 18.

Disclosures

There are no potential conflicts of interest, financial or otherwise, identified for this study.

Code and Data Availability

The data that support the observations of this work would be made available by the corresponding author upon reasonable request.

Acknowledgments

These research results were obtained from the commissioned research (Grant No. PJ012368C06801) by the National Institute of Information and Communications Technology (NICT), Japan.

References

1. 

A. Maimone, A. Georgiou and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graphics, 36 (4), 1 –16 https://doi.org/10.1145/3072959.3073624 ATGRDF 0730-0301 (2017). Google Scholar

2. 

T. Yoneyama et al., “Holographic head-mounted display with correct accommodation and vergence stimuli,” Opt. Eng., 57 (6), 061619 https://doi.org/10.1117/1.OE.57.6.061619 (2018). Google Scholar

3. 

C. Chang et al., “Toward the next-generation VR/AR,” Optica, 7 (11), 1563 –1578 https://doi.org/10.1364/OPTICA.406004 (2020). Google Scholar

4. 

H. Sakata and Y. Sakamoto, “Fast computation method for a Fresnel hologram using three-dimensional affine transformations in real space,” Appl. Opt., 48 (34), H212 –H221 https://doi.org/10.1364/AO.48.00H212 APOPAI 0003-6935 (2009). Google Scholar

5. 

E. Zschau et al., “Generation, encoding, and presentation of content on holographic displays in real time,” Proc. SPIE, 7690 76900E https://doi.org/10.1117/12.851015 PSISDG 0277-786X (2010). Google Scholar

6. 

C. Gao et al., “Accurate compressed look up table method for CGH in 3D holographic display,” Opt. Express, 23 (26), 33194 –33204 https://doi.org/10.1364/OE.23.033194 OPEXFF 1094-4087 (2015). Google Scholar

7. 

T. Shimobaba, N. Masuda and T. Ito, “Simple and fast calculation algorithm for computer-generated hologram with wavefront recording plane,” Opt. Lett., 34 (20), 3133 –3135 https://doi.org/10.1364/OL.34.003133 OPLEDP 0146-9592 (2009). Google Scholar

8. 

T. Ito et al., “Special-purpose computer HORN-5 for a real-time electroholography,” Opt. Express, 13 (6), 1923 –1932 https://doi.org/10.1364/OPEX.13.001923 OPEXFF 1094-4087 (2005). Google Scholar

9. 

T. Shimobaba et al., “Fast calculation of computer-generated-hologram on AMD HD5000 series GPU and OpenCL,” Opt. Express, 18 (10), 9955 –9960 https://doi.org/10.1364/OE.18.009955 OPEXFF 1094-4087 (2010). Google Scholar

10. 

H. Sakai and Y. Sakamoto, “Autotuning GPU code for acceleration of CGH calculation,” Opt. Eng., 61 (2), 023102 https://doi.org/10.1117/1.OE.61.2.023102 (2022). Google Scholar

11. 

Y.-H. Lee et al., “High-performance computer-generated hologram by optimized implementation of parallel GPGPUs,” J. Opt. Soc. Korea, 18 (6), 698 –705 https://doi.org/10.3807/JOSK.2014.18.6.698 1226-4776 (2014). Google Scholar

12. 

S. Ikawa et al., “Real-time color holographic video reconstruction using multiple-graphics processing unit cluster acceleration and three spatial light modulators,” Chin. Opt. Lett., 18 (1), 010901 https://doi.org/10.3788/COL202018.010901 CJOEE3 1671-7694 (2020). Google Scholar

13. 

T. Ichikawa, K. Yamaguchi and Y. Sakamoto, “Realistic expression for full-parallax computer-generated holograms with the ray-tracing method,” Appl. Opt., 52 (1), A201 –A209 https://doi.org/10.1364/AO.52.00A201 APOPAI 0003-6935 (2013). Google Scholar

14. 

K. Matsushima, M. Nakamura and S. Nakahara, “Silhouette method for hidden surface removal in computer holography and its acceleration using the switch-back technique,” Opt. Express, 22 (20), 24450 –24465 https://doi.org/10.1364/OE.22.024450 OPEXFF 1094-4087 (2014). Google Scholar

15. 

D. Blinder et al., “Photorealistic computer generated holography with global illumination and path tracing,” Opt. Lett., 46 (9), 2188 –2191 https://doi.org/10.1364/OL.422159 OPLEDP 0146-9592 (2021). Google Scholar

16. 

D. Kumar and N. K. Nishchal, “Synthesis and reconstruction of multi-plane phase-only Fresnel holograms,” Optik, 127 (24), 12069 –12077 https://doi.org/10.1016/j.ijleo.2016.09.114 OTIKAJ 0030-4026 (2016). Google Scholar

17. 

Y. Sakamoto, “Holometric video streaming,” in Proc. 13th Int. Conf. 3D Syst. and Appl., 14 –15 (2022). Google Scholar

18. 

T. Baba, R. Kon and Y. Sakamoto, “Method for generating planar computer-generated hologram at free viewpoint from cylindrical object light,” Opt. Eng., 61 (11), 113101 https://doi.org/10.1117/1.OE.61.11.113101 (2022). Google Scholar

19. 

Y. Alkhalili, T. Meuser and R. Steinmetz, “A survey of volumetric content streaming approaches,” in Proc. 2020 IEEE Sixth Int. Conf. Multimedia Big Data (BigMM), 191 –199 (2020). https://doi.org/10.1109/BigMM50055.2020.00035 Google Scholar

20. 

T. Jodo and Y. Sakamoto, “Fast calculation system for head-mounted display using user’s attitude angle in CGH,” Proc. SPIE, 12592 125920A https://doi.org/10.1117/12.2666823 (2023). Google Scholar

21. 

S. Subramanyam et al., “Comparing the quality of highly realistic digital humans in 3DoF and 6DoF: A volumetric video case study,” in Proc. 2020 IEEE Conf. Virtual Reality and 3D User Interfaces (VR), 127 –136 (2020). https://doi.org/10.1109/VR46266.2020.00031 Google Scholar

22. 

Y. Sakamoto and M. Tobise, “Computer generated cylindrical hologram,” Proc. SPIE, 5742 267 –274 https://doi.org/10.1117/12.589727 PSISDG 0277-786X (2005). Google Scholar

23. 

Y. Sando, M. Itoh and T. Yatagai, “Fast calculation method for cylindrical computer-generated holograms,” Opt. Express, 13 (5), 1418 –1423 https://doi.org/10.1364/OPEX.13.001418 OPEXFF 1094-4087 (2005). Google Scholar

24. 

T. Yamaguchi, T. Fujii and H. Yoshikawa, “Fast calculation method for computer-generated cylindrical holograms,” Appl. Opt., 47 (19), D63 –D70 https://doi.org/10.1364/AO.47.000D63 APOPAI 0003-6935 (2008). Google Scholar

25. 

X. Zhang et al., “Fast generation of 360-degree cylindrical photorealistic hologram using ray-optics based methods,” Opt. Express, 29 (13), 20632 –20648 https://doi.org/10.1364/OE.428475 OPEXFF 1094-4087 (2021). Google Scholar

26. 

A. Goncharsky and S. Durlevich, “Cylindrical computer-generated hologram for displaying 3D images,” Opt. Express, 26 (17), 22160 –22167 https://doi.org/10.1364/OE.26.022160 OPEXFF 1094-4087 (2018). Google Scholar

27. 

J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett., 9 (11), 405 –407 https://doi.org/10.1063/1.1754630 APPLAB 0003-6951 (1966). Google Scholar

28. 

Y. Takaki and Y. Tanemoto, “Band-limited zone plates for single-sideband holography,” Appl. Opt., 48 (34), H64 –H70 https://doi.org/10.1364/AO.48.000H64 APOPAI 0003-6935 (2009). Google Scholar

Biography

Ryotaro Kon received his MS degree in engineering from Hokkaido University in 2024.

Yuji Sakamoto is a professor at Graduate School of Information Science and Technology, Hokkaido University. He has been engaged in research on computer-generated holograms, 3D image processing, and computer graphics.

CC BY: © The Authors. Published by SPIE under a Creative Commons Attribution 4.0 International License. Distribution or reproduction of this work in whole or in part requires full attribution of the original publication, including its DOI.
Ryotaro Kon and Yuji Sakamoto "Real-time generation of computer-generated hologram for 360-degree panoramic views using cylindrical object light," Optical Engineering 63(6), 065101 (10 June 2024). https://doi.org/10.1117/1.OE.63.6.065101
Received: 14 December 2023; Accepted: 18 May 2024; Published: 10 June 2024
Advertisement
Advertisement
KEYWORDS
Holograms

3D image reconstruction

Video

Computer generated holography

Head-mounted displays

3D video streaming

Optical engineering

RELATED CONTENT


Back to Top