1 of 2

Results

Poster Title

Author1, Author2, and Corresponding Author *

School of Electronic and Electrical Engineering, Kyungpook National University

Introduction

1. 슬라이드 크기는 변경하지 마세요.

너비: 40cm

높이: 90cm

※ 위 사이즈는 실제 출력물의 ½ 크기입니다.

2. PPT 파일을 저장할 때 파일> 옵션> 저장에서

“파일의 글꼴포함”을 반드시 체크하세요.

※ 폰트 저장 오류가 생길 경우

바꾸기>글꼴바꾸기 에서 글꼴을

반드시 변경해야 합니다.

3. 포스터 작성 부분에서 상단 제목 좌측에는

엠블럼, 우측에는 연구실 로고를 넣으세요.

※ 막대바 색상이나 글자 크기 등은 원하는대로

변경해도 됩니다.

4. 수식은 문자가 깨지지 않는지 꼭 체크하고

확신이 들지 않으면 수식을 그림으로 변환하여

저장하세요.

5. 이 설명글은 삭제하고

PPT 또는 PPTX 포맷으로 파일을 제출하세요.

연구실

로고

KNU-EERC Paper Awards: Undergraduate Poster

P-01

2 of 2

Results

Integral Imaging Display interactive with

External Illumination

Seongju Lee, Minwoo Jung and Joonku Hahn*

School of Electronic and Electrical Engineering, Kyungpook National University

Introduction

References

Method

In the principle of depth perception, humans have several depth cues. To provide 3D vision in a physical method, bin-ocular parallax is typically used. With binocular parallax, research on glasses-free 3D displays has been studied for over 100 years. Lipman's integral imaging [1], which uses a lens array to implement a two-dimensional parallax, is a well-known case that has been steadily studied to date[2].

However, there are not only physical methods for perceiving 3D. There are also psychological depth cues that allow a person to perceive a 2D image as a 3D object, even without binocular parallax. The most representative psychological depth cue is the shadow effect. In particular, the shadow effect shows the most remarkable effect when distinguishing the embossing/engraving of a 2D image and the depth order[3]. This depth cue can have a greater effect when interacting with the external environment or the user[4].

In this paper, we propose a binocular parallax display that can interact with an external illumination to maximize the 3D perception. This system implemented a two-dimensional parallax using a lens array. To measure the external light source, we used the method of inverting the light field[5], which allowed us to place a camera for measuring the external light source inside the system. Additionally, unlike Nayar's system[4], our system can measure not only the three-dimensional coordinate of the external light source, but also its direction and shape.

[1] G. Lippmann, “Épreuves réversibles donnant la sensation du relief,” Journal de Physique Théorique et Appliquée, vol. 7, no. 1, pp. 821–825, 1908.

[2] X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: Sensing, display, and applications [invited],” Applied Optics, vol. 52, no. 4, p. 546, 2013.

[3] N. Sugano, H. Kato, and K. Tachibana, “The effects of shadow representation of virtual objects in augmented reality,” The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings.

[4] S. K. Nayar, P. N. Belhumeur, and T. E. Boult, “Lighting sensitive display,” ACM Transactions on Graphics, vol. 23, no. 4, pp. 963–979, 2004.

[5] M. Hirsch, D. Lanman, H. Holtzman, and R. Raskar, “Bidi screen,” ACM SIGGRAPH Asia 2009 papers on - SIGGRAPH Asia '09, 2009.

In the 3D displaying, the viewing distance is infinite, and the interval between each view is 5 degrees, resulting in a total viewing angle of 50 degrees. The code for integral imaging reproduces a 3D image with a frame rate of 20 in a process time of 0.73 seconds. When executed with the code for measuring the external light source, a 3D image with a frame rate of 7.7 is produced. The system can interact with external light sources and control content in real time through other input devices. Compared to the square lens array, the use of a hexagonal lens array allows for a viewing angle that is up to 1.22 times wider than the number of viewpoints, while maintaining the resolution of each view.

Discussion & Conclusions

Figure 2. Photograph of the results taken from far right and far left of the viewing angle (±25 degrees).

Figure 3. Photograph of the results with various position of external light source (flash light).

The results show that the proposed system not only implements a 3D display using integral imaging, but also measures the information of external light sources. However, this system can measure the shape, direction, and position of the external light source, but it does not recognize when there are multiple light sources, which occurs when the same field exists between the light field created by one light source and the field created by another light source. In addition, there is a problem that the quality of the image shown in our system is deteriorated due to the mismatch between the hexagonal array of the lens array and the square array of the LCD panel. Simply increasing the resolution of the panel can improve the quality, but the frame rate is inevitably reduced due to the increased load on the arithmetic unit.

The limitations are obvious, but there is development: By improving the method used to capture external light sources, objects as well as external light sources can be recognized, such as light field cameras. Therefore, it is necessary to emphasize these improvements in order to strengthen 3D cognition.

To provide binocular parallax, we used an integral imaging method. Integral imaging uses the property that lenses transfer spatial coordinates to angular frequencies. An image of a certain point of view can be considered as a set of pixels which are parallel light traveling in the same angular direction. A lens can make a point in spatial domain into parallel light traveling at a certain angle. Therefore, with relay of a display and a lens array, different images can be formed for different angles.

Conversely, light coming from the outside can also be measured using this property. When an external light source with a certain wavefront is locally divided, it can be seen as a set of parallel rays with various angles. Since the lens converges parallel light traveling in a direction of a certain angle to one point, the wavefront of the external light source can be measured through the location of the converged point in each lens.

We implemented a display using a 15” LCD panel (BOE Co.) with a resolution of 1920 x 1080. We used a hexagonal lens array (Fresnel Tech Inc.) as the lens array to implement 11x11 viewpoints with a resolution of 147 x 85. We removed the backlight from the existing LCD panel and used it as a transparent display, attach-ing a diffuser to the back of the LCD panel to allow external light to be screened. Four LED pillars were placed inside to replace the removed backlight. A 60 Hz webcam (Logitec Co.) with a resolution of 1920 x 1080 was placed inside the system to take a picture of the screened external light on the diffuser, and the obtained image was analyzed to calculate the information of the external light source.

Aligner

LCD & Diffuser

Lens array

Cover glass

CCD

LED

Implementation

Figure 1. Design of our proposed system.

Pixels

3D Image

Display panel

Lens array

Spotted

Wavefront

of external light

Sensor

Lens array

Figure 1. Principle of our proposed method. Left: Diagram of integral imaging. Right: Diagram of measuring external light source.

행사명 표시 부분(크기 변경 불가)

포스터번호 표시 부분(크기변경불가)

연구실 로고

경북대 엠블럼

포스터 작성부분

포스터 예시

1. 슬라이드 크기는 변경하지 마세요.

너비: 40cm

높이: 90cm

※ 위 사이즈는 실제 출력물의 ½ 크기입니다.

2. PPT 파일을 저장할 때 파일> 옵션> 저장에서

“파일의 글꼴포함”을 반드시 체크하세요.

※ 폰트 저장 오류가 생길 경우

바꾸기>글꼴바꾸기 에서 글꼴을

반드시 변경해야 합니다.

3. 포스터 작성 부분에서 상단 제목 좌측에는

엠블럼, 우측에는 연구실 로고를 넣으세요.

※ 막대바 색상이나 글자 크기 등은 원하는대로

변경해도 됩니다.

4. 수식은 문자가 깨지지 않는지 꼭 체크하고

확신이 들지 않으면 수식을 그림으로 변환하여

저장하세요.

5. 이 설명글은 삭제하고

PPT 또는 PPTX 포맷으로 파일을 제출하세요.

KNU-EERC Paper Awards: Undergraduate Poster

P-01