Japan is rich in cultural properties of great historical and artistic value, of which the most important are protected as nationally designated cultural properties. Intangible cultural properties are techniques that have been handed down from generation to generation, such as theater, music, and craft techniques. In Japan, where the population is aging rapidly, digital archiving is essential for the transmission of intangible cultural assets. In this study, we focused on the musical accompaniment of the Kanuma Imamiya Shrine Festival (Kanuma Autumn Festival), which is designated as a UNESCO Intangible Cultural Heritage and a nationally designated Important Intangible Folk Cultural Property. Although the Kanuma Autumn Festival had to be canceled last year due to the emergency declaration requested by the spread of coronavirus infection, the festival is still being carried on. In this study, we have developed multi-viewpoint viewing system for ohayashi content with the cooperation of the Kamifukatsu Music Preservation Society and evaluated the system through experiments with participants. As a result, we obtained positive feedback from the participants, while those who had experience with musical performances pointed out points that needed to be improved. The controller was used as a method of interaction with the contents, but other methods have not yet been compared. Therefore, we aim to support the inheritance of tradition and examine the most user-friendly interaction method in terms of operability.
In recent years, research on augmented reality (AR) has been active, and eye gaze input has attracted attention as one of the input methods in AR. One problem with gaze input is that it is difficult to input when the target is small or crowded. In response to this problem, the bubble cursor facilitates eye gaze input by changing the size of the circular cursor according to the distance to the nearest target. In addition, by using the distance from the eyeball to the point where the lines of sight of both eyes intersect, called the convergence distance, it is believed that the user's gazing point can be accurately determined without relying on collision determination with virtual objects. In this study, we use the convergence distance to determine the gazing point in an AR environment and consider the bubble cursor centered on the gazing point.
KEYWORDS: Head-mounted displays, Heart, Time metrology, Virtual reality, Visualization, Infrared sensors, Information visualization, Visual process modeling, Space operations, Software development
The time that people perceive is called "psychological time" and is classified as "time perception" when the time range is less than 5 s and "time estimation" when the time range is 5 s or more. It is known that "psychological time" is perceived longer by fear, but there is little research on each effect of "time perception" and "time estimation" evoked by fear. Therefore, using VR technology that can give a sense of immersion as users really do, we present VR contents that evoke strong fear safely to users both in the "time perception" and "time estimation" ranges and investigate the differences between the effects of "time perception" and "time estimation" evoked by fear.
In recent years, due to the COVID-19 pandemic and the widespread use of technology, the Internet and food and beverage websites are often used for take-out and food and beverage reservations, and information such as reviews and photos on these platforms has a significant impact on revenue. In this study, to develop an appetite-enhancing application, we focus on food images as a factor that strongly influences appetite and analyze what image features stimulate appetite. Then, based on the results of the analysis, we generate appetizing images using GAN (Generative Adversarial Network).
In virtual and augmented reality, there has been a lot of research on interaction with virtual objects. In particular, eyetracking techniques are used to achieve hands-free interaction. In this paper, we focus on quick eye movements and intuitive pointing to move a virtual object with the gaze only. This study is basic research about the method for interacting with virtual objects in an augmented reality environment by the gaze. In this paper, we explore interaction by a typical controller and the gaze and investigate each characteristic of the interaction. The results show that the gaze may be useful as same as the controller in pointing to the distant position. To achieve more accurate hands-free interaction using the gaze, it is necessary to understand how to perceive the distance between the virtual object and the real object. Also, we need to examine the gaze interaction that uses other than the closing motion of the eye.
The purpose of this study is to determine whether microsaccades and pupil diameter can be used to estimate human covert attention when subjects are observing natural scenes. In this experiment, natural images were presented, and eye movements and pupil responses were measured. The analysis focused on microsaccades and gaze time. As a result, the histogram at the angle of microsaccades showed that microsaccades may be attracted to the object of interest. In addition, the rank of the objects of interest was not proportional to the gaze time. This indicates that the factor determining the rank of the objects of interest did not depend on the gaze time.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.