[Japanese | Thesis | Researches in Minoh Lab | Minoh Lab]
We propose a system in which an actor can directly look at the virtual objects and manipulate them in "Virtual Studio", as they deal with real objects.
In scenes created with the virtual studio system, virtual objects can be watched by audience as if the virtual objects were in the same world, where real objects exist. However, the actor is not able to look at the virtual objects from her/his point of view and manipulate them directly.
We employ monocular ``see-through wearable-display", whose size is so small that the audience is hardly prevented from perceiving the actor's face. The system allows the actor to look at virtual objects as if they were real objects, by updating the virtual objects' images on the wearable-display based on the actor's viewpoint using perspective projection.
In order to let the actor manipulate virtual objects, like real objects, we propose a method to make use of image processing from camera image and the position and direction of the actor's wrist measured by placing only ``one" position-sensor at the wrist. By employing both camera image and the data from position sensor, the system can estimate the position of the real object that the actor use to manipulate the virtual objects without use of many position-sensors for real objects.