top of page

OntheGiant@CHI'19

We propose a multi-scale Mixed Reality (MR) collaboration between the Giant, a local Augmented Reality user, and the Miniature, a remote Virtual Reality user, in Giant-Miniature Collaboration (GMC). The Miniature is immersed in a 360-video shared by the Giant who can physically manipulate the Miniature through a tangible interface, a combined 360-camera with a 6 DOF tracker.

SvG@ISMAR'18

The advancements in Mixed Reality (MR), Unmanned Aerial Vehicle (UAV), and multi-scale collaborative virtual environments (MCVEs) have led to new interface opportunities for remote collaboration. This paper explores a novel concept of flying telepresence for multi-scale mixed reality remote collaboration. This work could enable remote collaboration at a larger scale such as building construction. 

Emotion Sharing@CHI-Play'18

Sharing and augmenting facial expression in cooperative social Virtual Reality (VR) games; we implemented a prototype system for capturing and sharing facial expression between VR players through their avatar

Mini-Me@CHI'18

We present Mini-Me, an adaptive avatar for enhancing Mixed Reality (MR) remote collaboration between a local Augmented Reality (AR) user and a remote Virtual Reality (VR) user. The Mini-Me avatar represents the VR user’s gaze direction and body gestures while it transforms in size and orientation to stay within the AR user’s field of view.

Pinpointing@CHI'18

This work investigates precise, multimodal selection techniques using head motion and eye gaze. A comparison of speed and pointing accuracy reveals the relative merits of each method, including the achievable target size for robust selection. 

SnowDome@CHI'18

We present Snow Dome, a Mixed Reality (MR) remote collaboration application that supports a multi-scale interaction for a Virtual Reality (VR) user. We share a local Augmented Reality (AR) user’s reconstructed space with a remote VR user who has an ability to scale themselves up into a giant or down into a miniature for different perspectives and interaction at that scale within the shared space

Counterpoint@CHI'18

This video presents a design exploration that interleaves micro gestures with other types of gestures from the greater lexicon of gestures for computer interaction. We describe three prototype applications that show various facets of this multi-dimensional design space. These applications portray various tasks on a Hololens AR display, using different combinations of wearable sensors

ECL@CHI'18

Sharing CHI2018's atmosphere from ECL members' perspective.

CoVAR@SIGGRAPH Asia'17

We introduce CoVAR, a collaborative Virtual and Augmented Reality system for room-scale remote collaboration and interaction. CoVAR combines Augmented Reality (AR) and Virtual Reality (VR) technologies to build on the strengths of each platform. In our system, we represent remote users with a virtual head and hands, and we reconstruct the real environment of the AR user and share it with a remote VR user, so they both feel as if they are sharing the same space.

GazeInteraction@3DUI'17

This video shows three novel eye-gaze-based interaction techniques: (1) Duo-Reticles, eye-gaze selection based on eye-gaze and inertial reticles, (2) Radial Pursuit, cluttered-object selection that takes advantage of smooth pursuit, and (3) Nod and Roll, head-gesture-based interaction based on the vestibulo-ocular reflex.

SharedPhysioVR@CHI'17

We demonstrate two collaborative immersive VR games that display the real-time heart rate of one player to the other. The two different games elicited different emotions, one joyous and the other scary.

@QuiverVision'15

From late 2014 to early 2016, I took a break from my PhD and joined QuiverVision, a mobile AR startup based in Tokyo.

For more information about "Quiver: 3D Coloring App", please visit:

GSIAR@AWE'14

We showed off our G-SIAR system at Augmented World Expo, the largest AR exhibition in the world.

GSIAR@ISMAR'14

We conducted a study using G-SIAR system comparing a direct manipulation using natural hands and indirect manipulation using gesture and speech.  We presented our results and gave a demonstration at ISMAR'14 in Munich. 

KITE@ISMAR'13

We demonstrated KITE at ISMAR'13.

GestureLIB@HITLABNZ'12

We worked on hand tracking, classification, and pose estimation using a random forest classification method.

PhysicalInteract@3DUI'12

We demonstrated a physically-based interaction for AR.

Best Video @HRI'12

In this video, "Nao Haka", four robots and a haka leader perform a traditional Maori Haka. The Haka leader, who performs the main actions is supported by Aldebaran Nao Robots, which are controlled by an external performer, using a Microsoft Kinect as the input device. This device allows for full-body user tracking. This Video was made as a supportive gesture towards the All Blacks Rugby World Cup Campaign 2011.

ARMM@SIGGRAPH Asia'11

We demonstrated "ARMicroMachine", the first AR car racing game on a tabletop with real-time surface reconstruction for occlusion and environment awareness. 

bottom of page