Asset 2productpagearrow_downarrow_downarrowsbasic-demo-iconAsset 6productpagecarcheckbox-iconclosecommunitycontrollercookingdjfooter_mailfooter_facebookfooter_linkedinfooter_mailfooter_mappinfooter_twitterhand_finger_circle_iconhand-globe-sparkle-iconAsset 5productpageintegrate-iconkitlayer-iconleft_arrow_greenleft_arrow_whiteleft_arrowmagic-demo-iconnews-facebooknews-linkedinnews-twitterphone_2pinplugandplayquick-iconright_arrow_greenright_arrow_whiteright_arrowscrollAsset 4productpagesupportAsset 3productpagewebcam

H2020 Levitate Project | Haptic VR gaming

Dr Orestis Georgiou, Progam Manager, Ultrahaptics, writes about how an H2020 Levitate intern project synchronized modalities to create a groundbreaking haptic VR game.

The H2020 Levitate Project is a joint effort between Ultrahaptics and four universities: Aarhus University, Chalmers University of Technology, University of Glasgow and the University of Sussex.

The aim of the project is to create, prototype and evaluate a radically new human-computer interaction paradigm based on levitating matter. The project utilises ultrasonic waves controlled by Ultrahaptics’ hardware to perform the levitation.

One of the requirements of such an innovative interface is the synchronization between audio, haptic and visual modalities. To explore this, four interns from Manchester University were recruited and given the challenge of integrating these modalities to create a high-quality experience that you can simply walk up and use.

Haptic VR gaming

The result was a haptic VR rhythm game with gameplay that was a sort of cross between a race-car driving game and Guitar Hero, and with a futuristic visual feel. As musical notes come in through the “note highway”, the player uses their bare hands to tap them or swipe them left or right.

The students used Ultrahaptics’ technology to create mid-air haptic sensations that could be felt directly on the skin and synched these with the music beat and graphics.

 “We felt that haptics could give useful in-game feedback in a very intuitive way, so we decided to use different sensations to indicate hitting and missing. If there was no haptic feedback, players wouldn’t really know if they made the correct gestures or if they hit the note.”

– Bao ZiaoTong, Shing Hei Chan, Boyin Yang and Ziyuan Chen

Each gesture in the game is associated with a different haptic sensation, specially designed for that gesture. The game was developed using the Unity game engine, which seamlessly binds to the Ultrahaptics’ Software Development Kit.

Levitate Project video

Everyone at Ultrahaptics was incredibly impressed by what the interns came up with. As program manager, what I’m most proud of was the way the interns took ownership of the project and whilst initially faced with a steep learning curve, by the end had patched together a groundbreaking, fully functioning mid-air haptics VR game.

The H2020 Levitate project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 737087. The students involved were Bao ZiaoTong (graphics developer), Shing Hei Chan (haptics developer), Boyin Yang (audio developer) and Ziyuan Chen (game developer).

Get the latest curated news on spatial interaction and next-generation interfaces, straight to your inbox.

By clicking sign up, you agree to our privacy and cookie policy.

Enquire about the evaluation program

We require the information you provide on this form to put you in touch with your local account manager. Check out our privacy policy for how we protect and manage your submitted data.

This is the team modal