Update (June 8, 2017): The UI Input Module has been deprecated, as 3D interfaces are now handled by the Leap Motion Interaction Engine. Learn more on our blog.
Unity Widgets are back – with a new name and massively streamlined functionality! Just released for our Unity Core Assets, the UI Input Module provides a simplified interface for physically interacting with World Space Canvases within Unity’s UI System. This makes it simple for developers to create one-to-one tactile user interfaces in VR.
Create and customize tactile interfaces and menus for VR. Click To TweetThe module also provides “CompressibleUI” scripts that enable UI elements to pop-up and flatten in response to touch. You can try the new inputs in our latest Developer Gallery demo, or download the Module from our Unity page.
What’s Inside?
One of the goals of our module releases is to provide you with the tools for interpreting and using hand tracking data to make building hand-enabled experiences faster and easier. The UI Input Module aims to do just that. By using the new LeapEventSystem prefab, developers can easily use their old worldspace user-interfaces and menus with hand tracking. Setting it up is as simple as ensuring that there’s a LeapEventSystem in the scene and that menus are close enough to touch.
Additionally, our pre-constructed UI Widgets demonstrate how to put together a diegetic UI Element that works well with compression and touch. Their built-in sound effects add haptic cues that give the sense of each button and slider having an internal mechanism for more satisfying interactions. We’ve include examples for Buttons, Sliders, and Scroll Panels in the module.
The CompressibleUI helper utility makes it easy to have animated, 3D UIs that respond to touch and interaction. This utility also animates the opacity of drop shadows, giving your UI elements that extra sense of depth necessary for fulfilling interactions. This utility is used in each of our example Widgets.
Quick Setup Guide
- Set up a Leap Camera Rig normally by dragging in an LMHeadMountedRig prefab from the Unity Core Assets
- Go to the LeapMotionModules/UIInput/Prefabs folder and drag a “LeapEventSystem” prefab onto your scene
- Create a Canvas object and add UI Elements to it
- Standard GUI elements can be added by right-clicking on the parent Canvas and selecting UI->Button/Slider/etc.
- The Leap UI Input Module works out-of-the-box with Unity’s uGUI system
- Or special Leap UI Elements, which can be found in the Prefabs folder
- These prefabs are also compatible with mouse Interaction
- Standard GUI elements can be added by right-clicking on the parent Canvas and selecting UI->Button/Slider/etc.
- Test out your new menu in VR!
Note: The UI Module does not recognize Canvases that are instantiated at runtime. For custom UI Elements, make sure the GameObject with the “Selectable” component is the only one in its hierarchy that has “RaycastTarget” enabled.
Designing with the UI Input Module
As a new type of interface, it’s very important that developers use the UI Input Module in ways that feel natural and intuitive to first-time users. Here are a few tips for developing with the module:
Big buttons. They’re easier to read in VR and easier to select.
Drop shadows. Use drop shadows on your UI elements to signify when they’re depressed or elevated. Shadows and shading are powerful depth cues for conveying button states.
Sound effects are essential for #VR interfaces – otherwise they feel weird or disappointing. Click To TweetSound effects. These are a powerful way to signify the success or failure of an action. It’s very important to use a sound effect upon both the initiation of an interaction and its termination. Missing sound effects on the termination of an action can leave the user feeling confused or unfulfilled.
CompressibleUI. This is a small helper utility within the Module that allows UI elements to expand and compress in relation to the surface of the canvas – in response to both touch and general interaction. It can also control the opacity of drop shadows. It’s a powerful tool toward increasing the dynamism of your UI elements.
Start with examples. Use the prefabs included in the UI Input Module as examples for setting up your own UI and components. You’ll also find tooltips on the Event System parameters that will help you learn how everything works.
That’s all for now! Next week we’ll be featuring an experimental approach to UI input that we’ve been playing with. In the meantime, we’d love to hear what you think about the UI Input Module – leave your feedback in the comments!
Quite complicated to use. The Getting Started Guide doesn’t say anything about changing the scale of the canvas, distance at which the menu should be, and setting the canvas to World Space. This should work out of the box, what about a prefab with the canvas? Why do you have a slider prefab if Unity’s slider works too? Is it better than Unity’s?
I delved into some of those default values using the sample scene, but even with that my interaction doesn’t work as well as in the sample scene. I’m missing something but it’s hard to know.
Used the UI Modules in a simple test. Here are the steps.
1) new scene, edit project settings->players-> VR enabled (set)
2) add LeapMotion packages: by Assets->Import Package->Customer Package
in your File System where ever you downloaded: LeapMotion_CoreAsset_Orion4.1.2unitypackage and
UIInputModule-1.10.unitypackage (or whatever later version etc of these you have)
3) Instantiate (navigate in Unity, project panel to: LeapMotion->PreFabs choose LMHeadMountRig and drag it to Unity Hierarchy;
4) Instantiate (navigate in Unity, project panel to: LeapMotionModules->PreFabs choose LeapMotionEventSystem and drag it to Unity Hierarchy.
5) Instantiate (navigate in Unity, project panel to LeapMotionModules->PreFabs->Examples choose: MinimumPanel – Private drag it to Unity Hierarchy
6) Add hands to the LeapController
On the LMHeadMountRig in the Hierarchy (i.e. the Hierarchy panel in Unity) click on the > and traverse down until you find: LeapHandController.
In the Project, find LeapMotion->Prefabs->HandModelsNonHuman and HandModelsPhysical
Choose CapsuleHand_L, drag it and drop it right on LeapHandController in the Hierarchy. then,
Choose CapsuleHand_R, drag it and drop it right on LeapHandController in the Hierarchy.
In the Project, find LeapMotion->Prefabs-> HandModelsPhysical
Choose RigidRoundHand_L, drag it and drop it right on LeapHandController in the Hierarchy. then,
Choose RigidRoundHand_R, drag it and drop it right on LeapHandController in the Hierarchy.
Go Up to the Hierarchy, Select LeapHandController. FInd the HandPool Component. In ModelPool, Size enter 2
You’ll see Element 0 and Element 1;
Open Element 0 and drag, from the Hierarchy: RigidRoundHandL to the Left Model, RigidRoundHandR to the Right
Open Element 1 and drag, from the Heirarchy: CapsuleHand_L to the Left Model, CapsuleHand_R to the Right Model
Check Is_Enabled for Element 0 and Element 1
7) Set Up the LeapEventSystem
In the Hierarchy, Click on LeapEventSystem; In the value Leap Data Provider, drag LeapHandController from the Hierarchy and drop it in the slot.
In Project, select All Materials, I just grabbed Background and dragged it to Pointer Material.
8) Positioning
My LMHeadMountedRig is set at 0 0 0; I set the MinimalPanel-Private transform values, in the inspector to 0.01, .88, 0.15.
After that, Click the > at the top, center of Unity, and you should be able to click away at the panel.
Sorry if this is too much detail or too simplistic, but I’m elated at the value this feature has for VR. Everyone I show VR to gets bored in about 30 seconds if there isn’t something interactive. Everyone I show LeapMotion + OVR, is amazed, as I am, at the accuracy of LM hands. I think the paradigm of XBOX controller for motion, and then LM hands for touch is lacking. Need to have an in-scene element you can touch for scene navigation, so you never leave the scene, so to speak.
Next, I’ll see how to script the LMEvents off to custom unity scripts to turn on lights, start movement, etc.
Oh yes, get rid of the default Main Camera, disable in the inspector by unchecking the uppermost box.
you are my savior. This is a much better and detailed tutorial than what the blog post above.
Dunno if this would be a problem, I didn’t use VR mode for this one since my project did use Orion but not in VR. Everything work just fine, but I cannot find the cursor so that might be a problem.
Make sure you drag and drop a Pointer Sprite and a Pointer Material in the Pointer Setup section of the Leap Input Module (Script) component of the Leap Event System.
Thanks for the support. Good luck.
oh apparently I have to use Sprites-Default on Pointer Material to make it visible with the standard MinimalPanel-Private. Problem solved. Thanks for your support too 🙂
[…] New Unity Module for User Interface Input […]
[…] design process behind the UI Input Module was driven by many of these insights. In turn, they continue to inform our other bleeding-edge […]
[…] Unity Module for user interface input […]
Hi there,
Integrating this module in a larger UI for academic research in Data Visualization Interaction, I was trying to use Environmental Pointer Option in the advanced experimental options, but notice the pointers position themselves on the leap hand models 90% of the time. Is there a solution to add tag or a component to a mesh object and have the pointers correctly project onto those? Thanks,
Sergio.
just noticed in original scene it works correctly…
Cool!
How can I use “raycast” in this ?
Is there a UE 4 version also ?
Not yet, but we are planning on one.
Thanks ! Looking forward to it. Will help us all immensely in the UE4 community.
… [Trackback]
[…] Informations on that Topic: blog.leapmotion.com/ui-input-module/ […]
[…] Create and customize tactile interfaces and menus for VR. Click To TweetThe module also provides “CompressibleUI” scripts that enable UI elements to pop-up and flatten in response to touch. You can try the new inputs in our latest Developer Gallery demo, or download the Module from our Unity page. […]