On the 8th of April 2022, I released HandWavey to the world. I released a video, a few Patreon posts, including a getting started guide (which is now here), but I wasn’t doing regular blog posts at the time, so never released a general one here. It seems time to fix that.
handWavey is a way of controlling your keyboard and mouse using a LeapMotion/UltraLeap controller.
There have been many projects attempting this concept over the years, and I haven’t liked any of them. They have all been either proof-of-concept quality, or so heavily optimised for a narrow set of use-cases that they wouldn’t be useful for me, and were not reasonable to extend.
handWavey is not that. It is designed to be the tool that you get to know, configure to your needs, and you become awesome with. There is a learning curve to being proficient with it. But if you invest that time; you get a system that:
- Will reliably do what you want it to do.
- Is highly customisable to you.
- Incredibly fun and satisfying to use.
Here is the video that I put out at the time:
Table of contents
- Table of contents
- Quick links
- How it works
- What it can do
- What else can it do?
- Why?
- Accessibility
- User testing
- Where I came from
- Getting up to speed more easily
- grabClick vs tiltClick
- Rotating it like a track pad
- The ergonomics of gestures
- A focus on configuration
- My usage
- Getting started
- What’s next
Quick links
- The most important blog posts:
- Tutorial
- Documentation
- Releases
- Github
How it works
It uses a LeapMotion controller
Above: A LeapMotion controller.
to understand the position and orientation of many bones in your hand.
Above: Using LeapMotions’s visualiser to show what the data is doing in real-time.
handWavey is an app that I wrote that uses that data to the control your computer.
You then control the computer by moving your first hand around to move the mouse cursor, and then moving your hand in specific ways (gestures) to tell it to do something.
Above: Footage of me actually using it, but with rings added to show what is being interpreted.
I used the rings in the video to visualise what handWavey was interpreting:
- The blue rings are the primary hand. - The first hand to be introduced.
- A bar on either side indicates that the hand is closed.
- The bar between the two rings, that is slightly offset from the top indicates that hand is in segment 0.
- The purple rings are the secondary hand. - The second hand to be introduced.
- The bar between the two rings, that is slightly offset from the top indicates that hand is in segment 0.
The secondary hand is not necessary. In fact, I optimise most gestureLayouts to use on the primary hand for common tasks. So it’s rare that I ever use the secondary hand. While we are used to removing one hand from the keyboard to control the mouse/trackpad/trackpoint, we’re not used to removing both hands. So I find there is an extra reluctance to use the secondary hand for day-to-day tasks.
Having said that, it’s particularly useful when you want to perform a lot of actions without touching your keyboard and mouse at all. You can really get in to Minority report type gestures when doing so.
So the above visualised the segments and open vs close hands. I then visualised the zones separately:
Above: A visualisation of the zones.
- Action zone:
- Active zone:
- noMove zone:
- absent: The hand is not present.
What it can do
User input
At the most basic level, it controls the keyboard and mouse. It can control them individually, but also in combination with each other, such as SHIFT+Click, and CTRL+Scroll.
Feedback
Without having something physical to touch, handWavey lacks probably the biggest implicit feedback that users rely on.
To address this, every action and state can trigger audio feedback. Notifying for everything would be completely unusable, but to get that feedback on just a few things can make a remarkable difference to getting a feel for the boundaries and how physical movements relate to gestures.
It therefore defaults to having some feedback by default, but I have various example audio configurations available that you can use to configure it based on your needs as you progress.
Left vs Right hand
handWavy doesn’t care about the left hand vs the right hand. It only cares about which had is first, and which is second. So you can intuitively swap hands, or a left handed person could use this after a right handed person, and everything just works.
Which computer?
By default it controls the local computer. But it’s also possible to control another computer via VNC. This is useful if your system is not yet supported by handWavey.
Above: handWavey controlling a laptop wirelessly via a direct VNC connection.
I’ve made it really easy to add extra control methods. So when it becomes clear how Wayland is going to work in this regard, it will be as easy as Wayland makes it.
I’ve also laid most of the ground work for it to be device agnostic. So if another device becomes available that can produce similar metrics to the LeapMotion, it should be easy to add support for it.
I have not yet tried a Leap Motion 2 controller.
What else can it do?
The one that really blows peoples’ minds when they see it is scrolling in the tiltClick gestureLayout. You literally grab the page and move it.
Above: Me scrolling in the tiltClick gestureLayout by closing my hand and lifting it up or down.
A gesture can combine multiple keyboard and mouse inputs. Here I am moving a window, which is ALT + Left-click + drag in linux:
Above: Me moving a window from a single gesture.
Above: Here I’m zooming into an image using a two-handed gesture.
Why?
For me personally, I had two motivators:
- The technology is fricken cool.
- I’ve been programming for over 30 years. So dividing up mouse usage with another input method seems like a good idea to prevent something like OOS.
But on top of that, I’ve also been getting involved with accessibility.
Accessibility
Almost every time I talk to someone about this, their eyes light up as they start thinking of various accessibility use-cases for this.
Let’s be clear, I’m not an expert in any of these. So if any of them might be relevant to you; please discuss it with someone who is. If you/they need a hand fine-tuning it to your needs, please create an issue on the handWavey repo. handWavey is extremely customisable, so there is a good chance that we can configure it to do what you need. It would be great to fine-tune the examples to what would actually be useful to people with those conditions.
Shakey hands
(Thinking parkinson’s.)
I have some shakey-hands examples which trade-off stability vs responsiveness.
They work by increasing the moving mean that is used for reducing error in the measurements. Normally the mean size would be around 4, but in these examples they range from 10 to 100 for cursor movement, and 20 to 200 for scrolling. The 100+200 are not maximums, but instead are the highest that I thought was useful to create examples for.
In my experimentation, I was able to really throw my hand around and still get stable movement of the cursor. I’d need to spend some time with someone who has this condition to really understand what good values are, and what else needs to be tweaked.
In general, you want the means to be as small as possible, without stressing trying to keep the cursor stable. When the numbers are too high, the cursor will lag behind what you are asking it to do.
Here is the R&D episode where I demoed the feature:
Reduced mobility
(Thinking about struggling to do fine movements with the hands.)
The reducedMobility gestureLayouts are for people with reduced mobility in their hands. The current examples at the time of this writing assume reduced fine control of the hands, but not the arms. This is addressed by splitting the pointing vs actions to two separate hands. Specifically:
- The primary hand (first to be introduced) moves the cursor.
- The secondary hand performs the actions.
I demoed this feature in this R&D episode:
Controlling it with your feet
Above: Me using the foot gestureLayout.
The foot gestureLayout is another approach to reduced mobility. It would require a fair bit more work to get to a useable level, and I don’t know anyone who needs this, so I’m not going to invest more time until I know that someone would benefit from it. - If you would benefit from it, you can let me know.
It’s not easy to use, yet. But for now, just know that it works, and is a possibility. It just needs more time investment, and input from someone who would make use of it.
User testing
Testing handWavey with real-world users has been a fist-full of insight to the face, but very necessary.
Above: A shot from the R&D video about user testing where the users played with a painting canvas.
Where I came from
When I got handWavey working just-well-enough to use, I moved my mouse out of reach, and used handWavey full-time so that I’d feel what things were important, and what could wait. I did this for several months while I refined things and added functionality. Effectively, I learnt the whole thing in a series of steps over a long period of time. It felt so incredibly natural to me, that when I first started testing with other people, it was a bit of a surprise to see how much trouble they were having with it. They hadn’t had the step-by-step introduction to it that I had had.
Getting up to speed more easily
To make handWavey more approachable, I created a quick course to help you practise the individual concepts without the stress of triggering unintentional actions on the computer. It’s basically a series of configurations with different things disabled, and some accompanying text guiding you through what to concentrate on.
grabClick vs tiltClick
One of the big learnings to come from the user testing was that they found grabClick to be much easier to learn than tiltClick. However, once you learn tiltClick, you can interact with the computer much more precisely and effortlessly. So I recommend starting with grabClick, and then migrating to tiltClick as soon as you are feeling comfortable with it.
There are other gestureLayouts that can be experimented with.
Rotating it like a track pad
Another suggestion that frequently comes up is to rotate it so that the movement is horizontal, more like a trackpad. I actually made a configuration for horizontal movement that people can try.
I can’t emphasise enough how much of a bad idea this is. - Feel free to try it, but please consider carefully whether this is actually what you want.
The rationale behind it is that while the movement of the mouse cursor on the screen is vertical, the movement of devices that we are used to (like the trackpad, or traditional mouse) is horizontal. So if you move you hand in the same way as you move a traditional pointing device, it will be more intuitive, right?
Probably. But, there are two huge downsides:
- In this orientation, your arm spends a lot more time extended further out from you. This will make you much more prone to injury.
- Switching between the two mentalities within handWavey is very difficult. If you start with the horizontal gestureLayout, it will be very difficult to graduate to the standard vertical ones. And you will never get the most out of handWavey.
The ergonomics of gestures
One of the things that always shocks me about other projects like this is the reliance on fine, repetitive gestures to perform frequent actions. Combine that with unreliable detection of those gestures, that is so often present in these solutions, and you have a recipe for injury. If you’re involved with development of these systems, please give this some more thought.
By contrast, handWavey uses large, gentle gestures that are designed to be simple and easy to perform without tensing up the muscles. I have data correction, and cut-offs to make the input more reliable, and I have more coming. All of which are configurable with sensible defaults so that you can make it fit to you, not the other way around.
A focus on configuration
Over the last several years, I’ve seen a growing trend of arrogance in the tech space of telling users “This is what you need. Love it, or use a non-existent alternative.” All the while, businesses usually focus on the biggest subset of the population to get the bigeest return on investment, while leaving large portions of the population totally un-catered to.
handWavey has a huge potential to help in the vast, and contrasting, world of accessibility. I wanted a solution where I can really geek out, but it can also be configured to be really simple, or to the specific needs of an individual.
The cost of this decision is that the configuration may be overwhelming to someone without a tech background. So for now the expectation is that someone who finds it too daunting to do the customisation they need, would get assistance from someone with a technical background. Again, the defaults are intended to be sensible so that you hopefully don’t have to change anything. But if you want/need to, you can. AND there are lots of ready-to-use examples to help you succeed.
My usage
I’ve just taken a very long (several months) break from handWavey. I wanted to test how my wrists responded to using and not using handWavey to make sure that I wasn’t causing myself damage by using it. I’m confident that I am not, but will re-assess this from time to time.
I have no scientific basis to tell you whether it is safe for you to use. Please use your judgement, and get professional advice when appropriate.
Above: Using a conventional mouse.
I’m going to continue using a mix of a traditional mouse, and handWavey. I’m slowly reducing the number of limitations that prevent using it full time. But more importantly, dividing time between a traditional mouse gives my body more variety, and will hopefully reduce the chances of injury from using one solution exclusively for years at a time.
Getting started
View the Getting started guide, and make sure to look at the training course.
The source code is public available.
What’s next
This project has been on hold for quite a while now while I focussed on other things. But coming back to it with fresh eyes, I see lots of niggles that I’m itching to fix, and I have lots of ideas for ways to improve it. I’ve also had a few requests from users that I’d really like to get to. So I’ll be doing more blog posts over the next while as I start to work through all of those.
In short: There’s a lot more coming on this.