Now that .NET Framework 4.0 Beta 2 is out let’s look again what is available for building multi-touch application in WPF. In Beta 1 we got only a preview of manipulation and inertia components. With Beta 2 we finally get access to whole touch input system, and it looks very close to what was shown on PDC last year. Here is an overview from MSDN:
Elements in WPF now accept touch input. The UIElement, and UIElement3D, and ContentElement classes expose events that occur when a user touches an element on a touch-enabled screen. The following events are defined on UIElement and respond to touch input. Note that these events are also defined on UIElement3D and ContentElement.
In addition to the touch events, the UIElement supports manipulation. A manipulation is interpreted to scale, rotate, or translate the UIElement. For example, a photo viewing application might allow users to move, zoom, resize, and rotate a photo by touching the computer screen over the photo. The following events have been added to UIElement:
To enable manipulation, set the IsManipulationEnabled to true.
This is exactly what we were waiting for. I started playing with this new API and I will try to share some samples soon. But here is one problem I run into already. When I tried to use touch in ordinary WPF application, I noticed that all Buttons have a weird behavior, so that when I touched the Button it didn’t go into a pressed state immediately (I’m testing this on HP TouchSmart). Instead it only fires the Click event when I lift the finger. Testing it more I noticed that on any UIElement it won’t fire the MouseDown event until I lift the finger or slide it quite a bit.
To help me diagnose this issue I created a simple test application shown here:
From left to right I have gray rectangles that react to input events from mouse, stylus and touch. Each rectangle will change color to orange when mouse or stylus is over, and to green when is pressed. In addition, on press I capture the appropriate device and during capture the border changes color to red.
This test confirms that when I use touch panel then the xxxDown events don’t arrive immediately regardless of the device used. I spent quite a while trying to figure it out and trying various settings in both the Touch and Pen Control Panel and the NextWindow’s USB Config utility.
Finally I noticed that somehow this works perfectly fine when I use manipulations. From this I quickly found that this problem goes away when you set IsManipulationEnabled property on the element. Turns out that the side effect of setting this property to true is also changing the stylus properties to disable PressAndHold and Flicks gestures on the element. This explains this problem because stylus engine has to postpone these events trying to interpret the gestures. You can see the difference by selecting the appropriate checkbox on the window.
However although now we get the xxxDown events immediately (which will be very useful for manipulations) this doesn’t fix the problem with Button pressed state. You will notice that MouseDown event is still deferred regardless of the IsManipulationEnabled settings. I believe that in this case this might be caused by input logic in Windows itself. In fact the only way I could affect this was to force the touch panel to report input as mouse events (using NextWindow’s USB Config tool).
In the end I believe the proper fix for this would require extending all WPF built-in controls so that they understand the touch events and react accordingly. At the same time some controls might get some multi-touch specific behaviors as well. For example it was already announced that ScrollViewer will be enhanced to support multi-touch panning. In case of Button it was mentioned several times, that when simultaneous touches occur the correct behavior should be to fire the Click event only after the last touch is lifted. I hope that we get some of these enhancements in the RTM timeframe.
You can download the sample code here.