Reader Level:
ARTICLE

Touch Screen in Windows Phone 7

Posted by Charles Petzold Articles | Windows Phone November 18, 2010
Windows Phone 7 comes with a feature that is likely to be new and unusual. The screen on the phone is sensitive to touch. The multi-touch screen on a Windows Phone 7 device can detect at least four simultaneous fingers. It is the interaction of these fingers that makes multi-touch so challenging for programmers.
  • 0
  • 0
  • 11456
Download Files:
 


This chapter is taken from book "Programming Windows Phone 7" by Charles Petzold published by Microsoft press. http://www.charlespetzold.com/phone/index.html
Windows Phone 7 comes with a feature that is likely to be new and unusual. The screen on the phone is sensitive to touch. The multi-touch screen on a Windows Phone 7 device can detect at least four simultaneous fingers. It is the interaction of these fingers that makes multi-touch so challenging for programmers.
In a Silverlight program, touch input is obtained through events. In an XNA program, touch input comes through a static class polled during the Update method. One of the primary purposes of the XNA Update method is to check the state of touch input and make changes that affect what goes out to the screen during the Draw method.
The multi-touch input device is referred to in XNA as a touch panel. It is possible to obtain information about the multi-touch device itself by calling the static TouchPanel.GetCapabilities method. The TouchPanelCapabilities object returned from this method has two properties:

  • IsConnected and true.MaximumTouchCount.

To obtain low-level touch input, you'll probably be calling this method during every call to Update after program initialization:

TouchCollection touchLocations = TouchPanel.GetState();
The TouchCollection is a collection of zero or more TouchLocation objects. TouchLocation has three properties:

  • State, Position, Id

One exception: If the finger is tapped and released on the screen very quickly-that is, within a 1/30th Of a second-its's possible that the TouchLocation object with equal to pressed will be followed with state equal to Released with no Moved states in between.
TouchLocation also has a very handy method called TryGetPreviousLocation, which you call like this:
TouchLocation previousTouchLocation;
bool success = touchLocation.TryGetPreviousLocation(out previousTouchLocation);

The program I've proposed changes the text color when the user touches the text string, so the processing of TouchPanel.GetStates will be relatively simple, it needs a font, which I've made a little larger so it provides a more substantial touch target. A few more fields are required:
Example

public class Game1 : Microsoft.Xna.Framework.Game
{
    GraphicsDeviceManager graphics;
    SpriteBatch spriteBatch;
    Random rand = new Random();
    string text = "Hello, Windows Phone 7!";
    SpriteFont segoe36;
    Vector2 textSize;
    Vector2 textPosition;
    Color textColor = Color.White;
    ...
}

The LoadContent method is similar to earlier versions except that textSize is saved as a field because it needs to be accessed in later calculations:

protected override void LoadContent()
{
     spriteBatch = new SpriteBatch(GraphicsDevice); 
      segoe36 = this.Content.Load<SpriteFont>("Segoe36");
       textSize = segoe36.MeasureString(text);
       Viewport viewport = this.GraphicsDevice.Viewport;
        textPosition = new Vector2((viewport.Width - textSize.X) / 2,
                                 (viewport.Height - textSize.Y) / 2);
}

As is typical with XNA programs, much of the "action" occurs in the Update method. The method calls TouchPanel.GetStates and then loops through the collection of TouchLocation objects to find only those with State equal to Pressed.

protected
override void Update(GameTime gameTime)
{
    // Allows the game to exit
    if (GamePad.GetState(PlayerIndex.One).Buttons.Back == ButtonState.Pressed)
         this.Exit();
    TouchCollection touchLocations = TouchPanel.GetState();
    foreach (TouchLocation touchLocation in touchLocations)
    {
        if (touchLocation.State == TouchLocationState.Pressed)
        {
            Vector2 touchPosition = touchLocation.Position;
            if (touchPosition.X >= textPosition.X &&
            touchPosition.X < textPosition.X + textSize.X &&
            touchPosition.Y >= textPosition.Y &&
            touchPosition.Y < textPosition.Y + textSize.Y)
            {
                textColor = new Color((byte)rand.Next(256),
                                      (byte)rand.Next(256),
                                      (byte)rand.Next(256));
            }
            else
            {
                textColor = Color.White;
            }
        }
    }
    base.Update(gameTime);
}

If the Position is inside the rectangle occupied by the text string, the textColor field is set to a random RGB color value using one of the constructors of the Color structure. Otherwise, textColor is set to Color.White.
The Draw method looks very similar to the versions you've seen before, except that the text color is a variable:

protected
override void Draw(GameTime gameTime)
{
    this.GraphicsDevice.Clear(Color.Navy);
    spriteBatch.Begin();
    spriteBatch.DrawString(segoe36, text, textPosition, textColor);
    spriteBatch.End();
    base.Draw(gameTime);
}

Gesture Interface

The TouchPanel class also includes gesture recognition, which is demonstrated by the XnaTapHello project. The fields of this project are the same as those in XnaTouchHello, but the LoadContent method is a little different:

protected
override void LoadContent()
{
    spriteBatch = new SpriteBatch(GraphicsDevice);
    segoe36 = this.Content.Load<SpriteFont>("Segoe36");
    textSize = segoe36.MeasureString(text);
    Viewport viewport = this.GraphicsDevice.Viewport;
    textPosition = new Vector2((viewport.Width - textSize.X) / 2,
                               (viewport.Height - textSize.Y) / 2);
    TouchPanel.EnabledGestures = GestureType.Tap;
}

Notice the final statement. GestureType is an enumeration with members Tap, DoubleTap, Flick, Hold, Pinch, PinchComplete, FreeDrag, HorizontalDrag, VerticalDrag, and DragComplete, defined as bit flags so you can combine the ones you want with the C# bitwise OR operator.
The Update method is very different.

protected
override void Update(GameTime gameTime)
{
    // Allows the game to exit
    if (GamePad.GetState(PlayerIndex.One).Buttons.Back == ButtonState.Pressed)
        this.Exit();
    while (TouchPanel.IsGestureAvailable)
    {
        GestureSample gestureSample = TouchPanel.ReadGesture();
        if (gestureSample.GestureType == GestureType.Tap)
        {
            Vector2 touchPosition = gestureSample.Position;
            if (touchPosition.X >= textPosition.X &&
                touchPosition.X < textPosition.X + textSize.X &&
                touchPosition.Y >= textPosition.Y &&
                touchPosition.Y < textPosition.Y + textSize.Y)
            {
                textColor = new Color((byte)rand.Next(256),
                                      (byte)rand.Next(256),
                                      (byte)rand.Next(256));
            }
            else
            {
                textColor = Color.White;
            }
        }
    }
    base.Update(gameTime);
}

Although this program is interested in only one type of gesture, the code is rather generalized. If a gesture is available, it is returned from the TouchPanel.ReadGesture method as an object of type GestureSample.
Touch Events in Silverlight

Like XNA, Silverlight also supports two different programming interfaces for working with multi-touch, which can be most easily categorized as low-level and high-level. The low-level interface is based around the static Touch.FrameReported event, which is very similar to the XNA TouchPanel except that it's an event and it doesn't include gestures.
The high-level interface consists of three events defined by the UIElement class: ManipulationStarted, ManipulationDelta, and ManipulationCompleted. The Manipulation events, as they're collectively called, consolidate the interaction of multiple fingers into movement and scaling factors.
The program I want to write is only interested in the primary touch point when it has a TouchAction of Down, so I can use that same logic. The SilverlightTouchHello project has a TextBlock in the XAML file:

<Grid x:Name="ContentPanel" Grid.Row="1" Margin="12,0,12,0">
     <TextBlock Name="txtblk"
               Text="Hello, Windows Phone 7!"
               Padding="0 22"
               HorizontalAlignment="Center"
               VerticalAlignment="Center" />
</Grid>
I used Padding rather than Margin because Padding is space inside the TextBlock. The TextBlock actually becomes larger than the text size would imply. Here's the complete code-behind file. The constructor of MainPage installs the Touch.FrameReported event handler.

using
System;
using System.Windows.Input;
using System.Windows.Media;
using Microsoft.Phone.Controls; 
namespace SilverlightTouchHello
{
    public partial class MainPage : PhoneApplicationPage
    {
        Random rand = new Random();
        Brush originalBrush;       
        public MainPage()
        {
            InitializeComponent();
            originalBrush = txtblk.Foreground;
            Touch.FrameReported += OnTouchFrameReported;
        }
        void OnTouchFrameReported(object sender, TouchFrameEventArgs args)
        {
            TouchPoint primaryTouchPoint = args.GetPrimaryTouchPoint(null); 
            if (primaryTouchPoint != null && primaryTouchPoint.Action == TouchAction.Down)
            {
                if (primaryTouchPoint.TouchDevice.DirectlyOver == txtblk)
                {
                    txtblk.Foreground = new SolidColorBrush(
                                Color.FromArgb(255, (byte)rand.Next(256),
                                                    (byte)rand.Next(256),
                                                    (byte)rand.Next(256)));
                }
                else
                {
                    txtblk.Foreground = originalBrush;
                }
            }
        }
    }
}
The event handler is only interested in primary touch points with an Action of Down. If the DirectlyOver property is the element named txtblk, a random color is created.

Summary
I hope this article helps you to clear the touch interface programming for windows phone 7.

COMMENT USING

Trending up