您的当前位置:首页正文

Android多点触摸

2020-04-29 来源:划驼旅游
How to use Multi-touch in Android 2

This is the first in a series of articles on developing multi-touch applications with Android 2.x. It is excerpted from Chapter 11 of the book “Hello, Android! (3rd edition)”, available in beta now at The Pragmatic Programmers. Introducing multi-touch

Multi-touch is simply an extension of the regular touch-screen user interface, using two or more fingers instead of one. We’ve used single-finger gestures before,

although we didn’t call it that. In Chapter 4 we let the user touch a tile in the Sudoku game in order to change it. That’s called a “tap” gesture. Another gesture is called “drag”. That’s where you hold one finger on the screen and move it around, causing the content under your finger to scroll.

Tap, drag, and a few other single-fingered gestures have always been supported in Android. But due to the popularity of the Apple iPhone, early Android users

sufferedfrom a kind of gesture envy. The iPhone supported multi-touch, in particular the “pinch zoom” gesture.

Three common touch gestures: a) tap, b) drag, and c) pinch zoom. (Image courtesy of GestureWorks.com)

With pinch zoom, you place two fingers on the screen and squeeze them together to make the item you’re viewing smaller, or pull them apart to make it bigger. Before Android 2.0 you had to use a clunky zoom control with icons that you pressed to zoom in and out (for example the setBuiltInZoomControls() in the MyMap example). But thanks to its new multi-touch support, you can now pinch to zoom on Android too! As long as the application supports it, of course.

Note: If you try to run the example in this chapter on Android 1.5 or 1.6, it will crash because those versions do not support multi-touch. We’ll learn how to work around that in chapter 13, “Write Once, Test Everywhere”. Warning: Multi-bugs ahead

Multi-touch, as implemented on current Android phones is extremely buggy. In fact it’s so buggy that it borders on the unusable. The API routinely reports invalid or impossible data points, especially during the transition from one finger to two fingers on the screen and vice-versa.

On the developer forums you can find complaints of fingers getting swapped, x and y axes flipping, and multiple fingers sometimes being treated as one. With a lot of trial and error, I was able to get the example in this chapter working because the gesture it implements is so simple. Until Google acknowledges and fixes the problems, thatmay be about all you can do. Luckily, pinch zoom seems to be the only multi-touch gesture most people want. The Touch example

To demonstrate multi-touch, we’re going to build a simple image viewer application that lets you zoom in and scroll around an image. Here’s a screenshot of the finished product:

The Touch example implements a simple image viewer with drag and pinch zoom. Building the Touch example

To demonstrate multi-touch, we’re going to build a simple image viewer application that lets you zoom in and scroll around an image. See Part 1 for a screenshot of the finished product.

Begin by creating a new “Hello, Android” project with the following parameters in the New Android Project dialog box: Project name: Touch Build Target: Android 2.1 Application name: Touch

Package name: org.example.touch Create Activity: Touch

This will create Touch.java to contain your main activity. Let’s edit it to show a sample image, put in a touch listener, and add a few imports we’ll need later: From Touchv1/src/org/example/touch/Touch.java: package org.example.touch; import android.app.Activity; import android.graphics.Matrix;

import android.graphics.PointF; import android.os.Bundle; import android.util.FloatMath; import android.util.Log;

import android.view.MotionEvent; import android.view.View;

import android.view.View.OnTouchListener; import android.widget.ImageView;

public class Touch extends Activity implements OnTouchListener { private static final String TAG = \"Touch\" ; @Override

public void onCreate(Bundle savedInstanceState) { super. onCreate(savedInstanceState); setContentView(R.layout.main);

ImageView view = (ImageView) findViewById(R.id.imageView); view.setOnTouchListener(this); }

@Override

public boolean onTouch(View v, MotionEvent event) { // Handle touch events here... } }

We’ll fill out that onTouch( ) method in a moment. First we need to define the layout for our activity:

From Touchv1/res/layout/main.xml:

xmlns:android=\"http://schemas.android.com/apk/res/android\" android:layout_width=\"fill_parent\" android:layout_height=\"fill_parent\" >

The entire interface is a big ImageView control that covers the whole screen. The android:src=”@drawable/butterfly” value refers to the butterfly image used in the example. You can use any JPG or PNG format image you like, just put it in the

res/drawables-nodpi directory. The android:scaleType=”matrix” attribute indicates we’re going to use a matrix to control the position and scale of the image. More on that later. The AndroidManifest.xml file is untouched except for the addition of the android:theme= attribute:

From Touchv1/AndroidManifest.xml:

android:theme=\"@android:style/Theme.NoTitleBar.Fullscreen\" > android:label=\"@string/app_name\" >

@android:style/Theme.NoTitleBar.Fullscreen, as the name suggests, tells Android to use the entire screen with no title bar or status bar at the top. You can run the application now and it will simply display the picture. Understanding touch events

Whenever I first learn a new API, I like to first put in some code to dump everything out so I can get a feel for what the methods do and in what order events happen. So let’s start with that. First add a call to the dumpEvent() method inside onTouch(): From Touchv1/src/org/example/touch/Touch.java: @Override

public boolean onTouch(View v, MotionEvent event) { // Dump touch event to log dumpEvent(event);

return true; // indicate event was handled }

Note that we need to return true to indicate to Android that the event has been handled. Next, define the dumpEvent() method. The only parameter is the event that we want to dump.

From Touchv1/src/org/example/touch/Touch.java:

/** Show an event in the LogCat view, for debugging */ private void dumpEvent(MotionEvent event) {

String names[] = { \"DOWN\" , \"UP\" , \"MOVE\" , \"CANCEL\" , \"OUTSIDE\" , \"POINTER_DOWN\" , \"POINTER_UP\" , \"7?\" , \"8?\" , \"9?\" }; StringBuilder sb = new StringBuilder(); int action = event.getAction();

int actionCode = action & MotionEvent.ACTION_MASK; sb.append(\"event ACTION_\" ).append(names[actionCode]); if (actionCode == MotionEvent.ACTION_POINTER_DOWN

|| actionCode == MotionEvent.ACTION_POINTER_UP) { sb.append(\"(pid \" ).append(

action >> MotionEvent.ACTION_POINTER_ID_SHIFT); sb.append(\")\" ); }

sb.append(\"[\" );

for (int i = 0; i < event.getPointerCount(); i++) { sb.append(\"#\" ).append(i);

sb.append(\"(pid \" ).append(event.getPointerId(i)); sb.append(\")=\" ).append((int) event.getX(i)); sb.append(\ if (i + 1 < event.getPointerCount()) sb.append(\";\" ); }

sb.append(\"]\" );

Log.d(TAG, sb.toString()); }

Output will go to the Android debug log, which you can see by opening the LogView view (see Section 3.10, Debugging with Log Messages).

The easiest way to understand this code is to run it. Unfortunately you can’t run this program on the Emulator (actually you can, but the Emulator doesn’t support multi-touch so the results won’t be very interesting). So hook up a real phone to your USB port and run the sample there (see Section 1.4, Running on a Real Phone). When I tried it on my phone and performed a few quick gestures, I received the output below:

1. event ACTION_DOWN[#0(pid 0)=135,179] 2. event ACTION_MOVE[#0(pid 0)=135,184]

3. event ACTION_MOVE[#0(pid 0)=144,205] 4. event ACTION_MOVE[#0(pid 0)=152,227]

5. event ACTION_POINTER_DOWN(pid 1)[#0(pid 0)=153,230;#1(pid 1)=380,538] 6. event ACTION_MOVE[#0(pid 0)=153,231;#1(pid 1)=380,538] 7. event ACTION_MOVE[#0(pid 0)=155,236;#1(pid 1)=364,512] 8. event ACTION_MOVE[#0(pid 0)=157,240;#1(pid 1)=350,498] 9. event ACTION_MOVE[#0(pid 0)=158,245;#1(pid 1)=343,494]

10. event ACTION_POINTER_UP(pid 0)[#0(pid 0)=158,247;#1(pid 1)=336,484] 11. event ACTION_MOVE[#0(pid 1)=334,481] 12. event ACTION_MOVE[#0(pid 1)=328,472] 13. event ACTION_UP[#0(pid 1)=327,471] Here’s how to interpret the events:

On line 1 we see an ACTION_DOWN event so the user must have pressed one finger on the screen. The finger was positioned at coordinates x=135, y=179, which is near the upper left of the display. You can’t tell yet whether they’re trying to do a tap or a drag.

Next, starting on line 2 there are some ACTION_MOVE events, indicating the user moved their finger around a bit to those coordinates given in the events. (It’s actually very hard to put your

finger on the screen and not move it at all, so you’ll get a lot of these.) By the amount moved you can tell the user is doing a drag gesture.

The next event, ACTION_POINTER_DOWN on line 5, means the user pressed down another finger. “pid 1” means that pointer id 1 (that is, finger number 1) was pressed. Finger number 0 was already down, so we now have two fingers being tracked on the screen. In theory, the Android API can support up to 256 fingers at once, but the first crop of Android 2.x phones is limited to 2. The coordinates for both fingers come back as part of the event. It looks like the user is about to start a pinch zoom gesture.

Here’s where it gets interesting. The next thing we see is a series of

ACTION_MOVE events starting on line 6. Unlike before, now we have two fingers moving around. If you look closely at the coordinates you can see the fingers are moving closer together as part of a pinch zoom.

Then on line 10 we see an ACTION_POINTER_UP on pid 0. This means that finger number 0 was lifted off the screen. Finger number 1 is still there. Naturally, this ends the pinch zoom gesture.

We see a couple more ACTION_MOVE events starting on line 11, indicating the remaining finger is still moving around a little. If you compare these to the earlier move events, you’ll notice a different

pointer id is reported. Unfortunately the touch API is so buggy you can’t always count on that.

Finally, on line 13 we get an ACTION_UP event as the last finger is removed from the screen.

Now the code for dumpEvent() should make a little more sense. The getAction() method returns the action being performed (up, down, or move). The lowest 8 bits of the action is the action code itself, and the next 8 bits is the pointer (finger) id, so we have to use a bitwise AND (&) and a right shift (>>) to separate them.

Then we call the getPointerCount( ) method to see how many finger positions are included. getX( ) and getY() return the X and Y coordinates, respectively. The fingers can appear in any order, so we have to call the getPointerId() to find out which fingers we’re really talking about.

That covers the raw mouse event data. The trick, as you might imagine, is in interpreting and acting on that data. Setting up for Image Transformation

In order to move and zoom the image we’ll use a neat little feature on the

ImageView class called matrix transformation. Using a matrix we can represent any kind of translation, rotation, or skew that we want to do to the image. We already turned it on by specifying android:scaleType=”matrix” in the res/layout/main.xml file. In the Touch class, we need to declare two matrices as fields (one for the current value and one for the original value before the transformation). We’ll use them in the onTouch( ) method to transform the image. We also need a mode variable to tell whether we’re in the middle of a drag or zoom gesture: From Touchv1/src/org/example/touch/Touch.java:

public class Touch extends Activity implements OnTouchListener { // These matrices will be used to move and zoom image Matrix matrix = new Matrix();

Matrix savedMatrix = new Matrix();

// We can be in one of these 3 states static final int NONE = 0; static final int DRAG = 1; static final int ZOOM = 2; int mode = NONE;

@Override

public boolean onTouch(View v, MotionEvent event) { ImageView view = (ImageView) v;

// Dump touch event to log dumpEvent(event);

// Handle touch events here...

switch (event.getAction() & MotionEvent.ACTION_MASK) { }

// Perform the transformation view.setImageMatrix(matrix);

return true; // indicate event was handled } }

The matrix variable will be calculated inside the switch statement when we implement the gestures. Implementing the Drag Gesture

A drag gesture starts when the first finger is pressed to the screen (ACTION_DOWN) and ends when it is removed (ACTION_UP or ACTION_POINTER_UP). From: Touchv1/src/org/example/touch/Touch.java: switch (event.getAction() & MotionEvent.ACTION_MASK) { case MotionEvent.ACTION_DOWN: savedMatrix.set(matrix);

start.set(event.getX(), event.getY()); Log.d(TAG, \"mode=DRAG\" ); mode = DRAG; break;

case MotionEvent.ACTION_UP:

case MotionEvent.ACTION_POINTER_UP: mode = NONE;

Log.d(TAG, \"mode=NONE\" ); break;

case MotionEvent.ACTION_MOVE: if (mode == DRAG) {

matrix.set(savedMatrix);

matrix.postTranslate(event.getX() - start.x, event.getY() - start.y); }

break; }

When the gesture starts we remember the current value of the transformation

matrix and the starting position of the pointer. Every time the finger moves, we start the transformation matrix over at its original value and call the postTranslate( )

method to add a translation vector, the difference between the current and starting positions.

If you run the program now you should be able to drag the image around the screen using your finger. Neat, huh?

Joe Asks: Why not use the built-in gesture library?

The android.gesture package provides a way to create, recognize, load, and save gestures. Unfortunately it is not very useful in practice. Among other problems, it doesn’t come with a standard collection of built-in gestures (like tap and drag) and it doesn’t support multi-touch gestures such as pinch zoom. Perhaps a future version of Android will include a better gesture library so the code in this chapter could be simplified.

Implementing the Pinch Zoom Gesture

The pinch zoom gesture is similar to the drag gesture, except it starts when the second finger is pressed to the screen (ACTION_POINTER_DOWN). From Touchv1/src/org/example/touch/Touch.java: case MotionEvent.ACTION_POINTER_DOWN: oldDist = spacing(event);

Log.d(TAG, \"oldDist=\" + oldDist); if (oldDist > 10f) {

savedMatrix.set(matrix); midPoint(mid, event); mode = ZOOM;

Log.d(TAG, \"mode=ZOOM\" ); }

break;

case MotionEvent.ACTION_MOVE: if (mode == DRAG) { // ... }

else if (mode == ZOOM) {

float newDist = spacing(event); Log.d(TAG, \"newDist=\" + newDist); if (newDist > 10f) {

matrix.set(savedMatrix);

float scale = newDist / oldDist;

matrix.postScale(scale, scale, mid.x, mid.y); }

}

break;

When we get the down event for the second finger, we calculate and remember the distance between the two fingers. In my testing, Android would sometimes tell me (incorrectly) that there were two fingers pressed down in almost exactly the same position. So I added an check to ignore the event if the distance is smaller than some arbitrary number of pixels. If it’s bigger than that, we remember the current

transformation matrix, calculate the midpoint of the two fingers, and start the zoom. When a move event arrives while we’re in zoom mode, we calculate the distance between the fingers again. If it’s too small, the event is ignored, otherwise we restore the transformation matrix and scale the image around the midpoint. The scale is simply the ratio of the new distance divided by the old distance. If the new distance is bigger (that is, the fingers have gotten further apart), then the scale will be greater than 1, making the image bigger. If it’s smaller (fingers closer together), then the scale will be less than one, making the image smaller. And of course if everything is the same, the scale is equal to 1 and the image is not changed. Now let’s define the spacing( ) and midPoint( ) methods. Distance Between Two Points

To find out how far apart two fingers are, we first construct a vector (x, y) which is the difference between the two points. Then we use the formula for Euclidean distance to calculate the spacing:

From Touchv1/src/org/example/touch/Touch.java: private float spacing(MotionEvent event) { float x = event.getX(0) - event.getX(1); float y = event.getY(0) - event.getY(1); return FloatMath.sqrt(x * x + y * y); }

The order of the points doesn’t matter because any negative signs will be lost when we square them. Note that all math is done using Java’s float type. While some Android devices may not have floating point hardware, we’re not doing this often enough to worry about its performance. Midpoint of Two Points

Calculating a point in the middle of two points is even easier:

From Touchv1/src/org/example/touch/Touch.java: private void midPoint(PointF point, MotionEvent event) { float x = event.getX(0) + event.getX(1); float y = event.getY(0) + event.getY(1); point.set(x / 2, y / 2); }

All we do is take the average of their X and Y coordinates. To avoid garbage

collections that can cause noticeable pauses in the application, we reuse an existing object to store the result rather than allocating and returning a new one each time. Try running the program now on your phone. Drag the image with one finger, and zoom it by pinching two fingers in or out. For best results, don’t let your fingers get closer than an inch or so apart. Otherwise you’ll start to run into some of those bugs in the API I mentioned earlier. Fast-Forward >>

In this chapter we learned how to use the multi-touch API to create a pinch zoom gesture. There’s a nice site called GestureWorks that describes a whole library of gestures that have been implemented on the Adobe Flash platform. If you’re willing to push the limits of Android’s quirky multi-touch support, then perhaps you can find ideas there for other gestures to implement in your Android programs.

Because multi-touch code uses new methods that didn’t exist before Android 2.0, if you try to run the Touch example on earlier versions it will fail with a “Force close” error. Luckily there are ways around this limitation (described later in the book - Ed). You can’t teach an old phone new tricks, but you can at least keep it from crashing. In the next chapter we’ll investigate home screen extensions, including live wallpaper.

因篇幅问题不能全部显示,请点此查看更多更全内容

12.737144s