Android and touch event

Полный список

Раньше мы для View-компонентов использовали OnClickListener и ловили короткие нажатия. Теперь попробуем ловить касания и перемещения пальца по компоненту. Они состоят из трех типов событий:

— нажатие (палец прикоснулся к экрану)
— движение (палец движется по экрану)
— отпускание (палец оторвался от экрана)

Все эти события мы сможем ловить в обработчике OnTouchListener, который присвоим для View-компонента. Этот обработчик дает нам объект MotionEvent, из которого мы извлекаем тип события и координаты.

На этом уроке рассмотрим только одиночные касания. А мультитач – на следующем уроке.

Project name: P1021_Touch
Build Target: Android 2.3.3
Application name: Touch
Package name: ru.startandroid.develop.p1021touch
Create Activity: MainActivity

strings.xml и main.xml нам не понадобятся, их не трогаем.

MainActivity реализует интерфейс OnTouchListener для того, чтобы выступить обработчиком касаний.

В onCreate мы создаем новый TextView, сообщаем ему, что обработчиком касаний будет Activity, и помещаем на экран.

Интерфейс OnTouchListener предполагает, что Activity реализует его метод onTouch. На вход методу идет View для которого было событие касания и объект MotionEvent с информацией о событии.

Методы getX и getY дают нам X и Y координаты касания. Метод getAction дает тип события касания:

ACTION_DOWN – нажатие
ACTION_MOVE – движение
ACTION_UP – отпускание
ACTION_CANCEL – практически никогда не случается. Насколько я понял, возникает в случае каких-либо внутренних сбоев, и следует трактовать это как ACTION_UP.

В случае ACTION_DOWN мы пишем в sDown координаты нажатия.

В случае ACTION_MOVE пишем в sMove координаты точки текущего положения пальца. Если мы будем перемещать палец по экрану – этот текст будет постоянно меняться.

В случае ACTION_UP или ACTION_CANCEL пишем в sUp координаты точки, в которой отпустили палец.

Все это в конце события выводим в TextView. И возвращаем true – мы сами обработали событие.

Теперь мы будем водить пальцем по экрану (курсором по эмулятору) в приложении, и на экране увидим координаты начала движения, текущие координаты и координаты окончания движения.

Все сохраним и запустим приложение.

Ставим палец (курсор) на экран

Если вчерашний вечер не удался, голова не болит, рука тверда и не дрожит :), то появились координаты нажатия.

Если же рука дрогнула, то появится еще и координаты перемещения.

Продолжаем перемещать палец и видим, как меняются координаты Move.

Теперь отрываем палец от экрана и видим координаты точки, в которой это произошло

В целом все несложно. При мультитаче процесс немного усложнится, там уже будем отслеживать до 10 касаний.

Если вы уже знакомы с техникой рисования в Android, то вполне можете создать приложение выводящее на экран геометрическую фигуру, которую можно пальцем перемещать. Простейший пример реализации можно посмотреть тут: http://forum.startandroid.ru/viewtopic.php?f=28&t=535.

На следующем уроке:

— обрабатываем множественные касания

Присоединяйтесь к нам в Telegram:

— в канале StartAndroid публикуются ссылки на новые статьи с сайта startandroid.ru и интересные материалы с хабра, medium.com и т.п.

— в чатах решаем возникающие вопросы и проблемы по различным темам: Android, Kotlin, RxJava, Dagger, Тестирование

— ну и если просто хочется поговорить с коллегами по разработке, то есть чат Флудильня

— новый чат Performance для обсуждения проблем производительности и для ваших пожеланий по содержанию курса по этой теме

Источник

Understanding Touch Control and Events in Android

Almost all phones nowadays running Android are touch-controlled. There are very few phones which are not touch-based.

In this blog, we are going to talk about how we can handle touch events in Android. This is like, understanding how we can work around with touch controls and touch events when we tap on our screen.

We are going to discuss how touch events in Android work internally for any view.

So, how do the input events actually work and what exactly happens when we touch our screen when we have a ViewGroup having different views inside it?

In this case, we are going to talk about a case where we have a LinearLayout containing a button like,

What we are to going to discuss in the blog is,

  • What happens when we touch the screen?
  • How do we intercept touch events?
  • How touch is handled?
  • What are the touch events we majorly work within Android for handling the touch control?

What happens when we touch the screen?

So, when we touch the screen the activity’s view gets the touch event notification first also known as DecorView in Android. Now, we generally don’t work with the touch of DecorView . So, the touch gets transferred to the ViewGroup and subsequently to its children in the XML file.

But how can we transfer the touch event trigger?

In Android, the ViewGroup transfers the touch event from top to bottom in the hierarchy of ViewGroup to its children using dispatchTouchEvent() .

How do we intercept touch events?

First when we perform a touch action,

Then ViewGroup gets the touch event, and then it is intercepted in the ViewGroup itself using onInterceptTouchEvent() .

If on intercepting if we return true then the touch event is not passed to its children and if we pass false , the Android eco-system gets notified that the ViewGroup wants to dispatch the event to its children, in our case it is a button.

In general, if returning true, it means we have handled the event in the ViewGroup itself, no need to dispatch to its children.

Now, as the button we have, is the last view in our view tree. So, it won’t be able to pass the touch event to its children anymore as it has none. So, in button, we would have our last onInterceptTouchEvent being called.

Intercepting of the event can only happen in ViewGroups and not Views.

Now, let’s discuss how the touch is handled in the view.

Читайте также:  Самый мощный деинсталлятор для андроид

How touch is handled in the view?

When dispatching the touch event from top to bottom in the view hierarchy, we need to see at which position of the tree we need to handle the touch on the view.

When handling the dispatching the event, the top position of the hierarchy takes the lead, but when it comes to handling the touch the child views using onTouchEvent are always the first, and then it keeps moving towards the ViewGroups.

Touch event works just like the dispatching of the events but in the reverse order from child to parent.

Let’s say if we dispatch the event from ViewGroup and intercept the event there, it depends on the return value (true/false) that shall the touch of the view be handled on the ViewGroup or the children.

So in our case, if the onTouchEvent of Button returns true, then it means that it has been handled and then, it will not go to the LinearLayout .

What are the touch events we majorly work within Android for handling the touch control?

When we get the touch event, it gets handled by onTouchEvent which also has a parameter of type MotionEvent .

All the task performed regarding the touch has its reference in the event parameter. We can have the coordinates like X and Y points on the screen of the point on the touch.

It even has the actions in it, like let’s see if we tap on the screen then MotionEvent.ACTION_DOWN is called and when we lift the touch MotionEvent.ACTION_UP is called.

Even dragging a finger on the screen, the action is MotionEvent.ACTION_MOVE.

So, the flow on the view happens is when we want to tap the button,

Activity -> dispatchTouchEvent (LinearLayout) -> dispatchTouchEvent(Button) -> onTouchEvent(Button).

and when we don’t want to tap the button but want to handle the click on LinearLayout, the flow would be,

Activity -> dispatchTouchEvent (LinearLayout) -> dispatchTouchEvent(Button) -> onTouchEvent(Button) (will return false) -> onTouchEvent(LinearLayout).

Summary

To summarise everything how to work around touch control is,

  • When working on touch events we start by clicking a view and removing the gesture (in our case our finger/stylus) then MotionEvent.ACTION_DOWN and MotionEvent.ACTION_UP is called respectively.
  • When the initial touch happens on the ViewGroup and after intercepting when it moves to the child, then MotionEvent.ACTION_CANCEL gets called on the ViewGroup and the touch event dispatches to the children.
  • Now, everything depends on onInterceptTouchEvent() and its return value. Based on its return value the dispatchTouchEvent is dependent, that if returns true the dispatcher is canceled, and if it returns false then the dispatching of the touch event keeps going on until its used.
  • And onTouchEvent() if the return value is true, then it means the touch is handled and if it returns false then it means the touch is not handled.

Now, we know how to control the touch event and how it works. This is very useful when designing the CustomViews and own view library.

Источник

Gestures and Touch Events

Gesture recognition and handling touch events is an important part of developing user interactions. Handling standard events such as clicks, long clicks, key presses, etc are very basic and handled in other guides. This guide is focused on handling other more specialized gestures such as:

  • Swiping in a direction
  • Double tapping for zooming
  • Pinch to zoom in or out
  • Dragging and dropping
  • Effects while scrolling a list

You can see a visual guide of common gestures on the gestures design patterns guide. See the new Material Design information about the touch mechanics behind gestures too.

At the heart of all gestures is the onTouchListener and the onTouch method which has access to MotionEvent data. Every view has an onTouchListener which can be specified:

Each onTouch event has access to the MotionEvent which describe movements in terms of an action code and a set of axis values. The action code specifies the state change that occurred such as a pointer going down or up. The axis values describe the position and other movement properties:

  • getAction() — Returns an integer constant such as MotionEvent.ACTION_DOWN , MotionEvent.ACTION_MOVE , and MotionEvent.ACTION_UP
  • getX() — Returns the x coordinate of the touch event
  • getY() — Returns the y coordinate of the touch event

Note that every touch event can be propagated through the entire affected view hierarchy. Not only can the touched view respond to the event but every layout that contains the view has an opportunity as well. Refer to the understanding touch events section for a detailed overview.

Note that getAction() normally includes information about both the action as well as the pointer index. In single-touch events, there is only one pointer (set to 0), so no bitmap mask is needed. In multiple touch events (i.e pinch open or pinch close), however, there are multiple fingers involved and a non-zero pointer index may be included when calling getAction() . As a result, there are other methods that should be used to determine the touch event:

  • getActionMasked() — extract the action event without the pointer index
  • getActionIndex() — extract the pointer index used
Читайте также:  Simprocess что это android

The events associated with other pointers usually start with MotionEvent.ACTION_POINTER such as MotionEvent.ACTION_POINTER_DOWN and MotionEvent.ACTION_POINTER_UP . The getPointerCount() on the MotionEvent can be used to determine how many pointers are active in this touch sequence.

Within an onTouch event, we can then use a GestureDetector to understand gestures based on a series of motion events. Gestures are often used for user interactions within an app. Let’s take a look at how to implement common gestures.

For easy gesture detection using a third-party library, check out the popular Sensey library which greatly simplifies the process of attaching multiple gestures to your views.

You can enable double tap events for any view within your activity using the OnDoubleTapListener. First, copy the code for OnDoubleTapListener into your application and then you can apply the listener with:

Now that view will be able to respond to a double tap event and you can handle the event accordingly.

Detecting finger swipes in a particular direction is best done using the built-in onFling event in the GestureDetector.OnGestureListener .

A helper class that makes handling swipes as easy as possible can be found in the OnSwipeTouchListener class. Copy the OnSwipeTouchListener class to your own application and then you can use the listener to manage the swipe events with:

With that code in place, swipe gestures should be easily manageable.

If you intend to implement pull-to-refresh capabilities in your RecyclerView, you can leverage the built-in SwipeRefreshLayout as described here. If you wish to handle your own swipe detection, you can use the new OnFlingListener as described in this section.

If you are interested in having a ListView that recognizes swipe gestures for each item, consider using the popular third-party library android-swipelistview which is a ListView replacement that supports swipe-eable items. Once setup, you can configure a layout that will appear when the item is swiped.

Check out the swipelistview project for more details but the general usage looks like:

and then define the individual list item layout with:

Now front will be displayed by default and if I swipe left on an item, then the back will be displayed for that item. This simplifies swipes for the common case of menus for a ListView.

Another more recent alternative is the AndroidSwipeLayout library which can be more flexible and is worth checking out as an alternative.

Supporting Pinch to Zoom in and out is fairly straightforward thanks to the ScaleGestureDetector class. Easiest way to manage pinch events is to subclass a view and manage the pinch event from within:

Using the ScaleGestureDetector makes managing this fairly straightforward.

One of the most common use cases for a pinch or pannable view is for an ImageView that displays a Photo which can be zoomed or panned around on screen similar to the Facebook client. To achieve the zooming image view, rather than developing this yourself, be sure to check out the PhotoView third-party library. Using the PhotoView just requires the XML:

and then in the Java:

Check out the PhotoView readme and sample for more details. You can also check the TouchImageView library which is a nice alternative.

Scrolling is a common gesture associated with lists of items within a ListView or RecyclerView . Often the scrolling is associated with the hiding of certain elements (toolbar) or the shrinking or morphing of elements such as a parallax header. If you are using a RecyclerView , check out the addOnScrollListener. With a ListView , we can use the setOnScrollListener instead.

With Android «M» and the release of the Design Support Library, the CoordinatorLayout was introduced which enables handling changes associated with the scrolling of a RecyclerView . Review the Handling Scrolls with CoordinatorLayout guide for a detailed breakdown of how to manage scrolls using this new layout to collapse the toolbar or hide and reveal header content.

Dragging and dropping views is not particularly difficult to do thanks to the OnDragListener built in since API 11. Unfortunately, to support gingerbread managing drag and drop becomes much more manual as you have to implement it using the onTouch handlers. With API 11 and above, you can leverage the built in drag handling.

First, we want to attach an onTouch handler on the views that are draggable which will start the drag by creating a DragShadow with the DragShadowBuilder which is then dragged around the Activity once startDrag is invoked on the view:

If we want to add «drag» or «drop» events, we should create a DragListener that is attached to a drop zone for the draggable object. We hook up the listener and manage the different dragging and dropping events for the zone:

Check out the vogella dragging tutorial or the javapapers dragging tutorial for a detailed look at handling dragging and dropping. Read the official drag and drop guide for a more detail overview.

Detecting when the device is shaked requires using the sensor data to determine movement. We can whip up a special listener which manages this shake recognition for us. First, copy the ShakeListener into your project. Now, we can implement ShakeListener.Callback in any activity:

Now we just have to implement the expected behavior for the shaking event in the two methods from the callback.

For additional multi-touch events such as «rotation» of fingers, finger movement events, etc., be sure to check out libraries such as Sensey and multitouch-gesture-detectors third-party library. Read the documentation for more details about how to handle multi-touch gestures. Also, for a more generic approach, read the official multitouch guide. See this blog post for more details about how multi-touch events work.

Читайте также:  Дискорд для андроид не работает микрофон

This section briefly summarizes touch propagation within the view hierarchy. There are three distinct touch related methods which will be outlined below:

Order Method Invoked On Description
1st dispatchTouchEvent A, VG, V Dispatch touch events to affected child views
2nd onInterceptTouchEvent VG Intercept touch events before passing to children
3rd onTouchEvent VG, V Handle the touch event and stop propogation

«Order» above defines which of these methods gets invoked first when a touch is initiated. Note that above in «invoked on» A represents Activity , VG is ViewGroup , V is View describing where the method is invoked during a touch event.

Keep in mind that a gesture is simply a series of touch events as follows:

  1. DOWN. Begins with a single DOWN event when the user touches the screen
  2. MOVE. Zero or more MOVE events when the user moves the finger around
  3. UP. Ends with a single UP (or CANCEL) event when the user releases the screen

These touch events trigger a very particular set of method invocations on affected views. To further illustrate, assume a «View C» is contained within a «ViewGroup B» which is then contained within «ViewGroup A» such as:

Review this example carefully as all sections below will be referring to the example presented here.

When a touch DOWN event occurs on «View C» and the view has registered a touch listener, the following series of actions happens as the onTouchEvent is triggered on the view:

  • The DOWN touch event is passed to «View C» onTouchEvent and the boolean result of TRUE or FALSE determines if the action is captured.
  • Returning TRUE: If the «View C» onTouchEvent returns true then this view captures the gesture
    • Because «View C» returns true and is handling the gesture, the event is not passed to «ViewGroup B»‘s nor «ViewGroup A»‘s onTouchEvent methods.
    • Because View C says it’s handling the gesture, any additional events in this gesture will also be passed to «View C»‘s onTouchEvent method until the gesture ends with an UP touch event.
  • Returning FALSE: If the «View C» onTouchEvent returns false then the gesture is propagated upwards
    • The DOWN event is passed upward to «ViewGroup B» onTouchEvent method, and the boolean result determines if the event continues to propagate.
    • If «ViewGroup B» doesn’t return true then the event is passed upward to «ViewGroup A» onTouchEvent

In addition to the onTouchEvent , there is also a separate onInterceptTouchEvent that exists only on ViewGroups such as layouts. Before the onTouchEvent is called on any View , all its ancestors are first given the chance to intercept this event. In other words, a containing layout can choose to steal the event from a touched view before the view even receives the event. With this added in, the series of events from above become:

  • The DOWN event on «View C» is first passed to «ViewGroup A» onInterceptTouchEvent , which can return false or true depending on if it wants to intercept the touch.
  • If false, the event is then passed to «ViewGroup B» onInterceptTouchEvent can also return false or true depending on if it wants to intercept the touch.
  • Next the DOWN event is passed to «View C» onTouchEvent which can return true to handle the event.
  • Additional touch events within the same gesture are still passed to A and B’s onInterceptTouchEvent before being called on «View C» onTouchEvent even if the ancestors chose not to previously intercept.

The takeaway here is that any viewgroup can choose to implement onInterceptTouchEvent and effectively decide to steal touch events from any of the child views. If the children choose not respond to the touch once received, then the touch event is propagated back upwards through the onTouchEvent of each of the containing ViewGroup as described in the previous section.

As the chart much earlier shows, all of this above behavior of touch and interception is being managed by «dispatchers» via the dispatchTouchEvent method invoked on each view. Revealing this behavior, as soon as a touch event occurs on top of «View C», the following dispatching occurs:

  1. The Activity.dispatchTouchEvent() is called for the current activity containing the view.
  2. If the activity chooses not to «consume» the event (and stop propagation), the event is passed to the «ViewGroup A» dispatchTouchEvent since A is the outermost viewgroup affected by the touch.
  3. «ViewGroup A» dispatchTouchEvent will trigger the «ViewGroup A» onInterceptTouchEvent first and if that method chooses not to intercept, the touch event is then sent to the «ViewGroup B» dispatchTouchEvent .
  4. In turn, the «ViewGroup B» dispatchTouchEvent will trigger the «ViewGroup B» onInterceptTouchEvent and if that method chooses not to intercept, the touch event is then sent to the «ViewGroup C» dispatchTouchEvent .
  5. «ViewGroup C» dispatchTouchEvent then invokes the «ViewGroup C» onTouchEvent .

To recap, the dispatchTouchEvent is called at every level of the way starting with the Activity . The dispatcher is responsible for identifying which methods to invoke next. On a ViewGroup , the dispatcher triggers the onInterceptTouchEvent , before triggering the dispatchTouchEvent on the next child in the view hierarchy.

The explanation above has been simplified and abridged for clarity. For additional reading on the touch propagation system, please review this detailed article as well as this doc on ViewGroup touch handling and this useful blog post.

Источник

Оцените статью