When user touches the screen, ecore_drm generates MOUSE_MOVE event before MOUSE_BUTTON_DOWN.
But when ecore_wayland gets touch_motion before touch_down, touch_focus window is NULL and sending MOUSE_MOVE event fails.
Also in terms of 'touch' input, touch_focus needs be set in cb_touch_down, not in cb_pointer_enter.
So this commit makes sure that ecore_wayland generates MOUSE_MOVE when touch_focus is set.
Details
Details
when the application is launched for first time, user clicks on any area of the screen.
We can see the posion of touch event wrong.
Diff Detail
Diff Detail
- Repository
- rEFL core/efl
- Branch
- work
- Lint
No Linters Available - Unit
No Unit Test Coverage - Build Status
Buildable 1044 Build 1109: arc lint + arc unit
Comment Actions
This looks better to me. When we get around to implementing wl_touch instead of emulating a mouse we won't get any surprise interactions between touch and mouse.
Thanks