Page MenuHomePhabricator

e + touchscreen - actually tried it before?
Open, NormalPublic

Description

have you actually tried e, wayland mode, on a touch screen? i know you guys have ativ laptops with touch screens right? it seems broken. kind of works. kind of doesn't. have you at least tried?

like run as normal - try click on an icon in ibar to launch - mouse cursor doesnt even go there. click and cursor stays where it is... unless i click somewhere on the desktop - sometimes the cursor goes there on the 2nd click. click then drag moves the cursor where it should. i can't click to drag any window around at all. i can click+drag to use a menu but not click, then click again then click again etc.

so basically touch input seems pretty horribly broken. at least on this laptop here next to me..

raster created this task.Jan 11 2017, 2:04 PM

While I was developing/adding the touchscreen stuff, yes. It worked then. Perhaps it may have bit-rotted from protocol updates, etc. Will take a look this week.

this isn't just protocol - basically the mouse (via touch) doesnt work in the compositor at all - forget clients and protocol... :)

Ok. Will investigate this week

devilhorns triaged this task as Normal priority.Jan 12 2017, 4:22 AM
raster reopened this task as Open.Jan 12 2017, 3:36 PM

you only fixed one thing... now try use any wayland client. terminology. elementary_test, weston terminal... touch doesnt work at all... :( mouse does move. it flashes blue when touch the screen registering that e knows it clicked... but clients have no clue... clients are "dead to the touch" :)

That is strange as I was able to select text in terminology when I was testing the fix...I'll run some more tests and take a deeper look

devilhorns added a comment.EditedJan 13 2017, 4:24 AM

Ok, I can run elementary_test here and click on the buttons and tests run, I can scroll the elm_test window with my finger, I can get Terminology Options popup to show, can change settings there etc etc. Seems to be working just fine here....

NB: this is with a full Arch system update, and full git rebuild of efl & e as of this morning.

Unsure what could be causing your issue(s) :(

@zmike @ManMower Can you guys confirm a working or non-working state here ??

devilhorns added a comment.EditedJan 13 2017, 7:56 AM

Sorry for the crappy camera control, but I am not left-handed :)

I'll plug in my external touch screen as soon as I can and do some testing.

ghmm well a quick a dirty test of weston-terminal shows stracing it... nothing happening when i click/touch or otherwise try and interact with it. so the issue is that e is simply not sending any wayland touch input events to clients. BUT ... if i plug in a usb mouse... they get sent. well mouse events get sent...

once i have plugged in a usb mouse THEN touching weston terminal produces events.. and any new efl apps i run after this work with BOTH touch AND mouse.

the issue i think is... my device has no mouse. no touchpad. ONLY a touch screen. no other built in or attached mouse input device. it's a convertible laptop which really is a tablet when the kbd is hidden...

so we have an issue handling devices with no mouse device (and just touch).

In T5094#79943, @raster wrote:

ghmm well a quick a dirty test of weston-terminal shows stracing it... nothing happening when i click/touch or otherwise try and interact with it. so the issue is that e is simply not sending any wayland touch input events to clients.

No. E is working on that front bro.... comes in as mouse input...

BUT ... if i plug in a usb mouse... they get sent. well mouse events get sent...

No mouse on the start ?? Is maybe pointer input not being initted because no raw pointer device.... ???

once i have plugged in a usb mouse THEN touching weston terminal produces events.. and any new efl apps i run after this work with BOTH touch AND mouse.

Ok, so no mouse. Touchscreen at the start. With just touch, no pointer action. Got it... Have a good theory on what is happening....

Elput or libinput....,

Likely no pointer device getting created on startup, will show up as "clickable", but not recognized as pointer. Elput issue (imo)...Not hard to fix, can do....

Will look...

the issue i think is... my device has no mouse. no touchpad. ONLY a touch screen. no other built in or attached mouse input device. it's a convertible laptop which really is a tablet when the kbd is hidden...

so we have an issue handling devices with no mouse device (and just touch).

Yes. Elput. Can fix ;)

Chris, I seem to recall you have one of those bullshit keyboards that reports as both a keyboard and a mouse (that never generates any events) - that dead mouse may be hiding the issue from you... Just a thought.

this is kind of the shitty downside of wayland compositor dev. all the things xorg has hidden or "just handled" over the years and decades with history, hacking and everyone just making it work... now is up to us... some of the work is in libinput, but other bits are ours to test. we need to use both shitty input devices like chris's kbd and other odd ones like x86 tables or convertibles etc. and arm devices, boards, hell real phones and tablets too... we now have to solidify our code in the face of "odd stuff". :)

Ok, so did some digging/testing this morning. If I disable all "pointer" abilities (basically just disabled creating any pointers in Elput) and just have "touch" abilities, then Yes there is an issue. For some reason, Enlightenment is working with internal clients and touching on the main canvas to get the main menu...However when touching "client applications" (terminology, elm_test, etc) , the events are not getting sent to the clients. So there is indeed an issue here. Still looking into it...

Update: Touch does work in Weston on EFL clients, so this is pointing to a compositor issue (so far)...

Ok. Partially fixed. Weston-simple-touch example app will now work in Enlightenment.... but there is still an issue with EFL client apps in Enlightenment not getting touch events...

Shitty keyboards are an interesting case - not sure whether it's us or libinput that should be trying to hide that extra pointer device, and it can impact features -like trying to hide the mouse cursor when there's no real mouse attached.

See also: the acpi power button is a "keyboard", so we don't really know when there are no REAL keyboards attached (if we wanted to use absence of keyboard to help us know when to launch an onscreen keyboard...)

Seems like that's not the problem here though. :)

Can we back way up for a minute though? The patches that just landed are harmful. weston-simple-touch *only* binds touch devices - we shouldn't be sending it any kind of mouse motion at all, but now we are.

I really dislike emulating a mouse pointer from touch events in the first place, as its proved to be a horrific mound of work in X with fragile results, but emulating touch from a pointer is something IMHO we shouldn't even consider.

There are no legacy apps depending on the touch interface to worry about - if an app knows about touch it also knows about pointers, and should be expected to do the right thing with both.

Please consider reverting E commit 7906537 as it's a big step in a bad direction. :(

devilhorns added a comment.EditedJan 17 2017, 9:50 AM

Well, libinput identifies it as a pointer device, so likely the fix for that is needed there (see acpi example wrt to this also).

The issue here is how touch events are treated currently inside EFL. They are not separate, but rather when touch happens an event for mouse_button/motion is emulated.

I would agree that inside EFL they should be handled separately...but being close to an EFL release date, I didn't want to go breaking things too badly ... am just dealing with the cards I have ;)

this is really just an internal efl thing. the first touch events produce both multi callbacks (multi touch with device # 0) AND mouse events. but that's inside efl. you can figure out the difference... :) we should do the right thing in sending events to all wl clients... but how efl then transforms wayland input events (touch, mouse) back into callbacks is an efl thing. we do "touch == mouse" because this means simpler development for making apps work across multiple input device setups. if you really want to know the difference you can ... :)

ohduna added a subscriber: ohduna.Jan 25 2017, 8:16 PM
ohduna added subscribers: input.hacker, JHyun.

This task is about identification of event source device. I had a interest in this issue and tried to contribute. (refer to.. https://phab.enlightenment.org/D3860)
IMHO, we should fill 'dev' variable in events(Ecore_Event_XXX/Evas_Event_XXX) in drm backend, similar to wayland backend. Then, enlightenment could distinguish source devices from events and deal with that.
Do you agree on this? Shall I try this issue? Please let me know you guys ideas.
Thanks

Could we just have a new EVAS_EVENT_FLAG_TOUCH or something that we set on mouse events that were actually touch events?

@ohduna - correct. the dev field is meant to be the source device the event came from. if you REALLY care where it comes from - check that.

I have patches which fill in the "dev" field for events coming from Elput already. Works similar to the way it was implemented in wayland engines (ecore_evas creates the devices and calls Elput functions to set them). When Elput raises input events, it fills in the "dev" field of the event structure. I did not push these changes yet tho as I am still running tests. It did Not however fix the issue of touch....

ohduna added a comment.Feb 5 2017, 8:43 PM

@devilhorns - Thank you for letting me know. :)
If those patches are ready to share in dev branch (not master branch), I am happy to run tests in enlightenment.

@ohduna I have pushed the EFL portion of this work into a branch (devs/devilhorns/drm_evas_devices) : https://git.enlightenment.org/core/efl.git/log/?h=devs/devilhorns/drm_evas_devices

There are still some changes required in Enlightenment to figure out which device an event came from and pass it along accordingly. Ex: e_comp_wl.c: _e_comp_wl_send_mouse_move should likely test the ev->dev for being a "touch device" and send wl_touch_send_motion (and likewise for mouse button events, etc).

@raster Think I have found an issue here with the way ecore_input is dealing with multi.device...

Essentially when we get the first touch event from wayland, the ev->multi.device is set to 0 (due to the touch 'slot' being 0). Now, inside ecore_event_evas_mouse_move (and other functions in ecore_input_evas.c), IF the ev->multi.device == 0 then the "evas multi events" are not sent :( This means that on a single touch Enlightenment never gets sent the "multi event" and thus touch event is not getting sent to the client.

I am wondering if perhaps that code (if ev->multi.device == 0) should be if (ev->multi.device < 0) and make ev->multi.device == -1 by default.....This way on a single touch point (where touch slot == 0) we would still get a multi event being sent and can thus pass that along to clients...

Just adding some links here that are related to this issue so that I do not lose them ;)

https://phab.enlightenment.org/D3860
https://phab.enlightenment.org/D4194

btw... still broken fy... :)

zmike added a comment.Jun 15 2017, 6:00 AM

Yes, I'm working on acquiring more varied types of input devices to handle things like this.

@zmike Not sure if this will help you or not, but what I did was: On my Samsung ATIV laptop, I disabled the touchpad in the bios, and when I went to start E-WL, I would just physically unplug the mouse. That essentially left no pointer devices and used touchscreen only