[Development] need to handle touch events rather than depending on mouse event synthesis

825 views
Skip to first unread message

Shawn Rutledge

unread,
Feb 29, 2012, 11:20:36 AM2/29/12
to devel...@qt-project.org
We've been chatting some more around the Oslo office about the fact that
even on platforms where the touchscreen is the primary pointing device,
both QWidgets and Qt Quick components are handling mostly mouse
events. In order to make that work, the Qt 4 approach (which is still
in place now) is for each incoming touch event, first let it propagate
as a touch event; then, if the event was not accepted, synthesize a
mouse event and let that propagate through. This method of achieving
backwards compatibility discourages handling touch events though: if
any variety of button widget or QML component, which handles mouse
events only, is a child (or descendant) of any component which handles
touch events, then when the touch event is accepted by the parent
component, the mouse event will not be synthesized. So the button (or
other mouse-only component) cannot be pressed.

The WebKit team has this problem, in that they want the QML web view to
be flickable, but obviously the user needs to be able to interact with
any mouse-only components (such as buttons) which might be on top. So
WebKit needs to separately synthesize mouse events from touch events
because the main synthesis mechanism doesn't work in that case.

In src/quick/items/qquickcanvas.cpp, there is a very recent new method
translateTouchToMouse which generates a new QQuickMouseEventEx event
type containing velocity information. (In fact maybe it makes sense to
just put the velocity in the base QMouseEvent, but that's optional to
what I'm about to propose.) This is IMO another case where mouse event
synthesis should not be necessary. I suspect the reason for it is the
same as the WebKit case.

Graphics View has yet another way, but handling touch events there is
lower-priority than for QML.

If we set aside all the Qt history and think about what would have been
ideal if we were starting over from scratch, I think I'd have wanted a
"pointing" event type which has a bit less than what QMouseEvent does:
at least the coordinates and an enum for "what happened" (pressed,
released, clicked, double-clicked, entering, leaving and so on). The
mouse event could inherit that and add mouse-specific concepts like
having multiple buttons, and the touch event could inherit and add the
multiple-finger concept. The point being that naive widgets like
Button should not need to care where the event came from, just that it
was clicked or tapped, but not dragged; so QPushButton would just handle
the hypothetical Pointing event. Then most of the third-party legacy
apps would have already been doing the same thing, and we wouldn't have
trouble at this stage to introduce a touch event as a different
specialization of the Pointing event. (Alternatively multiple fingers
could be treated just like multiple mice: separate press/release events
for the independent fingers. But actually it's useful to group multiple
touch points together as long as they come from the same user; it
facilitates gestural UIs, in that the UI does not need to gather up
multiple points from multiple events occurring at different times, and
figure out that they are part of one gesture.)

Anyway, back to reality... my next idea was let's have a flag to enable
the mouse event synthesis, which should be true by default, so that we
can at least turn it off and try to develop pure-touch UIs. But it turns
out this flag already exists: AA_SynthesizeMouseForUnhandledTouchEvents,
which is true by default. And there is even the reverse one:
AA_SynthesizeTouchForUnhandledMouseEvents. So that's a good start.

The proposal is this: I think we need to have a QML PointingArea element
which looks just like MouseArea except that it handles both mouse events
and single-touch events the same way. Then we need to start using it
instead of MouseArea in almost every case. That way
AA_SynthesizeMouseForUnhandledTouchEvents can eventually be set to false
for some applications.

We also need the exsisting QWidgets to start handling touch events too.
After that is done, individual QWidget-based apps in the field
(especially those with custom widgets) can set the
AA_SynthesizeMouseForUnhandledTouchEvents flag or not, depending on what
works better for them; but we need to move towards a future in which we
do not need to synthesize mouse events.

Some apps may eventually have a use for the distinction between mouse
and touch, too. One reason I got interested again at this particular
time is that I was thinking it would be nice if the KDE Konsole
(terminal application) was flickable. But dragging with the left mouse
button is the way that you select text, and that is also useful. So on
my touchscreen laptop, I can select text by dragging, regardless whether
I drag with a finger on the screen, with the touchpad, or with an
external mouse; but with a finger on the screen, selecting text is not
really what I expect. If Konsole handled both mouse events and touch
events, it could do something appropriate for each of them, and there
could be some useful multi-finger gestures too. This is just an example,
and in fact it may already be possible to implement this use case
without changes to Qt (except for the lack of support for XInput 2.2
and/or direct support for the evdev driver on my laptop, which I'm also
interested in looking at separately.) But in a more complex case which
has more widgets or Qt Quick elements, if touch events are being
accepted at all, you need to have them understood everywhere.

But we have the issue that the QML MouseArea component is in very
widespread use. This is because QML started life on the desktop. There
is already QQuickMultiPointTouchArea and QQuickPinchArea; so in
applications which intend to be portable between touchscreen devices and
conventional desktop usage, it should be OK to have overlapping touch
areas and MouseAreas, and this will enable the app developer to
customize the behavior depending on whether the user is interacting with
a mouse or a finger. But naive components like any of the many Button
implementations should not necessarily need to care. They should be
using the proposed PointingArea instead of MouseArea.

Alternatively MouseArea could handle touch events itself, but then
pretty soon we will start thinking the name sounds rather dated. It
wouldn't even surprise me if people stop using mice in a decade or so;
whereas we will probably always have some kind of "pointing device",
thus the need for a generic name which won't be obsolete later. And, we
still need a mouse-only Area which can be used in combination with the
touch-only Areas so that it's possible to make a cross-platform UI.

In summary the consequences of mouse event synthesis present some real
problems, and I think we need to get the device-agnostic PointingArea
into Qt5 ASAP.

--
MVH, Shawn Rutledge ❖ "ecloud" on IRC
_______________________________________________
Development mailing list
Devel...@qt-project.org
http://lists.qt-project.org/mailman/listinfo/development

Atlant Schmidt

unread,
Feb 29, 2012, 12:08:19 PM2/29/12
to Shawn Rutledge, devel...@qt-project.org
Shawn:

This sounds like the roots of a strong proposal -- carry on!

Two thoughts:

o Please be sure your "pointing device" proposal
can be generalized beyond "mice and touchscreens".
There are certainly other pointing devices already
in the world (who remembers Light Pens, Joy sticks,
and "Dial Boxes";-) ) and others that will become
fully practical soon (eye tracking where your point-
of-regard acts as the pointing device).

We should make sure the new approach is "future
proof" to the maximal possible degree.


o Please be sure that pinch/unpinch gestures fit
within the overall strategy.


Note: I only mention these points to make them explicit;
I'm not suggesting any omission on your part.


Atlant
Click https://www.mailcontrol.com/sr/B+EtTRM4Uc!TndxI!oX7UjaeDmea67kXqb9CWjbwKoiYy83oa75O41kEGDKOsIQDF+3g31OSGY6FfmYpsSvVwQ== to report this email as spam.

This e-mail and the information, including any attachments, it contains are intended to be a confidential communication only to the person or entity to whom it is addressed and may contain information that is privileged. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify the sender and destroy the original message.

Thank you.

Please consider the environment before printing this email.

Shawn Rutledge

unread,
Feb 29, 2012, 12:31:36 PM2/29/12
to ext Atlant Schmidt, devel...@qt-project.org
On Wednesday, February 29, 2012 12:08:19 PM ext Atlant Schmidt wrote:
> Shawn:
>
> This sounds like the roots of a strong proposal -- carry on!
>
> Two thoughts:
>
> o Please be sure your "pointing device" proposal
> can be generalized beyond "mice and touchscreens".
> There are certainly other pointing devices already
> in the world (who remembers Light Pens, Joy sticks,
> and "Dial Boxes";-) ) and others that will become
> fully practical soon (eye tracking where your point-
> of-regard acts as the pointing device).
>
> We should make sure the new approach is "future
> proof" to the maximal possible degree.

I agree, but the idea of the PointingArea is more of a lowest-common-
denominator handler for things that mice and touchscreens can both do, and
hopefully the unknown future devices as well. Joysticks and lightpens are
also both similar enough in that you have at least 2 axes and one primary
button. I think you should be able to click the same Button component with
any of these devices, as long as there is a driver which can generate an
existing Qt event type (that is questionable, but at least it's a small thing
to do if the need arises).

> o Please be sure that pinch/unpinch gestures fit
> within the overall strategy.

That's covered by the PinchArea. As far as I know, there shouldn't be a
problem with stacking multiple Areas to handle multiple types of events in
case you want the same item to be interactive via mouse as well.

Thanks for the feedback.

kenneth.r.c...@nokia.com

unread,
Feb 29, 2012, 3:41:06 PM2/29/12
to shawn.r...@nokia.com, devel...@qt-project.org
Sorry for top posting, but I just wanted to point out that the new Qt5 WebView could also use both mouse and touch for more or less the same reasons you stated for the terminal application. Actually mouse events should not be able to cause a flick on the item (unless emulated for testing purposes, or for devices not supporting native touch events)

Kenneth
________________________________________
From: development-bounces+kenneth.r.christiansen=noki...@qt-project.org [development-bounces+kenneth.r.christiansen=noki...@qt-project.org] on behalf of Rutledge Shawn (Nokia-MP/Oslo)
Sent: Wednesday, February 29, 2012 5:20 PM


To: devel...@qt-project.org
Subject: [Development] need to handle touch events rather than depending on mouse event synthesis

We've been chatting some more around the Oslo office about the fact that

Samuel Rødal

unread,
Mar 1, 2012, 3:26:28 AM3/1/12
to devel...@qt-project.org
On 02/29/2012 05:20 PM, ext Shawn Rutledge wrote:
> The proposal is this: I think we need to have a QML PointingArea element
> which looks just like MouseArea except that it handles both mouse events
> and single-touch events the same way. Then we need to start using it
> instead of MouseArea in almost every case. That way
> AA_SynthesizeMouseForUnhandledTouchEvents can eventually be set to false
> for some applications.

PointingArea sounds a bit strange at first, but maybe it's hard to find
a better name for it, at least I can't think of one.

I agree with your argument that changing MouseArea to handle touch
events might not be a good idea after all, as applications might want to
still handle touch and mouse input slightly differently by putting both
a MouseArea and a TouchArea covering the same region.

> In summary the consequences of mouse event synthesis present some real
> problems, and I think we need to get the device-agnostic PointingArea
> into Qt5 ASAP.

Agreed, the synthesizing of mouse events has been nice to enable
existing applications to still work on touch devices, but it's not an
ideal solution in the long run.

--
Samuel

kenneth.r.c...@nokia.com

unread,
Mar 1, 2012, 4:09:44 AM3/1/12
to samuel...@nokia.com, devel...@qt-project.org
Wouldn't PointerArea make a bit more sense. MS have also introduced MSPointer in IE10[1] which represents a mouse, finger-touch or stylus.

[1] http://blogs.msdn.com/b/ie/archive/2011/09/20/touch-input-for-ie10-and-metro-style-apps.aspx

Kenneth
________________________________________
From: development-bounces+kenneth.r.christiansen=noki...@qt-project.org [development-bounces+kenneth.r.christiansen=noki...@qt-project.org] on behalf of Rodal Samuel (Nokia-MP/Oslo)
Sent: Thursday, March 01, 2012 9:26 AM
To: devel...@qt-project.org
Subject: Re: [Development] need to handle touch events rather than depending on mouse event synthesis

André Somers

unread,
Mar 1, 2012, 4:32:05 AM3/1/12
to devel...@qt-project.org
While I find this dicussion interesting, I am wondering about the
relation with the Qt 5 feature freeze that has gone by a few weeks ago,
and Lars' email about that on february 22 ("Important: Qt 5 freeze"). Do
you think it is possible to tackle this in 5.1, or does it need binary
or even source incompatible changes?

André

Op 29-2-2012 17:20, Shawn Rutledge schreef:
> (...) In summary the consequences of mouse event synthesis present some real


> problems, and I think we need to get the device-agnostic PointingArea
> into Qt5 ASAP.
>

_______________________________________________

Alan Alpert

unread,
Mar 1, 2012, 4:48:15 AM3/1/12
to devel...@qt-project.org

The overlapping MouseArea/TouchArea is an interesting idea, and might explain
why we'd need a TouchArea when it's virtually identical to the 'PointerArea"
element. But then we'd have three area interaction elements that are virtually
identical, with a few slight differences in functionality (like wheel events)
and some name changes (onTap instead of onClicked despite identical
functionality...).

Perhaps we could just add an enum to MouseArea? EventType { MouseEvents,
TouchEvents, MouseAndTouchEvents (default) }. That allow you more fine-grained
control, with identical default behavior that doesn't require event synthesis
that messes with other elements. Not to mention that the common case is you
don't care what pointy thing they used to say 'do that', there are devices
with both mouse and touch and often the app really doesn't care which it was.

It doesn't solve the name issue, but that one is a difficult one because it
leads to a lot of API differences which are purely naming. I'd rather use a
MouseArea's onClicked signal for a touch UI than have to switch to using
TouchArea's onTapped everywhere just because this is the mobile app UI.
PointerArea's onJab (onPointedWithEnthusiasm? We've run away from the metaphor
a little here...) might not solve this, but it would feel like an unnecessary
change during the transition period even if it did.

--
Alan Alpert
Senior Engineer
Nokia, Qt Development Frameworks

Thiago Macieira

unread,
Mar 1, 2012, 4:57:11 AM3/1/12
to devel...@qt-project.org
On quinta-feira, 1 de março de 2012 10.32.05, André Somers wrote:
> While I find this dicussion interesting, I am wondering about the
> relation with the Qt 5 feature freeze that has gone by a few weeks ago,
> and Lars' email about that on february 22 ("Important: Qt 5 freeze"). Do
> you think it is possible to tackle this in 5.1, or does it need binary
> or even source incompatible changes?

To be honest, this might be one of the things we _need_ to fix before 5.0,
especially if it means changing the event classes on the C++ side. Maybe it
can be done compatibly, maybe it can't. Shawn will have to investigate and
tell us as soon as possible.

--
Thiago Macieira - thiago.macieira (AT) intel.com
Software Architect - Intel Open Source Technology Center
Intel Sweden AB - Registration Number: 556189-6027
Knarrarnäsgatan 15, 164 40 Kista, Stockholm, Sweden

signature.asc

Michael Hasselmann

unread,
Mar 1, 2012, 6:30:21 AM3/1/12
to devel...@qt-project.org
Hi,

> Wouldn't PointerArea make a bit more sense. MS have also introduced MSPointer in IE10[1] which represents a mouse, finger-touch or stylus.
>
> [1] http://blogs.msdn.com/b/ie/archive/2011/09/20/touch-input-for-ie10-and-metro-style-apps.aspx

That "hardware agnostic" abstraction layer is a joke, isn't it? It has a
pointerType property and on top of that, pointerType specific
properties. So effectively, they still have different device classes.
Not a too good reference, IMO. I can only imagine what kind of API
consumer code this will produce.

regards,
Michael

Shawn Rutledge

unread,
Mar 1, 2012, 7:24:09 AM3/1/12
to kenneth.r.c...@nokia.com, devel...@qt-project.org
On 1 March 2012 10:09, <kenneth.r.c...@nokia.com> wrote:
> Wouldn't PointerArea make a bit more sense. MS have also introduced MSPointer in IE10[1] which represents a mouse, finger-touch or stylus.
>
> [1] http://blogs.msdn.com/b/ie/archive/2011/09/20/touch-input-for-ie10-and-metro-style-apps.aspx

I had thought of this as a possible name too, but maybe it sounds a
bit confusing if the first thing you think of is a C pointer rather
than a pointing device.

Frederik Gladhorn

unread,
Mar 1, 2012, 8:49:12 AM3/1/12
to devel...@qt-project.org

I think I agree with Alan here. Adding the functionallity to MouseArea will be
the least disruptive and solve the things we discussed.
In an ideal world we might start with a different name, but MouseArea has
become so widespread and predominant that it makes sense to simply keep it.

+1 for adding touch handling to MouseArea.

(My personal first idea was to merge mouse and single touch events in QEvent,
but that turns out would break things on many levels, so I'm quite happy this
idea came up)

As for the other mail, why doing it now? Yes, it should have been done a long
time ago, but now is the chance to get it right. Actually this change now
proposed is quite limited in scope, so it shouldn't cause many problems :)

With WebKit in the game we are facing challenges that we simply cannot work
around in a sane fashion without major hacks. So why not get it right now.
When we researched the Gesture Recognition apis we actually had major issues
of the same kind with QGraphicsView. Getting events twice is simply not nice.

Cheers
Frederik

Samuel Rødal

unread,
Mar 1, 2012, 9:16:49 AM3/1/12
to devel...@qt-project.org

Hmm, true, then if you wanted to handle touch and mouse differently
you'd just put two of them on top of each other with different event types.

>> It doesn't solve the name issue, but that one is a difficult one because it
>> leads to a lot of API differences which are purely naming. I'd rather use a
>> MouseArea's onClicked signal for a touch UI than have to switch to using
>> TouchArea's onTapped everywhere just because this is the mobile app UI.
>> PointerArea's onJab (onPointedWithEnthusiasm? We've run away from the
>> metaphor a little here...) might not solve this, but it would feel like an
>> unnecessary change during the transition period even if it did.
>
> I think I agree with Alan here. Adding the functionallity to MouseArea will be
> the least disruptive and solve the things we discussed.
> In an ideal world we might start with a different name, but MouseArea has
> become so widespread and predominant that it makes sense to simply keep it.
>
> +1 for adding touch handling to MouseArea.

It's possible to make MouseArea a sub-class of whatever we actually want
to call it, and mark it as deprecated. That way it could be removed down
the line, or made to only handle the MouseEvent type. Similarly we could
have a TouchArea convenience sub-class that only handles the TouchEvent
type.

--
Samuel

Shawn Rutledge

unread,
Mar 1, 2012, 9:23:59 AM3/1/12
to Samuel Rødal, devel...@qt-project.org
On 1 March 2012 15:16, Samuel Rødal <samuel...@nokia.com> wrote:
> It's possible to make MouseArea a sub-class of whatever we actually want
> to call it, and mark it as deprecated. That way it could be removed down
> the line, or made to only handle the MouseEvent type. Similarly we could
> have a TouchArea convenience sub-class that only handles the TouchEvent
> type.

That sounds like a good idea.

Shawn Rutledge

unread,
Mar 1, 2012, 9:26:00 AM3/1/12
to devel...@qt-project.org
On 1 March 2012 10:48, Alan Alpert <alan....@nokia.com> wrote:
> The overlapping MouseArea/TouchArea is an interesting idea, and might explain
> why we'd need a TouchArea when it's virtually identical to the 'PointerArea"
> element. But then we'd have three area interaction elements that are virtually
> identical, with a few slight differences in functionality (like wheel events)
> and some name changes (onTap instead of onClicked despite identical
> functionality...).
>
> Perhaps we could just add an enum to MouseArea? EventType { MouseEvents,
> TouchEvents, MouseAndTouchEvents (default) }. That allow you more fine-grained
> control, with identical default behavior that doesn't require event synthesis
> that messes with other elements. Not to mention that the common case is you
> don't care what pointy thing they used to say 'do that', there are devices
> with both mouse and touch and often the app really doesn't care which it was.

This has the advantage of fewer QML elements to know about, but I
still wonder how silly the name will sound in a few years; it already
does sound silly on mobile devices.  Another thing is that if you
expect a right-click or a wheel movement, those are usually emulated
with gestures on touch devices, but should the do-everything MouseArea
really be responsible for doing that too?  We already have GestureArea
right?  I misspoke about having PointingArea be just like MouseArea; I
was thinking it would actually not have the multi-button concept, thus
no right-clicking, middle-clicking, back/foward or other buttons, or
wheel.  You would still need MouseArea because those events are
mouse-specific.

Renaming would be easier now than later when there are way more apps
using it.  But if we can live with the retro-sounding name for the
foreseeable future, and if we agree that for touch devices, the
mouse-emulation gestures should be in MouseArea but the rest of the
possible gestures should not, it would save some work for the existing
apps and QML component sets.  Right-clicking is pretty useful after
all, regardless of whether it is done with a real right mouse button,
or emulated via touch.

Already I think if a hypothetical QML component needed gesture
recognition, pinch-zoom functionality and dragging all on the same
area, it would need to have stacked Areas, right?  Of course I should
try it before assuming that it already works.  ;-)

We could try to have one InteractionArea that does it all, but maybe
it would tend to get unwieldy over time as new devices are introduced.
 We would need to have good generic names for every event type which
we are able to imagine now, and be prepared to add more later on.
Before multi-touch devices were actually introduced, I would have
probably failed to imagine that the event could include the size,
shape and angle of the finger or other object, as either an ellipse or
a blob; but evdev apparently supports that.  If the old Sun Starfire
mockup were ever really implemented, the multi-touch surface becomes a
scanner too; then we would need to distinguish fingers from donuts and
coffee cups (some devices can already reject palms and just see the
fingers even today), scan any image that is intentionally pressed
against the surface, OCR any text, and any of these items can be
treated as "input".  Each item has a 2d location on the screen like a
finger does now.  So if that happens, QML could add an ImageScanArea
or some such, and that would be another Area type which applications
would need to start using.

The broader question is should it be considered normal and healthy for
applications and components to stack up Areas for all possible kinds
of input that they know how to support, or should we try to combine
them more than they are now?  To me the stacking seems more likely to
survive future changes, assuming that it works well enough.

> It doesn't solve the name issue, but that one is a difficult one because it
> leads to a lot of API differences which are purely naming. I'd rather use a
> MouseArea's onClicked signal for a touch UI than have to switch to using
> TouchArea's onTapped everywhere just because this is the mobile app UI.
> PointerArea's onJab (onPointedWithEnthusiasm? We've run away from the metaphor

onPointSelected maybe?

> a little here...) might not solve this, but it would feel like an unnecessary
> change during the transition period even if it did.

BTW back in the 80's I knew an old civil engineer who was new to
computers (more of a slide-rule guy) and thought that "mouse" referred
to what we usually call the cursor (the arrow on the screen).  I've
also seen in the context of CAD digitizing tablets that the puck you
move around on the tablet can be called a cursor, especially if it has
crosshairs for accurately digitizing existing drawings.  If this
confusion occurs again after younger generations forget about physical
mice, maybe the MouseArea name won't be so bad after all.

Anyway right now I think we need to reach a conclusion on whether to
1) leave MouseArea alone and add PointingArea (or other name), with
single-click only and a generic handler name for that
2) add touch support (touches emulating the mouse) for every feature
that MouseArea supports, into MouseArea itself
3) same as #2 but leave something out
4) same as #2 but rename it anyway
5) any better ideas?

Atlant Schmidt

unread,
Mar 1, 2012, 10:43:37 AM3/1/12
to Shawn Rutledge, devel...@qt-project.org
Shawn:

> BTW back in the 80's I knew an old civil engineer who was new to
> computers (more of a slide-rule guy) and thought that "mouse" referred
> to what we usually call the cursor (the arrow on the screen). I've
> also seen in the context of CAD digitizing tablets that the puck you
> move around on the tablet can be called a cursor, especially if it has
> crosshairs for accurately digitizing existing drawings. If this
> confusion occurs again after younger generations forget about physical
> mice, maybe the MouseArea name won't be so bad after all.

People seem to survive calling the file a "core" file
even though magnetic core memory probably hasn't been
dumped into one of those files in several decades... ;-)

They'll probably survive these terminology "adjustments"
as well.

Atlant


This e-mail and the information, including any attachments, it contains are intended to be a confidential communication only to the person or entity to whom it is addressed and may contain information that is privileged. If the reader of this message is not the intended recipient, you are hereby notified that any dissemination, distribution or copying of this communication is strictly prohibited. If you have received this communication in error, please immediately notify the sender and destroy the original message.

Thank you.

Please consider the environment before printing this email.

martin...@nokia.com

unread,
Mar 1, 2012, 5:10:16 PM3/1/12
to shawn.t....@gmail.com, devel...@qt-project.org
Something like PointerArea would be nice to have, but it is not something we can throw in while in feature freeze. What we have now is a little clunky, but it works, and has been very well tested and debugged over some years. I don't want to discourage anyone from working on this, but it's not the time to force it through.

Br,
Martin.

> -----Original Message-----
> From: development-bounces+martin.jones=noki...@qt-project.org
> [mailto:development-bounces+martin.jones=noki...@qt-project.org]
> On Behalf Of ext Shawn Rutledge
> Sent: Friday, March 02, 2012 12:26 AM
> To: devel...@qt-project.org
> Subject: Re: [Development] need to handle touch events rather than
> depending on mouse event synthesis
>

adriano....@nokia.com

unread,
Mar 14, 2012, 8:52:20 PM3/14/12
to devel...@qt-project.org
>Torsdag 1. mars 2012 19.48.15 skrev ext Alan Alpert:
> Perhaps we could just add an enum to MouseArea? EventType { MouseEvents,
> TouchEvents, MouseAndTouchEvents (default) }. That allow you more
> fine-grained control, with identical default behavior that doesn't require
> event synthesis that messes with other elements. Not to mention that the
> common case is you don't care what pointy thing they used to say 'do that',
> there are devices with both mouse and touch and often the app really
> doesn't care which it was.

I agree. I think this is the best solution given the deadline.
It will not break any apps and we can get rid of this event synthesis for good.
Is there anybody already working on this fix?

Br,
Adriano

martin...@nokia.com

unread,
Mar 14, 2012, 11:15:03 PM3/14/12
to adriano....@nokia.com, devel...@qt-project.org
Sure, if you don't touch the flag you don't break anything, but as soon as someone uses the flag somewhere in a hierarchy of interactive Items you're going to have problems. It's too late for 5.0.

Br,
Martin.

> -----Original Message-----
> From: development-bounces+martin.jones=noki...@qt-project.org
> [mailto:development-bounces+martin.jones=noki...@qt-project.org]
> On Behalf Of Rezende Adriano.1 (Nokia-MP/Oslo)
> Sent: Thursday, March 15, 2012 10:52 AM
> To: devel...@qt-project.org
> Subject: Re: [Development] need to handle touch events rather than
> depending on mouse event synthesis
>

Adriano Rezende

unread,
Mar 15, 2012, 5:02:52 AM3/15/12
to Jones Martin (Nokia-MP/Brisbane), devel...@qt-project.org
On 03/15/2012 04:15 AM, Jones Martin (Nokia-MP/Brisbane) wrote:
> Sure, if you don't touch the flag you don't break anything, but as soon as someone uses the flag somewhere in a hierarchy of interactive Items you're going to have problems. It's too late for 5.0.

We are already having problems, since this event propagation is broken.
Currently, we are having to create our own SingleTouchArea to avoid
touchs being propagated to underneath MultiPointTouchAreas, because even
though the MouseArea 'handles' touch events, through synthesized mouse
events, it will not prevent the original touch event passing through it.
The z-order matters here and on touch devices this behavior is not
acceptable. If we don't fix this know, it will bring much more problems
in the future.

If someone change this flag, he knows exactly what he is doing and
should know the outcome.

Reply all
Reply to author
Forward
0 new messages