pyautogui package

100 views
Skip to first unread message

Jeremy Gray

unread,
Dec 29, 2014, 11:56:31 PM12/29/14
to psycho...@googlegroups.com
Hi folks esp Jon and Sol,

I just came across pyautogui (https://pyautogui.readthedocs.org/en/latest/), a cross-platform lib to control the keyboard and mouse. This allows setting the mouse position in OS X (10.9.5, I have not tried others, should do 10.10 and other platforms before getting excited about it). It does work with pyglet, based on a quick demo I mocked up.

I'm thinking it could be handy for automating tests of GUIs and the mouse, and possibly things for end-users too like setting the mouse position without using iohub. I'm not currently up on the status of this in iohub, so maybe this is redundant. But also maybe its a low-effort way to get those features. It looks fairly new (2014), and so hopefully will be developed and maintained for a while. 

--Jeremy

Sol Simpson

unread,
Dec 31, 2014, 11:58:21 AM12/31/14
to psycho...@googlegroups.com


On Monday, December 29, 2014 11:56:31 PM UTC-5, Jeremy Gray wrote:
Hi folks esp Jon and Sol, 
..... things for end-users too like setting the mouse position without using iohub. I'm not currently up on the status of this in iohub, so maybe this is redundant. 


iohub originally had support for setting mouse position and visibility, but once integrated with psychopy that functionality became redundant because pyglet could be used to do these things (that was my thinking at least). So while this functionality  has not been totally removed from the iohub mouse device api, it really should have be(en) and is likely in a unusable state for some, if not all, OS's.

I did not realize pyglet has become broken in this way, so if it is something that should be resurrected in iohub, it should not be very hard to do. Or would it make sense to look into patching pyglet and doing a pull request so it gets fixed in pyglet itself?

Jeremy Gray

unread,
Dec 31, 2014, 12:23:07 PM12/31/14
to psycho...@googlegroups.com

Hi folks esp Jon and Sol, 
..... things for end-users too like setting the mouse position without using iohub. I'm not currently up on the status of this in iohub, so maybe this is redundant. 


iohub originally had support for setting mouse position and visibility, but once integrated with psychopy that functionality became redundant because pyglet could be used to do these things (that was my thinking at least). So while this functionality  has not been totally removed from the iohub mouse device api, it really should have be(en) and is likely in a unusable state for some, if not all, OS's.

I did not realize pyglet has become broken in this way, so if it is something that should be resurrected in iohub, it should not be very hard to do.

I am pretty sure that pyglet never supported setting the mouse position, whereas pygame does.

If iohub does not need that functionality, maybe it would be easier to just use pyautogui than to recreate it in iohub (and maintain going forward). This would add a dependency, but (hopefully) be maintained cross-platform without needing our effort.
 
Or would it make sense to look into patching pyglet and doing a pull request so it gets fixed in pyglet itself?

I like the idea a lot, but pyglet seems to have stalled as a project, so I don't think this would be so useful. I'm mildly concerned about pyglet's longer term viability.

--Jeremy

Sol Simpson

unread,
Dec 31, 2014, 3:54:56 PM12/31/14
to psycho...@googlegroups.com


On Wednesday, December 31, 2014 12:23:07 PM UTC-5, Jeremy Gray wrote:
I am pretty sure that pyglet never supported setting the mouse position, whereas pygame does.


At least on Windows, pyget's Window.set_mouse_position seems to work. There seems to be code in place for all OS's. Have not tried it on others though. I'm running pyglet 1.2 tip.

So for a psychopy window:

win.winHandle.set_mouse_position(pix_x,pix_y)


seems to work, where pix_x,pix_y are in pix coords with a bottom left window origin.

Jeremy Gray

unread,
Dec 31, 2014, 4:17:02 PM12/31/14
to psycho...@googlegroups.com
Ah, cool! Yes, setting the mouse position in this way works on Mac as well (bottom left origin, same).

So the only use for pyautogui I can see would be in testing, e.g., of various wx things in the Builder interface (dialogs, etc). That's always been lower priority to test, because issues are more visible and do not affect experiment data. So maybe its not worth it.

--Jeremy

--
You received this message because you are subscribed to the Google Groups "psychopy-dev" group.
To unsubscribe from this group and stop receiving emails from it, send an email to psychopy-dev...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Jon Peirce

unread,
Jan 1, 2015, 8:04:00 AM1/1/15
to psycho...@googlegroups.com
Sorry to be late to the party. I agree - I wouldn't bother including pyautogui for the purpose of setting mouse pos anyway I don't think (find and add that necessary system calls if needed, as Sol had done for iohub).

The question of whether it's useful for the purpose of testing the gui... hmm... so far we have no way to test that menus and buttons in the gui work at all. This would potentially increase our test coverage a great deal. The problem is that it would be very fragile I think. If the button moves position very slightly on a different screen size then tests will fail for very boring reasons. I'd love instead to hear of a way to fire wx events to the app instead (open the dialog box and fire a wx.WX_OK event etc). I sort of imagine that such a thing must be possible, but I've not found out how. I think that would lead to more robust testing of the gui, which we could certainly use.

best wishes and happy new year to all!
Jon

Jeremy Gray

unread,
Jan 6, 2015, 10:55:11 AM1/6/15
to psycho...@googlegroups.com

I'd love instead to hear of a way to fire wx events to the app instead (open the dialog box and fire a wx.WX_OK event etc). I sort of imagine that such a thing must be possible, but I've not found out how. I think that would lead to more robust testing of the gui, which we could certainly use.

I agree this would be great. I googled around, did not see anything obvious for this. People seem to mostly test their application logic, and not try to test whether wx behaves correctly.
 
The question of whether it's useful for the purpose of testing the gui... hmm... so far we have no way to test that menus and buttons in the gui work at all. This would potentially increase our test coverage a great deal. The problem is that it would be very fragile I think. If the button moves position very slightly on a different screen size then tests will fail for very boring reasons.

That fragility might not be an issue. Interestingly, pyautogui comes with a function for returning the location of something on-screen, based on an exact visual match to a template that you provide, like a .png file. It takes ~1s to compute, and returns the coordinates in pixels. It appears to be off by exactly a factor of 2 (likely due to mac retina display?). Pretty nifty. It returns None if it cannot find anything, so you could skip a test, for example.

I have a small window pop up in a random location on the screen, draw a Rect, and then use the locateOnScreen() function to find it, move the mouse there (correcting for x2 effect), and then assert that the mouse is in the rectangle.

I have not tried testing on travis, nor with an actual GUI. 

Overall, triggering wx events would be smoother but this might sort of work too.

--Jeremy

Jeremy Gray

unread,
Jan 6, 2015, 1:23:04 PM1/6/15
to psycho...@googlegroups.com
Apparently Al Sweigart (the pyautogui author) wrote a script to play an online flash game, completely automatically. http://inventwithpython.com/blog/2014/12/17/programming-a-bot-to-play-the-sushi-go-round-flash-game/

I verified that for PsychoPy, just a few lines is all that is needed to fully automate adding a new component to a routine, without needing to know in advance where on the screen anything is. 3 lines of code, and a small .png file.

Complications would include everything we already know about that complicates a visual match cross-platform, like fonts, colors, GUI and OS window themes, retina display, etc. On mac, the OK button is dynamic, sort of a pulsating blue, so it might be a bit hit or miss to locate the button by visual matching. I think you'd have to do a while loop until the pulsation happens to be the right color at the time you check. Getting the right visual snippets for travis might be an interesting exercise, but at least tests could be run locally, even if not on travis.



--Jeremy

Jon Peirce

unread,
Jan 6, 2015, 7:44:19 PM1/6/15
to psycho...@googlegroups.com
It looks to me like we could use wx.PostEvent(obj, eventID) but I haven't tried
Jon
Reply all
Reply to author
Forward
0 new messages