How to change accelerometer refresh rate?

241 views
Skip to first unread message

MrMcChicken

unread,
Feb 25, 2016, 4:11:31 PM2/25/16
to MIT App Inventor Forum
Hi, I am working on an app in which I want to use the accelerometer. My problem is, that the accelerometer updates its data really fast.
In fact it is so fast, that running timers won't work properly in parallel. Every time I set the timer faster it gets more inaccurate.
I need to collect data every 100ms down to 20ms from the accelerometer. The timer should increment in the same time.
The value I read out of the accelerometer is pretty larg too. In my case I only need one or two decimal digits.
Is it somehow possible to reduce the updating speed  and the precision of the accelerometer to reduce processing power?

I am running the latest version offline. So if it is only possible to change in the source code I might have a look.

Scott Ferguson

unread,
Feb 25, 2016, 9:47:53 PM2/25/16
to MIT App Inventor Forum
You may want to employ smoothing of the sensor values.
That works exactly the same way as averaging a set of grades.
Each cycle of the clock timer your add the current value to a total.
At the end of a specified number of milliseconds you divide the total by the number of samples taken.
Then display that average value.
---

MrMcChicken

unread,
Feb 26, 2016, 5:59:17 AM2/26/16
to MIT App Inventor Forum
The raw values are okay for my use. The only problem I have is that they get refresehd so fast. This makes parallel working timer go inaccurate.
My Note II shows me different time values compared to my old LG GT540. I thought I could reduce somehow the refresh rate or the amount of digits.
The reason for reducing the process power is to make the app run equally on different mobile phones.

Scott Ferguson

unread,
Feb 26, 2016, 3:12:21 PM2/26/16
to mitappinv...@googlegroups.com
You might set a clock timer and use the Duration block to wait x milliseconds before sampling the accelerometer value.
If you set the delay large enough most devices should be able to keep up.

I just today had an issue with two different devices, one running 3x faster than the other and my game animations were too fast on the faster device.
I had to create a clock timer set to 0 Interval and use the delay block to test for 1000 ms to elapse while adding 1 to a tick counter variable.
the slower device counted to 150 ticks in one second and the faster one counted to 450 ticks.
I then used those values with the linear interpolation formula y = y1 + (x - x1)*(y2-y1)/(x2-x1) to determine the gravity and jumping velocity.
The best gravity and jumping velocity values were determined by experimentation with the two devices.
These values were plugged into the formula along with the number of ticks counted for the current device.
This should work for most any device even if the device is slower or faster than my two devices -- the formula still works.
---

MrMcChicken

unread,
Feb 27, 2016, 11:25:58 AM2/27/16
to MIT App Inventor Forum
Please correct me if I'm wrong, but I understand this in the following way:
I set a timer which enables the accelerometer, then puts the data of the accelerometer to three Variables and disables it again.
Then this repeats every 100ms. Is this the way to get single samples of the sensor every given time?
Unfortunately I can't test it now. Thanks for your effort so far.

Scott Ferguson

unread,
Feb 27, 2016, 11:58:44 AM2/27/16
to mitappinv...@googlegroups.com
You only need to read the accelerometer reading every x ms inside the Timer event block.
You can set the TimerInterval to 0, but the clock duration block really determines how many milliseconds  before you test the accelerometer value.
So leave the accelerometer enabled at all times.
Why do this?
Because using the clock TimerInterval values below say 100 are not an accurate measure of elapsed milliseconds as different devices have different cpu processing speeds and the slower ones cannot keep up.
---
Reply all
Reply to author
Forward
0 new messages