Wytze van der Zee

unread,
Jul 25, 2017, 8:23:34 AM7/25/17
to MIT App Inventor Forum
Hello, again someone new to this very nice developing system. 

I connected an accelerometer (JY-61 MPU6050) via bluetooth to the app which shows shows the data realtime. However, this data is sorted/catagorized as follows:

0x55 0x51 AxL AxH AyL AyH AzL AzH TL TH SUM


What is the best way to get this data 'labled' so I can only get the AxL and AxH for example of every 'package' with 11 values?


Thanx,


Wytze


P.s. see attached image for the blocks

Question MIT app inventor.PNG

Evan Patton

unread,
Jul 25, 2017, 9:04:08 AM7/25/17
to MIT App Inventor Forum
We don't have a built-in mechanism in the blocks language for destructuring an array into variables, but you can construct the code by hand. You would want to create some local or global variables for whichever variables of interest are. The BluetoothClient's ReceiveSignedBytes will return you a list of values, so you will want to pull out the indices corresponding to the byte position of the variables in the packet (remember too that App Inventor is 1-indexed, not 0-indexed). For AxL and AxH, these indices would be 3 and 4, respectively.

Evan

Wytze van der Zee

unread,
Jul 25, 2017, 11:03:45 AM7/25/17
to mitappinv...@googlegroups.com
Thanx Evan for your respons!

So I tried to create a global variable, put the ReceivedSignedBytes in it and asked for an index number from the list (see image for the blocks code). 
However, I end up with the following notifications 1) "Select list item: List index too large" and 2) "Select list item: Attempt to get item number 3 of a list of length 0: ()"

Is this the right way of pulling out the wanted indices? 
Do you know what to do with the "List index too large" notification?


Question MIT app inventor.PNG

Abraham Getzler

unread,
Jul 25, 2017, 12:45:47 PM7/25/17
to MIT App Inventor Forum
According to the manual I found at 

the data stream will be text, not bytes, with NL record breaks.
Also, from their picture of the data stream, I see what looks like
different spacing between fields, suggesting possibly a TAB character as the field delimiter.

Worse, using the limit of my vision, the decimal point used is a comma, not a period.

I suggest capturing the data using a BlueTooth Get Text block,
using a BlueTooth client Delimiter of 10, and adjusting your
byte count in the Get Text block accordingly.
See the post about delimiters in the Arduino/BlueTooth
section of this FAQ ...

Capture some of the text here using the Companion, so it can be copied/pasted,
to get the truth as to the data format.

ABG


Wytze van der Zee

unread,
Jul 26, 2017, 4:31:28 AM7/26/17
to MIT App Inventor Forum
I also thougt the sensor would send text. So I tried the block you suggested (BluetoothClientGetText). Unfortunately this gave me some strange data. After reading the same document as you sended, I doubted whether it is text. The image of the 'raw' data suggests it is text, however, this is already manipulated data. 

The document says the raw data is sended in 3 packages (1 accaleration, 2 velocity, 3 angle) with all the same sequence: 0x55 0x51 AxL AxH AyL AyH AzL AzH TL TH SUM 
It says that the H and L represent the High and Low bite of the concerning value with which you could calculate the actual value. 

I found this video (https://www.youtube.com/watch?v=tp99s2fFmTk) that analyzed the data (at 8:30 of the video he starts analyzing).
When I used the ReceiveSignedBites block I received an array of values, so I guess this should be the right block. 

Image 1 is a screenshot when I used the block 'BluetoothClientGetText'
Image 2 when I used ReceiveSignedBites
Image 1.png
Image 2.png

Abraham Getzler

unread,
Jul 26, 2017, 12:16:39 PM7/26/17
to MIT App Inventor Forum
The data confirms that it's bytes.

Looking at the data stream, I see -5 appearing consistently 
in the same relative position in a sequence of 11 bytes.

Converting the 0x55 and 0x51 to decimal, we get 85, 81.
That appears (sorta) every 11 bytes, with the first block
broken off in the middle by chance of timing.

The second byte of the pair increases by 1 until 83, then repeats
from 81, confirming your statement that there are three 
blocks per set of readings.

So it looks like you have a solid basis for extracting the data
from the data stream into high and low bytes.

Do you need help with the math for reassembling
the signed bytes into integers?

ABG



Capture.PNG

Evan Patton

unread,
Jul 26, 2017, 12:16:58 PM7/26/17
to MIT App Inventor Forum
Hi Wytze,

When you print out the list read using the signed bytes method, you will need to do some additional interpretation. For example, 0x55 0x51 will be 85 and 81 in decimal, resp. You can see that 10 bytes into the list that signature occurs. Following that you have -123 (0x85) -1 (0xFF) -41 (0xD7) 0 (0x0) 19 (0x13) 8 (0x8) 122 (0x7A) -5 (0xFA) -111 (0x91).

So:
AxL = 0x85
AxH = 0xFF
AyL = 0xD7
AyH = 0x0
AzL = 0x13
AzH = 0x8
TL = 0x7A
TH = 0xFA
SUM = 0x91

Then if you reconstruct the 3 values:
Ax = 0xFF85 = -122
Ay = 0x00D7 = 215
Az = 0x0x0813 = 2067

Evan

Abraham Getzler

unread,
Jul 26, 2017, 1:53:03 PM7/26/17
to MIT App Inventor Forum
See attached for some blocks to combine signed bytes
into a 16 bit integer, with test data for 0xFF85 = -123,
as confirmed by the Windows 7 calculator in programmer mode.

ABG

hexify.aia
signed_dec.PNG
FF85_dec.PNG
FF85_hex.PNG
int16.PNG

Wytze van der Zee

unread,
Jul 27, 2017, 7:59:27 AM7/27/17
to MIT App Inventor Forum
Thank you both a lot for your help so far. 

I think I'm starting to understand the theory of converting the High and Low bytes to hexadecimals, combining them and convert this hex to an integer, right?
However, in the blocks from ABG I do not understand every part, which is why I'm struggeling to integrate them into my own 'project'. 

Could one of you maybe explain what I need to do to integrate these blocks into my blocks (from the previous posts)?

Thank you,

Wytze

Abraham Getzler

unread,
Jul 27, 2017, 11:20:20 AM7/27/17
to MIT App Inventor Forum
Could one of you maybe explain what I need to do to integrate these blocks into my blocks (from the previous posts)?

Could you post your .aia file?

Also your latest Downloaded Blocks Image?

ABG

Wytze van der Zee

unread,
Jul 27, 2017, 12:29:06 PM7/27/17
to MIT App Inventor Forum
Hello Abraham,

In the attached, the .aia. It also includes the code you sended me.
Also the latest Blocks Images. Is this what you meant?

Wytze
I_jump.aia
Latest Blocks Image 1.PNG
Latest Blocks Image 2.PNG

Abraham Getzler

unread,
Jul 27, 2017, 4:24:20 PM7/27/17
to MIT App Inventor Forum
See attached for how to advance the input buffer to a Type 81 record,
then match the input buffer with field names,
then build a JSON Object from the names and values,
and display it as JSON text.

See JSON.org for an explanation of JSON,
and the JSON section of this FAQ for how
to extract values from a JSON object in AI2.

See the free book in the Books and Tutorials section of that FAQ
for how to work with procedures and value procedures.

I unfortunately don't have hardware with which to test this.

ABG

JSON_Object.PNG
segment.PNG
align.PNG
blocks.png
Clock1_Timer.PNG
globals.PNG
I_jump_JSON_V1.aia

Abraham Getzler

unread,
Jul 27, 2017, 5:20:18 PM7/27/17
to MIT App Inventor Forum
I extended the app to show all 3 record types, in JSON,
and fixed a bug where it would not advance past the first record.

See attached.

If you can post the buffer contents as text here, I can test.

ABG

blocks_V2.png
I_jump_JSON_V2.aia

Wytze van der Zee

unread,
Jul 31, 2017, 8:00:20 AM7/31/17
to MIT App Inventor Forum
Again, thank you very much! It is quite a lot what you added.

I tested the V2 of the .aia file you send (and of course tried to follow it as far as I could).
However, I ended up with an error as showed in 'Error V2.0' as a result of the code part as showed in 'Image_V2.0'.

When I changed the '2' from the "Initialize local position to: 2" back to "1" as it was in V1 I did not get this error. 
After running the blocks code of Image_V2.1 I receive the content as showed in 'Data V2.1'
The data seems very structures, with which I should be able to continiue!

One more problem is that the data is coming in very slow, and after a few seconds the memory seems full 'Error V2.1'. 
Could it be that the part in the blocks, showed in Image_V2.1 is not working like it should?

Regards,

Wytze


Data V2.1.png
Error V2.1.png
Image_V2.0.PNG
Image_V2.1.PNG
Error V2.0.png

Abraham Getzler

unread,
Jul 31, 2017, 12:09:10 PM7/31/17
to MIT App Inventor Forum
The two problems were opposite sides of the same coin ...

I was accumulating input data in one global list, adding on the end as the data arrived,
and removing from the front (slot 1) as I recognized and displayed header by header.

My removal from the front wasn't checking for a totally empty list in the input buffer.

Switching from 2 to 1 as the starting pojnt disabled the removal, scanning only at position 1,
leading to severe constipation as the input buffer filled all of memory.

I added an empty list check and reinstated the forced advance from position 2.

I also renamed the app to reflect its hardware basis, and added a procedure
to combine pairs of bytes into words, by name.

I also took the repeating code for the three message types, and
turned it into a procedure, which I called once per pattern type.
The list parameters pointing to lists should update those lists,
giving test points for Do It debugging, thanks to AI2 call by reference.

Again, I have no test data (text form), and my OCR package can't read screen shots
of your test data.

ABG


I_jump_JY61_MPU6050_JSON_V3.aia
align.PNG
blocks_V3.png
Clock1_Timer.PNG
combine_bytes.PNG
extract.PNG
JSON_Object.PNG
globals.PNG

Wytze van der Zee

unread,
Aug 2, 2017, 5:43:39 AM8/2/17
to MIT App Inventor Forum
It seems very nice, and I understand what you did.
Unfortunately it resulted in the same problem. 

I attached the buffer data.

Wytze

 
GlobalBuffer.txt

Ghica

unread,
Aug 2, 2017, 8:44:01 AM8/2/17
to MIT App Inventor Forum
Hi,
I do not want to interfere with Abrahams JSON work. He is the best.
But, I am wondering about the timing of your input. Your app receives messages every second, but how fast is your device sending data? Your app must receive the data slightly faster than the device is sending it, otherwise the buffers will eventually overflow. On the other hand you should not make the time in app inventor too short, AI is not the fastest!
Cheers, Ghica.

Wytze van der Zee

unread,
Aug 2, 2017, 12:07:46 PM8/2/17
to MIT App Inventor Forum
Hi Ghica,

That is a nice suggestion I didn't realize!
The standerd sample frequency of the device is 100Hz (at a baud rate of 115200). I tried to set the TimerInterval of Clock1 to 5ms in stead of 1000ms. In the beginning the app seems to receive a lot faster than before. After a few seconds the buffer still seems to overflow. Do you think 100Hz is to fast for AI?

I also found in the manual of the device that the baud rate can be configured to 9600 (20Hz). Therefore, the manual (which Abraham also sended in his first respons) says you should send a message of "0xFF 0xAA 0x64" to the device. I also tried this (see Image_V2.2) but since the txt file (which is filled with the global variable 'input_buffer') shows +/- 200 series of data in +/- 2sec of connection, I guess the configuration did not work. 

Do you think the sample frequency is too high, so I need to figure out why the configuration to 20Hz did not work? Or is is more likely that it is a result of the code?

Wytze
Image_V2.2.PNG

Ghica

unread,
Aug 2, 2017, 12:47:51 PM8/2/17
to MIT App Inventor Forum
5 ms is much too fast. Realistically take at least 100ms.
Your configuration string was sent as text and not as hex bytes. Therefore it cannot have worked.
It escapes me forever how to do this, but Abraham is good at that, so I hope he will give a suggestion.
Cheers, Ghica.

Abraham Getzler

unread,
Aug 2, 2017, 2:02:50 PM8/2/17
to MIT App Inventor Forum
Thank you for replying with your tests and the text form of test data.

After sleeping on the problem, it also occured to me that my buffer parsing
technique was Fascist in its approach, and need modification to make it
more Libertarian.  I was forcing it to seek out the three different message types
in succession, regardless of what was actually next in the buffer.

With some small modification, it should seek out the next message of ANY type,
and treat it according to its type.

I can start work on it after my delayed morning email perusal.

ABG

Abraham Getzler

unread,
Aug 2, 2017, 3:54:12 PM8/2/17
to MIT App Inventor Forum
I added proper hex to decimal conversion to new version 4,
but have not yet revised buffer management,
to allow speed control testing in the meantime.
See attached, particularly the Set Speed button.

ABG

I_jump_JY61_MPU6050_JSON_V4.aia
blocks_V4.png
btnSetSpeedClick.PNG
signed_bytes.PNG

Abraham Getzler

unread,
Aug 2, 2017, 7:06:32 PM8/2/17
to MIT App Inventor Forum
I got it working, thanks to the test data.

See attached.

Documentation to follow.

ABG

V5_run.PNG
blocks_V5.png
I_jump_JY61_MPU6050_JSON_V5.aia

Wytze van der Zee

unread,
Aug 3, 2017, 5:26:58 AM8/3/17
to MIT App Inventor Forum
Abraham, thank you for all the work you did! 

The buffer is as fast as it was before. But when I connect the device, and move/shake the accelerometer to see whether the data is changing, I think I see the data changing seconds later (even if this is not the real data yet). When I connect the device for say 5sec, I see the 'backlog' increasing in volume. After pressing 'stop', the backlog needs +/- 20-30sec till it is empty again. 

I integrated the configuration for the baud rate in the block where the device gets connected so it starts sending data at 20Hz from the beginning. Also with this configuration the backlog runs full.

Do you think it is possible that AI is not fast enough for this much of data and calculations?

Wytze
Image_V5.PNG

Abraham Getzler

unread,
Aug 3, 2017, 12:36:44 PM8/3/17
to MIT App Inventor Forum
The backlog is controlled by the difference in rates between accepting
a message and your device sending a message.

You are making goose liver, stuffing your AI2 device faster than
it has been set to work.

Last time I looked at your AI2 Clock, I had sped it up from 
1000 ms per cycle to 100 ms per cycle.

I had also restricted it to processing only one message from the buffer per
cycle, to avoid going into a while loop coma and to allow AI2
a chance to refresh the display with the newly arrived data.

The hex message you sent to your device ...
Does it control the serial bit rate (9600 vs 2*9600 bps) or
does it control how frequently the device decides to send a message?
(That's two different things.)

There are two possible models for how to deal with data transfer across
devices where you can't control the message frequency of the source ...
The Bear Swatting for Salmon At The WaterFall model,
or the Beaver Damming The Stream model.

My blocks used the Beaver Dam model, accumulating every
incoming message into the global input_buffer variable and
draining just 1 message per Clock cycle.

You could switch it to the Bear At The Waterfall model by just
replacing the contents of the global input_buffer with the newly arrived message.

You won't catch every message, but you will have a limitted backlog.

ABG

Abraham Getzler

unread,
Aug 3, 2017, 2:21:36 PM8/3/17
to MIT App Inventor Forum
I forgot to mention my error that was causing the undefined error message ...
(Never let a good mistake go to waste.)

I had used AI2 typeblocking to pull in a length of list block
to get the backlog value, during the testing with that 26,000 byte 
test data list.  Unfortunately, I did not notice that I had pulled in a text length
block instead, forcing the 26,000 item list to be forced into a single text value,
before the internal text length calculation.  That crashed the Companion.

The error message was unfortunately vague.

ABG


 

Abraham Getzler

unread,
Aug 3, 2017, 2:53:32 PM8/3/17
to MIT App Inventor Forum
I integrated the configuration for the baud rate in the block where the device gets connected so it starts sending data at 20Hz from the beginning

Somewhere in my clutter room I might still have a 110 baud acoustic coupler,
left over from when I had to dial into Compuserve from my kitchen rotary phone.

Yet you chose 20Hz for your baud rate?

Are you trying to communicate with a submarine running silently in
a deep water trench?  Do you mind having to take minutes to send "Hello"?

For a common baud rate across different wired serial devices,
try 9600 baud.  That would allow fast enough data traffic.

See my earlier post, regarding the frequency of transmissions,
which might be ranging from fast talking auctioneer  rates
to slow educational TV kid show speech rates  .

ABG


Abraham Getzler

unread,
Aug 4, 2017, 1:20:51 PM8/4/17
to MIT App Inventor Forum
I double checked the documentation on the chip against your post, and I
misunderstood your post.  

You are sending at a frame rate of 20 Hz, and a baud rate of 9600 baud.

So a better Clock1 period for your app would be 
  (1000 ms per second) / (20 cycles per second) = 50 ms per cycle.

ABG


Wytze van der Zee

unread,
Aug 7, 2017, 10:45:16 AM8/7/17
to MIT App Inventor Forum
Hello Abraham,

Thank you for eather the informative parts as well as the funny parts of your posts.

I will try the Bear At The Waterfall model and will also set the Clock1 to 50ms in stead of 10ms I used earlier in order to see if the backlog will stay stable. 
If it doesn't work I'll have to find other solutions.

Eather way, thank you a lot for spending so much time on this problem and explaining it all very clear.

Wytze

Abraham Getzler

unread,
Aug 7, 2017, 1:13:00 PM8/7/17
to MIT App Inventor Forum
Let us know if you get it working?

We could use some good working examples of
byte encoded message streams.

Thanks,
ABG

Wytze van der Zee

unread,
Aug 9, 2017, 6:38:17 AM8/9/17
to MIT App Inventor Forum
With some very small changes it worked!

In Clock1 I directly set the SignedBytes from the bluetooth connection to 'input_buffer' instead of adding it at the end of an extra variable (last_input) see images. This was the problem of the backlog running full. 
Furthermore I changed the Clock1 TimerInterval to 50 as you mentioned.

I attached the .aia file.
Thanx again for all your advices and help!

Wytze
 


Clock Block new.PNG
Clock Block old.PNG
I_jump_JY61_MPU6050_JSON_V5.aia

Abraham Getzler

unread,
Aug 9, 2017, 12:31:31 PM8/9/17
to MIT App Inventor Forum
Thank you for testing!

Do the 16 bit numbers make sense?

Do they correspond to real physical measurements?

ABG

Wytze van der Zee

unread,
Aug 14, 2017, 8:51:46 AM8/14/17
to MIT App Inventor Forum
The numbers make perfect sense and show expected data (except for the temperature data which allways tells me that it is something between 30 and 35 degrees. Unfortunately this only happens once or twice a year in my country). 

I do have one simple final question.

I need the Ax data of only the acceleration (JSON record 81). I tried to pull it out the local variable "result" and put it into a global variable. However, I end up with Ax data of acceleration, velocity and angle (JSON 81, 82 and 83) because they are all 'Ax'. I also tried to pull it out from the JSON81 lable where Ax, Ay, Az and Temp are added to the origional JSON81 data but because the 'result' data is added later, I do not see how to extract this from the JSON81. 

Could you maybe give me some advise how to extract only the Ax from JSON81 (acceleration data)? 

Wytze

I_jump_JY61_MPU6050_JSON_V5_test.aia

Abraham Getzler

unread,
Aug 14, 2017, 10:05:03 AM8/14/17
to MIT App Inventor Forum
the temperature data which allways tells me that it is something between 30 and 35 degrees. 
Maybe it's Centigrade?

 Could you maybe give me some advise how to extract only the Ax from JSON81 (acceleration data)? 

Make a Label or a global variable to receive type81Ax.

Immediately after doing the combination of bytes into 16 bit integers,
test:
  if record type = 81 then
    set type81Ax.text to lookup in pairs (pairs, "Ax")

where pairs is the list of pairs built up from every reading.

ABG
 

Wytze van der Zee

unread,
Aug 14, 2017, 3:17:40 PM8/14/17
to MIT App Inventor Forum
Thank you,

It worked!
Reply all
Reply to author
Forward
0 new messages