Signal K sensor daemon, validation and multiplexing

169 views
Skip to first unread message

Ilker Temir

unread,
Oct 5, 2016, 12:24:14 AM10/5/16
to sig...@googlegroups.com

Hello,

I have developed a daemon for the DIY sensors that I have on my boat, starting with a BME280 temperature, pressure and humidity sensor. The daemon is a simple plugin based architecture which can take data from multiple plugins which are essentially Python modules. It acts as a Signal K server on the network. I only implemented WebSocket for now.

I have a few questions:

  • I don't want to hassle the user with a mmsi or ID. I currently return 'vessels.self' for context. Is that compliant with the Signal K spec?
  • Is there a way to validate if my code complies with Signal K spec? I saw the testclient but I couldn't get it to work even with the actual node implementation. Is this the right way for validation?
  • Now that I have two Signal K servers on my Raspberry Pi (my own and Node server) running on two different ports, how can I multiplex them?
    • I saw the multiplexer code, but doesn't have much documentation. Any instructions I can follow?
    • I also saw mdns-ws provider for the Node server. I tried to leverage it to have Node server read from my Signal K server by tweaking it, which didn't succeed. I then tried original mdns-ws provider with another Signal K node server for comparison, I couldn't make it work either. Is this approach right to multiplex two streams (one Signal K from my own code, and another NMEA stream from TCP)?

For reference, here is the output from my code:

Ilkers-MBP:~ itemir$ wscat --connect 'ws://192.168.1.2:1923/signalk/v1/stream?subscribe=all'
connected (press CTRL+C to quit)
< {"timestamp": "2016-10-05T04:15:42.839646+00:00", "self": "self", "version": "0.0.1", "name": "signalk-server"}
< {"updates": [{"timestamp": "2016-10-05T04:15:46.332212+00:00", "values": [{"path": "environment.inside.temperature", "value": 22.964940532787296}, {"path": "environment.outside.pressure", "value": 101669.46334475203}, {"path": "environment.inside.humidity", "value": 48.645098289722895}], "source": "sources.boatsensord"}], "context": "vessels.self"}
< {"updates": [{"timestamp": "2016-10-05T04:15:51.460690+00:00", "values": [{"path": "environment.inside.temperature", "value": 22.916005042948473}, {"path": "environment.outside.pressure", "value": 101670.2761034649}, {"path": "environment.inside.humidity", "value": 48.716331745739964}], "source": "sources.boatsensord"}], "context": "vessels.self"}
>

If you want to poke at it, available at https://github.com/itemir/boatsensord

Thanks,

Ilker


Teppo Kurki

unread,
Oct 5, 2016, 1:41:23 AM10/5/16
to sig...@googlegroups.com
On 10/5/16, Ilker Temir <il...@ilkertemir.com> wrote:
> Hello,
>
> I have developed a daemon for the DIY sensors that I have on my boat,
> starting with a BME280 temperature, pressure and humidity sensor. The
> daemon is a simple plugin based architecture which can take data from
> multiple plugins which are essentially Python modules. It acts as a
> Signal K server on the network. I only implemented WebSocket for now.
>
> I have a few questions:
>
> * I don't want to hassle the user with a mmsi or ID. I currently
> return 'vessels.self' for context. Is that compliant with the Signal
> K spec?

Sort of. Schema says that context is a string:
https://github.com/SignalK/specification/blob/master/schemas/delta.json#L11

But context is not required and Node server treats missing context in
delta from providers as self, so if I were you I would just omit
context. That's what I am doing in my I2C provider.

> * Is there a way to validate if my code complies with Signal K spec? I
> saw the testclient <https://github.com/SignalK/testclient> but I
> couldn't get it to work even with the actual node implementation. Is
> this the right way for validation?

Testclient would be the way to go and last time I checked it worked
ok. It will work either using discovery or by explicitly giving it the
server to connect to.

But it will not work without http, as it fetches /signalk with http to
discover the versions and endpoints that the server supports.

You can use testclient without discovery with

bin/testclient demo.signalk.org 80

(I just pushed a small fix, function without discovery was broken)

For a lightweight sensor server I would not worry about full SK
compliance. For my own ESP8266 temperature sensing node my plan is to
have it discover the primary SK server and start pushing line oriented
SK deltas over tcp. Work in progress, needs the tcp server part also.

> * Now that I have two Signal K servers on my Raspberry Pi (my own and
> Node server) running on two different ports, how can I multiplex them?
> o I saw the multiplexer code
> <https://github.com/SignalK/signalk-multiplexer-node>, but
> doesn't have much documentation. Any instructions I can follow?

Multiplexer really isn't a component to use for multiplexing different
types of streams.

> o I also saw mdns-ws
>
> <https://github.com/SignalK/signalk-server-node/blob/master/providers/mdns-ws.js>
> provider for the Node server. I tried to leverage it to have
> Node server read from my Signal K server by tweaking it, which
> didn't succeed. I then tried original mdns-ws provider with
> another Signal K node server for comparison, I couldn't make it
> work either. Is this approach right to multiplex two streams
> (one Signal K from my own code, and another NMEA stream from TCP)?

mdns-ws out of the box connects to all discovered SK servers and
starts streaming from them over ws. As your server doesn't support
discovery you need to tweak it to connect by configured ip. Certainly
doable and probably of general interest. Feel like doing that?

NMEA stream over TCP is not related to mdns nor ws. You need to create
a separate pipedProvider for it: tcp => nmea0183-signalk. See
https://github.com/SignalK/signalk-server-node/blob/master/settings/volare-tcp-settings.json#L24-L46

The idea with pipedProviders is that you create one for each of your
inputs and splice together the processing elements that you need. For
example if you have a tcp server sending out line oriented SK deltas
you would create pipedProvider tcp => liner => from_json: tcp
providers the byte stream, liner chops it to lines and from_json
parses each line from JSON string to js object.

You need not do explicit multiplexing, the server merges the input
from all the providers.

Ilker Temir

unread,
Oct 5, 2016, 11:24:09 AM10/5/16
to Signal K, t...@iki.fi
Thanks, please see inline:
That explains. 
 

You can use testclient without discovery with

bin/testclient demo.signalk.org 80

I did try this, but couldn't get it to work, definitely not with my code (which makes sense with above explanation) but also with the node implementation. It could be the result of below mentioned fix. I will check it again.


(I just pushed a small fix, function without discovery was broken) 
For a lightweight sensor server I would not worry about full SK
compliance. For my own ESP8266 temperature sensing node my plan is to
have it discover the primary SK server and start pushing line oriented
SK deltas over tcp. Work in progress, needs the tcp server part also.


Sounds like we have had similar use cases. I think you prefer JS but have a look, if there is anything re-usable from my Python code. 
 
>   * Now that I have two Signal K servers on my Raspberry Pi (my own and
>     Node server) running on two different ports, how can I multiplex them?
>       o I saw the multiplexer code
>         <https://github.com/SignalK/signalk-multiplexer-node>, but
>         doesn't have much documentation. Any instructions I can follow?

Multiplexer really isn't a component to use for multiplexing different
types of streams.

>       o I also saw mdns-ws
>
> <https://github.com/SignalK/signalk-server-node/blob/master/providers/mdns-ws.js>
>         provider for the Node server. I tried to leverage it to have
>         Node server read from my Signal K server by tweaking it, which
>         didn't succeed. I then tried original mdns-ws provider with
>         another Signal K node server for comparison, I couldn't make it
>         work either. Is this approach right to multiplex two streams
>         (one Signal K from my own code, and another NMEA stream from TCP)?

mdns-ws out of the box connects to all discovered SK servers and
starts streaming from them over ws. As your server doesn't support
discovery you need to tweak it to connect by configured ip. Certainly
doable and probably of general interest. Feel like doing that?


That's actually precisely what I did and I already have some code (I called the static provider signalk-ws) but it didn't work. That made me to take a step back and test it against the actual node implementation (with the assumption that my server code could be non-compliant), it didn't work either. I tried it against node with and without mdns, same result. In both instances, I can get the mdns-ws and signalk-ws (my tweaked provider that connects statically instead of mdns-ws) connect to the server but there was no data flow. 
 
NMEA stream over TCP is not related to mdns nor ws. You need to create
a separate pipedProvider for it: tcp => nmea0183-signalk. See
https://github.com/SignalK/signalk-server-node/blob/master/settings/volare-tcp-settings.json#L24-L46


I understand this. This is actually what I am doing with a modified volare-tcp-settings.json configuration. My Node Signal K server is reading from TCP on port 2000 and makes it available on port 3000. This works without issues. 

Now that I have another data set on the same Pi (port 1923), I am trying to get Signal K server read from it in addition to port 2000. I noticed you can specify multiple sources, which is what I eventually want to do (TCP 2000 and SignalK on port 1923). First part is working, I couldn't get the second part to work on its own (meaning, before combining the two together), either with mdns-ws or signalk-ws (provider I derived from mdns-ws for static configuration). 
 
The idea with pipedProviders is that you create one for each of your
inputs and splice together the processing elements that you need. For
example if you have a tcp server sending out line oriented SK deltas
you would create pipedProvider tcp => liner => from_json: tcp
providers the byte stream, liner chops it to lines and from_json
parses each line from JSON string to js object.

Let me look into this, it may be the missing part. 
Reply all
Reply to author
Forward
0 new messages