Suppressing duplicate messages

524 views
Skip to first unread message

Robert Terzi

unread,
Nov 7, 2015, 6:58:13 PM11/7/15
to rtl...@googlegroups.com
I'm interested in what people's thoughts are with regard to duplicate message suppression in rtl_433.

For LaCrosse TX, I suppress duplicate messages from the repeat.

However for the DSC security contacts I chose not to do that. Downstream from rtl_433, the repeated messages provide information on signal decoding reliability that would be lost if duplicates were eliminated.

Some of the Acurite devices send the data 3 times. The consoles give "signal quality" reports based on the number of times the data is successfully decoded -- all 3 - high, 2 - medium, 1 - low.

I think there is general agreement that the output of rtl_433 is primarily intended for use by other programs. Given that, I'm leaning toward the idea that duplicate suppression should be the job of the downstream parsers, not rtl_433 as it is throwing information away that might be useful.

Comments?

--Rob

Benjamin Larsson

unread,
Nov 8, 2015, 4:58:07 AM11/8/15
to rtl...@googlegroups.com
Whatever consumer you have it should be capable of handling near
similar/duplicate events. In some cases the protocol decoder can/should
merge the events. But the main idea is that the protocol decoders should
expose the hardware properties and be as simple as possible.

MvH
Benjamin Larsson

Tom Foth

unread,
Nov 8, 2015, 7:34:33 PM11/8/15
to rtl_433
I'm new here... and just started to use rtl_433 with my Oregon Scientific sensors.  If I had my choice, I'd have command line options like:

--dont-dedup (continues current behavior)
--dedup n (where n is the number of milliseconds where duplicates need to occur to be deduped)
--dedup-error (which, when used with --dedup, would cause an error message to be generated if messages from the same device didn't match)

The default behavior would be --dont-dedup.

Just a thought.

I'm dealing the dedupping in my consumer now.

Put again, I'm just the new guy who hasn't done anything but use the software...

Tommy Vestermark

unread,
Nov 10, 2015, 5:02:10 PM11/10/15
to rtl_433
Hi Robert,

Some thoughts...

For most transmitters the duplication is a matter of increasing the chance of reception of a single event. But for many transmitters the error detection/correction capability of the protocol is so crappy that (de)duplication is also essential to filter out corrupted data (it seems they typically have many repeats to compensate). Therefore I personally think we should really strive to do our best effort to actively use the duplicated data to increase quality of the data output - but only output the content of the transmitted event once. Reception quality could be a parameter of the data output.

For some transmitters like remote controls, they will output messages as long a key is pressed. In this situation the duration of the key press may actually carry valuable information (maybe even rolling codes) and all messages should be output as is.

Maybe a recommendation like this:
  • For transmitters with good protocol error detection/correction (CRC etc.), the first good message is output and duplicated messages identified to be from the same event are suppressed (e.g. time based).
  • For transmitters with crappy error detection/correction at least N of M messages from same event must be identical before outputting data as a single message. Some helper functions could be added to aid in this.
  • For transmitters where the repeats carry extra information (e.g. duration of a key press) all messages are output.
This is however not easy to generalize and the huge variation of the device decoders seem to reflect that. Many device decoders currently just look at the first package in a message and ignore the rest... I made this overview some months ago.

/Tommy

Tom Foth

unread,
Nov 10, 2015, 11:53:40 PM11/10/15
to rtl_433
Tommy,

Would implementing the parameters as I outlined them accomplish what you stated?

It seems like they would... at least I had the same intent as you mentioned.

Tom

ygator

unread,
Feb 24, 2016, 6:07:29 PM2/24/16
to rtl_433


On Saturday, November 7, 2015 at 6:58:13 PM UTC-5, Robert Terzi wrote:

ygator

unread,
Feb 24, 2016, 6:44:34 PM2/24/16
to rtl_433
In my osv1 decoder I suppress the output by keeping that last decoded message along with the time of capture.  If the next decoded message is the same and the time is within say 5 seconds I suppress the output.
This was easy to do since the decoded data is only 32 bits.  I am not sure if this is a good way though since I don't know if it is possible for another osv1 message from a different sensor to pop in between the repeats???
I know this cannot be done on my lacrossews since in between messages I have seen os messages occur so possibly another lacrossews message could come in as well.

Below is what I am doing now.  I just need to save the clist in case the script ends between repeats.

while true
do
 
# read data from queue
 data
=`./q getque ${QNAME}`
 
# check if json
 
if [ "${data:0:1}" = "{" ]; then
 
# strip time
  dataWithoutTime
=`echo ${data} | sed 's/\"time\" : [^,]*, //'`
  time
=`echo ${data} | sed 's/.*"time" : "\([^"]*\).*/\1/'`
  time
=`date -d "${time}" +%s`
  model
=`echo ${data} | sed 's/.*"model" : "\([^"]*\).*/\1/'`
 
# check if duplicate
  duplicateData $
{time} "${dataWithoutTime}"
 
# post if not duplicate
 
if [ $? -eq 0 ]; then
   fixedData
=`echo ${data} | sed 's/\"id\" :/\"sid\" :/'`
   
if [ "${model}" == "Weather Sensor THGR122N" ]; then
    curl
-s -X POST -d "${fixedData}" -H "Content-Type: application/json" "${POSTURL}" > /dev/null 2> /dev/null
   
else
   
if [ "${model}" == "OSv1 Temperature Sensor" ]; then
     echo OSV1 $
{time} ${fixedData}
   
else
     echo LCWS $
{time} ${fixedData}
   
fi
   
fi
 
else
   echo DUPE $
{time} ${dataWithoutTime}
   a
=0
 
fi
  enque $
{time} "${dataWithoutTime}"
 
else
 
# echo JUNK ${data}
  a
=0
 
fi
 
./q rlsque test.Q
done

function enque() {
 
local time
 
local data
 time
=${1}
 data
=${2}
 let time
=time+7

 clistTime
[${clistCount}]=${time}
 clistData
[${clistCount}]=${data}
 let clistCount
=clistCount+1
}

function duplicateData() {
 
local time
 
local data
 
local ret
 
local i
 
local j

 
# args
 time
=${1}
 data
=${2}

 
# Not a duplicate to start
 ret
=0
 
# skip past old data
 i
=0
 
while [ ${i} -lt ${clistCount} ]; do
 
if [ ${clistTime[${i}]} -ge ${time} ]; then
   
break
 
fi
  let i
=i+1
 
done
 
#
 j
=0
 
while [ ${i} -lt ${clistCount} ]; do
  clistTime
[${j}]=${clistTime[${i}]}
  clistData
[${j}]=${clistData[${i}]}
 
if [ "${clistData[${i}]}" == "${data}" ]; then
   ret
=-1
 
fi
  let i
=i+1
  let j
=j+1
 
done
 let clistCount
=${j}
 
return ${ret}
}



On Saturday, November 7, 2015 at 6:58:13 PM UTC-5, Robert Terzi wrote:

Robert Terzi

unread,
Feb 25, 2016, 6:49:27 PM2/25/16
to rtl_433
I think you are agreeing with me and/or making my case -- Duplicate message suppression is probably best handled in the downstream consumer that is best suited to make decisions about what it wants to keep and what it wants to discard.

(Implementing suppression in rtl_433 means keeping state in rtl_433.  This means making some (possibly hard coded) assumptions about the number of devices you need to track.  This can lead to some brittle code that isn't easy to change.)
--
You received this message because you are subscribed to the Google Groups "rtl_433" group.
To unsubscribe from this group and stop receiving emails from it, send an email to rtl_433+u...@googlegroups.com.
To post to this group, send email to rtl...@googlegroups.com.
Visit this group at https://groups.google.com/group/rtl_433.
To view this discussion on the web, visit https://groups.google.com/d/msgid/rtl_433/f211953f-0aa5-481a-966e-d5c7209c2910%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

ygator

unread,
Mar 5, 2016, 5:24:21 PM3/5/16
to rtl_433
I am only familiar with the sensors I have, but with them it is a simple task to suppress duplicates.  I think there would be a benefit for users which want to have less sophisticated back-ends fed from rtl_433.
In my opinion it would be nice to have an option to suppress duplicates.

On Thursday, February 25, 2016 at 6:49:27 PM UTC-5, Robert Terzi wrote:
I think you are agreeing with me and/or making my case -- Duplicate message suppression is probably best handled in the downstream consumer that is best suited to make decisions about what it wants to keep and what it wants to discard.

(Implementing suppression in rtl_433 means keeping state in rtl_433.  This means making some (possibly hard coded) assumptions about the number of devices you need to track.  This can lead to some brittle code that isn't easy to change.)

On 2/24/2016 6:44 PM, ygator wrote:
In my osv1 decoder I suppress the output by keeping that last decoded message along with the time of capture.  If the next decoded message is the same and the time is within say 5 seconds I suppress the output.
This was easy to do since the decoded data is only 32 bits.  I am not sure if this is a good way though since I don't know if it is possible for another osv1 message from a different sensor to pop in between the repeats???
I know this cannot be done on my lacrossews since in between messages I have seen os messages occur so possibly another lacrossews message could come in as well.


On Saturday, November 7, 2015 at 6:58:13 PM UTC-5, Robert Terzi wrote:
I'm interested in what people's thoughts are with regard to duplicate message suppression in rtl_433.

For LaCrosse TX, I suppress duplicate messages from the repeat.

However for the DSC security contacts I chose not to do that.  Downstream from rtl_433, the repeated messages provide information on signal decoding reliability that would be lost if duplicates were eliminated.

Some of the Acurite devices send the data 3 times.  The consoles give "signal quality" reports based on the number of times the data is successfully decoded -- all 3 - high, 2 - medium, 1 - low.

I think there is general agreement that the output of rtl_433 is primarily intended
...
Reply all
Reply to author
Forward
0 new messages