Has the RESET option in INCRBY command removed from REDIS TSDB

19 views
Skip to first unread message

Antariksh Goyal

unread,
Jan 3, 2020, 6:39:09 AM1/3/20
to RedisTimeSeries Discussion Forum
I was just revisiting the documentation of Redis TSDB on https://oss.redislabs.com/redistimeseries/commands/ and could not find the 'RESET' option in incrby command. Has there been any change regarding this, and if yes what is the alternative to work with counters and gauge in Redis TSDB.

Guy Korland

unread,
Jan 4, 2020, 2:37:39 PM1/4/20
to RedisTimeSeries Discussion Forum
Hi,

Yes, we removed this feature since it was very confusing and was inconsistent with the reset of the API.
What did you use it for?

Technically if you want to achieve similar functionality you can achieve it by creating a down sample rule with a similar timebucket and set a minimal retention time on the original timesries.


e.g.
TS.CREATE mykey RETENTION 1
TS
.CREATE tokey
TS
.CREATERULE mykey tokey AGGREGATION sum 10

Guy

Antariksh Goyal

unread,
Jan 5, 2020, 2:15:30 PM1/5/20
to RedisTimeSeries Discussion Forum
I used the RESET option to have the value of a metric increase over a specific duration of time and then reset after that. By using the downsampling approach the time bucket option will always give value for the last 'n' minutes, which in turn wont give me the correct value for a specific period and will give overlapped or multiple times counted value. Right? So one event will be tracked in 'n' different time instances.

Ariel Shtul

unread,
Jan 6, 2020, 11:07:37 AM1/6/20
to RedisTimeSeries Discussion Forum
You can use TS.ADD all these values and then TS.RANGE the with aggregation SUM. If you used TS.DECRBY, you will have to make sure you add a negative value.

Antariksh Goyal

unread,
Jan 7, 2020, 12:25:01 AM1/7/20
to Ariel Shtul, RedisTimeSeries Discussion Forum
TS.ADD won't do the job, as when I will write different values on the same timestamp for a metric, it will be overwritten. RESET helped in that case as it maintained the counter very well.

Thanks,
Antariksh Goyal


--
You received this message because you are subscribed to a topic in the Google Groups "RedisTimeSeries Discussion Forum" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/redistimeseries/ZWUHXWE1HIk/unsubscribe.
To unsubscribe from this group and all its topics, send an email to redistimeseri...@googlegroups.com.
To view this discussion on the web, visit https://groups.google.com/d/msgid/redistimeseries/a23f9318-74f0-47ee-88b4-2fe13f501faa%40googlegroups.com.

Ariel Shtul

unread,
Jan 7, 2020, 1:25:41 AM1/7/20
to RedisTimeSeries Discussion Forum
Due to the implementation of double delta compression we have disabled overwriting a sample. We may add an option to have multiple data points for one timestamp. Would that satisfy your use case?

Antariksh Goyal

unread,
Jan 7, 2020, 1:52:24 AM1/7/20
to Ariel Shtul, RedisTimeSeries Discussion Forum
So is this option added in the release, or will it be added in the future? And what do we mean by double data compression and is it implemented in the current release?

If adding over an existing value on timestamp is possible, post that, then it would be good for me.

Thanks,
Antariksh Goyal


On Tue, Jan 7, 2020 at 11:55 AM Ariel Shtul <ariel...@redislabs.com> wrote:
Due to the implementation of double delta compression we have disabled overwriting a sample. We may add an option to have multiple data points for one timestamp. Would that satisfy your use case?

--
You received this message because you are subscribed to a topic in the Google Groups "RedisTimeSeries Discussion Forum" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/redistimeseries/ZWUHXWE1HIk/unsubscribe.
To unsubscribe from this group and all its topics, send an email to redistimeseri...@googlegroups.com.

Antariksh Goyal

unread,
Jan 7, 2020, 8:00:27 AM1/7/20
to RedisTimeSeries Discussion Forum
Guy,

The TIMEBUCKET while creating the rule for destination key from source key, what does it mean exactly. So for example if the value of timebucket is 10 seconds, then the destination key will have values for every 10 sec gap, which in turn will have the aggregation(sum, avg, etc) in that timestamp during that 10 second time. No value will be counted twice in any time bucket? right?


On Sunday, January 5, 2020 at 1:07:39 AM UTC+5:30, Guy Korland wrote:

Guy Korland

unread,
Jan 7, 2020, 11:18:58 AM1/7/20
to RedisTimeSeries Discussion Forum
>The TIMEBUCKET while creating the rule for destination key from source key, what does it mean exactly.
>So for example if the value of timebucket is 10 seconds, then the destination key will have values for every 10 sec gap, which in turn will have the aggregation(sum, avg, etc) in that timestamp during that 10 second time.

Right

> No value will be counted twice in any time bucket? right?

Right

Antariksh Goyal

unread,
Jan 8, 2020, 2:26:55 AM1/8/20
to RedisTimeSeries Discussion Forum
@ariel, awaiting your response.


On Tuesday, January 7, 2020 at 12:22:24 PM UTC+5:30, Antariksh Goyal wrote:
So is this option added in the release, or will it be added in the future? And what do we mean by double data compression and is it implemented in the current release?

If adding over an existing value on timestamp is possible, post that, then it would be good for me.

Thanks,
Antariksh Goyal


On Tue, Jan 7, 2020 at 11:55 AM Ariel Shtul <ariel...@redislabs.com> wrote:
Due to the implementation of double delta compression we have disabled overwriting a sample. We may add an option to have multiple data points for one timestamp. Would that satisfy your use case?

--
You received this message because you are subscribed to a topic in the Google Groups "RedisTimeSeries Discussion Forum" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/redistimeseries/ZWUHXWE1HIk/unsubscribe.
To unsubscribe from this group and all its topics, send an email to redistimeseries+unsubscribe@googlegroups.com.

Ariel Shtul

unread,
Jan 8, 2020, 7:11:32 AM1/8/20
to RedisTimeSeries Discussion Forum


On Tuesday, 7 January 2020 08:52:24 UTC+2, Antariksh Goyal wrote:
So is this option added in the release, or will it be added in the future? And what do we mean by double data compression and is it implemented in the current release?

Multiple values on one timestamp will not be included in this version. 
Double delta compression saves you memory and works faster. It is a Win-Win and it is included in the coming v1.2 release 

If adding over an existing value on timestamp is possible, post that, then it would be good for me.

Thanks,
Antariksh Goyal


On Tue, Jan 7, 2020 at 11:55 AM Ariel Shtul <ariel...@redislabs.com> wrote:
Due to the implementation of double delta compression we have disabled overwriting a sample. We may add an option to have multiple data points for one timestamp. Would that satisfy your use case?

--
You received this message because you are subscribed to a topic in the Google Groups "RedisTimeSeries Discussion Forum" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/redistimeseries/ZWUHXWE1HIk/unsubscribe.
To unsubscribe from this group and all its topics, send an email to redisti...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages