One of the Datasource compaction sudenly stopped working

15 views
Skip to first unread message

pathik poddar

unread,
Jun 10, 2025, 2:32:13 AMJun 10
to druid...@googlegroups.com
Hi All,

In our cluster most of the datasource having compaction on and all are working except one. 
It has 11 TB data to compact and compaction is not working. We have checked coordinator log but no warning or error for that data source. 
What can we do?

Thanks,
Pathik 

Daniel Nash

unread,
Jun 10, 2025, 1:32:48 PMJun 10
to druid...@googlegroups.com
Pathik,

Would you be able to post the auto-compaction configuration for that datasource here?

The only thing that immediately occurred to me was that perhaps you are not using the default inputSegmentSizeBytes of 100 TB and that it was set smaller than the amount of data you have in a single granularity timechunk, e.g. assuming you have day granularity, inputSegmentSizeBytes is set to 1 TB and you have more than 1 TB of data in a single day.  The documentation for that parameter states "Since a time chunk must be processed in its entirety, if the segments for a particular time chunk have a total size in bytes greater than this parameter, compaction will not run for that time chunk." 

~Dan


--
You received this message because you are subscribed to the Google Groups "Druid User" group.
To unsubscribe from this group and stop receiving emails from it, send an email to druid-user+...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/druid-user/CAPvkwwVxjqyY_0Pvf5RR_VHWikn%2BZhpX93cKTXeEM2pE-LVZTg%40mail.gmail.com.

pathik poddar

unread,
Jun 10, 2025, 4:00:11 PMJun 10
to druid...@googlegroups.com
Hi ,

It was set to 100 tb only . I have submitted compaction for that datasource from backend and it is working.

Thanks,
Pathik 

Ben Krug

unread,
Jun 10, 2025, 6:00:12 PMJun 10
to druid...@googlegroups.com
I noticed that you mentioned checking the coordinator log.  Have you also checked overlord logs, and if tasks are kicking off, the task logs?

Reply all
Reply to author
Forward
0 new messages