sub directory too large to scan

2 views
Skip to first unread message

Ken Carlile

unread,
Nov 11, 2025, 9:03:33 AM (11 days ago) Nov 11
to Discuss
Hi folks, 

I'm in the midst of a migration from an old storage system to a new one. I'm not using Globus as my primary data mover, but in order to help the process along, I'm using Globus to copy from an S3 bucket (on a local-ish ceph cluster) which has a copy of the data to the new storage system. 

In any case, I've been seeing "a resource or processing limit was exceeded" errors on some of the transfers, with the details as follows: 
{
  "error": {
    "details": "A sub directory is too large to scan"
  }
}
I'm guessing there's probably nothing I can do about this, but I'm curious what the limit I'm hitting is. I also wonder if this has to do with the fact the source is a bucket rather than a POSIX share. 

Thanks, 
Ken

James Kube

unread,
Nov 11, 2025, 10:27:23 AM (11 days ago) Nov 11
to Ken Carlile, Discuss
Hi Ken,

The error being encountered generally occurs when a directory contains more than 5,000,000 files in its root.

We have more details on this, as well as other Transfer limits, available here.

Also, we're more than happy to take a look at the transfer as well, if you'd like us to further review, please open a ticket by sending an e-mail to sup...@globus.org (or directly via the ticketing system here) and we'll further investigate.

Thank you,
James
--
James Kube
Technical Support Engineer
University of Chicago, Globus
www.globus.org
Reply all
Reply to author
Forward
0 new messages