Google Drive Download Quota Exceeded Reddit

0 views
Skip to first unread message

Kipa Crawn

unread,
Aug 5, 2024, 6:54:48 AM8/5/24
to mingligagar
SoI have been using google business account for a while and have plenty of data on the drive. However, since yesterday the number of files limit on the drive has reached as rclone is giving me the following IO error:

Files + Folders approx. total looks like 4.5M. There are lots of small files. Some of the non-critical files (30,000 or so) I have removed since yesterday and cleared them from the trash to see if that makes a difference but the error has still persisted.


Thank you. I have visited that page yesterday and it has no mention of any quota for maximum number of files/folders allowed on google drive and that I have exceeded this quota. That is what the error implies though. Wondering if anyone else has had this error before? Surely I am not alone.


I am having the same issue since 14th of February.

rclone and other services uploading to my google drive come up with errors.

I can not upload anything through the web interface as well.

Running on an "Google Workspace Enterprise Standard" subscription with a single account.

This carried over from a "G Suite Enterprise" subscription with the same single user.

Admin console shows 107.91 TB of shared 5 TB used.

This subscription has been above 100 TB for a long time now.

Google Workspace support told me to contact API support for the error mentioned by OP.


Mind you, I have 800TB of available space with all of my accounts, only using 111TB currently. The folder in question is well below the 400k limit as well. According to the admin interface, the shared drive in question is only using 26% of its 400k quota.


I am unable to sync/copy/download files from one team drive to another team drive. I am getting the errors 'file has been downloaded too many times' and 'user rate limit exceeded' however this has been happening for over a week so i dont think there is any bandwidth issue.


I started getting this issue when i upgraded to a different server however some friends are also having similar issue and i have found a few posts on reddit with something similar. I should note that i have been using this setup for several months and has suddenly broke.


The 2 remotes are 2 different team drives on different accounts. I have made sure that both accounts have access to each team drive. I also have setup different API's for each remote so there should not be any issues there.


I realised that the backup account is not able to download directly from the main drive (download quota exceeded), however i am able to download this from the main drive account. As a quick fix i have setup a remote pointing to the backup drive using the main account, however this uses the bandwidth from the main account which i use for other purposes so this is not ideal.


I do not have the mount script i have been using as i lost this when i migrated server, however i was using the old cache remote with crypt (gdrive -> cache -> crypt), and did not have any issues with this for the few months this was used. In the past couple of days, i have been trying out the new rclone vfs cache mode full however this does not seem to have fixed any issues.


My issue is that i have not been able to copy files from the main drive to the backup drive for over a week, so im not sure if this is because of the 750GB limit. I was planning on making this thread a few days ago but i wanted to make sure that this wasnt because of the limit.


I get the same error with server side copying. From what i remember, google has different bandwidth limits from server side copy and uploading it. I'll give it a few days of not doing any transfers and see if my issue gets fixed


hey dude, i have been having this issue for about a month now. previously i have been able to copy 750 gb / day fine without any problem. now I can't even copy 1 gb of data. any update on your end on this issue?


After i made my last post it seemed to be working fine, until last night. Yesterday i was adding a lot of media to my plex server, and i guess with the intro detection it was downloading a lot and caused the bandwidth issue. I believe that something similar happened last week when i first had the issue - i had added a lot of media and used plex to read the metadata in the file instead of getting it elsewhere. This mustve downloaded the files causing a bandwidth issue.


What's confused me is that im getting a bandwidth issue on my backups as well as my main drive. A few months ago, if i had gotten a bandwidth error on my main drive, i would just mount the backup and everything would be fine, however now, the backups seem to be linked to the main drive. Whether this is because of the way google deduplicate content on drive, i don't know. I use crypt so technically i should be the only one with that file stored in all of drive.


It seems like im getting throttled based on directory rather than the files. For example last night, I could not download anything from the 'TV Shows' folder, however the 'Youtube' folder in my drive was fine.


In terms of fixing your problem, i can only suggest waiting it out and hope that it gets fixed, i didnt do anything special or particular the first time. If you're also using plex (or alternatives), id recommend turning off extras like intro detection, or at least limit it to when maintenance occurs, as this seems to have caused my issues.


Perhaps this was the case, as it stands, im getting a download quota error for all of the files in this drive now. Im not sure what i did last week to fix the issue, Ive not made any transfers to or from this remote for a few days now and im still getting bandwidth issues. Im 90% sure google has changed something behind the scene because ive not done anything different in the past 9 months


I am having the same issue. Even on new files i upload just to test i get the "user rate limit exceeded" when trying to server side copy from my team drive to my back up team drive. I cant watch certain files on plex either some of which i was previously able to view. Ive copied TBs of data before with zero issues and now suddenly even though i havnt copied any data in over a month im getting this weird issue. At first i thought it was an RClone issue, but i think this is something google has changed behind the scenes.


Do you have your drive encrypted using the crypt remote? im sure that this isnt a fault of rclone, but google drive something behind the scenes, we are not the only ones that are having this issue, and i cant find a way to fix it other than waiting and hoping for the best.


So the team drive im using is "owned" by email "A" and I usually upload things with that email. However, ive noticed that all of the files which arent working were uploaded with email "B". since i sometimes upload files to that team drive with whatever email address i happen to be logged in at at the time.


If i delete files which arent working and reupload them with email "A" they suddenly work. If i delete that same file and upload it with email "B" they no longer work on plex. which is weird since they both have the same managing rights. And like i said, files which used to work are now not working. So i dont know what changed, but anything uploaded with my other email to my team drive isnt being read properly. Rclone has access to both accounts so i dont know whats happening.


Hello,

I want to do a server side copy of my files from one google team drive to another. I am the manager of both team drives.

I am completely inexperienced when it comes to Rclone or running commands. which means I don't know how to do even the most basic of things. I only managed to mount my team drive because of a step by step guide on reddit.


step3 - Tranfer files normally using the regular move/copy/sync commands

Rclone will detect if it can use server-side (assuming it is enabled) and use that instead of piping it through your local computer. If you have verbose output on it will indicate this in the output, but of course it will be obvious from just looking at your network graph also...


If are you are completely fresh of the boat - an example of how to copy from X to Y:

rclone copy MyRemoteName1:/backupfolder MyRemoteName2:/backupfolder -P -v

(-P gives you a progress indicator and -v will show verbose output, neither are required but they can be useful to see what is going on while you are still unfamiliar with the process).


Give that a try and come back to ask questions once you get stuck on something.

I am more than willing to help - but there is little point in me explaining how to set up a basic remote when there is both a guide and a built-in configuration menu to help you do this. I'd much prefer answering more spesific questions if you get stuck


thanks, ill try it out tomorrow and let you know how it went. I already know how to set up remotes. quick question though, should i set up a separate API for the new remote or use the same one im using for the first team drive?


The benefit to using the same across all is that making sure the required data-access for the user is in place is much easier. Otherwise you have to make sure to share access for all backup accounts back to the primary user. Teamdrives are easier to manage in this regard.


tried it out, and it works perfectly! thanks a lot man. im assuming ill still run into the 750gb daily limit. will it just throw up an error if i do? and when i start it up the next day, do i have to add anything so it doesnt try and recopy anything its already copied? or will it just automatically skip those items?


But if you run into the limit nothing bad happens... rclone will just continue trying a whole lot and eventually error out on transfering some files. Simply running the same command again when you have more quota to use will transfer any files that errored out because of this.

3a8082e126
Reply all
Reply to author
Forward
0 new messages