Download Quota Exceeded Google Drive 2022 ((FREE))

0 views
Skip to first unread message

Sunniva Delrio

unread,
Jan 18, 2024, 3:36:10 PM1/18/24
to wanpepenma

It's probably wide known, but Google is an asshole and puts a download quota to every file. If you ever come across a file that has exceeded the quota, you need to grab your own google account and follow these steps:

download quota exceeded google drive 2022


Download Ziphttps://t.co/MhwKi0kBYj



You'll have to wait for google to zip the file, but at least now it's downloadable. It's absurd to me that Google would rather waste energy and cpu time to zip down a file that already exists instead of just letting you just download it. This is anticonsumer. I will never ever buy the premium version of google drive in my life.

However, there is a google drive download limit. If a file on Google Drive is being viewed and downloaded by a large number of users, the file may be locked for 24 hours before the quota is reset, because Google Drive wants to minimize the potential abuse of its server.

I need to store a dataset (120GB) in my drive to later train a model on it. If I upload it from my PC its taking a hell lot of time. So I thought I will directly download to Drive through Colab, I downloaded 20 GB tar file from the source using wget directly to Drive using the following:

I am using my college email id so there is TBs of drive space available and the internet connection is also good, I have tried this many times but at some point, this pops up and the data doesn't go into drive.

What is the problem? How do people work with large datasets on colab. Furthermore after I get these video clips, I have to preprocess them on colab itself and store the final preprocessed data (120GB) to the drive. Now I am not so sure its as simple as just running the code as I am stuck in the first part only, give me some ways to get around this issue.

You just need to wait till the activity stops. In my case, when the code finished running and when the "Quota limit has been exceeded" pop up showed up, I thought the data stops beings uploaded, but in the background it finishes adding the whole data to drive slowly given enough time, so essentially there is no problem.

Why do Drive operations sometimes fail due to quota?
Google Drive enforces various limits, including per-user and per-file operation count and bandwidth quotas. Exceeding these limits will trigger Input/output error as above, and show a notification in the Colab UI. A typical cause is accessing a popular shared file, or accessing too many distinct files too quickly. Workarounds include:

Copy the file using drive.google.com and don't share it widely so that other users don't use up its limits.Avoid making many small I/O reads, instead opting to copy data from Drive to the Colab VM in an archive format (e.g. .zip or.tar.gz files) and unarchive the data locally on the VM instead of in the mounted Drive directory.Wait a day for quota limits to reset.

Individual users can only upload 750 GB each day between My Drive and all shared drives. Users who reach the 750-GB limit or upload a file larger than 750 GB cannot upload additional files that day. Uploads that are in progress will complete. The maximum individual file size that you can upload or synchronize is 5 TB.

I am unable to sync/copy/download files from one team drive to another team drive. I am getting the errors 'file has been downloaded too many times' and 'user rate limit exceeded' however this has been happening for over a week so i dont think there is any bandwidth issue.

The 2 remotes are 2 different team drives on different accounts. I have made sure that both accounts have access to each team drive. I also have setup different API's for each remote so there should not be any issues there.

I realised that the backup account is not able to download directly from the main drive (download quota exceeded), however i am able to download this from the main drive account. As a quick fix i have setup a remote pointing to the backup drive using the main account, however this uses the bandwidth from the main account which i use for other purposes so this is not ideal.

I do not have the mount script i have been using as i lost this when i migrated server, however i was using the old cache remote with crypt (gdrive -> cache -> crypt), and did not have any issues with this for the few months this was used. In the past couple of days, i have been trying out the new rclone vfs cache mode full however this does not seem to have fixed any issues.

My issue is that i have not been able to copy files from the main drive to the backup drive for over a week, so im not sure if this is because of the 750GB limit. I was planning on making this thread a few days ago but i wanted to make sure that this wasnt because of the limit.

What's confused me is that im getting a bandwidth issue on my backups as well as my main drive. A few months ago, if i had gotten a bandwidth error on my main drive, i would just mount the backup and everything would be fine, however now, the backups seem to be linked to the main drive. Whether this is because of the way google deduplicate content on drive, i don't know. I use crypt so technically i should be the only one with that file stored in all of drive.

It seems like im getting throttled based on directory rather than the files. For example last night, I could not download anything from the 'TV Shows' folder, however the 'Youtube' folder in my drive was fine.

Perhaps this was the case, as it stands, im getting a download quota error for all of the files in this drive now. Im not sure what i did last week to fix the issue, Ive not made any transfers to or from this remote for a few days now and im still getting bandwidth issues. Im 90% sure google has changed something behind the scene because ive not done anything different in the past 9 months

I am having the same issue. Even on new files i upload just to test i get the "user rate limit exceeded" when trying to server side copy from my team drive to my back up team drive. I cant watch certain files on plex either some of which i was previously able to view. Ive copied TBs of data before with zero issues and now suddenly even though i havnt copied any data in over a month im getting this weird issue. At first i thought it was an RClone issue, but i think this is something google has changed behind the scenes.

Do you have your drive encrypted using the crypt remote? im sure that this isnt a fault of rclone, but google drive something behind the scenes, we are not the only ones that are having this issue, and i cant find a way to fix it other than waiting and hoping for the best.

So the team drive im using is "owned" by email "A" and I usually upload things with that email. However, ive noticed that all of the files which arent working were uploaded with email "B". since i sometimes upload files to that team drive with whatever email address i happen to be logged in at at the time.

If i delete files which arent working and reupload them with email "A" they suddenly work. If i delete that same file and upload it with email "B" they no longer work on plex. which is weird since they both have the same managing rights. And like i said, files which used to work are now not working. So i dont know what changed, but anything uploaded with my other email to my team drive isnt being read properly. Rclone has access to both accounts so i dont know whats happening.

I have tried searching in both developer console IAM (I can't find any quotas related to storage for the service account). And in admin.google.com for our organisation but no user-based quotas are enabled neither can I find the service account there.

I'm also in the same situation.
And I can't find what the quota is for the service account.
And in my Google Drive, uploads made by the service account appear as shared with me and don't consume space from my Google Drive.
And when I delete it from Google Drive it doesn't seem to release the quota for the service account.
I'm uploading via a shell script.
Does anybody know how to solve this?
My Google Drive is not for business.

EDIT2: Apparently the space is based on the service account, not the user. In order to use the space of the user, the service account needs to be configured to *impersonate* the user (see -to-fix-the-storage-exceeded-issue-for-google-drive-...). I couldn't be bothered with this, so I just created a new service account, shared my gdrive folder, use the new key in my shell script, and got it to work again.

So I have a HTML5 video player that plays a video from Google Drive. The video is more than 100MB in size, so I created a Google Drive API key and included it in the code for the video. However, I can only play about five videos until my quota runs out. Am I doing something wrong? Is there a more efficient way to play the video, without constantly running out of my quota? This is the error I'm getting:

User Rate Limit Exceeded. Rate of requests for user exceed configured project quota. You may consider re-evaluating expected per-user traffic to the API and adjust project quota limits accordingly. You may monitor aggregate quota usage and adjust limits in the API Console: =XXXXXXX"

In the API Console, there is a similar quota referred to as Requests per 100 seconds per user. By default, it is set to 100 requests per 100 seconds per user and can be adjusted to a maximum value of 1,000. But the number of requests to the API is restricted to a maximum of 10 requests per second per user.

Individual users can only upload 750 GB each day between My Drive and all shared drives. Users who reach the 750-GB limit or upload a file larger than 750 GB will be blocked from uploading additional files that day.

On demand I can make a video of how Google fixed this issue and you can no longer override their Drive Quota error by simply copying to your Drive.
What happens when you copy to drive is it creates a virtual shortcut to Zibo's original file (remember ? the one with Quota Error?) and even if you do see it under "my drive" the file is still owned by Ziboman himself and its not a PHYSICAL copy (think of all the bandwidth and disk space Google would have to waste because of our arrogance - amazing huh?) hence each time I try to download it even after linking it to my drive it gives out a Quota error.

df19127ead
Reply all
Reply to author
Forward
0 new messages