Google cloudstorage returns 404 Not Found when trying to delete from bucket (localhost)

7,050 views
Skip to first unread message

Richard Cheesmar

unread,
May 7, 2017, 9:49:02 AM5/7/17
to Google App Engine
I am trying to delete a file (video) from Google Cloud Storage via the cloudstorage api but although the file exists I'm getting the following error:

cloudstorage.delete('/catchamove-video/products/6411421952770048.mp4')

*** NotFoundError: Expect status [204] from Google Storage. But got status 404.
Path: '/catchamove-video/products/6411421952770048.mp4'.
Request headers: None.
Response headers: {'transfer-encoding': 'chunked', 'date': 'Sun, 07 May 2017 12:31:47 GMT', 'server': 'Development/2.0'}.
Body: ''.
Extra info: None.
Both the bucket and file are present on the console.

I have several buckets and this file is not in the default bucket...

Adam (Cloud Platform Support)

unread,
May 7, 2017, 2:11:38 PM5/7/17
to Google App Engine
Is this called from an App Engine standard app in production? Are you authenticating using specific credentials or using the default application credentials?

Richard Cheesmar

unread,
May 7, 2017, 3:05:52 PM5/7/17
to Google App Engine
I'm calling this from the App Engine standard app in localhost...therefore assuming default application credentials.
Message has been deleted

Richard Cheesmar

unread,
May 7, 2017, 3:10:43 PM5/7/17
to Google App Engine
It works when I delete an image cloudstorage.delete(image). However, that image is in the apps default bucket.



On Sunday, May 7, 2017 at 9:11:38 PM UTC+3, Adam (Cloud Platform Support) wrote:

Adam (Cloud Platform Support)

unread,
May 8, 2017, 1:54:45 PM5/8/17
to google-a...@googlegroups.com
Just noticed the Development/2.0 string in the response. When testing on the development server, the Cloud Storage client only interacts with the default bucket or the bucket set as the default on the command line. You can change the default bucket by running dev_appserver.py with the --default_gcs_bucket_name [BUCKET_NAME] flag.

Richard Cheesmar

unread,
May 10, 2017, 10:37:59 AM5/10/17
to Google App Engine
Ok, Adam, thanks, I'll check that out when I finish what I'm on...

It is a bit of a pain though, as if you need to delete items from cloud that are in two different buckets in the same rpc, whilst testing on localhost, you can't.

Jordan (Cloud Platform Support)

unread,
May 16, 2017, 10:08:24 AM5/16/17
to Google App Engine
Ideally, performing local testing should never affect your production environment. By limiting the local devserver access to a single test bucket, it helps prevent potential harm to all of your Google Cloud Storage buckets and objects. This test bucket should also be just that, a 'test' bucket completely separate from the buckets being used in production. 

If you find that you require additional access, you can open a feature request and provide your business requirements for allowing the devserver to have access to all of your Cloud Storage buckets at once.  

Lawrence Schlosser

unread,
May 11, 2018, 9:52:33 AM5/11/18
to Google App Engine
I think I had the same or similar issue. Posted on SO:  https://stackoverflow.com/a/50286894/3765426

Lawrence Schlosser

unread,
May 11, 2018, 9:52:33 AM5/11/18
to Google App Engine
Not sure if this is the same exact issue, but I found that all of my gcs requests from dev_appserver.py were getting routed to 

localhost:8080/_ah/gcs/<BUCKET_NAME>/<OBJECT_NAME>

... as opposed to the "real" gcs url, e.g. 

https://www.googleapis.com/storage/v1/b/<BUCKET_NAME>/<OBJECT_NAME>

Therefore my GET requests to gcs always resulted in 404 errors.  

After digging through the source code of appengine-gcs-client, it looks like you must first set an access token if you would like to use dev_appserver to access live/remote content in gcs.

def set_access_token(access_token):
 
"""Set the shared access token to authenticate with Google Cloud Storage.
  When set, the library will always attempt to communicate with the
  real Google Cloud Storage with this token even when running on dev appserver.
  Note the token could expire so it's up to you to renew it.
  When absent, the library will automatically request and refresh a token
  on appserver, or when on dev appserver, talk to a Google Cloud Storage
  stub.
  Args:
    access_token: you can get one by run 'gsutil -d ls' and copy the
      str after 'Bearer'.
  """

 
global _access_token
  _access_token
= access_token


Some additional hints/options, cloudstorage.storage_api
def _get_storage_api(retry_params, account_id=None):
 
"""Returns storage_api instance for API methods.
  Args:
    retry_params: An instance of api_utils.RetryParams. If none,
     thread's default will be used.
    account_id: Internal-use only.
  Returns:
    A storage_api instance to handle urlfetch work to GCS.
    On dev appserver, this instance will talk to a local stub by default.
    However, if you pass the arguments --appidentity_email_address and
    --appidentity_private_key_path to dev_appserver.py it will attempt to use
    the real GCS with these credentials.  Alternatively, you can set a specific
    access token with common.set_access_token.  You can also pass
    --default_gcs_bucket_name to set the default bucket.
  """



In short, I was able to remedy the issue by simply setting an access token, e.g.

cloudstorage.common.set_access_token("<TOKEN>")

See the note in set_access_token docstring about how to get an access token.

Once that I did that, all of my gcs requests were properly routed.


libs

pip
google-api-python-client==1.6.4
GoogleAppEngineCloudStorageClient==1.9.22.1

gcloud
Google Cloud SDK 200.0.0
alpha 2018.04.30
app-engine-python 1.9.69
app-engine-python-extras 1.9.69
beta 2018.04.30
bq 2.0.33
cloud-datastore-emulator 1.4.1
core 2018.04.30
gsutil 4.31



On Sunday, May 7, 2017 at 6:49:02 AM UTC-7, Richard Cheesmar wrote:
Reply all
Reply to author
Forward
0 new messages