I've been trying to save some data in the datastore following the google Bookshelf python example, with the gcloud python library, as explained in the tutorial.
It was working well to save small strings and times.
The problem is for other formats. For example, if I want to save a big JSON :
data['result'] = json.dumps(tags[0])
ds = datastore.Client(current_app.config['PROJECT_ID'])
if id:
key = ds.key('Pred', id)
else:
key = ds.key('Pred')
entity = datastore.Entity(
key=key,
exclude_from_indexes=['description'])
entity.update(data)
ds.put(entity)
return from_datastore(entity)
It says that the string is too long (more than 1500) to be accepted.
I've been trying other solutions, such as using the google appengine ndb library, but when using this library from a computer outside the compute engine, it said there was missing a proxy api for the datastore v3.
Sounds also the example in the tutorial follows DB pratices, and not NDB.
The same question as the question about how to insert a text format field in the datastore, is true for other formats.
Thanks