Hello,
I'm trying to load data to the datastore to use my application.
I'm loading more than 13.000 entities of one kind from a cvs file via the bulkloader.
Since the new free quota, loading fail by consuming all quota loading only 1.000-2.000 entities.
I'm using the python sdk 1.5.3 (I've not yet tested the 1.6.0).
The command line I use is :
appcfg.py upload_data --batch_size=1000 --rps_limit=1000 --config_file=bulkloader.yaml --filename=contrats.csv --kind=Contrat --url "https://%APP_HOST%.appspot.com/remote_api"
My bulkloader.yaml look like this (not every properties)
transformers:
- kind: Contrat
connector: csv
connector_options:
# TODO: Add connector options here--these are specific to each connector.
encoding: windows-1252
import_options:
dialect: 'excel'
delimiter: ';'
export_options:
dialect: 'excel'
delimiter: ';'
property_map:
- property: __key__
external_name: key
export_transform: transform.key_id_or_name_as_string
import_transform: transform.create_foreign_key('Contrat', key_is_id=True)
- property: canalClientId
external_name: canalClientId
# Type: Integer Stats: 6010 properties of this type in this kind.
import_transform: transform.none_if_empty(int)
- property: codePostalPDL
external_name: codePostalPDL
# Type: String Stats: 7135 properties of this type in this kind.
My file contrats.csv is obtain by downloading data from another server using the bulkloader.
Upload is working on the local dev server (not the same command line) and on another paying appengine app.
How can trying to load 13k entities can consume 50k datastore entities (and loading only 2k) ?
(Datastore Write Operation was at 0% before the first try)
![]()
Thanks,
Mathieu