google app engine - How to avoid "safety" over quota panic when accessing datastore ? (billing is enabled) -


i deployed site google app engine (using golang , datastore 1000 records). billing enabled , daily budget established. quota details page indicates under quota. doing urlfetch obtain tsv file use build data entities in datastore.

two problems:

  1. only 778 entities create - log indicates long running process appears terminate prematurely without error message. docs normal
  2. the second step involves creating json file entities in datastore. process causes "panic: overquota" because process taking long suppose.

how proceed? should divide tsv datafile several smaller files? can request "more time" don't go on safety quotas?

important note datastore part of developers console showing problems: although application has access 778 datastore entities, console reports 484 entities of kind total of 704 entities of kinds (actually 933)

i've been working @ while , wondering if there going on system or there things can data entities set properly. wish find more read safety quotas... ... , remote api working! thanks!

it depends on doing processing both of these use cases within appengine platform.

for example if performing urlfetch file process within frontend instance have 60 seconds processing. app engine requires frontend instances respond each request within 60 seconds.

i'm making assumption doing, request being terminated. around time restriction should move type of batch data processing taskqueue each task required completed within 10 minutes.

the same holds true reads. either need @ how reading data datastore or need batch either deferred task or pipeline.

do have snippet can share how composing json?


Comments

Popular posts from this blog

google chrome - Developer tools - How to inspect the elements which are added momentarily (by JQuery)? -

angularjs - Showing an empty as first option in select tag -

php - Cloud9 cloud IDE and CakePHP -