python - Google App Engine Efficient Data store Read/Write Operation to Save Quota -
i've created google app engine apps using python.the application deals lot of user names.
it has got database 50k usernames. each user name has unique hash value. stored in data store.
when app user submit user name. application first checks if username exist in db.
if new user name, application calculate new hash new name , store name , hash in datastore.
if user name exist in datastore, retrieve old hash data store.
sample code:
class names(db.model): name = db.stringproperty(required=true) hash = db.stringproperty(required=true) username = "debasish" user_db = db.gqlquery("select * names name=:1", username) user = user_db.get() if user == none: #doesn't exist in db..so calculate new hash name , store in db e = names(name=username,hash="badasdbashdbhasbdasbdbjasbdjbasjdbasbdbasjdbjasbd") e.put() else: #retrieve old hash. self.response.out.write('{"name":"'+user.name+'","hash":"'+user.hash+'"}')
the problem i'm facing gae's free data store read operation quota.its exceeding , application stop working.
i've tried implement memcache,like , adding entire db in memcache. failure,result more bad.
def get_fresh_all(self): all_names = db.gqlquery("select * names") memcache.add('full_db', all_names, 3600) return all_names
so,guys please suggest , doing wrong?? how can make data store read operations more efficiently??
thanks in adv.
you can:
- switch ndb caching automatic
- query keys instead of entities
select __key__ from
... - reduce related indexes (surely decreases write ops, perhaps read ops)
- rewrite entities username key_name , use method get_or_insert()
user = names.get_or_insert("debasish", hash="badasdbashdbhasbd")
Comments
Post a Comment