i'm doing research iphone application idea have had, idea has 2 parts - app it's self , online end. end logging onto form of database. data structure straight forward iphone app have potential update , log end multiple times per minute. now, lets speculate, app becomes successful , have 1 000 000 iphones updating twice per minute. lets each time log send 512 bytes thats 1 024 000 000bytes per minute (512 * 2 * 1 000 000) or 1 gigabyte/min 1 474 560 000 000bytes/day or 1.5tb/day. lot of data being produced.
question how core data cope large quantities of data?
have read of developer docs on , can gather data stored in memory until tell save disk:
what intentional use of save function? every time application closes? every time have x amounts of object in memory , feel it's time free up?
what using end? mac?
there isn't issue in terms of quantity of data, maybe hosts internet connection bandwidth, shouldn't have issue otherwise.
not sure core data you, else have answer that.
writing objects? iphone app need real time access db or db used statistics?
can cache data on end , put queue. have separate thread read queue , write disk/database if want to.
if want write directly db, need batch rows (multiple rows / commit). have work db speed.
if not depending on live access, 1 strategy i've seen write flat file in csv format , cut off @ 50mb. each time limit reached, create new file incrementing number in filename ie., myfile0001.csv, myfile0002.csv, etc.. create scripts check next file in sequence , load using normal db utilities.
Comments
Post a Comment