django - Python - Parsing large text file and inserting data into database -


i may undertaking large of project noob, i'm trying host unofficial api kickasstorrents. currently, offer text dumps of entire database around 650mb.

right i'm reading text file python , inserting database using django's orm:

with open('hourlydump.txt', 'r') f:         line in f:             sections = line.split('|')              torrent.objects.create(...) 

using hourly dump test (which ~900kb), came execution time of 2 minutes. scaling 700mb speed impractical.

i'm thinking problem has solution, i'm not sure be. i'm sure time load entire database own still significant, i'm hoping there's more efficient solution don't know reduce execution time less 25 hours.

edit: bottleneck inserting database.

inserting orm:

 $ python manage.py create_data    execution time: 134.284000158 

just creating objects , storing them in list:

$ python manage.py create_data execution time: 1.18499994278 

i appreciate guidance might have.

welp, i'm dumb.

bulk create new friend.

if has same problem, don't insert rows 1 @ time. each create() call 1 insert statement. instead, add objects list, , bulk_create(the_list).


Comments

Popular posts from this blog

PHP DOM loadHTML() method unusual warning -

python - How to create jsonb index using GIN on SQLAlchemy? -

c# - TransactionScope not rolling back although no complete() is called -