Inserting millions of records into MySQL database using Python -


i have txt file 100 million records (numbers). reading file in python , inserting mysql database using simple insert statement python. taking long , looks script wouldn't ever finish. optimal way carry out process ? script using less 1% of memory , 10 15% of cpu.

any suggestions handle such large data , insert efficiently database, appreciated.

thanks.

sticking python, may want try creating list of tuples input , using execute many statement python mysql connector.

if file big, can use generator chunk more digestible.

http://dev.mysql.com/doc/connector-python/en/connector-python-api-mysqlcursor-executemany.html


Comments

Popular posts from this blog

PHP DOM loadHTML() method unusual warning -

python - How to create jsonb index using GIN on SQLAlchemy? -

c# - TransactionScope not rolling back although no complete() is called -