0
python script slowing the data load to a database

hi I am trying to load data to a database (mysql and sqlserver) using insert sql scripts in python cur.execute(query,(data list)) , after inserting 100 mb volume of data the loading process is slowing down each record 7 min approximatley after 100mb data . am running the python code in linux redhat 8 .how can i optimise this so that it runs the same as first 100 mb data

env : linux ram : 8 gb instance type : t2 medium python 3.8


Python 23-10-20, 6:04 p.m. samrtr
0
This is beyond the scope of the tutorials. Please consider using an ORM such as SQLAlchemy. Please check the active connections on your SQL DB and increase if necessary. You can also try SQLite
22-04-21, 10:41 a.m. ankitrj.iitb


Log-in to answer to this question.