I'm trying to import data created from a production scheduling system into a database that serves my own system.
Thing is, the scheduler generates its result as text files (in a specific format) and I need to import that data into my system DB.
This file can have huge amount of data, like 10.000+ lines(each line in the text file will be a record in DB). The way I was thinking on importing it would be reading the file line by line, parsing it accordingly and insert into DB. But as you can see if 1 line = 1 INSERT statment this will mean my program will hit the RDBMS 10.000+ times and the performance will suffer really bad. And since a rescheduling might be performed several times a day, this might need to be run also several times a day.
I can't see another lighter way of doing this, so I'd appreciate to hear some alternatives that could help me out reducing the load of the importing.
TIA
Thing is, the scheduler generates its result as text files (in a specific format) and I need to import that data into my system DB.
This file can have huge amount of data, like 10.000+ lines(each line in the text file will be a record in DB). The way I was thinking on importing it would be reading the file line by line, parsing it accordingly and insert into DB. But as you can see if 1 line = 1 INSERT statment this will mean my program will hit the RDBMS 10.000+ times and the performance will suffer really bad. And since a rescheduling might be performed several times a day, this might need to be run also several times a day.
I can't see another lighter way of doing this, so I'd appreciate to hear some alternatives that could help me out reducing the load of the importing.
TIA