I have a situation where I am recieving very large data files in csv or text format which (due to our less than good network) I have to break into smaller files in order to load them into SQL Server. I need files of no more than 500,000 records - so it needs to be split by record count as opposed to file size.
I'd like to do this using Access VBA, but I am open to other tools as well. All of the free/shareware I see out there split files by byte size - and they don't seem to create usable text files as part of their process.
My thought is to use GetRows() and CacheSize instead of just looping through the recordset -but I am unfamiliar with how to efficiently make use of these commands. I think looping through 6 or 7 million records would just plain take too long...
I'm open to any suggestions!
Thanks in advance!
I'd like to do this using Access VBA, but I am open to other tools as well. All of the free/shareware I see out there split files by byte size - and they don't seem to create usable text files as part of their process.
My thought is to use GetRows() and CacheSize instead of just looping through the recordset -but I am unfamiliar with how to efficiently make use of these commands. I think looping through 6 or 7 million records would just plain take too long...
I'm open to any suggestions!
Thanks in advance!