Right then I have got a CSV file that has been provided by a third party that needs to be used on a website we are devloping. This CSV file has got 22644 rows in it at the moment with 16 fields per row ! I know this is not the most efficient way, db replication would be better and preferred, by our hand has been forced in this issue !
When i try and connect to the csv file using CFHTTP I get an error stating: "Invalid file format". This only happens when i use the full file (2.2MB) when i use a cut down version the csv file it (27 rows, 16 fields pr/) executes as it is meant to do !
Is there only so much information that CFHTTP can drag down?
I can't seem to find any mention of file size limitations in documentation. could this be the case ?
When i try and connect to the csv file using CFHTTP I get an error stating: "Invalid file format". This only happens when i use the full file (2.2MB) when i use a cut down version the csv file it (27 rows, 16 fields pr/) executes as it is meant to do !
Is there only so much information that CFHTTP can drag down?
I can't seem to find any mention of file size limitations in documentation. could this be the case ?