ralphtrent
Programmer
Hi
I have a program that reads a file line by line. The line contains a date time stamp, a stored procedure name, all ot its parameters and then some additional details.
I need to be able to see how many stored procedures with the same exact parameters where executed. What i currently do is create two data tables.
The idea of the first DT is to hold the name of the SP and the parameters. The way he gets populated is as follows. I read the line of the file, I check to see if the sp_name and parameters combonation exists in the database using the Datatable.Rows.Find method. If the row exists I move on, if not I add the row and go on to populate the second database with the SP name and the other details. I finish going through the file. At the end, I take my second datatable and I dump the results to a file. When I write the row to the file (using a foreach on database.rows) I DataTable.Select against my first Datatable for rows where the SP_NAME = the current row's SP_NAME. This throws those rows into an array. I then say the length of the array is the number of unique SP executions.
Now for the problem. I am working with a 340 meg file, the memory usage of the program has peeked over 400k. What can I do to limit the amount of memory this program uses.
One thing I have tried was instead of holding onto the text of the parameters was to turn the parameter string into a number (using string.GetHashCode()), but the rarity happend where the hash number was the same for two different strings and that threw my numbers off.
Any idea's are GREATLY Appreciated.
Thanks.
I have a program that reads a file line by line. The line contains a date time stamp, a stored procedure name, all ot its parameters and then some additional details.
I need to be able to see how many stored procedures with the same exact parameters where executed. What i currently do is create two data tables.
The idea of the first DT is to hold the name of the SP and the parameters. The way he gets populated is as follows. I read the line of the file, I check to see if the sp_name and parameters combonation exists in the database using the Datatable.Rows.Find method. If the row exists I move on, if not I add the row and go on to populate the second database with the SP name and the other details. I finish going through the file. At the end, I take my second datatable and I dump the results to a file. When I write the row to the file (using a foreach on database.rows) I DataTable.Select against my first Datatable for rows where the SP_NAME = the current row's SP_NAME. This throws those rows into an array. I then say the length of the array is the number of unique SP executions.
Now for the problem. I am working with a 340 meg file, the memory usage of the program has peeked over 400k. What can I do to limit the amount of memory this program uses.
One thing I have tried was instead of holding onto the text of the parameters was to turn the parameter string into a number (using string.GetHashCode()), but the rarity happend where the hash number was the same for two different strings and that threw my numbers off.
Any idea's are GREATLY Appreciated.
Thanks.