Somethings to consider:
1) How important is this feature to the whole application, e.g. how much time/effort should be taken to work on this?
2) How accurate do you need to be? Will it matter if the calculations are off by 10%, 50%, 200% etc.
3) How long are the downloads going to be, seems to me that more accurate calcs are needed if the download time is expected to be minutes or hours vs seconds or fraction of a second.
For fun lets say that it is very important, needs to be highly accurate and the download times are in minutes to hours. I will add to that the assurtions that the more recent download speed is more important then older download speed.
1) Determine the speed for the last unit of time (Say second) and put it into a data struction (Array)
2) Average the array elements in groups (Say the first 1/3, 2nd 3rd, lst 3rd. Or maybe the first 60%, then next 30% and the last 10% ) The more groups the better. But I suspect there will be dimenising returns
3) Add up the averages with a weight factor example:
( Avg Group-1 * 1 ) + ( Avg Group-2 * 3 ) + ( Avg Group-3 * 6) In this case 60% of the data comes from the most recent download speed information
4) Final download speed = ( Weighted Average ) / (Total Weights)
Play with the number of groups, elements in the groups and weights to find the best solution. I suggest while debugging save the group values, projections and final actual times in a data structure to review
Lion Crest Software Services
Anthony L. Testi