Hi
I have a data logger that records different lenghts of data. I download it in my application and create graphs.
I need to show a progressbar to indicate how long the download is going to take. I get data from the logger to indicate how long the data run was. I have timed it and it takes about 1second to download 1.68 minutes of data.
If the logger run was 44minutes, it takes 26 seconds to download.
when I set my progressBar.Max = 100 and timer,Interval = 260 milliseconds, the progressBar finishes in about 6 second instead of 26 seconds.
What am I doing wrong?
private void progressTimer(object sender, EventArgs e)
{
subData1 = startValues1.Split(new char[] { ' ' });
double progressTime = Convert.ToDouble(subData1[13]);
int tValue = Convert.ToInt16(progressTime /1.68);
timer1.Interval = tValue;
// Increment the value of the ProgressBar a value of one each time.
progressBar1.Increment(1);
// Determine if we have completed by comparing the value of the Value property to the Maximum value.
if (progressBar1.Value == progressBar1.Maximum)
// Stop the timer.
timer1.Stop();
}