we need to write simulation program to imitate the behavior of a single CPU system that has a preemtive RR scheduler and the collecting data regarding the operation of the simulation, create a input file to represent process arrival and service times
30 0.7
54 17.28
.
.
this EXP file means that first process arrives at time 30 and request 0.7 from CPU time and the second at time 54 request 17.28 sec and son on.after you have implimented the simulation run simulation experiments using the fixed input load but change the dispatcher overhead time to be the values 0,5,10,15,25 and 25 milliseconds and time quanta of 50,100,250 and 500 milliseconds.
after all determine the average wait time and the average turnaround time for all processes
anyone can help ?