1)UNIX operating system uses Round robin Time slicing with multilevel feed back.
Assume that there are 10 processes waiting in a queue which is implemented as a linked list of PCB’s ( process control blocks). Assume the PCB have information about Process ID , CPU burst time required , amount of memory being used .
Assume the time slice is 2 units. Simulate Round robin Time slicing until all the jobs complete and find average waiting time. Modify your program to include random arrival of jobs with a fixed burst time required and find the average waiting time of the jobs completed over a simulation time of 100 units.
Can any expert help me to explain what this question mean?I need to do this in C or C++ for my assignment. Any example for me to refer?
I know round robin, but to implement a linked list of PCB anf some other information like "PCB have information about Process ID , CPU burst time required , amount of memory being used ." and time slice =2units make me confuse. Please help me, I really desperate to get the solution. I am scare I will fail this subject. thank you very much.