Problem:
Consider a computer with identical interpreters at level 1, 2 and 3. It takes an interpreter n instructions to fetch, examine, and execute one instruction. A level 1 instruction takes k nanoseconds to execute. How long does it take for an instruction at levels 2, 3, and 4.
My work:
Level 4 - n^4
Level 3 - n^3
Level 2 - n^2
Level 1 - n^1 or n (takes k nanoseconds)
n = number of instructions it takes to fetch, examine, and execute one instruction
k = nanoseconds to execute a level 1 instruction
Does anyone know if I set the problem up correctly? The question in general really confuses me.