Ask Question
9 October, 09:50

Suppose a program takes 1000 machine instructions to run from start to end, and can do that in 10 microseconds when no page faults occur. How long will this same program take if 1 in every 100 instructions has a page fault and each page fault takes 100 milliseconds to resolve?

+4
Answers (1)
  1. 9 October, 12:47
    0
    (10^6 + 9.9)

    Explanation:

    Given:

    Total number of machine instructions = 1000

    Number of page fault in 100 instructions = 1

    Number of page faults in 1000 instructions = 10

    Time to serve one page fault = 100 milliseconds

    Time to serve ten page faults = 100*10 milliseconds = 1000 milliseconds = 10^6 Microseconds

    Number of instructions without any page fault = 1000 - 10 = 990

    Time required to run 1000 instructions = 10 Microseconds

    So, time required to run 990 instructions = (10 * (990/1000)) Microseconds = 9.9 Microseconds

    So, the total time required to run the program = (10^6 + 9.9) Microseconds
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “Suppose a program takes 1000 machine instructions to run from start to end, and can do that in 10 microseconds when no page faults occur. ...” in 📙 Computers & Technology if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers