Ask Question
26 October, 18:34

Suppose a host has a 1-MB file that is to be sent to another host. The file takes 1 second of CPU time to compress 50%, or 2 seconds to compress 60%.

(a) Calculate the bandwidth at which each compression option takes the same total compression + transmission time.

(b) Explain why latency does not affect your answer.

+4
Answers (1)
  1. 26 October, 22:01
    0
    Answer: bandwidth = 0.10 MB/s

    Explanation:

    Given

    Total Time = Compression Time + Transmission Time

    Transmission Time = RTT + (1 / Bandwidth) xTransferSize

    Transmission Time = RTT + (0.50 MB / Bandwidth)

    Transfer Size = 0.50 MB

    Total Time = Compression Time + RTT + (0.50 MB / Bandwidth)

    Total Time = 1 s + RTT + (0.50 MB / Bandwidth)

    Compression Time = 1 sec

    Situation B:

    Total Time = Compression Time + Transmission Time

    Transmission Time = RTT + (1 / Bandwidth) xTransferSize

    Transmission Time = RTT + (0.40 MB / Bandwidth)

    Transfer Size = 0.40 MB

    Total Time = Compression Time + RTT + (0.40 MB / Bandwidth)

    Total Time = 2 s + RTT + (0.40 MB / Bandwidth)

    Compression Time = 2 sec

    Setting the total times equal:

    1 s + RTT + (0.50 MB / Bandwidth) = 2 s + RTT + (0.40 MB / Bandwidth)

    As the equation is simplified, the RTT term drops out (which will be discussed later):

    1 s + (0.50 MB / Bandwidth) = 2 s + (0.40 MB / Bandwidth)

    Like terms are collected:

    (0.50 MB / Bandwidth) - (0.40 MB / Bandwidth) = 2 s - 1s

    0.10 MB / Bandwidth = 1 s

    Algebra is applied:

    0.10 MB / 1 s = Bandwidth

    Simplify:

    0.10 MB/s = Bandwidth

    The bandwidth, at which the two total times are equivalent, is 0.10 MB/s, or 800 kbps.

    (2). Assume the RTT for the network connection is 200 ms.

    For situtation 1:

    Total Time = Compression Time + RTT + (1/Bandwidth) xTransferSize

    Total Time = 1 sec + 0.200 sec + (1 / 0.10 MB/s) x 0.50 MB

    Total Time = 1.2 sec + 5 sec

    Total Time = 6.2 sec

    For situation 2:

    Total Time = Compression Time + RTT + (1/Bandwidth) xTransferSize

    Total Time = 2 sec + 0.200 sec + (1 / 0.10 MB/s) x 0.40 MB

    Total Time = 2.2 sec + 4 sec

    Total Time = 6.2 sec

    Thus, latency is not a factor.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “Suppose a host has a 1-MB file that is to be sent to another host. The file takes 1 second of CPU time to compress 50%, or 2 seconds to ...” in 📙 Computers & Technology if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers