Ask Question
31 July, 23:10

The time it takes to transmit a file always depends on the file size. Suppose you transmitted 30 files, with the average size of 126 Kbytes and the standard deviation of 35 Kbytes. The average transmittance time was 0.04 seconds with the standard deviation of 0.01 seconds. The correlation coefficient between the time and the size was 0.86.

Based on this data, fit a linear regression model and predict the time it will take to transmit a 400 Kbyte file.

+4
Answers (1)
  1. 1 August, 01:35
    0
    Y = 0.009042 + 0.0002457X

    Y = 0.1073 seconds

    Step-by-step explanation:

    In the given problem we have two variables: the transmission time and average size of file.

    Y = transmission time

    X = average file size

    The linear regression model is given by

    Y = a + bX

    The slope b is given by

    b = correlation coefficient * (SDy/SDx)

    Where SDy is the standard deviation of average transmittance time and SDx is the standard deviation of average file size.

    b = 0.86 (0.01/35)

    b = 0.0002457

    The y-intercept a is given by

    a = y - bx

    a = 0.04 - (0.0002457) 126

    a = 0.04 - 0.030958

    a = 0.009042

    Therefore, the linear regression model is

    Y = 0.009042 + 0.0002457X

    Predict the time it will take to transmit a 400 Kbyte file.

    Substitute X = 400 in the regression model

    Y = 0.009042 + 0.0002457 (400)

    Y = 0.009042 + 0.09828

    Y = 0.1073 seconds

    Therefore, the predicted time to transmit a 400 Kbyte file is 0.1073 seconds.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “The time it takes to transmit a file always depends on the file size. Suppose you transmitted 30 files, with the average size of 126 Kbytes ...” in 📙 Mathematics if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers