Ask Question
15 December, 05:00

Radio signals travel at a rate of 3 * 108 meters per second. how many seconds would it take for a radio signal to travel from a satellite to the surface of earth if the satellite is orbiting at a height of 9.6 * 106 meters? (hint: time is distance divided by speed.)

+2
Answers (1)
  1. 15 December, 08:09
    0
    3.2x10^-2 seconds (0.032 seconds)

    This is a simple matter of division. I also suspect it's an exercise in scientific notation, so here is how you divide in scientific notation:

    9.6 x 10^6 m / 3x10^8 m/s

    First, divide the significands like you would normally.

    9.6 / 3 = 3.2

    And subtract the exponent. So

    6 - 8 = - 2

    So the answer is 3.2 x 10^-2

    And since the significand is less than 10 and at least 1, we don't need to normalize it.

    So it takes 3.2x10^-2 seconds for the radio signal to reach the satellite.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “Radio signals travel at a rate of 3 * 108 meters per second. how many seconds would it take for a radio signal to travel from a satellite ...” in 📙 Physics if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers