Ask Question
4 January, 17:53

Radio signals travel at a rate of 3x10^8 meter per second. How many seconds would it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 9.6*10^6 meters? (Hint: Time divide by speed.)

A) 3.2*10^2 seconds

B) 3.2*10^-2 seconds

C) 3.13 * 10^1seconds

D) 2.88*10^15 seconds

+5
Answers (1)
  1. 4 January, 18:27
    0
    Distance = velocity x time

    distance = 9.6 x 10^15 meters

    velocity = 3x10^8 meters/second

    time = distance / velocity = m / (m/s) = s
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “Radio signals travel at a rate of 3x10^8 meter per second. How many seconds would it take for a radio signal to travel from a satellite to ...” in 📙 Mathematics if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers