Ask Question
1 August, 12:06

A radio signal travels at 3.00 • 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the surface of Earth if the satellite is orbiting at a height of 3.54 • 10^7 meters? Show your work.

would the answer be rate=3*10^8m/s

distance=3.6*10^7m

time=distance/rate=3.6/30?

+1
Answers (1)
  1. 1 August, 12:20
    0
    The answer is 0.118 seconds.

    The velocity (v) is the distance (d) divided by time (t):

    v = d : t

    It is given:

    v = 3.00 * 10⁸ meters per second

    d = 3.54 * 10⁷ meters

    It is unknown:

    t = ?

    If:

    v = d : t

    Then:

    t = d : v

    t = 3.54 * 10⁷ meters : 3.00 * 10⁸ meters/second

    t = 1.18 * 10⁻¹ seconds

    t = 0.118 seconds

    Therefore, radio signal will travel from a satellite to the surface ofEarth 0.118 seconds.
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “A radio signal travels at 3.00 • 10^8 meters per second. How many seconds will it take for a radio signal to travel from a satellite to the ...” in 📙 Mathematics if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers