Ask Question
5 July, 04:32

A rock is thrown horizontally at a speed of 5.0 m/s from the top of a cliff 64.7 m high. The rock hits the ground 18.0 m from the base of the cliff. How would this distance change if the rock was thrown at 10.0 m/s?

+5
Answers (1)
  1. 5 July, 05:04
    0
    Refer to the diagram shown.

    Assume g = 9.8 m/s² and ignore air resistance

    When the rock is launched from a height of 64.7 m,

    u = 5.0 m/s, the horizontal velocity

    v = 0, the initial vertical velocity

    If the rock hits the ground 18.0 m from the base of the cliff, then the time of flight is

    t = (18.0 m) / (5.0 m/s) = 3.6 s

    The vertical distance traveled is

    s = (1/2) * (9.8 m/s²) * (3.6 s) ² = 63.504 m

    Because this distance is less than 64.7 m, ground level is slightly higher away from the base of the cliff. It is higher by

    64.7 - 63.504 = 1.196 m

    If the rock is thrown at 10 m/s, the time of flight remains the same because acceleration due to gravity is the same.

    Therefore the horizontal distance traveled is

    (10.0 m/s) * (3.6 s) = 36.0 m

    Answer: The distance will be 36.0 m
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “A rock is thrown horizontally at a speed of 5.0 m/s from the top of a cliff 64.7 m high. The rock hits the ground 18.0 m from the base of ...” in 📙 Physics if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers