Ask Question
11 December, 15:58

A particular baseball pitcher throws a baseball at a speed of 39.1 m/s (about 87.5 mi/hr) toward home plate. We use g = 9.8 m/s2 and ignore air friction.

(a) Assuming the pitcher releases the ball 16.6 m from home plate and throws it so the ball is initially moving horizontally, how long does it take the ball to reach home plate?

+5
Answers (1)
  1. 11 December, 16:54
    0
    There is no acceleration in the horizontal direction (just g in the vertical), so we can use v = d/t, where v is velocity, d is distance and t is time. We can solve for time like so: t = d/v, we can plug in numbers (v is 39.1m/s completely in the horizontal direction, so no need to break it down with sin's and cos's, just plug it in) and we get t = (16.6m) / (39.1 m/s) = 0.42 s. Keep in mind it wouldn't fall far enough vertically to hit home plate (though we don't know the ball's initial height anyway), but would be in the air just above it. Cheers!
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “A particular baseball pitcher throws a baseball at a speed of 39.1 m/s (about 87.5 mi/hr) toward home plate. We use g = 9.8 m/s2 and ignore ...” in 📙 Physics if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers