Ask Question
16 June, 10:31

A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the distribution. What isthe standard error of this estimate?

+2
Answers (1)
  1. 16 June, 14:06
    0
    The standard error in estimating the mean = (0.1 * standard deviation of the distribution)

    Step-by-step explanation:

    The standard error of the mean, for a sample, σₓ is related to the standard deviation, σ, through the relation

    σₓ = σ / (√n)

    n = sample size = 100

    σₓ = σ / (√100)

    σₓ = (σ/10) = 0.1σ

    Hence, the standard error in estimating the mean = (0.1 * standard deviation of the distribution)
Know the Answer?
Not Sure About the Answer?
Get an answer to your question ✅ “A sample of 100 independent random numbers is taken from this distribution, and its average is used to estimate the mean of the ...” in 📙 Mathematics if there is no answer or all answers are wrong, use a search bar and try to find the answer among similar questions.
Search for Other Answers