Q:

IQ scores based on the​ Stanford-Binet IQ test are normally distributed with a mean of 100 and standard deviation 15. If you were to obtain 100 different simple random samples of size 20 from the population of all adult humans and determine​ 95% confidence intervals for each of​ them, how many of the intervals would you expect to include​ 100? One would expect 95 of the 100 intervals to include the mean 100.

Accepted Solution

A:
Answer:95 intervalsStep-by-step explanation:Given that population mean = 100 standard deviation = 15 number of interval that involve the mean for 95% confidence interval is calculated asWe know that when we measure the 99 percent confidence interval, 99 outof 100 confidence interval are required to provide the mean population. similarly Assuming we measure a confidence interval of 95 percent, then we  should expect 95 out of 100 confidence interval to provide the mean population therefore, answer is 95