# Standard Error vs. Confidence Interval (not confidence level)

T

#### tahashamim

Dear All, (I am not sure if there is already a post for this or not, if there is, please guide me as I am at work and can not spend way too much time in searching)

Once again, I am here posting (probably) a very simple question. I have been reading on standard error definition and also of confidence interval and I also came to know that Z value is considered into the formula to calculate confidence intervle.

So the question really is that what is the difference between the two when both gives the margin in plus or minus. So for example if the sample mean is 60 lb and the standard error is +/- 3 lb through a formula. Then later they are saying confidence interval is 60 lb +/- (x lb)... They both are giving a range, what is the difference in both of these range?

Any help in the most non-technical way if possible?

Thanks in advance to all of you who will make an attempt.

Thanks,
Taha.

#### Tim Folkerts

Trusted Information Resource
The two ideas - standard error and confidence interval - are closely connected, as you seem to have learned.

Standard error is a particular calculation -- you take the standard deviation of a set of numbers and divide by the square root of the number of items in the set.
SE = s / n^0.5
Given typical assumption about normal distributions as such, you can conclude that the true mean would be within +/- the standard error 68% of the time.

Confidence intervals are more general -- you can choose any % certainly you want, but multiplying by the appropriate Z value. The most common is 95% confidence intervals. In this case you would use limits of +/- 1.96*SE.

For example, suppose you measured 100 widgets and found that the mean was 10 and the standard deviation was 2. The standard error would be 2 / 100^0.5 = 0.2. The 95% confidence interval would be +/- 1.96 * 0.2 = +/- 0.39

Tim

PS Standard_error_(statistics) has a pretty basic discussion of the topic.