This is getting a bit technical, but here goes. The Confidence level is typically 1 minus the selected alpha (risk) level. Since the typical level of alpha is .05 (5% risk), the confidence level is typically 95%. Most people when they talk about this confidence level say “a confidence interval of 95%”. So the terms are pretty synonymous.
Now for any distribution with a given mean and standard deviation, there is also a distribution of averages. The variation in that distribution of averages (estimated by taking the standard deviation from the individual distribution and dividing by the square root of n) is by definition the confidence interval. Here again, people talk about +/- 2 std deviations (really 1.96 standard deviations) as being 95% confidence. It is an estimate of how much the mean would vary, if you drew another sample from the same distribution. Maybe more than you needed, but I hope that helps!