CLT has no such requirements, and CLT says or assumes nothing about actual underlying distributions.
A random variable (for example runs scored in an innings) can have any underlying distribution (runs scored in an innings is definitely not a normal distribution, it has a very high right skew). If you take samples of that random variable, and calculate sample mean, this sample mean is itself a new random variable. CLT says that irrespective of the underlying distribution the sample mean will converge to a normal distribution as the sample size goes up.
It is a very, very strong result in statistics, the very basis of all stats actually. You can see it in action in simulations. There is this animation I could find on web that shows it. Underlying distribution is a bimodal distribution (anything but normal). But if you calculate mean of ever larger samples, the distribution of sample mean approaches normal.
EDIT: Some more illustrations
Uniform distribution -
Central Limit Theorem
Triangular distribution -
Central Limit Theorem (triangle)
1/X distribution -
Central Limit Theorem 1/X
Parabolic distribution -
Central Limit Theorem (parabola)