If the random number generator is doing what it says it does, then we should be able to check it. For example, if we generate many random numbers, their average should be near 0.5. Furthermore, the standard deviation should be near 0.25.
As the array size gets bigger, does the average value converge on 0.5?
where n is the number of elements, xi is the
ith element of the array, and is the average, as
calculated by avgArray.
HINT: Since you need to know the average to calculate the standard deviation, you might want stdArray to invoke avgArray. Or, you might change the interface to stdArray so that it takes the average as a third argument.
As the size of the array increases, does the standard deviation converge on 0.25 as expected? Which takes longer to converge, the mean or the standard deviation? In other words, how big does the array have to be for the mean to be within 0.01 of 0.5? How big does the array have to be for the standard deviation to be withing 0.01 of 0.25?