I have a strange behaviour in my attempt to code Excel's NORMINV() in C. As norminv() I took this function from a mathematician, it's probably correct since I also tried different ones with same result. Here's the code:
double calculate_probability(double x0, double x1)
{
return x0 + (x1 - x0) * rand() / ((double)RAND_MAX);
}
int main() {
long double probability = 0.0;
long double mean = 0.0;
long double stddev = 0.001;
long double change_percentage = 0.0;
long double current_price = 100.0;
srand(time(0));
int runs = 0;
long double prob_sum = 0.0;
long double price_sum = 0.0;
while (runs < 100000)
{
probability = calculate_probability(0.00001, 0.99999);
change_percentage = mean + stddev * norminv(probability); //norminv(p, mu, sigma) = mu + sigma * norminv(p)
current_price = current_price * (1.0 + change_percentage);
runs++;
prob_sum += probability;
price_sum += current_price;
}
printf("\n\n%f %f\n", price_sum / runs, prob_sum / runs);
return 0;
}
Now I want to simulate Excel's NORMINV(rand(), 0, 0.001) where rand() is a value > 0 and < 1, 0 is the mean and 0.001 would be the standard deviation.
With 1000 values it looks okay:
100.729780 0.501135
With 10000 values it spreads too much:
107.781909 0.502301
And with 100000 values it sometimes spreads even more:
87.876500 0.498738
Now I don't know why that happens. My assumption is that the random number generator has to be normally distributed, too. In my case probability
is calculated fine since the mean is pretty much 0.5 all the time. Thus I don't know why the mean deviation is increasing. Can somebody help me?
Aucun commentaire:
Enregistrer un commentaire