samedi 1 janvier 2022

What's wrong with f(random(..), random(..)) in C?

I came across a problem which really confused me.
I was trying to calculate the values of $f(x,y)=\frac{y^3}{x^2+y^2}$ at some random points on $-1<x<1$ and $-1<y<1$.
Here is my code:

#include<stdio.h>
#include<stdlib.h>
#include<math.h>

#define f(x, y) ( pow((y),3) / (pow((x),2) + pow((y),2)) )

double random(double min, double max)
{
    double m;
    m = rand()/(double)RAND_MAX;
    return (min + m * (max - min));
}

void main()
{
    int i; 
    double v;
    
    for(i = 0; i < 500; i++)
    {
        v = f(random(-1,1), random(-1,1));
        printf("%lf\n", s);
    }
}

And then i got some very weird values which are greater than 1 or smaller than -1 (which are the maximum and minimum f can get on -1<x<1 and -1<y<1).

On the other hand, when i changed my code by using 2 variables carrying random x and y and then plug them into f:

    double x, y;
    for(i = 0; i < 500; i++)
    {
        x = random(-1,1);
        y = random(-1,1);
        v = f(x, y);
        printf("%lf\n", s);
    }

the problem disappeard.
Can anyone please help me explain this? Thanks in advance!




Aucun commentaire:

Enregistrer un commentaire