samedi 5 novembre 2022

Why the most random method (or function) returns from zero(0) to one(1) by default?

I am wonder why the most random method, such as random() method in random module in Python or random() method in Math package in Java, returns a range from zero to one by default?

I assume that it is related with probability or the peeking the limitation of the range. For instance, if we set the default range as natural number, the end of limitation would be positive infinity. Thus, it is much clear to set the range from zero to one for the readability and clarity.

What do you think?

I assume that it is related with probability or the peeking the limitation of the range. For instance, if we set the default range as natural number, the end of limitation would be positive infinity. Thus, it is much clear to set the range from zero to one for the readability and clarity.

What do you think?




Aucun commentaire:

Enregistrer un commentaire