Like the asker of this question, I was wondering why Math.ceil(Math.random() * 10)
was not preferred over Math.floor(Math.random() * 10) + 1
, and found that it was because Math.random has a tiny (but relevant) chance of returning 0 exactly. But how tiny?
Further research told me that this random number is accurate to 16 decimal places... well, sort of. And it's the "sort of" that I'm curious about.
I understand that floating point numbers work differently to decimals. I struggle with the specifics though. If the number were a strict decimal value, I believe the chances would be one in ten billiard (or ten quadrillion, in the American system) - 1:1016.
Is this correct, or have I messed up, or does the floating point thing make a difference?
Aucun commentaire:
Enregistrer un commentaire