I need to generate a random number between 0 and 1 in C#. It doesn't need to be more accurate than to a single decimal place but it's not a problem if it is.
I can either do Random.Next(0, 10) / 10.0
or Random.NextDouble()
.
I could not find any concrete information on the time complexity of either method. I assume Random.Next()
will be more efficient as in Java, however the addition of the division (the complexity of which would depend on the method used by C#) complicates things.
Is it possible to find out which is more efficient purely from a theoretical standpoint? I realise I can time both over a series of tests, but want to understand why one has better complexity than the other.
Aucun commentaire:
Enregistrer un commentaire