mardi 31 janvier 2017

Don't understand code that gets random number between two values

So, I understand this seems like a stupid question. With that said however, within my classes concerned with teaching proper code. I came upon this. Min + (Math.random() * ((Max - Min) + 1)) Essentially the code goes. Add your minimum value + a random number between 0.0 and 1.0 multiplied by the max value minus the minimum value plus 1. The book regards this code as a base for retrieving a random value within certain parameters. i.e max=40 min=20. that would get a value between 40 and 20.

The thing is, I know what the code is saying and doing. I was using this to generate a random character by adding (char) to the front and using 'a' and 'z' as the values. The thing is though, I don't understand how, mathematically speaking, this even works. I understand it makes me a pretty poor programmer on my part. I never claimed to be great or brilliant. I know algebra and some basic higher math concepts but there are some stupidly basic formulas like this that leave me scratching my head.

In terms of programming logic, this isn't so much an issue for me, but seeing concepts like this. I'm confused. I don't get the mathematical logic of this code. Am I missing anything? I mean, with a math random value between 0.0 and 1.0, I don't see how it procures a value between the minimum and maximum value. Would anybody be willing to be give me a layman's explanation of how this works?




Aucun commentaire:

Enregistrer un commentaire