I have the following function, written in JavaScript:
function genNumArray(n) {
let answer = []
while (answer.length < n) {
let randNum = Math.floor(Math.random() * n + 1)
if (!answer.includes(randNum)) {
answer.push(randNum)
}
}
return answer
}
I want to know the time complexity of this code expressed in Big O notation, assuming that my function's n
is the growing term that tends to infinity. When you answer this question, please include an explanation for your answer, which I hope would include some statistics-based models or reasoning.
Supplementary Thoughts:
This function takes a positive integer, n
, and returns an array of length n
with the values 1 through n
as its contents, ordered at random.
I am aware that the given function code is not the most efficient way of implementing the function description I just gave.
For those of you less familiar with JavaScript, Math.random
creates a pseudo-random, floating point number greater than or equal to 0, but less than 1. See the docs for more info.
In regards to your answer to this question, I would prefer you pretend that JavaScript's Math.random
creates "true randomness" rather than considering JavaScript or computer architecture shortcomings that make the randomness less pure (although, any thoughts on how the difference effects time complexity are welcome). However, I'm unsure of whether this would change the answer either way.
For any sticklers out there who say "well, Math.random
has a finite amount of decimal places, so your function fails to create a random, equal distribution of numbers when n
is very large", let's also pretend that any given result of Math.random
somehow doesn't have a finite amount of decimal places.
Aucun commentaire:
Enregistrer un commentaire