I can't seem to rationalize the Big-O notation for a while loop that has an indeterminate number of iterations.
In my code for a personal project, I have a list containing all 0s. I then implement a while loop that will generate a random integer between 0 and 9. If the value in the list at the index of the random number is a 0, then the value is written to a 1 and the while loop exits. Otherwise, a random number is generated again and the process repeats.
I'm not entirely sure what the time complexity of this would be, however. For example, if after 9 iterations of the algorithm, every single value in the list except index 9 is 1, and if the random number generator just happens to not generate the number 9 for, say, 99 iterations, then it would exit after 99 + 9 iterations. Wouldn't the worst-case be O(infinity)? I don't think this is possible, but I figured I'd ask since I wasn't sure.
My textbooks and online resources don't seem to provide much insight on examples such as this. I'm sure that the best-case would be O(1), but the average and worst cases I'm a bit unsure about.
I found a similar problem that has the same premise. Here's the pseudocode, where n is some integer of arbitrary size:
sample_found = false
while(!sample_found) {
if (rand(0,n) == 0) {
sample_found = true
}
}
In the worst case, this would run infinitely, right? I'm not sure about average case, either.
Aucun commentaire:
Enregistrer un commentaire