I am taking a new course at the university for Simulation, and one of the methods of generating numbers we learned is Von Neuman's Middle Square method. We've got asked the question from "The Art of Programming" book, and I have thought alot without a clue on how to answer it:
Question 9. [M14] in The Art of Programming: "Prove that the middle-square method using 2n-digit numbers to the base b has the following disadvantage: If the sequence includes any number whose most significant n digits are zero, the succeeding numbers will get smaller and smaller until zero occurs repeatedly."
I would be happy for a clue on how to start proving such a thing. In practice, I tried it, and it is truly so. But how would one come to prove this mathematically?
Aucun commentaire:
Enregistrer un commentaire