I need to write a program exploiting cryptographic insecurity of javascript's Math.random() for a CTF thing. However I do not know how javascript converts an int to a float between 0 and 1. The pseudo-randomization algorithm of javascript, as far as I understand is xorshift128+, which in python would look something like this:
def cast_to_int32(x):
return x & 0xffffffff
def xorshift128plus():
global state0
global state1
state0 = cast_to_int32(cast_to_int32(18030 * (state0 & 0xffff)) + cast_to_int32(state0 >> 16))
state1 = cast_to_int32(cast_to_int32(30903 * (state1 & 0xffff)) + cast_to_int32(state1 >> 16))
return cast_to_int32(cast_to_int32(state0 << 16) + (state1 & 0xffff))
However bitwise operations in this scenario are performed on integers, not floats. But the result of javascript's Math.random() is a float between 0 and 1. How does javascript convert that? I know the most intuitive thing to suggest would be a simple division by 2^32 or 2^31 but I've tried that and couldn't reproduce the results.
I tried dividing the int result by (2^31-1),(2^31),(2^32-1),(2^32) but none of these reproduced the results of Math.random() from the same state.
Aucun commentaire:
Enregistrer un commentaire