I am trying to implement a set of encode and decode steganography functions in C, where i use rand() to randomly scatter my data along an array.
I use rand to calculate a random index like so:
unsigned int getRandIndex(int size) {
return ((unsigned)(rand() * 8)) % size;
}
I seed rand like so:
unsigned long seed = time(NULL);
srand(seed);
I include the seed along with my data as a part of a header that contains a checksum and length.
The problem i have is that while decoding, when i seed the rand function again with the seed i decoded from the data, rand() tends to produce the slightest variations like so:
Index at encode: | Index at decode:
---------------------------------------------------------------------
.
.
.
At: 142568 | At: 142568
At: 155560 | At: 155552
-- --
At: 168184 | At: 168184
.
.
.
Messing up my decoded data.
Is this a limitation of the rand() function? I am 100% sure that the seed is being decoded correctly bit-for-bit as i have verified that.
Aucun commentaire:
Enregistrer un commentaire