mercredi 28 octobre 2015

Encog NEAT Network evolving to a maximum size? (for approximating a complex function)

I am trying to create a neural network to approximate the output of a deterministic pseudo-random number generator function. I am starting of by using Encog's NEAT network.

Generating a pseudo random number involves a lot of arithmetic (yes I know NN's is not ideal for this) and I thus expect quite a lot of hidden layers and neurons to finally approximate this function.

Is there any limit in growth size increase(amount of neurons) of the NN offspring while evolving which might thus hinder populations to properly evolve into the fittest state possible?

Hoping that Jeff (http://ift.tt/1SayGJc) picks up on this question.




Aucun commentaire:

Enregistrer un commentaire