I'm making a neural network, and when assigning random weight values using np.random.rand(797, 600) for example, they all turn out positive (from 0 to 1). This is fine normally, but I have up to 800 nodes which means that by the end of initial forward propagation if all the weights are positive the sigmoided output is always 1, just because the sum of all values adds up so fast with so many synapses and nodes.
To solve this problem, I wanted to make a function that would randomly multiply each weight by 1 or -1. Then, with a random number of positive and negative numbers, the outputs would be closer to 0 and the sigmoid function would return an actual prediction that wasn't 1 all the time. Here are the two methods I have tried to do this, and neither of them worked.
# method 1
import random as rand
import numpy as np
def random_positive_or_negative(value):
return rand.choice([1, -1]) * value
example_weights = np.random.rand(4, 4)
print(random_positive_or_negative(example_weights))
prints either something like this:
[[0.89098337 0.82291754 0.7730489 0.371631 ]
[0.22790221 0.19964653 0.94609767 0.57070762]
[0.35840034 0.06689964 0.71565062 0.43360395]
[0.57860037 0.11338668 0.338402 0.30737682]]
or like this:
[[-0.79750561 -0.94206793 -0.389792 -0.18541991]
[-0.36132547 -0.66040689 -0.06270979 -0.90775857]
[-0.22350726 -0.21148559 -0.78874412 -0.9702534 ]
[-0.74124928 -0.31675956 -0.97471565 -0.18389436]]
expected output something like this:
[[0.2158195 0.16492544 0.25672823 -0.5392236 ]
[-0.54530676 0.98215902 -0.14348151 0.02629328]
[-0.8642513 -0.71726141 -0.15890395 -0.08488439]
[0.54413198 -0.69790104 0.05317512 -0.06144755]]
# method 2
import random as rand
import numpy as np
def random_positive_or_negative(value):
return (i * rand.choice([-1, 1]) for i in value)
example_weights = np.random.rand(4, 4)
print(random_positive_or_negative(example_weights))
prints this:
<generator object random_positive_or_negative2.<locals>.<genexpr> at 0x114c474a0>
expected output something like this:
[[0.2158195 0.16492544 0.25672823 -0.5392236 ]
[-0.54530676 0.98215902 -0.14348151 0.02629328]
[-0.8642513 -0.71726141 -0.15890395 -0.08488439]
[0.54413198 -0.69790104 0.05317512 -0.06144755]]
Aucun commentaire:
Enregistrer un commentaire