mardi 24 juillet 2018

randomly initialized embeddings with Tensorflow

I am making an embedding layer with randomly initialized embeddings. I proceeded this way

vocab_size = 10
embed_dim = 4

mapping_strings = tf.constant(["hello", "lake", "palmer"])  # input tokens

table = tf.contrib.lookup.string_to_index_table_from_tensor(
mapping=mapping_strings, num_oov_buckets=1, default_value=-1)

ids = table.lookup(mapping_strings)           # ids for each  oken 

embedding_matrix = tf.random_normal(name="embedding_matrix",
                                        dtype=tf.float32,
                                        shape=[vocab_size,embed_dim])
#embedding for each id 
embedded_inputs = tf.nn.embedding_lookup(embedding_matrix, ids)



with tf.Session() as sess:

table.init.run()
print(sess.run(embedded_inputs))

This works and gives me the expected output , but i want these randomly initialized embeddings to be trained later, where are the weights and biases set ? and how will backpropagation be performed in order for the embeddings to be learned? also is tf.random_normal giving me a random variable for the embedding_matrix ?




Aucun commentaire:

Enregistrer un commentaire