本文共 986 字,大约阅读时间需要 3 分钟。
I have been thinking for several days about this problems.
As before, if I initialized one weight for the encoder part, and then transpose it for the decoder part. whethere it would be the tied weights? Today, I did a experiments, and saw that it was true. See the code below:x = tf.placeholder(dtype=tf.float32, shape=[None, 4])w = tf.Variable(tf.truncated_normal(shape=[4, 2], mean=0.0, stddev=0.1, dtype=tf.float32))b = tf.Variable(tf.constant(0.0, shape=[2], dtype=tf.float32))z = tf.nn.relu(tf.matmul(x, w) + b)w_inv = tf.transpose(w)b_inv = tf.Variable(tf.constant(0.0, shape=[4], dtype=tf.float32))rec = tf.nn.relu(tf.matmul(z, w_inv) + b_inv)loss = tf.squared_difference(x, rec)train_op = tf.train.AdamOptimizer().minimize(loss)with tf.Session() as sess: sess.run(tf.global_variables_initializer()) all_variables = tf.trainable_variables() for v in all_variables: print(v)
And the outputs:
It is clearly, here the tied weights are successful.
转载地址:http://zlini.baihongyu.com/