|
def pullaway_loss(embeddings): |
In the paper, after computing the similarity, there exists element-wise square. Maybe below will be right.
embeddings_norm = tf.nn.l2_normalize(embeddings, axis = 1)
up = tf.reduce_sum(tf.square(tf.matmul(embeddings_norm, embeddings_norm, transpose_b = True)))
batch = tf.cast(tf.shape(embeddings)[0], dtype = tf.float32)
f = (up - batch) / (batch * (batch - 1))
with tf.Session() as sess:
print(sess.run(f))
print(sess.run(s_norm))
EBGAN.tensorflow/EBGAN/Faces_EBGAN.py
Line 209 in 7c12383
In the paper, after computing the similarity, there exists element-wise square. Maybe below will be right.