Skip to content

pullaway_loss compute wrong #3

@YanWenqiang

Description

@YanWenqiang

def pullaway_loss(embeddings):

In the paper, after computing the similarity, there exists element-wise square. Maybe below will be right.

embeddings_norm = tf.nn.l2_normalize(embeddings, axis = 1)
up = tf.reduce_sum(tf.square(tf.matmul(embeddings_norm, embeddings_norm, transpose_b = True)))
batch = tf.cast(tf.shape(embeddings)[0], dtype = tf.float32)
f = (up - batch) / (batch * (batch - 1))
with tf.Session() as sess:
    print(sess.run(f))
    print(sess.run(s_norm))

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions