Skip to content

Unusual behaviour of the relation encoder #7

@ecklyn

Description

@ecklyn

Hi,
I am working on a personal project that uses HINormer and while debugging the model I've seen that the heterogeneous relations embedding (expressed as r in the code) follows a weird behaviour. When printed out during a training process, it seems that some columns end up filled with zeros during the first epoch and then don't change much during training. There is no recognisable pattern on how these columns are filled with zeros, sometimes it's one random column, sometimes two, sometimes all of them are filled with zeros.

I found that this problem stems from the activation function at the end of the forward process in the REConv module, which is a ReLU function by default and thus turns all the negative values to zero. You can check the same thing by printing out the rst before the activation function inside the REConv module. I wonder whether this is a desirable behaviour or if it could make the training pipeline more inconsistent since a big part of the heterogeneous relations embedding is zeroed out at the beginning of the training.

I noticed this behaviour during an attempt to debug the experiment provided with the parameters: python run.py --dataset DBLP --len-seq 50 --dropout 0.5 --beta 0.1 --temperature 2 . However, I used newer versions of Pytorch and DGL for my experiment (this is a necessity dictated by the presence of other models to use alongside HINormer in my personal project), and not those provided in the requirements. I don't know how high is the possibility that updated modules could be the cause in the generation of negative values in the embedding, though.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions