Skip to content

IntegratedGradient with XLM type models (FlauBERT) #414

Open
@carodupdup

Description

@carodupdup

Hi there,

I am facing some issues trying to implement the IntegratedGradient algorithms with the pertained Flaubert Model I have made.
First, I used the tutorial for the Bert for SQUAD and scrolling the issues, I came upon this gist: https://gist.github.com/davidefiocco/3e1a0ed030792230a33c726c61f6b3a5
that allowed me to apply the Flaubert Model by making only small changes.

However, this notebook allows me to use the LayerIntegratedGradient (LIG) algorithm but not the IntegratedGradient (IG). Therefore, going through the tutorial, I saw this paragraph explaining how to modify the algorithm used from LIG to IG:
"we can also use IntegratedGradients class instead (of LayerIntegratedGradients), however in that case we need to precompute the embeddings and wrap Embedding layer with InterpretableEmbeddingBase module. This is necessary because we cannot perform input scaling and subtraction on the level of word/token indices and need access to the embedding layer."

I'm afraid that it confused me very much. Finally, looking at issue #150 (the code written by vfdev-5) is where I am blocked. First, I am wondering where the person uses the InterpretableEmbeddingBase module. Furthermore, I am trying to use the code, but the encoder, that it an object in the BERT models, is not one in XLM (at least from my knowledge). So I am wondering if it is possible to achieve the use the IntegratedGradient algorithm for a model such as an XLM.

If anyone has been working on this specific problem or is willing to help me, it would be much appreciated. Thank you in advance!

Metadata

Metadata

Assignees

No one assigned

    Labels

    questionFurther information is requested

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions