Skip to content

Conversation

MichelangeloDomina
Copy link
Contributor

@MichelangeloDomina MichelangeloDomina commented Jul 10, 2025

Refactor the native PET code to be more readable and documented, while tracking the back compatibility with previous models' checkpoints.
The main changes so far are:

  • transformer.py
    Creation of the TokenEncoder.py class to factorize the logic behind the creation of new tokens from edges, distances and input messages.
  • module.py
    Several functionality have been pushed into private helpers, to make the init and the forward pass more readable. The helpers require further testing to ensure that all the functionalities are torch scriptable.

TODO

  • After the rebase of the branch onto the main, the module test_compatibility.py has been broken and must be fixed to ensure the back compatibility with previous checkpoints.
  • Possibly modify the creation and storing of the edge and nodes tokens from the heads and the last layers to reduce the code repetition inside the forward pass

📚 Documentation preview 📚: https://metatrain--662.org.readthedocs.build/en/662/

abmazitov and others added 4 commits July 4, 2025 13:51
…f mask; trying to improve readability of the architecture.

test_compatibility.py: remap checkpoint keys for compatibility with new structure of transfomer.py;
module.py: created a few private helpers to make the __init__ more readable.
@tulga-rdn
Copy link
Collaborator

Currently not torchscriptable, working on it

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants