GSoC 2025: Annotations support for entities #889
Replies: 3 comments 4 replies
-
|
Reading through your descriptions it all looks good! Re-usability of draw code would be very beneficial and it seems like you are on top of it. Looking forward to see what you cook up :) |
Beta Was this translation helpful? Give feedback.
-
|
I got the basic skeleton of the API and GUI interface working with the extensible annotation classes. I want to run it by you guys before I proceed with further changes. I haven't done any of the actual drawing related to annotations yet, but there aren't any unknowns there as of now (based on prototype experience). Here is a short video that shows the progress so far: annotations-api-gui-interface.mp4As you can see from the above, the whole registration process of custom annotation classes (the ones we write as well as the ones that users can write) work fine. The dynamic methods added for annotation types (like User created annotations also work fine - as seen in the The initial approach I thought for annotations (which is the same as what was in the prototype) is to use Blender properties to save all of the annotation specific data. Though Blender provides ways to create dynamic properties (which we need to support extensibility), it still seems like a lot of property management that we will have to deal with and wasn't quite straightforward. One other approach I had in mind was to use Geometry Nodes for this - essentially use the nodes themselves to store data as opposed to properties. We already have nice interfaces to expose and manage these - both for API and GUI. So, it seemed natural to re-use those. This is what you might have noticed towards the end of the video above. We could potentially create a new modifier etc, but I thought it'd be easy and maybe beneficial in the future too to use the existing tree. There is just one Are you guys ok with me going down this path? It might seem unconventional to use geometry nodes for annotations, but maybe it is a good thing and fits in with the overall MN theme of using nodes? With this, a lot of what we have for nodes can be re-used, where as with properties, it will be something new that we will have to figure out (given the dynamic nature we are seeking now) how to expose those and see if there are any other unknowns. Could you please share your thoughts and let me know if I can continue this way? |
Beta Was this translation helpful? Give feedback.

Uh oh!
There was an error while loading. Please reload this page.
-
Hi @BradyAJohnston and @yuxuanzhuang ,
I got started on the annotations support. I am planning to do this in such a way that this is common for all entities and not just Trajectories - though it is for Trajectories that I will be adding the basic annotations initially.
Here is the example GUI from the prototype that shows various options that will be supported both through the GUI and API:
Here is the high level approach I have in mind:
I will add a
BaseAnnotationclass that is completely independent and includes all theblf,gpuand other Blender code (for the drawing and rendering support) - this will include all the basic methods to draw text (labels) and lines anywhere in the 3d and 2d (viewport) space. All our high level annotations can be built using these basic methods. Entities will extend this to provide access to the entity specific details - for example,TrajectoryAnnotation(BaseAnnotation), which provides access to the underlying entity - the universe, blender object, etc.I will add an
AnnotationsManagerclass that is responsible for the overall management of annotations - including ones we provided and custom ones that users can add (see later). This will include all the blender property management that is needed for both API and GUI, the viewport draw handler, etc. This will provide APIs to add different annotation types, delete annotations, get annotations etc. This will also provide iterable and subscriptable access to the added annotations through the API. I will make a base class that has all this functionality so that it can be reused across different entities.What shows up in the UI (and through specific API) are the different annotation types that we support. For example, for Trajectories, the prototype supported the following:
atom_info- basic atom info - name and optional res id, seg idbond_angles- bond details between 3 atoms - lengths, anglecom- center of mass of a selectioncom_distance- distance between two center of massescanonical_dihedrals- canonical dihedral details of a given residAll annotation types will have the following configurable attributes (in addition to the input attributes that are annotation specific) like what is shown in the image above: font color, font size, text alignment, text rotation, offset x/y (viewport), line color, line width, arrow size, pointer length and others that control all aspects of what is drawn.
I also want users to be able to create their own annotation types with ease (and not have to modify MN code) - thereby getting the corresponding API and the GUI support. To allow this, I want to build the above Trajectory annotation types the same way that users would create their custom annotation types. This will allow us to build with extensibility in mind right from the beginning. This is also a bit similar to how MDAnalysis provides a
AnalysisBasefor creating custom analysis classes.For example, the
atom_infoannotation type could look something like:You will already see a similarity to Blender's panels and operators. Any classes derived from
TrajectoryAnnotationwill get auto-registered usingmetaclass's__new__or__init_subclass__methods with theAnnotationsManager. There will also be aregisterandunregisterfor manually controlling the same. Theannotation_typeclass attribute will specify the annotation type - this will also be a value in the dropdown in the GUI. A corresponding API method (hereadd_atom_info) will be added in theAnnotationManagerand it will return theAtomInfoobject using a simple factory pattern in the manager class. Python's annotations in this class (selection: str, etc) will be used to create the corresponding input blender properties. These will show up in the GUI along with the rest of the common annotation configurations (like font color, font size, etc). All of these will also be exposed to the API as dynamic attributes linked to the internal blender properties so that both the API and GUI are in sync. Thedrawmethod is called during the viewport draw handler to actually draw the annotation in the viewport or in renders.I did some very quick prototyping of the above approach, but will need to explore a bit more to make sure I'm not missing anything. I will update this discussion if any of this changes.
The actual APIs for the end users would look something like:
I will ensure that the common code is separated out so that this can be easily extended to other entities like
Moleculeetc later.I will update this discussion as I keep making progress.
Beta Was this translation helpful? Give feedback.
All reactions