Sunday, June 30, 2024

Using GNN property predictors as molecule generators

Félix  Therrien, Edward H. Sargent, and Oleksandr Voznyy (2024)
Highlighted by Jan Jensen

Figure 1 from the paper. (c) 2024 the authors

Now this is a very neat idea. Normally, we use back propagation to alter the weight in order to minimise the difference between the output and the ground truth. Instead, the authors use back propagation to alter the input to minimise the difference between the output and a desired value. In this case the input is the molecular adjacency matrix and the result is a molecule with the desired property.

It's one of those "why didn't I think of this?" ideas, but, in practise, there are a few tricky problems to overcome. These include recasting the integer adjacency matrix as a smooth float matrix, finding the right constraints to yield valid molecules, and finding the right loss function.  The authors manage to find clever solutions to all these problems and show that this simple idea actually works quite well. As I read it, the current implementation if limited to HCNOF molecules, but generalising it should not be an insurmountable task.

Even if this approach doesn't turn out to be the best generative model, it is one of these obvious (in hindsight) methods that have to be tested to justify more complicated approaches.   



This work is licensed under a Creative Commons Attribution 4.0 International License.



No comments:

Post a Comment