Marvin Friede, Christian Hölzer, Sebastian Ehlert, and Stefan Grimme (2024)
Highlighted by Jan Jensen
When I first saw this paper on SoMe I was incredibly excited because I first thought it was the release of the long-anticipated g-xTB method (I have a bad habit of reading very superficially and seeing what I want to see). But then I skimmed the abstract, saw my mistake, and promptly forgot about it the paper until I saw Jan-Michael Mewes' recent Bluesky thread.
The paper described a fully differentiable Python-baed PyTorch implementation of GFN1-xTB. In the paper they use it to compute some new molecular properties, but the real strength will be in developing new xTB methods for specific applications, i.e. a physics-based alternative to ML potentials. Jan give an illustrative example of this in his thread.
While this is application is mentioned in the paper it doesn't contain an actual application. It remains to be seen how fiddly the actual retraining will be, compared to MLPs, but the hope it that the bespoke xTB methods will require significantly less training data and be more broadly applicable than MLPs.
That's assuming that g-xTB doesn't solve all our problems, which is very much my expectation based on Grimme's talks about it (but keep in mind that my listenings skills are even worse than by reading skills).
This work is licensed under a Creative Commons Attribution 4.0 International License.
No comments:
Post a Comment