Perhaps we could “mount” the model for gpt-neo-1.3B or something similar locally and then the Python and JavaScript community could get familiar with using “AI” text generation.
The (above) model is rather large, so copying and running on each instance is prohibitive. But if the model were in a common local location, that would make it feasible to run it.
Yes I would like natural language. I want a Python AI that I can train on text files that speaks like a human based on its understanding of the text files.