Who tells the AI what to create? And how well does the AI understand the exact thing the person is attempting to do?
It would be no different than prompt engineers now knowing exactly what the say/type in order to get the image output they want.
That prompt work would be a kind of programming code in upon itself.
imagine if and when an entire model’s weights are lost?
Imagine you have a personal AI that you’ve been training for years, and its learning off you, and there’s a backup failure? It might be like losing a pet…