The next title has to generate more revenue than the last. Forever. Eternally.
Companies expect perpetual growth. No ceiling.
This has a breaking point that this meme captures pretty nicely :)
The next title has to generate more revenue than the last. Forever. Eternally.
Companies expect perpetual growth. No ceiling.
This has a breaking point that this meme captures pretty nicely :)
Ah, but that’s the thing. Training isn’t copying. It’s pattern recognition. If you train a model “The dog says woof” and then ask a model “What does the dog say”, it’s not guaranteed to say “woof”.
Similarly, just because a model was trained on Harry Potter, all that means is it has a good corpus of how the sentences in that book go.
Thus the distinction. Can I train on a comment section discussing the book?
We have to distinguish between LLMs
They are not one and the same
There must be more profit than last year.
We’re witnessing gaming devour itself from the inside in pursuit of this impossible goal.
Buy indie