So if you’re planning to have the kid play on Switch or something like that, it’s not going to work.
You can run Geyser (a modified Minecraft server) to let bedrock clients play on your Java server.
So if you’re planning to have the kid play on Switch or something like that, it’s not going to work.
You can run Geyser (a modified Minecraft server) to let bedrock clients play on your Java server.
Which is weird because one of Rossman’s sources claimed that they were on the phone with Brother, asked how to do manual registration, and were told it couldn’t be done unless a genuine Brother toner cartridge was installed.
“training a new model”
Is equivalent to “make a new game” with better graphics.
I’ve already explained that analogy several times.
If people pay you for the existing model you have no reason to immediately train a better one.
You have no idea the rage I had at online fanboys defending Lucas.
When you throw more hardware at them, they are supposed to develop new abilities.
I don’t think you understand how it works at all.
Data is collected. Training is done on the data. Training is done on the trained data (deep seek). You now how a model. That model is a static software program. It requires 700 GB of ram to run (deep seek). Throwing more hardware at the model does nothing but give you a quicker response.
If everyone pays you to use your model, you have no reason to develop a new one. Like Skyrim.
I’m supposed to be able to take a model architecture from today, scale it up 100x and get an improvement.
You can make Crysis run at higher fps. You can add polygons. (remember ati clown feet?) You can add details to textures. https://research.nvidia.com/publication/2016-06_infinite-resolution-textures
But really the “game” is the model. Throwing more hardware at the same model is like throwing more hardware at the same game.
Which part of diminished returns not offering as much profit did you not understand?
Current models give MS an extra 30% revenue. If they spend billions on a new model will customer pay even more? How much would you pay more for a marginally better AI?
Current games have a limit. Current models have a limit. New games could scale until people don’t see a quality improvement. New models can scale until people don’t see a quality improvement.
As long as the hobby doesn’t involve making something over more than 1 day.
Sorry I meant wdm. Multimode was for home installs.
Like games have diminished returns on better graphics (it’s already photo realistic few pay $2k on a GPU for more hairs?), AI has a plateau where it gives good enough answers that people will pay for the service.
If people are paying you money and the next level of performance is not appreciated by the general consumer, why spend billions that will take longer to recoup?
And again data centers aren’t just used for AI.
If buying a new video card made me money, yes
But this is the supposition that not buying a video card makes you the same money. You’re forecasting free performance upgrades so there’s no need to spend money now when you can wait and upgrade the hardware once software improvements stop.
And that’s assuming it has anything to do with AI but the long term macroeconomics of Trump destroying the economy so MS is putting off spending when businesses will be slowing down because of the tariff war.
More efficient hardware use should be amazing for AI since it allows you to scale even further.
If you can achieve scaling with software, you can delay current plans for expensive hardware. If a new driver came out that gave Nvidia 5090 performance to games with gtx1080 equivalent hardware would you still buy a new video card this year?
When all the Telcos scaled back on building fiber in 2000, that was because they didn’t have a positive outlook for the Internet?
Or when video game companies went bankrupt in the 1980’s, it was because video games were over as entertainment?
There’s a huge leap between not spending billions on new data centers ( which are used for more than just AI), and claiming that’s the reason AI is over.
Cancelling new data centers because deep seek has shown a more efficient path isn’t proof that AI is dead as the author claims.
Fiber buildouts were cancelled back in 2000 because multimode made existing fiber more efficient. The Internet investment bubble popped. That didn’t mean the Internet was dead.
There are over a hundred Xbox games that never got a port. Virtual Boy had only 22 games and got an emulator. There are obscure arcade games, that by their nature are a single game, and have been emulated.
Being hackable is why I would have expected ports to be easier.
It’s crazy that you can now play 360 games which used a Power PC CPU but still can’t play most original Xbox games which was a regular X86 PC and Nvidia GPU.
So George Lucas lied when he said the original theatrical cut would never be released again because the original print was modified in the process of making the special version.
Told you so, Lucas fan boys who said it was impossible!
You missed the part where deep seek uses a separate inference engine to take the LLM output and reason through it to see if it makes sense.
No it’s not perfect. But it isn’t just predicting text like how AI was a couple of years ago.
LLM’s can now generate answers. Watch this:
There’s camm2, the new standard for high speed removable memory. Asus already has released a motherboard that uses it and it matches the 8000 mts of the Framework which won’t be out until 3Q this year.
Framework chose non upgradable because it was easier/cheaper. That’s fine except Framework’s entire marketing has been built around upgradeable hardware.
With the current Minecraft monthly updates, paper is always behind on the latest features. There are also minor problems that paper introduces with its performance improvements.
Years ago paper was critical for a good Minecraft experience, but a newer PC (newer than 6 years old) runs great on vanilla.