• 1 Post
  • 440 Comments
Joined 2 years ago
cake
Cake day: June 29th, 2023

help-circle
  • More power to the rear makes sense because you get more traction at the rear under normal acceleration, not just when carrying a load. It’s pretty typical of electric cars to do this, just like it’s typical to have bigger brakes on the front of all cars, because there’s more traction at the front under braking.

    There’s also the issue of torque vectoring. Without a differential, torque vectoring is essential, but under acceleration torque vectoring to the rear wheels is much more effective than to the front wheels, so that’s another reason to split the rear power but not the front.


  • He proposed a moon cannon. The moon cannon was wrong, as wrong as thinking an LLM can have any fidelity whatsoever. That’s all that’s needed for my analogy to make the point I want to make. Whether rockets count as artillery or not really doesn’t change that.

    Cannons are not rockets. LLMs are not thinking machines.

    Being occasionally right like a stopped clock is not what “fidelity” means in this context. Fidelity implies some level of adherence to a model of the world, but the LLM simply has no model, so it has zero fidelity.


  • Interesting article, but you have to be aware of the flipside: “people said flight was impossible”, “people said the earth didn’t revolve around the sun”, “people said the internet was a fad, and now people think AI is a fad”.

    It’s cherry-picking. They’re taking the relatively rare examples of transformative technology and projecting that level of impact and prestige onto their new favoured fad.

    And here’s the thing, the “information superhighway” was a fad that also happened to be an important technology.

    Also the rock argument vanishes the moment anyone arrives with actual reasoning that goes beyond the heuristic. So here’s some actual reasoning:

    GenAI is interesting, but it has zero fidelity. Information without fidelity is just noise, so a system that can’t solve the fidelity problem can’t do information work. Information work requires fidelity.

    And “fidelity” is just a fancy way of saying “truth”, or maybe “meaning”. Even as conscious beings we haven’t really cracked that issue, and I don’t think you can make a machine that understands meaning without creating AGI.

    Saying we can solve the fidelity problem is like Jules Verne in 1867 saying we could get to the moon with a cannon because of “what progress artillery science has made during the last few years”. We’re just not there yet, and until we are, the cannon might have some uses, but it’s not space technology.

    Interestingly, artillery science had its role in getting us to the moon, but that was because it gave us the rotating workpiece lathe for making smooth bore holes, which gave us efficient steam engines, which gave us the industrial revolution. Verne didn’t know it, but that critical development had already happened nearly a century prior. Cannons weren’t really a factor in space beyond that.

    Edit: actually metallurgy and solid fuel propellants were crucial for space too, and cannons had a lot to do with that as well. This is all beside the point.




  • And anytime I see anyone advocating this crap it’s always because it gets the job done “faster”, and like, the rule is: “fast; cheap; good; pick two”, and this doesn’t break that rule.

    Yeah, they get it done super fast, and super shitty. I’m yet to see anyone explain how an LLM gets the job done better, not even the most rabid apologists.

    LLMs have zero fidelity, and information without fidelity is just noise. It is not good at doing information work. In fact, I don’t see how you get information with fidelity without a person in the loop, like on a fundamental, philosophical level I don’t think it’s possible. Fidelity requires truth, which requires meaning, and I don’t think you get a machine that understands meaning without AGI.






  • Yup, and honestly even according to that anti-art logic it was a strategic failure. Funny meme gifs were part of how the game gained notoriety, but you don’t maintain a game long term on meme status alone.

    Even if “haha funni physics glitches” were still the in thing - I think people got over them fast, like with any comedy style - the longevity of the game came from the deep mechanics and impressive missions people could do, and the community support.

    I actually think that sequels to breakout sandbox games are always doomed to fail. Like what if they tried to release Minecraft 2? It would be awful, and I think we all instinctively know it would be, which is kind of a self fullfulling prophecy.

    Minecraft doesn’t have a monopoly on the special sauce that makes their game good. It has a decade and a half of support and cultural recognition from a dedicated following. You can’t make that happen a second time. I don’t like what’s been done with the franchise commercially, but they figured out how to milk it without doing a direct sequel, which I think is part of why it’s still relevant.








  • If you like factory designing games, I can recommend anything by Zachtronics.

    They’re all esoteric programming/automation type puzzle games, and they all have their own unique solitaire games built-in for whenever you get tired of the main game.

    My personal favourites are SpaceChem - scifi molecule factories - and Opus Magnum - steampunk alchemical molecule factories. Something about the molecules just works for me, don’t know why. Plus the Opus Magnum solitaire game is really unique and fun, and it has a user-made level feature, so you can keep playing.

    Last Call BBS is a collection of minigames they made as their final release before shutting up shop, so it’s a lot more casual than the others, but a lot of fun.