I once met a person that never drank water, only soft drinks. It’s not the unhealthiness of this that disturbed me, but the fact they did it without the requisite paperwork.

Unlike those disorganised people I have a formal waiver. I primarily drink steam and crushed glaciers.

  • 0 Posts
  • 42 Comments
Joined 2 years ago
cake
Cake day: June 14th, 2023

help-circle
  • Ooh thankyou for the link.

    “We can leverage it [ray tracing] for things we haven’t been able to do in the past, which is giving accurate hit detection”

    “So when you fire your weapon, the [hit] detection would be able to tell if you’re hitting a pixel that is leather sitting next to a pixel that is metal”

    “Before ray tracing, we couldn’t distinguish between two pixels very easily, and we would pick one or the other because the materials were too complex. Ray tracing can do this on a per-pixel basis and showcase if you’re hitting metal or even something that’s fur. It makes the game more immersive, and you get that direct feedback as the player.”

    It sounds like they’re assigning materials based off the pixels of a texture map, rather than each mesh in a model being a different material. ie you paint materials onto a character rather than selecting chunks of the character and assigning them.

    I suspect this either won’t be noticeable at all to players or it will be a very minor improvement (at best). It’s not something worth going for in exchange for losing compatibility with other GPUs. It will require a different work pipeline for the 3D modellers (they have to paint materials on now rather than assign them per-mesh), but that’s neither here nor there, it might be easier for them or it might be hell-awful depending on the tooling.

    This particular sentence upsets me:

    Before ray tracing, we couldn’t distinguish between two pixels very easily

    Uhuh. You’re not selling me on your game company.

    “Before” ray tracing, the technology that has been around for decades. That you could do on a CPU or GPU for this very material-sensing task without the players noticing for around 20 years. Interpolate UVs across the colliding triangle and sample a texture.

    I suspect the “more immersion” and “direct feedback” are veils over the real reasoning:

    During NVIDIA’s big GeForce RTX 50 Series reveal, we learned that id has been working closely with the GeForce team on the game for several years (source)

    With such a strong emphasis on RT and DLSS, it remains to be seen how these games will perform for AMD Radeon users

    No-one sane implements Nvidia or AMD (or anyone else) exclusive libraries into their games unless they’re paid to do it. A game dev that cares about its players will make their game run well on all brands and flavours of graphics card.

    At the end of the day this hurts consumers. If your games work on all GPU brands competitively then you have more choice and card companies are better motivated to compete. Whatever amount of money Nvidia is paying the gamedevs to do this must be smaller than what they earn back from consumers buying more of their product instead of competitors.


  • really flashy guns and there is a very intricate damage system that runs at least partially on the GPU.

    Short opinion: no, CPU’s can do that fine (possibly better) and it’s a tiny corner of game logic.

    Long opinion: Intersecting projectile paths with geometry will not gain advantages being moved from CPU to GPU unless you’re dealing with a ridiculous amount of projectiles every single frame. In most games this is less than 1% of CPU time and moving it to the GPU will probably reduce overall performance due to the latency costs (…but a lot of modern engines already have awful frame latency, so it might fit right in fine).

    You would only do this if you have been told by higher ups that you have to OR if you have a really unusual and new game design (thousands of new projectile paths every frame? ie hundreds of thousands of bullets per second). Even detailed multi-layer enemy models with vital components is just a few extra traces, using a GPU to calc that would make the job harder for the engine dev for no gain.

    Fun answer: checkout CNlohr’s noeuclid. Sadly no windows build (I tried cross compiling but ended up in dependency hell), but still compiles and runs under Linux. Physics are on the GPU and world geometry is very non-traditional. https://github.com/cnlohr/noeuclid






  • Meanwhile the fan PC port is absolutely amazing. I couldn’t play my copy of PD on my actual N64 because the low framerate made me motionsick, the fan-made PC port runs smooth.

    This makes me remember what happened with the re3 and revc (GTA III and GTA Vice City) projects. Fans fixed so much in those games, in their spare time, and published it as a patch (so you still had to own the games). Take Two DMCA’d and sued them just before releasing their the maligned “GTA Trilogy”. I wonder if Microsoft would have done the same before releasing new Perfect Dark content?



  • I did this previously by using MultiMC (now PrismLauncher). Make a mix of mods, send them the zip of the whole game instance and ask them to drag and drop it onto MultiMC. The biggest issue I encountered was one family member having a black ingame world until they changed a setting in the graphics mod, otherwise it didn’t seem to be too hard.

    By comparison Minetest is much easier for playing mods with family. Everyone downloads the server’s mods when they join. But the interest is lower.


  • 10 sec

    Filthy casual.

    My family has a Chiq U58G7P. Warm boots take about 30 sec and cold boots a few minutes.

    Don’t try and press any buttons on the remote during this time or for a minute or two afterwards. They might work, 15 seconds later, or they might get ignored. Sometimes your button press inputs get re-ordered too.

    Factory resets do work, but then all it can do is broadcast TV. If you let it update and install streaming apps then you will be back to the same problems.

    I suspect that it might be running out of RAM and thrashing some poor innocent MMC as swap, but I can’t find a USB ADB port to properly find out (maybe it has one internally?).