

Yeah, bu can definitely be fun to mess around with. Linux Mesa supports the 40 series. I wonder how one would do in a BE-PowerPC system of around the same era?
Yeah, bu can definitely be fun to mess around with. Linux Mesa supports the 40 series. I wonder how one would do in a BE-PowerPC system of around the same era?
Hmm, sure.
I am not optimistic on it, but considering the Index’s(kit and controllers) are sold out, it wouldn’t surprise me that they might release a 2nd gen and announce HL3 along it.
Swap is used to defrag ram on linux. Could be related to that. In any case this is pretty normal. I have 43Gi available and it is used 700Mi of swap.
I agree. There are times when I speak to someone that has an unfamiliar accent that my brain has trouble processing. They can be speaking fluent English , but I still struggle.
Wouldn’t it better to just stop acknowledging Tesla and stop pretending that it is a premium brand?
It is easier. Raster based rendering does a lot of cheats to make it look realistic that the devs have to come up with and write. To do 4K you need to calculate over 8 million rays and their bounces. Very processor intensive, but much simpler from a programming point of view.
Now the difference between some of those settings on low and ultra are not really worth it is some cases as they look basicly the same.
Overall these ray tracing methods are leading to more realistic raster based rendering as the devs can actively see the “problem” areas and make or fix the cheats to look better.
Glad they came to their senses. I had trouble setting up one as I have/had a sony group account with no security info which is needed to reset the password. Turns out sony doesn’t filter periods from usernames from email. Just ended up adding a couple to my email address for a new account.
Not sure why you are so worried. Based on those requirements a Radeon rx 6800 should be able to do Ultra 4K @ 60. The main differentiator seems to be vram.
Quantum computing. The nuclear fusion of computers. I wonder how long they will be 10 years away.
I think this is what I am thinking of. Kind of a predecessor of modern machine learning.
I remember that as well.
Edit; moved comment to correct reply.
This isn’t exactly new. I heard a few years ago about a situation where the ai had these wires on the chip that should not do anything as they didn’t go anywhere , but if they removed it the chip stopped working correctly.
For a full 32GB at the max sustained speed(275MB/s), 32ish hours to transfer a full amount, 36 if you assume 250MB/s the whole run. Probably optimistic. CPU overhead could slow that down in a rebuild. That said in a RAID5 of 5 disks, that is a transfer speed of about 1GB/s if you assume not getting close to the max transfer rate. For a small business or home NAS that would be plenty unless you are running greater than 10GiBit ethernet.
I wouldn’t mind a different city, or multiple cities.
On top of the thermal paste idea, try doing a ram test. Could also try adding “nvidia_drm.modeset=0” to your kernel commandline or add “options nvidia-drm modeset=0” in /etc/modprobe.d/nvidia.conf. It might have that set to 1 somewhere.
There may be more in the logs as well. Are you able to test in windows? Maybe the gpu is failing as well.
I can’t tell from that output if you are using wayland or X11. If it is wayland I’d try X11 as the drivers before 555(?) don’t work as well and some compositors may be more unstable with the nvidia drivers.
I’d also try checking the logs as well. ‘journalctl -b -1 -xe’ to show the prior boot.
These is my only other suggestions.
If I am understanding that output, your bios is 5 years old and there are a LOT of updates since. If that is correct I’d try updating as bios updates fix things like PCIe compatibility which were released after that date.
I wasn’t saying AMD would shut down, but that Intel would take market share from them before truely affecting Nvidia’s market share. ie AMD and Intel would be fighting over the same 25ish% of the pc market.
Honestly I think it is because of DLSS. If you can get a $300 card that could do 4k DLSS performance well, why would you need to buy a xx70(ti) or xx80 card?