Absolutely anything to avoid the metric system! If we, as Americans, has to measure in quotation marks, we damn sure will!
Absolutely anything to avoid the metric system! If we, as Americans, has to measure in quotation marks, we damn sure will!
You bastard!
My first thought was: why have I never heard of this? The answer seems to be: 2" x 3" photo paper costs $0.41/ea.
Mail-order professional photo prints are quite a bit cheaper than that. So, the main thing this contends with is other instant photo creation methods. Which are pretty niche ever since digital cameras became good.
Ugh, why the hell aren’t those air-gapped?
Same thing in cars. Why is the infotainment system that is connected to the internet not air-gapped from the critical car functions?
These things aren’t hard to do. I guess we just need people to die before we take such basic safety measures.
I was able to pull out what used to be there:
Making sure they run well with Wine is probably what many game devs are dong who specifically want to support Linux. Right now the vast majority of games run out of the box on Wine, so there probably isn’t much a dev has to do if they want to make sure it runs great.
Why would you give up games to move to Linux? Been enjoying Cyberpunk and Guild Wars lately, and many games before that the last year. Honestly, at this point I don’t even check if games work with Linux, I just assume they do unless proven otherwise.
Check out Proton DB. Gives reports on how well things run. Anything Gold or higher is going to be a non-concern to play.
It was because developers historically were familiar with Windows and would just default to making a Windows product. You want a POS interface? Your developer is probably going to hand you a .exe and not a .deb. Then your next move is to tell the hardware division to put that .exe into production systems, at which it is too late for the hardware division to argue you just chose the more expensive option without thinking.
This is changing, particularly as many platforms make it trivial to compile for different OSes.
In addition to what the other guy said, Mint is also more focused on desktop. A bunch of apps are pre-installed that one would expect on a desktop OS. Additionally, the default Mint UI, Cinnamon, feels very familiar to a Windows user. It has a start menu, task bar, tray, etc.
Debian is in the same family, and is more oriented for servers. It is super minimal out of the box, which is perfect when you want it to sit in the other room and perform specific tasks. However, you can install all the same programs, even the Cinnamon UI on Debian.
Really the difference is the out of box experience, but they are otherwise pretty similar.
Framework laptops can come with optionally no OS if you choose, and I can attest to their build quality being quite good.
I know there are some brands that will have Linux pre-installed, but I don’t know enough about them to comment.
Writing code is easier than understanding and reviewing another’s code. There is good reason code reviewers aren’t the interns and new hires.
My question to others is, why would you want to turn into a code reviewer for AI code? It’s a shitload harder. And if the goal is anything but a weekend project, you damn well better be understanding and reviewing it critically, otherwise one is shitting up the code base and forcing others to clean up your mess.
ASRock is my go-to now. Funnily enough they split off of ASUS a while ago. One continually got better, and the other worse.
Edit: I was wrong about that last part. I thought they had split off, but apparently they are a subsidiary. Well, either way, they seem better.
The amount of damage a newbie programmer without a tight leash can do to a code base/product is immense. Once something is out in production, that is something you have to deal with forever. That temporary fix they push is going to be still used in a decade and if you break it, now you have to explain to the customer why the thing that’s been working for them for years is gone and what you plan to do to remedy the situation.
A newbie without a leash just pushing whatever an AI hands them into production. O, boy, are senior programmers going to be sad for a long, long time.
Yeah, going along these lines. There is probably a USB header on the motherboard. These have pretty darn good speeds. You can get an adapter that lets you turn those into a USB-C port and then use a standard USB-C to Ethernet adapter. Something like this or this. No guarantee on either of those specific adapters being good though. Looks like slim pickings for such things and both of those are garbage brands.
If you have a USB-C port on the back of your motherboard, you can get an adapter for that directly.
Also, motherboards generally come with 2.5Gb/s ports now too. Some even have two. Something to consider.
Am I correct in understanding that the card will run at PCIe gen 3 X1 if I do this?
Correct. The situation you described in the original post would result in Gen 3 x1 speeds.
The interface will always default to the fastest standard that both sides can support. If one is gen 2 and the other is gen 4, gen 2 is the highest that can be supported. If one side is x8 and the other is x4, x4 is the highest that can be supported.
What can I do if the card is PCIe gen 2 x8?
If you put a Gen 2 x8 card in a Gen 4 x1 slot, you will get a Gen 2 x1 link.
File a small slit in the end of the slot so the card fits into it, but runs past the back. The card will run at Gen 3 x1 speed, but otherwise work properly.
Many motherboards even come with the end of the PCIe slots open for this exact purpose.
Edit: Gen 3 x1 runs at almost a full GB/s, so a 2.5Gb/s card (notice the change in size of the “B”) should have more than enough bandwidth on Gen 3 x1, even at 2.5Gb/s full duplex.
If you don’t have enough money to pay for delivery, you need to be picking it up yourself or, even better, cooking it yourself. DoorDash is horrendously expensive.
Assassin’s Creed Shadows takes place in Japan circa 1579…300 years before the advent of the incandescent light bulb
These little jabs are great.
Now, the question will be, will they actually keep it around this time? This was a feature they introduced in Vista, and soon after killed.
Or car infotainment software…which for some reason is on the same communications network as all of a car’s safety-critical systems…