

Architect, so in the neighborhood… I mostly interact with UL in the context of fire-rated assemblies, though.
Laboratory planner by day, toddler parent by night, enthusiastic everything-hobbyist in the thirty minutes a day I get to myself.
Architect, so in the neighborhood… I mostly interact with UL in the context of fire-rated assemblies, though.
I recently switched phones and forgot I didn’t have an adblocker installed yet. Clicked on an article and holy shit the modern mobile web is a toxic hellscape without it…
In fairness(?) Ford bet big on small cars in the wake of the Great Recession, and that worked well for a while, but by the time they decided that the only non-truck (from a CAFE standpoint) that they were going to keep selling was the Mustang, they were losing money on every Focus and Fiesta they sold.
A lot of that was their godawful automatic transmission that was forcing them to spend zillions in warranty repairs, but at the end of the day the margin on economy cars is so slim that you can’t afford to make mistakes. Rather than bet on perfect execution in a market that was already shrinking in the US, they decided to focus on higher-margin products… and that’s fine in the short term, but as you mention it’s going to leave them exposed once nobody can afford to spend $50k+ on a horrifically overpriced big pickup anymore.
In AEC work we’ve moved almost exclusively to a competing PDF tool called Blubeam, which is proprietary but very worth the price, with tools for scaling, dimensioning, and producing material takeoffs from PDF drawings. Much of what you’d use Acrobat for in a more typical office environment are absent or limited, though.
All the people mentioned in the article are alt-right lunatics and/or Trumpworld grifters. The only other place they might conceivably take their schtick is Truth Social – this is really only interesting as confirmation that the thin-skinned and insecure FrEe SpEeCh AbSoLuTiSt running that shithole is absolutely willing to silence anybody who annoys him, over the pettiest of disputes, regardless of political affiliation.
Fair point. My thrust was more that the reason why things like system boot times and software launch speeds don’t seem to benefit as much as they seem like they should when moving from, say, a good SATA SSD (peak R/W speed: 600 MB/sec) to a fast m.2 that might have listed speeds 20+ times faster, is that QD1 performance of that m.2 drive might only be 3 or 4 times better than the SATA drive. Both are a big step up from spinning media, but the gap between the two in random read speed isn’t big enough to make a huge subjective difference in many desktop use cases.
Femme V really nails the emotional arc of V’s story in ways that the VA for masc V doesn’t, to the point that I’m truthfully less invested in my current playthrough (male street kid origin) than I was in my original (female corpo).
That said, everybody who did voice work in BG3 did fantastic in ways that have made other things I’ve touched feel hit-and-miss – I nearly dropped Avowed due to some early mid voice work making me worry about the overall quality – but Neil Newborn has been rightfully getting acclaim for Astarion, and you have to hand it to him – they gave him the character, but he was the one who decided “I’m gonna chew the scenery so hard I shit splinters” and made it work so damn well.
The trouble with ridiculous R/W numbers like these is not that there’s no theoretical benefit to faster storage, it’s that the quoted numbers are always for sequential access, whereas most desktop workloads are more frequently closer to random, which flash memory kinda sucks at. Even really good SSDs only deliver ~100MB/sec in pure random access scenarios. This is why you don’t really feel any difference between a decent PCIe 3.0 M.2 drive and one of these insane-o PCI-E 5.0 drives, unless you’re doing a lot of bulk copying of large files on a regular basis.
It’s also why Intel Optane drives became the steal of the century when they went on clearance after Intel abandoned the tech. Optane is basically as fast in random access as in sequential access, which means that in some scenarios even a PCIe 3.0 Optane drive can feel much, much snappier than a PCIe 4 .0 or 5.0 SSD that looks faster on paper.
Elon in his Cave Johnson era and we’re here for it
Look, I’m in no position to talk seeing as I once wrote a cron job in PHP, but the profusion of JavaScript in the late aughts and early teens for things that weren’t “make my website prettier!” feels very much like a bunch of “webmasters” dealing with the fact that the job market had shifted out from under them while they weren’t looking and rebranding as “developers” whose only tool was Hammer.js, and thinking all their problems could be recontextualized as Nail.js.
Point me towards systems that don’t have a human in the loop, particularly any that utilize fully-autonomous swarms, and I’ll agree. Scary as the former are, there’s a world of difference between a handful of FPV suicide drones, and a cloud of HL2-Manhack-esque things operating on face-recogniton-guided autopilot.
I’ve low-key started to think the only reason we haven’t seen autonomous hunter-killer drones yet is that nobody’s willing to break the seal, and I’m scared for what happens when somebody finally does.
I was the last of my immediate family on Facebook, and I only stuck around to keep in touch with a couple hobby groups. I decided to cut the cord once Zuck went mask-off, and honestly I haven’t regretted it. The family group text is still chugging along fine, and most of the people I actually want to talk to are on other platforms at this point.
I don’t blame anybody who feels like they have to keep Facebook to stay in touch with loved ones… but man, it feels good not to have that spammy time suck on my phone anymore.
Nobara is just Fedora with a heavy layer of gaming-focused polish applied. In that regard it’s quite a bit more familiar than something like Arch, which makes a point of not holding anybody’s hand, and (just in terms of ease of use and overall userbase) feels a lot closer to what Gentoo was like back when I last was in this space.
I was heavily in the camp of Debian-based distros back in the day, but Debian proper has never been a great choice for desktop, and Ubuntu’s star is much faded of late, so I decided to give an RPM-based distro a chance before jumping way off into the deep end. I don’t have the time to fiddle that I used to, and (at least until yesterday’s hiccup) Nobara was much closer to “it just works” out of the box than anything like Arch would have been.
I’m an ex-sysadmin so I guess I get to be the middle head, but blundering my way through the current distro scene after not having touched a desktop Linux install in, oh… twenty years or so, I feel more like the right. I suppose on the one had I had the good sense not to jump right into Arch or Nix, but even more familiar territory like Nobara has its pitfalls. Just today I had to clean up a botched release upgrade because the primary maintainer had left conflicting packages in the repository for an extended period. Not laying blame per se, that’s what you get when you sign on to a one-man effort, but it was a real pain in the butt to diagnose and correct.
Say it with me now: model collapse! I think this approach is especially insidious in that rather than dumping obvious nonsense into the training corpus that can then be scrubbed, it pushes the downstream LLM invisibly towards spontaneously imploding.
Keep in mind that when 10nm was in planning, EUV light sources looked very exotic relative to current tech, and even though we can see in hindsight that the tech works it is still expensive to operate – TSMC’s wafer costs increased 2x-3x for EUV nodes. If I was running Intel and my engineers told me that they thought they could extend the runway for DUV lithography for a node or two without sacrificing performance or yields, I’d take that bet in a heartbeat. Continuing to commit resources to 10nm DUV for years after it didn’t pan out and competitors moved on to smaller nodes just reeks of sunk-cost fallacy, though.
Intel’s problems, IMO, have not been an issue of strategy but of engineering. Trying to do 10nm without EUV was a forgivable error, but refusing to change course when the node failed over and over and over to generate acceptable yield was not, and that willful ceding of process leadership has put them in a hole relative to their competition, and arguably lost them a lucrative sole-source relationship with Apple.
If Intel wants to chart a course that lets them meaningfully outcompete AMD (and everyone else fighting for capacity at TSMC) they need to get their process technology back on track. 18A looks good according to rumors, but it only takes one short-sighted bean counter of a CEO to spin off fabs in favor of outsourcing to TSMC, and once that’s out of house it’s gone forever. Intel had an engineer-CEO in Gelsinger; they desperately need another, but my fear is that the board will choose to “go another direction” and pick some Welchian MBA ghoul who’ll progressively gut the enterprise to show quarterly gains.
This feels like complaints over asset flips bleeding over into first-party asset reuse, because the people complaining don’t understand why the former is objectionable. It’s not that seeing existing art get repurposed is inherently bad (especially environmental art… nobody needs to be remaking every rock and bush for every game) but asset flips tend to be low effort, lightly-reskinned game templates with no original content. Gamers just started taking the term at face value and assumed the use of asset packs was the problem, rather than just a symptom of a complete lack of effort or care on the developers’ part
Sorta yes and no. T-Mobile US is its own corporate entity, but their majority shareholder is Deutsche Telekom, and they take their name from that company’s mobile service brand.