

I can likely fall into some version of a category of learned professional. IMO, it’s fine, many of us have made our migration to Lemmy. Reddit can burn.
Some IT guy, IDK.
I can likely fall into some version of a category of learned professional. IMO, it’s fine, many of us have made our migration to Lemmy. Reddit can burn.
Bitch please.
Just try to install a program that isn’t in the repo without dropping to the CLI.
I dare you, I double dare you.
I’m pretty sure they’re saying that customization, while present in Linux, is not accessible to most because of a lack of GUI options to configure a nontrivial number of the customization settings.
This is correct. I work with the “average user” of technology daily as IT support, and honestly, they don’t give any shits at all about why it’s messed up, or what needs to be done to correct the problem. Box broken, make fix.
The argument that I think the poster is trying to make is that, if a user needs to do any self troubleshooting, which is basically inevitable with technology at the moment, having to use a CLI to get things done is undesirable for the average person. They barely want to bother opening control panel in Windows (or the new “settings” app… Ugh.) nevermind understand any of it.
Box broken. Make fix.
IMO, this is a demonstration of the problem. You’re blaming the poster/their equipment. Rather than any real solution to the problem the defacto answer is “well, it works for me so what’s wrong with you?”
I’ve never heard this kind of toxicity from other communities (like the apple/Windows crowds). Often you’ll get useful answers indicating what to check or pointing to another resource. There’s always the chance that the hardware is busted, but let’s face it, in the modern era, that’s far less likely to happen now than it was even 10 years ago.
Immediately blaming the user for their issue isn’t going to solve the problem, nor does it endear any average user to the Linux community or the Linux OS. This attitude is not going to help adoption even if the posters concerns are invalidated by newer/better drivers/software, and all they need to do is update, and/or try again.
This kind of statement actively harms Linux adoption.
Exactly this. And pretty much everyone here is a techie in some way, shape, or form.
Why does anyone think that a non-tech would take the time to troubleshoot their system the way we do? A user would hit their first issue and in the process of trying to solve it, just go and buy a MacBook.
This isn’t going to endear people to Linux.
We will not win the majority of the market with Linux in it’s current form. We need better integration and package management. Self repairing subsystems. We need Linux to basically fix itself when these ridiculous issues come up that non techs simply can’t be arsed to try to fix.
There’s a long way to go before pushing Linux on anyone outside of tech circles. Unless you want to be the 24/7 free tech support, it’s easier just to throw a cheap Windows system or Mac at them and let them deal with it instead.
I hate the term “it just works” because it’s almost never true, but I can say that for non techs, Windows and Mac “just work” more often than Linux does.
I love Linux. I love everything about it. From the origin story, the ability to make your system lean and clean, running at optimal performance, and being able to adjust every knob and setting to my heart’s content. I love it. But I’m a realist. All the things I love about Linux, are largely reasons that non techs would hate Linux.
I can’t tell you how pissed I was when they did they. They invalidated so many links to solutions.
Granted, there was a lot of useless slop on there too, mostly from eol versions of Windows like 2000, millennium edition…
They threw all of it away, both good and bad, without warning. Without any opportunity for anyone to archive it. WTF Microsoft.
To their credit, their new documentation seems to be much better, they actually have useful help articles on not only how to do something, but also explaining the mechanisms, requirements and limitations of things. Not everything is in their new docs but I have to give credit where it’s due, the technical document writers are doing good work.
With all that being said, it doesn’t mean that Windows, or Microsoft are on a good trajectory.
Their new operating systems and updates are some of the worst updates and changes I’ve seen to their systems. Adding ads and basically spying on paying customers…
There are some controversial changes I’m in favor of, like the TPM requirement. A lot don’t realize it but Apple integrated a TPM in basically everything they make over the years. The migration was slow but it happened almost silently, without anyone really noticing. All major smartphones have some version of a TPM, so the last bastion of not having/needing one is the PC market.
The PC market has known they should include this stuff for years before Windows 11 was released. If you go and look at mid to high end motherboards, even for custom/retail units, there are at least TPM headers on most of them. OEMs knew this was coming and instead of just integrating it into their product, like everyone else did, they made it an optional feature. Since nobody knew what the fuck a TPM is, nobody bought into that option. Now millions of computers are destined for ewaste because manufacturers couldn’t be bothered to add a small IC to the system without being obligated to do so by someone like Microsoft. An entire industry of technology has this one thing that nobody even fucking knows exists, and they’re the hold out.
… And everyone is mad at Microsoft about it.
I’m not. TPM chips are a good addition to systems. It shouldn’t even be a debate. I blame OEMs for not bothering to add them when they could have/should have, and making it mandatory on all prebuilts, all retail motherboards, all boutique systems, all custom builds… Everything. The cost difference would have been into the tens of dollars at most. It would have barely made any difference at all.
Anyways. I’ll stop now.
This is the problem I see with most people adopting Linux.
It’s great when it works but when things go awry you end up sinking hours of time into an issue. Generally on Windows or Mac, the most you’ll have to do is remove it and re-add it.
If more is needed, the userbase is so large that there’s a high probability that someone has had your exact issue and posted a solution about it somewhere online, you just need to go and find it.
Linux is very hit and miss on a lot of these points. Sometimes it’s great, sometimes it sucks.
Windows tends to suck all the time, but the vast majority of the time it only sucks a little bit, because it’s Windows… It works, but it’s not great.
I’m all for Linux, but as someone who is more interested in doing useful work on my computer, not troubleshooting my system to get it to operate at all, I’ve stuck to Windows for a while now. I support Linux and prefer it to alternatives when running any server-based service, but for my desktop? I can’t justify the time investment in getting it to the same operational level as my current Windows install.
This is the same reason I bought a Dell, knowing full well that I could get more performance and a better value by building my own system. I absolutely can build a system for myself, I choose not to because it’s simply more work that I don’t care to spend time on. To be fair, my system is a precision 2RU HEDT, but that’s another discussion entirely.
Please don’t take me wrong: Linux is great and should see more adoption. My argument is that there’s a nontrivial number of people who want a system that simply operates, not one that turns into a science project because of a borked update. Windows updates have caused problems, but usually not everything-is-broken type problems… More that printing doesn’t work or something like that…
IT guy checking in.
The only time I’ve even seen drive temp sensor alarms is on server raid arrays and other similar hard drives/SSDs… Never in my life have I seen one available on a consumer device, nor have I seen any alarm for and drive temp, go off. It just doesn’t happen.
IMO, this is one of those language barriers where people call their computer chassis (and everything in it) the “hard drive”.
Applying that assumption, their updated statement is: His computer over heated.
Idk what kind of shit system he’s running on that 60k rows would cause overheating, but ok.
I’m not sure they’re going to be able to give those things away.
Seems to me, at this point, this is a bit like owning anything with a swastika on it in 1946.
I know a lot of people who use and like brother printers. Years ago the go to was HP, then it was Xerox for a while when they had decent small format printers, but they seem to have gone back to their roots of large multi function printers for the most part and priced themselves out of most markets. They’re still good, but you pay for the name.
Toshiba’s printing division was absorbed by Xerox, no help there. Dell… Has printers? I guess?
Brother is kind of the stand out. Everything else you can buy as a consumer is either HP, which went completely nuts on the whole “genuine” printer ink/toner, which is why a lot of people ran away screaming. The quality of the printers declined as they tried to force people into, what is basically printer ink as a service. Stupid.
But yeah. Bother is a decent mix of functional, affordable, and being low on the bullshit of using a printer. … That is, as long as the article isn’t a sign of things to come…
I’m hoping that by the time I need a replacement for what I have right now, there will be something open source… Cries for an open alternative to the current printer market have been ongoing pretty much anytime printers are mentioned. I expect someone is, or will be developing something to the effect of an open source hardware printer.
To be fair, this is also me when I look at a network setup years later. (I do IT with a specialty in networks)
Quick story, I bought a bubble jet printer for college in the mid 2000s, with all the fixings.
I set it up and got it working and promptly never used it. Almost all of my courses allowed either digital submissions or provided the printouts you actually needed, like course work that you would fill out. So I basically wasted my money, especially considering I could always use the large format printers at the school for like 5 cents per page.
Anyways. I did a few test prints and everything was fine and I got to work in college. Almost every time I needed the printer in order to actually print something, I more or less had to go and buy new ink. At first I was like “I guess I printed more than I thought?” But it kept happening. I would print maybe twice a year. Eventually I stopped using it as a printer (it was a multifunction, so I kept it as a scanner), and just used the printers at school. It was cheaper, considering the fact that printer ink is worth more by volume than basically any other substance; and while I was only buying a small amount, maybe $20 or so (adjusted for inflation, this is probably like $50 today) each time, it was a lot for a broke college student.
After college, I picked up a random laser printer, the printer cost more up front (I got another multifunction, but this time with a network port because I’m a nerd). I basically never bought any toner for it, given how little I had to print year over year, and it always was ready to go. I had it for years until a new windows version (maybe the OG Windows 10? Maybe Windows 8/8.1) made the drivers stop working and the manufacturer wouldn’t make drivers for that model that worked with the new requirements from Windows… I did a little print server for a bit to give it some more longevity, but ultimately it had to go to the IT storage in the sky.
Why thermal? Seems odd, but alright.
I recommend laser for just about everyone.
Don’t print much? Get a laser. Otherwise your ink will dry out and you’ll have to get new ink every time you want to print.
Print a lot? Laser. Super reliable, can do tens of thousands of sheets before there’s a problem, maybe more.
In fact, the only time I’d recommend an ink printer is for color accurate work like photo printing, and if you’re not using photo paper for it, then there’s not really much of a point, is there?
I used to think bubble jet/ink jet was the shit, then I started working in IT professionally and discovered the truth.
Just buy a laser printer folks. Don’t bother with all the rest of this shit. If you want/need inkjet, then you already know you need it and why. If you’re not sure, get a laser. You’ll pay wayyyyyy less on materials to keep it running
1000x this.
We’ve got all this figured out for 3D printers with all kinds of cool tools to make the job easier, and yet, take away a dimension and there’s crickets?
The hell?
Let’s make a 3D printable 2D printer.
Similar to Florida?
Well. I think I’m officially out of touch with the newest generations slang terms. I only understood about half of that.
A small but important distinction.
Thank you for the information. Have a good day.
AFAIK: Gentoo used to be just source repos, but times have changed. Gentoo repos now have binaries. You can opt out of them, so it’s up to you.
With binaries, it works like any other distro. Download the updated binaries, install, done.
If you go from source, then it will download all the source code, and do the whole makefile thing, and install the new binaries when the compile is done, every time you do an update.
So the direct answer to your question is: it depends. If you’re compiling everything then yes, you need to recompile everything that is updated. If you’re going to opt for binaries in the package manager, then no.
Narrator: they didn’t.