• 1 Post
  • 111 Comments
Joined 2 years ago
cake
Cake day: June 20th, 2023

help-circle
  • Hell, pass init=/bin/yes and you’ll see even more greatly reduced RAM usage!

    ❯ ps aux | grep /usr/lib/sys | awk '{print $6}' | sed 's/$/+/' | tr -d '\n' | sed 's/+$/\n/' | bc
    266516
    

    So that’s 260 MiB of RSS (assuming no shared libs which is certainly false) for:

    • Daemon manager
    • Syslog daemon
    • DNS daemon (which I need and would have to replace with dnsmasq if it did not exist)
    • udev daemon
    • network daemon
    • login daemon
    • VM daemon (ever hear of the principle of least privilege?)
    • user daemon manager (I STG anyone who writes a user daemon by doing nohup & needs to be fired into the sun. pkill is not the tool I should have to use to manage my user’s daemons)

    For comparison the web page I’m writing this on uses 117 MiB, about half. I’ll very gladly make the tradeoff of two sh.itjust.works tabs for one systemd suite. Or did you send that comment using curl because web browsers are bloated?

    For another comparison 200 MiB of RAM is less than two dollars at current prices. I don’t value my time so low that I’ll avoid spending two bucks by spend hours debugging whatever bash scripting spaghetti hell other init systems cling onto to avoid “bloat”. I’ve done it, don’t miss it.


  • On the one hand, deanonimization attacks are never entirely avoidable on unhardened targets and this one isn’t particularly sophisticated and leaks relatively little information.

    On the other hand deanonimization attacks are always bad and it’s a good reminder to people of the risks they are taking. This is also slightly non-obvious behavior, even if it makes sense to the technically competent, as something like an IP grabber normally requires user interaction such as clicking a link. It’s also a vector that CF might be able to mitigate by patching the ability to query a given cache directly.


  • What? I’m not privy to RedHat/IBM/Google’s internal processes but they are all massive FOSS contributors at least some of which I assume are using Agile internally. The Linux kernel is mostly corpo-backed nowadays.

    The development cycle of FOSS is highly compatible with Agile processes, especially as you tend towards the Linux Kernel style of contributing where every patch is expected to be small and atomic. A scrum team can 100% set as a Sprint Goal “implement and submit patches for XYZ in kernel”.

    Also agile ≠ scrum. If you’re managing a small github project by sorting issues by votes and working on the top result, then congratulations, you’re following an ad-hoc agile process.

    I think what you’re actually mad at is corporate structures. They systematically breed misaligned incentives proportional to the structure’s size, and the top-down hierarchy means you can’t just fork a project when disagreements lead to dead ends. This will be true whether you’re doing waterfall or scrum.


  • Any source on any significant amount of children wasting time talking to AIs, or just anecdotes and a bad case of “youth these days”?

    The whole concept smells like fringe NEET 4chan-adjacent behavior. LLMs aren’t capable of maintaining an even remotely convincing simulacrum of human connection, and anyone who would project companionship onto these soulless computer programs obviously has preexisting and severe mental issues (relying on AIs to fill a void in human connection is certainly unhealthy but a symptom, not the root cause).

    The potential market for these AIs will never be any bigger than the market for anime waifu body pillows, because it’s same audience, different decade. Literally everyone else thinks AI girlfriends and body pillow waifus are weird as all hell, and that’s not going to change because neurotypical people want and need human connection and can tell the difference between a rock with googly eyes and a friend.

    Also arguably a rock with googly eyes has more charm and personality than Zuck’s horror show.



  • Oh they definitely exist. At a high level the bullshit is driven by malicious greed, but there are also people who are naive and ignorant and hopeful enough to hear that drivel and truly believe in it.

    Like when Microsoft shoves GPT4 into notepad.exe. Obviously a terrible terrible product from a UX/CX perspective. But also, extremely expensive for Microsoft right? They don’t gain anything by stuffing their products with useless annoying features that eat expensive cloud compute like a kid eats candy. That only happens because their management people truly believe, honest to god, that this is a sound business strategy, which would only be the case if they are completely misunderstanding what GPT4 is and could be and actually think that future improvements would be so great that there is a path to mass monetization somehow.




  • Or just :set mouse=a if your terminal emulator was updated in the past decade. gVim has nothing to offer anymore, except that it bundles its own weird terminal emulator that doesn’t inherit any of the fonts, themes, settings or shortcuts of one’s default terminal. Blegh.

    Also if you’re not going to leverage Vim’s main feature and just want to click around on stuff, just install VSCod(e|ium), which is genuinely amazingly good.


  • He was really popular on twitter, and if he says mastodon’s worse despite having a smaller audience there, I trust his judgement. Literally his pinned toot.

    “First replies shown are the ones the author replied to and/or liked” seems like an obvious, simple, and transparent algorithm. Like youtube comments. Give lazy reply guys an opportunity to see without scrolling down that they aren’t as original as they think they are. The fact that this isn’t implemented in even a basic form is absolutely insane and shows a very fundamental ideological disconnect between people who want “open twitter with decent moderation” and whatever the fuck it is that the mastodon OGs/devs are trying to achieve.



  • Some people don’t want a suggestion algorithm but do want full reply federation.

    Alec from Technology Connections stopped using mastodon because of this, every post he made would get nitpicked on by 20 different people from instances who did not federate the replies with each other so each reply guy thought they were the first.

    I have a single user instance and I use a relay, but most replies are still missing if I click on a post unless I go to the original webpage.

    Lazy-federating replies when a post is viewed sounds like an obvious solution but AFAIK the mastodon devs are very opposed to this.



  • I wasn’t very old then but the main thing was RAM. Fuckers in Microsoft sales/marketing made 1 GB the minimum requirement for OEMs to install Vista.

    So guess what? Every OEM installed Vista with 1 GB of RAM and a 5200 RPM hard drive (the “standard” config for XP which is what most of those SKUs were meant to target). That hard drive would inevitably spend its short life thrashing because if you opened IE it would immediately start swapping. Even worse with OEM bloat, but even a clean Vista install would swap real bad under light web browsing.

    It was utterly unusable. Like, everything would be unbearably slow and all you could do was (slowly) open task manager and say “yep, literally nothing running, all nonessential programs killed, only got two tabs open, still swapping like it’s the sex party of the century”.

    “Fixing” those hellspawns by adding a spare DDR2 stick is a big part of how I learned to fix computer hardware. All ya had to do was chuck 30 € of RAM in there and suddenly Vista went from actually unusable to buttery smooth.

    By the time the OEMs wised up to Microsoft’s bullshit, Seven was around the corner so everyone thought Seven “fixed” the performance issues. It didn’t, it’s just that 2 GB of RAM had become the bare minimum standard by then.

    EDIT: Just installed a Vista VM because I ain’t got nothing better to do at 2 am apparently. Not connected to the internet, didn’t install a thing, got all of 12 processes listed by task manager, and it already uses 500 MB of RAM. Aero didn’t even enable as I didn’t configure graphics acceleration.



  • Bro I wouldn’t trust most companies not to store their only copy of super_duper_important_financial_data_2024.xlsx on an old AliExpress thumb drive attached to the CFO’s laptop in a coffee shop while he’s taking a shit.

    If your company has an actual DRP for if your datacenter catches fire or your cloud provider disappears, you are already doing better than 98 % of your competitors, and these aren’t far-fetched disaster scenarios. Maintaining an entire separate pen-and-paper shadow process, training people for it? That’s orders of magnitude more expensive than the simplest of DRPs most companies already don’t have.

    Friendly wave to all the companies currently paying millions a year extra to Broadcom/VMWare because their tools and processes are too rigid to use with literally any other hypervisor when realistically all their needs could be covered by the free tier of ProxMox and/or OpenStack.


  • There’s probably a whole thesis or five to be written on the subject.

    The “traditional” AAA pipeline is “make big games with loooots of assets and mechanics, maximize playtime, must be an Open World and/or GaaS”. Both due to institutional pressures (lowest common denominator, investor expectations for everyone to copy the R* formula, GaaS are money printing machines) and technical reasons (open worlds are easy to do sloppily, you can just deliver the game half finished and have it work (e.g. Cyberpunk), GaaS/open worlds are a somewhat natural consequence of extremely massive development teams that simply could not work together on a more narrowly focused genre).

    That’s not to say there aren’t good expensive games being payrolled by massive studios like Sony or Microsoft. But AAA is a specific subset of those, and blandness comes with the territory. However if I was a betting man I’d say we’re nearing the end of this cycle with the high profile market failures of the last few years and the AAA industry will have to reinvent itself at least somewhat. Investors won’t want to be left holding the bag for the next Concord.


  • Wait until you learn about debhelper.

    If you use a debian-based system, unless you have actively looked at the DH source, the one thing that built virtually every package on your system, you do not get to say anything about “bloat” or “KISS”.

    DH is a monstrous pile of perl scripts, only partially documented, with a core design that revolves around a spaghetti of complex defaults, unique syntax, and enough surprising side effects and crazy heuristics to spook even the most grizzled greybeards. The number of times I’ve had to look at the DH perl source to understand a (badly/un)documented behavior while packaging something is not insignificant.

    But when we replaced a bazillion bash scripts with a (admittedly opinionated but also stable and well documented) daemon suddenly the greybeards acted like Debian was going to collapse under the weight of its own complexity.


  • Congrats. So you think that since you can do it (as a clearly very tech-literate person) the government shouldn’t do anything? Do you think it’s because they all researched the issues with these companies and decided to actively support them, or is it because their apathy should be considered an encouragement to continue?

    You are so haughty you’ve circled back around to being libertarian. This is genuinely a terrible but unfortunately common take that is honestly entirely indistinguishable from the kind of shit you’d hear coming from a FAANG lobby group.