• 7 Posts
  • 781 Comments
Joined 1 year ago
cake
Cake day: October 4th, 2023

help-circle







  • In August 1993, the project was canceled. A year of my work evaporated, my contract ended, and I was unemployed.

    I was frustrated by all the wasted effort, so I decided to uncancel my small part of the project. I had been paid to do a job, and I wanted to finish it. My electronic badge still opened Apple’s doors, so I just kept showing up.

    I asked my friend Greg Robbins to help me. His contract in another division at Apple had just ended, so he told his manager that he would start reporting to me. She didn’t ask who I was and let him keep his office and badge. In turn, I told people that I was reporting to him. Since that left no managers in the loop, we had no meetings and could be extremely productive.

    They created a pretty handy app that was bundled with the base OS, and which I remember having fun using. So it’s probably just as well that Apple didn’t hassle them. But in all seriousness, that’s not the most amazing building security ever.

    reads further

    Hah!

    We wanted to release a Windows version as part of Windows 98, but sadly, Microsoft has effective building security.


  • It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

    And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

    At least some Dell laptops authenticate to the charger so that only “authentic Dell chargers” can charge the battery, though they’ll run off third-party chargers without charging the battery.

    Unfortunately, it’s a common problem – and I’ve seen this myself – for the authentication pin on an “authentic Dell charger” to become slightly bent or something, at which it will no longer authenticate and the laptop will refuse to charge the battery.

    I bet the charger on yours is a barrel charger with that pin down the middle.

    hits Amazon

    Yeah, looks like it.

    https://www.amazon.com/dp/B086VYSZVL?psc=1

    I don’t have a great picture for the 65W one, but the 45W charger here has an image looking down the charger barrel showing that internal pin.

    If you want to keep using that laptop and want to use the battery, I’d try swapping out the charger. If you don’t have an official Dell charger, make sure that the one you get is one of those (unless some “universal charger” has managed to break their authentication scheme in the intervening years; I haven’t been following things).

    EDIT: Even one of the top reviews on that Amazon page mentions it:

    I have a DELL, that has the straight barrel plug with the pin in it. THEY REALLY made a BAD DECISION when they made these DELL laptops with that type of plug instead of making it with a dog leg style plug. I have to replace my charger cord A LOT because the pin gets bent inside and it stops charging at that plug, but the rest of the charger is still good…


  • Up until the early 2000s, serial computation speed doubled about every 18 months. That meant that virtually all software just ran twice as quickly every 18 months of CPU advances. And since taking advantage of that was trivial, new software releases did, traded CPU cycles for shorter development time or more functionality, demanded current hardware to run at a reasonable clip.

    In that environment, it was quite important to upgrade the CPU.

    But that hasn’t been happening for about twenty years now. Serial computation speed still increases, but not nearly as quickly any more.

    This is about ten years old now:

    https://preshing.com/20120208/a-look-back-at-single-threaded-cpu-performance/

    Throughout the 80’s and 90’s, CPUs were able to run virtually any kind of software twice as fast every 18-20 months. The rate of change was incredible. Your 486SX-16 was almost obsolete by the time you got it through the door. But eventually, at some point in the mid-2000’s, progress slowed down considerably for single-threaded software – which was most software.

    Perhaps the turning point came in May 2004, when Intel canceled its latest single-core development effort to focus on multicore designs. Later that year, Herb Sutter wrote his now-famous article, The Free Lunch Is Over. Not all software will run remarkably faster year-over-year anymore, he warned us. Concurrent software would continue its meteoric rise, but single-threaded software was about to get left in the dust.

    If you’re willing to trust this line, it seems that in the eight years since January 2004, mainstream performance has increased by a factor of about 4.6x, which works out to 21% per year. Compare that to the 28x increase between 1996 and 2004! Things have really slowed down.

    We can also look at about the twelve years since then, which is even slower:

    https://www.cpubenchmark.net/compare/2026vs6296/Intel-i7-4960X-vs-Intel-Ultra-9-285K

    This is using a benchmark to compare the single-threaded performance of the i7 4960X (Intel’s high-end processor back at the start of 2013) to that of the Intel Ultra 9 285K, the current one. In those ~12 years, the latest processor has managed to get single-threaded performance about (5068/2070)=~2.448 times the 12-year-old processor. That’s (5068/2070)^(1/12)=1.07747, about a 7.7% performance improvement per year. The age of a processor doesn’t matter nearly as much in that environment.

    We still have had significant parallel computation increases. GPUs in particular have gotten considerably more powerful. But unlike with serial compute, parallel compute isn’t a “free” performance improvement – software needs to be rewritten to take advantage of that, it’s often hard to parallelize solving problems, and some problems cannot be solved in parallel.

    Honestly, I’d say that the most-noticeable shift is away from rotational drives to SSDs – there are tasks for which SSDs can greatly outperform rotational drives.




  • I’ve kind of felt the same way, would rather have a somewhat-stronger focus on technology in this community.

    The current top few pages of posts are pretty much all just talking about drama at social media companies, which frankly isn’t really what I think of as technology.

    That being said, “technology” kind of runs the gamut in various news sources. I’ve often seen “technology news” basically amount to promoting new consumer gadgets, which isn’t exactly what I’d like to see from the thing, either. I don’t really want to see leaked photos of whatever the latest Android tablet from Lenovo or whatever is either.

    I’d be more interested in reading about technological advances and changes.

    I suppose that if someone wants to start a more-focused community, I’d also be willing to join that, give it a shot.

    EDIT: I’d note that the current content here kind of mirrors what’s on Reddit at /r/Technology, which is also basically drama at social media companies. I suppose that there’s probably interest from some in that. It’s just not really what I’m primarily looking for.



  • I think that California should take keeping itself competitive as a tech center more-seriously. I think that a lot of what has made California competitive for tech is because it had tech from earlier, and that at a certain threshold, it becomes advantageous to do more companies in an area – you have a pool of employees and investors and such. But what matters is having a sufficiently-large pool, and if you let that advantage erode sufficiently, your edge also goes away.

    We were just talking about high California electricity prices, for example. A number of datacenters have shifted out of California because the cost of electricity is a significant input. Now, okay – you don’t have to be right on top of your datacenters to be doing tech work. You can run a Silicon Valley-based company that has its hardware in Washington state, but it’s one more factor that makes it less appealing to be located in California.

    The electricity price issue came up a lot back when people were talking about Bitcoin mining more, since there weren’t a whole lot of inputs and it’s otherwise pretty location-agnostic.

    https://www.cnbc.com/2021/09/30/this-map-shows-the-best-us-states-to-mine-for-bitcoin.html

    In California and Connecticut, electricity costs 18 to 19 cents per kilowatt hour, more than double that in Texas, Wyoming, Washington, and Kentucky, according to the Global Energy Institute.

    (Prices are higher now everywhere, as this was before the COVID-19-era inflation, but the fact that California is still expensive electricity-wise remains.)

    I think that there is a certain chunk of California that is kind of under the impression that the tech industry in California is a magic cash cow that is always going to be there, no matter what California does, and I think that that’s kind of a cavalier approach to take.

    EDIT: COVID-19’s remote-working also did a lot to seriously hurt California here, since a lot of people decided “if I don’t have to pay California cost-of-living and can still keep the same job, why should I pay those costs?” and just moved out of state. If you look at COVID-19-era population-change data in counties around the San Francisco Bay Area, it saw a pretty remarkable drop.

    https://www.apricitas.io/p/california-is-losing-tech-jobs

    California is Losing Tech Jobs

    The Golden State Used to Dominate Tech Employment—But Its Share of Total US Tech Jobs has Now Fallen to the Lowest Level in a Decade

    Nevertheless, many of the tech industry’s traditional hubs have indeed suffered significantly since the onset of the tech-cession—and nowhere more so than California. As the home of Silicon Valley, the state represented roughly 30% of total US tech sector output and got roughly 10% of its statewide GDP from the tech industry in 2021. However, the Golden State has been bleeding tech jobs over the last year and a half—since August 2022, California has lost 21k jobs in computer systems design & related, 15k in streaming & social networks, 11k in software publishing, and 7k in web search & related—while gaining less than 1k in computing infrastructure & data processing. Since the beginning of COVID, California has added a sum total of only 6k jobs in the tech industry—compared to roughly 570k across the rest of the United States.

    For California, the loss of tech jobs represents a major drag on the state’s economy, a driver of acute budgetary problems, and an upending of housing market dynamics—but most importantly, it represents a squandering of many of the opportunities the industry afforded the state throughout the 2010s.


  • tal@lemmy.todaytoLinux@lemmy.worldFinally did it
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 days ago

    I mean, you can run an infrared remote. I don’t know if there’s specifically an open source infrared remote out there, but I wouldn’t be surprised, as they aren’t very complicated devices.

    On the software side, you’re going to want LIRC. It’ll have a list of supported receivers.

    https://www.lirc.org/

    kagis

    Here’s an open-source infrared remote:

    https://github.com/CoretechR/OMOTE

    Personally, I wouldn’t care if the remote is open source any more than I would my keyboard, but if it’s interesting to you…



  • If you seriously want to set something like this up, you’re going to need a device that can emit the smells that you want.

    This instance of a device looks like it uses atomizers hooked up to different tanks:

    https://www.amazon.com/Automatically-Releases-Immersive-Compatible-Platforms/dp/B0CNMXSN2K

    I’d imagine that one run as many tanks as one wanted.

    One limiting factor is that scent isn’t going to immediately change when you change your virtual environment. I’d guess that emitting the vapor close to your face, maybe running a hose up towards it, would help. Probably want some kind of exhaust to purge the previous smell from the room. My guess is that the reason that the reason that a “booth” is used in the submitted article is to minimize the airspace surrounding the user and thus clearing time.

    Second, some form of computer control. Maybe some device that has relays controlled via USB. A relay is an electromechanical switch that can can cut power to an atomizer on and off, could run it to the atomizer.

    https://ncd.io/usb-relay/

    Those guys sell USB devices with up to 64 relays. I haven’t looked, but it probably looks to the computer like a virtual serial port, takes text commands.

    Then you need some kind of daemon running on the computer to send these commands at appropriate times.

    And lastly, you need some way to trigger the daemon when the game is seeing some sort of event. Could monitor the game’s logfile if it has one and contains the necessary information – I recall some Skyrim-hooking software that does this – take a screenshot periodically and analyze it, or identify and then monitor the game’s memory, probably either a technique called library injection (on Linux, library interposers are a way to so this) or using the same API that debuggers use.

    If the hentai game that your friend is after is Ren’Py-based – a popular option for visual novels, which many such games are – and the game includes the Python source .rpy files, which some do, then the game’s source itself could simply be modified. If it contains only compiled .rpyc files, that won’t be an option.

    You’re going to need to obtain whatever scents you want to emit as well. You can get collections of essential oils – the aromatherapy crowd is into those – and mix them up to create blends that you want, stick 'em in the atomizer tanks.

    One issue is that hacking it into an existing game is going to mean that the game isn’t intentionally designed around the use of scent.



  • tal@lemmy.todaytoTechnology@lemmy.worldWorld's First MIDI Shellcode
    link
    fedilink
    English
    arrow-up
    24
    ·
    edit-2
    12 days ago

    The thing the guy is poking at is a synthesizer, a device that lets you compose music and synthesizes the audio.

    He got a service manual that showed some technical information about a similar synthesizer that indicated that some of the pins on one of the chips were used for a standard interface used to diagnose problems on devices, called JTAG. He guessed correctly that his similar synthesizer also used the same pins for this.

    He made some guesses about what functionality was present, and was able to identify the microprocessor and download the device firmware using this port.

    He then went looking for interesting bits of text in the firmware. What he ran across was something that appeared to be a diagnostic shell (I.e. you enter commands and can see a response) as well as the password to access it.

    He didn’t know how one reached the shell. He went digging in the firmware further and discovered that the device – which acted as a MIDI device over USB to a host computer – took in special MIDI commands that would go to this shell.

    Now he had a way to access the shell any time he had one of these synths plugged into his computer via USB – he didn’t need to physically connect to the diagnostic pins on the chip.

    One feature of the shell permitted modifying RAM on the synthesizer. It wasn’t intended to let one upload executable code, but he uploaded it into some unused memory and then overwrote the frame pointer on the stack used by the shell program to point to that code (which a processor uses to know where to continue executing after running a subroutine) and then returned into his code, which let him get to the point where he could not just upload code to the microprocessor but also run it.

    He wrote his own transfer program for high-speed data transfer over USB and modified the in-RAM code that displayed video.

    This then let him upload video to part of the display and display it at a relatively high frame rate, which is the anime video shown in the last section. I believe that the laptop in the foreground is showing the original frames.

    My understanding from two articles recently posted here is that it is a fad for hardware hackers to play this “bad apple” anime video on all sorts of old and low end devices.