

This genuinely made me do an IRL spit take, holy shit.
This genuinely made me do an IRL spit take, holy shit.
Same, but they did set up a self hosted instance for us to use and, tbh, it works pretty good.
I think it’s s good tool specifically for helping when you dunno what’s going on, to help with brainstorming or exploring different solutions. Getting recommended names of tools, finding out “how do other people solve this”, generating documentation, etc
But for very straightforward tasks where you already know what you are doing, it’s not helpful, you already know what code you are going to write anyways.
Right tool for the right job.
I primarily use GPT style tools like ChatGPT and whatnot.
The key is, rather than asking it to generate code, specify that you dont want code and instead want it to help you work through the solution. Tell it to ask you meaningful questions about your problem and effectively act as a rubber duck
Then, after you’ve chosen a solution with it, ask it to generate code based on all the above convo.
This will typically produce way higher quality results and helps avoid potential X/Y problems.
Humans are “trained” with maybe ten thousand “tokens” per day
Uhhh… you may wanna rerun those numbers.
It’s waaaaaaaay more than that lol.
and take only a couple dozen watts for even the most complex thinking
Mate’s literally got smoke coming out if his ears lol.
A single Wh
is 860 calories…
I think you either have no idea wtf you are talking about, or your just made up a bunch of extremely wrong numbers to try and look smart.
Humans will encounter hundreds of thousands of tokens per day, ramping up to millions in school.
An human, by my estimate, has burned about 13,000 Wh by the time they reach adulthood. Maybe more depending in activity levels.
While yes, an AI costs substantially more Wh
, it also is done in weeks so it’s obviously going to be way less energy efficient due to the exponential laws of resistance. If we grew a functional human in like 2 months it’d prolly require way WAY more than 13,000 Wh
during the process for similiar reasons.
Once trained, a single model can be duplicated infinitely. So it’d be more fair to compare how much millions of people cost to raise, compared to a single model to be trained. Because once trained, you can now make millions of copies of it…
Operating costs are continuing to go down and down and down. Diffusion based text generation just made another huge leap forward, reporting around a twenty times efficiency increase over traditional gpt style LLMs. Improvements like this are coming out every month.
For sure, much like how a cab driver has to know how to drive a cab.
AI is absolutely a “garbage in, garbage out” tool. Just having it doesn’t automatically make you good at your job.
The difference in someone who can weild it well vs someone who has no idea what they are doing is palpable.
Good, fire 2 devs out of 3.
Companies that do this will fail.
Successful companies respond to this by hiring more developers.
Consider the taxi cab driver:
With the invention if the automobile, cab drivers could do their job way faster and way cheaper.
Did companies fire drivers in response? God no. They hired more
Why?
Because they became more affordable, less wealthy clients could now afford their services which means demand went way way up
If you can do your work for half the cost, usually demand goes up by way more than x2 because as you go down in wealth levels of target demographics, your pool of clients exponentially grows
If I go from “it costs me 100k to make you a website” to “it costs me 50k to make you a website” my pool of possible clients more than doubles
Which means… you need to hire more devs asap to start matching this newfound level of demand
If you fire devs when your demand is about to skyrocket, you fucked up bad lol
We are having massive exponential increases in output with all sorts of innovations, every few weeks another big step forward happens
Wait til you realize that’s just what art literally is…
You skipped possibility 3, which is actively happening ing:
Advancements in tech enable us to produce results at a much much cheaper cost
Which us happening with diffusion style LLMs that simultaneously cost less to train, cost less to run, but also produce both faster abd better quality outputs.
That’s a big part people forget about AI: it’s a feedback loop of improvement as soon as you can start using AI to develop AI
And we are past that mark now, most developers have easy access to AI as a tool to improve their performance, and AI is made by… software developers
So you get this loop where as we make better and better AIs, we get better and better at making AIs with the AIs…
It’s incredibly likely the new diffusion AI systems were built with AI assisting in the process, enabling them to make a whole new tech innovation much faster and easier.
We are now in the uptick of the singularity, and have been for about a year now.
Same goes for hardware, it’s very likely now that mvidia has AI incorporating into their production process, using it for micro optimizations in its architectures and designs.
And then those same optimized gpus turn around and get used to train and run even better AIs…
In 5-10 years we will look back on 2024 as the start of a very wild ride.
Remember we are just now in the “computers that take up entire warehouses” step of the tech.
Remember that in the 80s, a “computer” cost a fortune, took tonnes of resources, multiple people to run it, took up an entire room, was slow as hell, and could only do basic stuff.
But now 40 years later they fit in our pockets and are (non hyoerbole) billions of times faster.
I think by 2035 we will be looking at AI as something mass produced for consumers to just go in their homes, you go to best buy and compare different AI boxes to pick which one you are gonna get for your home.
We are still at the stage of people in the 80s looking at computers and pondering “why would someone even need to use this, why would someone put one in their house, let alone their pocket”
No, it’s just not something exposed to you to see
But under the hood it very much does shift gears depending on what you ask it to do
It’s why gpt can do stuff now like analyze contents of images, basic OCR, but also generate images too.
Yet it can also do math, talk about biology, give relationship advice…
I believe open AI called the term “specialists” or something vaguely like that, at the time.
The thing is, the tech keeps advancing too so even if they tighten up deadlines, by the time they did that our productivity also took another gearshift up so we still are some degree ahead.
This isn’t new, in software we have always been getting new tools to do our jobs better and faster, or produce fancier results in the same time
This is just another tool in the toolbelt.
I am indeed getting more time off for PD
We delivered on a project 2 weeks ahead of schedule so we were given raises, I got a promotion, and we were given 2 weeks to just do some chill PD at our own discretion as a reward. All paid on the clock.
Some companies are indeed pretty cool about it.
I was asked to give some demos and do some chats with folks to spread info on how we had such success, and they were pretty fond of my methodology.
At its core delivering faster does translate to getting bigger bonuses and kickbacks at my company, so yeah there’s actual financial incentive for me to perform way better.
You also are ignoring the stress thing. If I can work 3x better, I can also just deliver in almost the same time, but spend all that freed up time instead focusing on quality, polishing the product up, documentation, double checking my work, testing, etc.
Instead of scraping past the deadline by the skin of our teeth, we hit the deadline with a week or 2 to spare and spent a buncha extra time going over everything with a fine tooth comb twice to make sure we didn’t miss anything.
And instead of mad rushing 8 hours straight, it’s just generally more casual. I can take it slower and do the same work but just in a less stressed out way. So I’m literally just physically working less hard, I feel happier, and overall my mood is way better, and I have way more energy.
Meanwhile a huge chunk of the software industry is now heavily using this “dead end” technology 👀
I work in a pretty massive tech company (think, the type that frequently acquires other smaller ones and absorbs them)
Everyone I know here is using it. A lot.
However my company also has tonnes of dedicated sessions and paid time to instruct it’s employees on how to use it well, and to get good value out of it, abd the pitfalls it can have
So yeah turns out if you teach your employees how to use a tool, they start using it.
I’d say LLMs have made me about 3x as efficient or so at my job.
They did that awhile ago, it was a big feature if gpt 3
We already did this like a year ago mate. That was like v3 of gpt
Eyup, it’s intuitive overall but there’s just weirdly some people out there that are all or nothing, and don’t understand “right tool for the job” lol
Like I said, in the scale compared to actual high frequency data though, that’s still be infrequent.
High frequency DBs are on the scale of many queried per second
Even with tonnes of data scientists and engineers querying the data, that’s still in the scale of queries per minute, which is low frequency in the data world.
They probably do use lots of NoSQL DBs too, which perform better for non relational “data lake” style architectures where you just wanna dump mountains of data as fast as possible into storage, to be perused later.
When you have cases where you have very very high volume of data in, but very low need to query it (but some potential need, just very low), nosql DBs excel
Stuff like census data where you just gotta legally store it for historical reasons, and very rarely some person will wanna query it for a study or something.
Keep in mind when I talk about low need to query, the opposite high need us on the scale of like, "this db gets queried multiple times per minute’
Stuff like… logins to a website, data that gets queried many times per minute or even second, then sometimes nosql DBs fall off.
Depends what is queried.
Super basic “lookup by ID” Stuff that operates as just a big ole KeyValuePair mapping ID -> Value? And thats all you gotta query?
NoSql is still the right tool for the job.
The moment any kind of JOIN
enters the discussion though, chances are you actually wanna use sql now
sent American technology stocks plummeting
Oh yeah, thats what did it, totally
Wow, that sure is something else.