Flying Squid@lemmy.world to Technology@lemmy.worldEnglish · rok temuYour AI Girlfriend Is a Data-Harvesting Horror Showgizmodo.comexternal-linkmessage-square97fedilinkarrow-up1495arrow-down113
arrow-up1482arrow-down1external-linkYour AI Girlfriend Is a Data-Harvesting Horror Showgizmodo.comFlying Squid@lemmy.world to Technology@lemmy.worldEnglish · rok temumessage-square97fedilink
minus-squareBetaDoggo_@lemmy.worldlinkfedilinkEnglisharrow-up262·rok temuThis is why you should always selfhost your AI girlfriend.
minus-squarePennomi@lemmy.worldlinkfedilinkEnglisharrow-up151·rok temuWoah there, I’m not sure I’m ready for that level of commitment.
minus-squarefoggy@lemmy.worldlinkfedilinkEnglisharrow-up23·rok temuI ain’t got that kinda GPU to spare. If it’s the games or my AI girlfriend…
minus-squareɐɥO@lemmy.ohaa.xyzlinkfedilinkEnglisharrow-up8·rok temuyou dont need that much power. something like a rx 6600xt/rtx 3060/rx580 is plenty
minus-squareneutron@thelemmy.clublinkfedilinkEnglisharrow-up7·edit-2rok temuIs support for AMD cards better these days? Last time I checked it involved checking ROCM compatibility because CUDA needs nvidia cards exclusively.
minus-squareɐɥO@lemmy.ohaa.xyzlinkfedilinkEnglisharrow-up6·rok temugpt4all worked out of the box for me
minus-squareCouncilOfFriends@lemmy.worldlinkfedilinkEnglisharrow-up7·rok temuI’d have to actually test my backup strategy
minus-squareMatch!!@pawb.sociallinkfedilinkEnglisharrow-up52·rok temuI’d only date someone fully independent, so my AI joyfriend operates their own cloud cluster through a combination of a crypto wallet and findom
minus-squareneutron@thelemmy.clublinkfedilinkEnglisharrow-up1·rok temuI’m interested. So you rent a cluster to run an instance instead of self hosting?
minus-squarewerefreeatlast@lemmy.worldlinkfedilinkEnglisharrow-up36·rok temuI think my wife is cheating on me with my self hosted AI girlfriend’s boyfriend that lives in the same database file. What do I do?
minus-squareMurdoc@sh.itjust.workslinkfedilinkEnglisharrow-up17·rok temuFor as long as everyone is using a virus checker, maybe you could try an open source relationship.
minus-squareDrWeevilJammer@lemmy.mllinkfedilinkEnglisharrow-up14·rok temuSounds like a job for Little Bobby Tables
minus-squarez00s@lemmy.worldlinkfedilinkEnglisharrow-up11arrow-down1·rok temuThat’ll make you go blind
minus-squareyildolw@lemmy.worldlinkfedilinkEnglisharrow-up8·rok temuNow consider the number of normal people in the world who do not have a server rack in their closet, and how much they are about to be defrauded and blackmailed
minus-squareUnderpantsWeevil@lemmy.worldlinkfedilinkEnglisharrow-up14arrow-down1·rok temu My Canadian Girlfriend Broke, Busted, Burned Out My Canadian Server-Farm Girlfriend Smart, Sexy, Superconductive
minus-squareNigelFrobisher@aussie.zonelinkfedilinkEnglisharrow-up2·rok temuYou have to at least move out of your AI parents’ server rack before letting your AI girlfriend move in.
minus-squareDarkThoughts@kbin.sociallinkfedilinkarrow-up2arrow-down4·rok temuYou need a little gpu server farm for proper models & context sizes though. Single consumer gpus don’t have enough vram for that.
minus-square𝚐𝚕𝚘𝚠𝚒𝚎@h4x0r.hostlinkfedilinkEnglisharrow-up18arrow-down1·rok temuMight as well just pay for a prostitute
minus-squareDarkThoughts@kbin.sociallinkfedilinkarrow-up9·rok temuYeah, I heard they’re also very privacy friendly.
minus-squareDarkThoughts@kbin.sociallinkfedilinkarrow-up3arrow-down3·rok temuI was being sarcastic. If you pay for a prostitute you might as well pay for an AI service such as novelai.
minus-squareneutron@thelemmy.clublinkfedilinkEnglisharrow-up2·rok temuLocally hosted AI sucking down on our dick through usb plugged dildos. This is the future.
This is why you should always selfhost your AI girlfriend.
Woah there, I’m not sure I’m ready for that level of commitment.
I ain’t got that kinda GPU to spare. If it’s the games or my AI girlfriend…
you dont need that much power. something like a rx 6600xt/rtx 3060/rx580 is plenty
Is support for AMD cards better these days? Last time I checked it involved checking ROCM compatibility because CUDA needs nvidia cards exclusively.
gpt4all worked out of the box for me
I’d have to actually test my backup strategy
🤣
I’d only date someone fully independent, so my AI joyfriend operates their own cloud cluster through a combination of a crypto wallet and findom
They sound fun
🤣
I’m interested. So you rent a cluster to run an instance instead of self hosting?
I think my wife is cheating on me with my self hosted AI girlfriend’s boyfriend that lives in the same database file. What do I do?
Delete system32 obviously
For as long as everyone is using a virus checker, maybe you could try an open source relationship.
Sounds like a job for Little Bobby Tables
That’ll make you go blind
And give you hairy palms
Now consider the number of normal people in the world who do not have a server rack in their closet, and how much they are about to be defrauded and blackmailed
Broke, Busted, Burned Out
Smart, Sexy, Superconductive
Let me go buy some milk.
You have to at least move out of your AI parents’ server rack before letting your AI girlfriend move in.
Nvidia^TM
You need a little gpu server farm for proper models & context sizes though. Single consumer gpus don’t have enough vram for that.
Might as well just pay for a prostitute
Yeah, I heard they’re also very privacy friendly.
Unless you piss them off
I was being sarcastic. If you pay for a prostitute you might as well pay for an AI service such as novelai.
Locally hosted AI sucking down on our dick through usb plugged dildos. This is the future.