In general how much VRAM do I need for 14B and 24B models?
I didn’t know that. I thought just one ROCM binary to install, run Ollama and that’s it. Thanks for the explanation
Do you have any recommendations for running the Mistral small model? I’m very interested in it alongside CodeLlama, OogaBooga and others
Wait how does that work? How is 24GB enough for a 38B model?
The 7900XTX was $1000 when it launched, I wouldn’t mind it used either.
I don’t mind multiple GPUs but my motherboard doesn’t have 2+ electrically connected X16 slots. I could build a new homeserver (I’ve been thinking about it) but consumer platforms simply don’t have the PCIE lanes for 2 actual x16 slots. I’d have to go back to Broadwell Xeons for that, which are really power hungry. Oh well, I don’t think it matters considering how power hungry GPUs are now.
I am OK with either Nvidia or AMD especially if Ollama supports it. With that said I have heard that AMD takes some manual effort whilst Nvidia is easier. Depends on how difficult ROCM is
Thank you. Are 14B models the biggest you can run comfortably?
Do you have 2 PCIE X16 slots on your motherboard (speaking in terms of electrical connections)?
Seedboxes go from €2 to €100+ a month depending on how much you will torrent and how much space you need on the box alongside other factors. My personal choices are Gigarapid and Ultra but there are others
What ratio are you at with your Linux ISOs *wink.
Ooh XE Iaso’s blog. I watched a couple of her videos on YouTube. What a talented person
Just lol at blocking Cloudflare. They seriously think this will being down the number of pirates?
People have had bad experiences and the founder of Njalla hasn’t helped. There are other Domain registrars with a similar strategy to Njalla if you prefer them instead. One can’t deny that not having your name on a domain allows for one to host… err, sensitive stuff
Njalla is what you use if you don’t want your name plastered all over the domain for various reasons
Isn’t this only for people running NGINX?
I’m trying to create a router + switch combo. I know bonding over CPU is considered a bad idea but I don’t want to run a proprietary OS on my switch to get VLANs. I’d rather run an OpenBSD VM and do everything in it.
This might delve into some networking, but if you can bear with me:
Whilst I like the idea of VLANs, I don’t like running proprietary firmware on my devices. Which means a regular L2+/L3 switch is not going to cut it. But I’m starting to wonder if I can just use Veths and subnetting to segregate traffic between different machines on my network?
Using your example, can I do:
PC (router) -> 10Gbe port (3 Veths) -> switch -> three different machines on different subnets?
Can I prevent the three machines from talking to each other directly through the switch if I put them in different subnets? Sorry for my lousy networking knowledge, it’s been a while.
is there a thriving selfhost/homelab type place that is active?
I mean, you’re right here.
Is there any benefit to hosting your own Lemmy and mesh it with the other Lemmey’s out there?
If it’s your personal instance: altruism. You’re taking some burden off of the main Lemmy servers by hosting your own with your content. You’re saving them bandwidth, storage and CPU time.
If it’s a public instance meant for others to use: you’re participating in decentralisation and keeping the Fediverse alive. Every new instance has their own mods, rules and policies. It’s like a little island connected to other islands to form a community.
I can wrap my 70 year old head around it.
Holy shit I hope my brain can process new tech at that age like you. Good luck, DM me if you have trouble.
My board has PCIe gen 4 x1, but unfortunately there’s a really cheap card with 6 ethernet ports but PCIe gen 2 X8
I have an alternative for you if your power bills are cheap: X99 motherboard + CPU combos from China