@prestonrho: go local #ai

prestonrhodes
prestonrhodes
Open In TikTok:
Region: US
Saturday 19 July 2025 20:10:35 GMT
37776
2251
116
58

Music

Download

Comments

downey571
downey571 :
How much computing power(like ram,gpu,processor,Storage and etc) do you actually need to run decent models locally.
2025-07-19 20:56:35
60
gyeditz2.0
👽 :
locally? so what should we use
2025-07-20 01:26:26
8
burningdicegames
Burning Dice :
I had Deepseek locally and it worked okay - Def need a really good computer though for true performance
2025-07-23 18:26:35
0
suziengatia.africa
susanngatia :
Tried and failed... teach us your ways....
2025-07-20 04:58:07
16
kellenchase
kellenchase :
Ooof… this one hit me in the “I don’t have the budget for a Mac Studio this month”
2025-07-20 00:01:17
9
redbick15
Red Kings :
How different is the setup compared to GPU mining. I’ve got 6 3070’s laying around from back in the day mining Ethereum.
2025-07-20 16:57:14
1
jacobluanjohnston
neuralnets432 :
basic fact of life that the chinese knew way before neural networks existed bro
2025-07-21 22:06:13
0
railbreaker
Railbreaker :
So many people don’t get this. All the extra garbage that tech inference MCP servers is why people can’t prompt consistently
2025-07-20 02:07:28
23
disentanglable.mortality
edoconnell425 :
curious. How much of your work do you do 100% locally?
2025-07-23 15:38:49
0
wallsareclosingin
Warren Cain :
Everyone, you can run an 8b parameter on a MacBook Air 2020 16gb ram 512gb SSD SUPER easily and the latest ones are amazing. Use Qwen 3 8b!
2025-07-21 02:21:35
11
jay_albert_
Jonathan Albert :
You might be interested in Lilypad Network
2025-07-20 07:09:08
2
theladykelly
Lady Kelly McCoy Williams :
Preston, you are spot on.
2025-07-19 21:21:24
6
crjndyuknstjlp
ceco :
well you know everything you say sounds so posable but the moment I need to start somewhere I am lost. .....
2025-07-19 21:58:47
1
dexterjkm1
dexterjkm1 :
Aren't local models worse at performance tho?
2025-07-20 10:40:18
1
andylearnsai
Andy Learns AI :
Gpu costs tho 💀
2025-07-20 00:05:09
1
todolist44
todolist44 :
What about running on your own cloud server not local but still kind of under your control
2025-07-20 08:22:49
3
yami.zxc
Yami.zxc :
yo any ideas to cloud gpu pool?
2025-07-23 03:46:47
1
alteredverse_sg
alt_verse :
I’m working on a local llm packaged with unreal engine
2025-07-20 00:58:06
7
everythingis.music
Mac :
How can you expect to scale a local model properly without the compute power? It can only do so much on a local machine
2025-07-20 17:41:52
1
rosscon96
Cowner :
Yeah and then the local model melts after ten minutes of operation
2025-07-21 13:38:32
1
klaudefurlong
Klaude | AI Business Coach :
Exactly what I want to end up doing
2025-07-19 20:35:45
2
aiwithenoch
Ai With Enoch | Ai Automations :
thanks for sharing
2025-07-19 20:23:52
2
notoriousb.j.g
bg :
yooo check your dm bro
2025-07-19 20:14:18
1
burdensome888
burdensome :
but I only have i5 laptops with igpu. what should I do tho?
2025-07-20 12:27:28
2
brunoblockchain
BRUNO&Co. :
This is HUGE!!!!
2025-07-19 22:03:05
2
To see more videos from user @prestonrho, please go to the Tikwm homepage.

Other Videos


About