@prestonrho: go local #ai

prestonrhodes
prestonrhodes
Open In TikTok:
Region: US
Saturday 19 July 2025 20:10:35 GMT
38659
2312
117
60

Music

Download

Comments

downey571
downey571 :
How much computing power(like ram,gpu,processor,Storage and etc) do you actually need to run decent models locally.
2025-07-19 20:56:35
62
gyeditz2.0
👽 :
locally? so what should we use
2025-07-20 01:26:26
8
suziengatia.africa
susanngatia :
Tried and failed... teach us your ways....
2025-07-20 04:58:07
17
railbreaker
Railbreaker :
So many people don’t get this. All the extra garbage that tech inference MCP servers is why people can’t prompt consistently
2025-07-20 02:07:28
23
wallsareclosingin
Warren Cain :
Everyone, you can run an 8b parameter on a MacBook Air 2020 16gb ram 512gb SSD SUPER easily and the latest ones are amazing. Use Qwen 3 8b!
2025-07-21 02:21:35
13
todolist44
todolist44 :
What about running on your own cloud server not local but still kind of under your control
2025-07-20 08:22:49
3
kellenchase
kellenchase :
Ooof… this one hit me in the “I don’t have the budget for a Mac Studio this month”
2025-07-20 00:01:17
9
everythingis.music
Mac :
How can you expect to scale a local model properly without the compute power? It can only do so much on a local machine
2025-07-20 17:41:52
1
redbick15
Red Kings :
How different is the setup compared to GPU mining. I’ve got 6 3070’s laying around from back in the day mining Ethereum.
2025-07-20 16:57:14
1
jay_albert_
Jonathan Albert :
You might be interested in Lilypad Network
2025-07-20 07:09:08
2
cliffordtmeece
Clifford T Meece :
I just got an nvidia rtx 3060 12gb for 200 bucks and it does o er 50 tokens a second with llama 3.
2025-07-26 06:16:12
0
yami.zxc
Yami.zxc :
yo any ideas to cloud gpu pool?
2025-07-23 03:46:47
1
jacobluanjohnston
neuralnets432 :
basic fact of life that the chinese knew way before neural networks existed bro
2025-07-21 22:06:13
0
finiteryan
Ryan :
google lm studio to get started with text to text ai gen, helps of models to choose from, nvidea videos cards are the way to go
2025-07-20 08:01:37
1
theladykelly
Lady Kelly McCoy Williams :
Preston, you are spot on.
2025-07-19 21:21:24
6
notoriousb.j.g
bg :
yooo check your dm bro
2025-07-19 20:14:18
1
marxeeeeeee
Marxe :
Local models atm just don’t match Claude opus or sonnet. Every agent I’ve written just works better with sonnet 7 or gpt4 compared to everything else
2025-07-22 17:16:06
0
alteredverse_sg
alt_verse :
I’m working on a local llm packaged with unreal engine
2025-07-20 00:58:06
7
burningdicegames
Burning Dice :
I had Deepseek locally and it worked okay - Def need a really good computer though for true performance
2025-07-23 18:26:35
0
lurkingshadow008
lurkingshadow008 :
I’m not super techy but trying to figure out this local running of machine
2025-07-22 04:18:45
0
fromdirttostars
fromdirttostars :
Basic security as well right
2025-07-20 22:25:28
0
westonheyer
Yamex :
The gooner market is huge
2025-07-22 02:22:45
0
rosscon96
Cowner :
Yeah and then the local model melts after ten minutes of operation
2025-07-21 13:38:32
1
zee.loves.u
Zee.Loves.U :
Omg can u teach us. 我也是中国人
2025-07-20 17:13:07
0
ewigunddreitage
Ab3mad :
bro, it is not not dangerous what you are broadcasting, take care, stay sharp.
2025-07-20 07:18:27
1
To see more videos from user @prestonrho, please go to the Tikwm homepage.

Other Videos


About