@prestonrho: The great ai migration is here, businesses are waking up and joining the local ai discussion. What side will you stand on? #techbro #ai #llm #chatgpt #airevolution #businessowner

prestonrhodes
prestonrhodes
Open In TikTok:
Region: US
Wednesday 28 May 2025 21:47:07 GMT
3419
157
18
9

Music

Download

Comments

plx01131
Shack Studios :
Last night I made a spec on pcpartspicker specifically designed for running 32B local LLM for AI dev work, it was over $10,000. Wish it wasn’t so but it is. For a solo dev or small business this is a serious price tag. If you’re interested in having agents, backup agents (like Kubernetes for agents), LLM powered search, LLM chat bots with rag and hallucination detection and correction systems ect…I’m not even sure how much the hardware and networking infrastructure would cost.
2025-05-29 12:20:54
4
autoaiadvantage
Auto Ai Advantage :
Local Ai is super good and full privacy. it requires some money but if you have a good pc you can already start for free, if you're a company, it takes definitely more money depending on how much big are you and what you need exactly
2025-05-30 08:49:12
0
amin.os2
Amin Os :
where is your program...would ping me your link
2025-05-31 05:50:02
0
jtechsolutions.ai
JT | JTech Solutions :
Totally agree on the privacy/control side. local AI can be a game changer for the right use case. But for anyone aiming to scale globally or run high load apps, managing ur own compute is a whole new business. U go from building an app to running a data center. Not to mention the manpower, networking and ops overhead. Cloud still wins for velocity + reach especially for enterprise
2025-06-04 17:55:01
1
zem9735
zem9735 :
You made a point then tried to sell a course
2025-05-29 05:07:34
0
user101108496
Smurf :
Makes sense to go local
2025-05-28 23:19:22
1
amulekone
amulekone :
Awesome video brother
2025-05-28 23:24:25
1
williamgawrysiak
William Gawrysiak :
You will get left behind without the cloud think about it
2025-05-29 00:21:01
0
user4345779655
Username Invalid :
Can you give an example of the difference btwn ChatGPT output v Local model output for a sample biz prompt??
2025-05-29 02:58:55
0
amin.os2
Amin Os :
thanks
2025-05-31 05:50:07
0
anooko8
Anooko :
@prestonrhodes Can we own it if we rent a server on the cloud and install it ourselves?
2025-06-10 17:56:05
0
limitedbyhumanity
Limited By Humanity :
Local is the way. We should chat
2025-05-29 17:54:31
0
nutrichem.ai
nutrichem.ai :
U tried the new mistral yet?
2025-05-28 22:25:03
0
user7778749598841
user7778749598841 :
The only issue is at scale the time and loe for local LLMs is huge… we’re hybrid while securing seed round but…
2025-05-28 22:47:36
3
roadtoredemption45
carloscue❌ :
It's definitely safer for local AI to handle the AI services just getting the necessary hardware depending on the AI is being used You're set
2025-05-29 09:41:14
0
_marvharris
Marv Harris :
Hybrid approach like enterprises do
2025-05-28 23:14:58
0
To see more videos from user @prestonrho, please go to the Tikwm homepage.

Other Videos


About