@dianasaurbytes: Interesting product lessons taken from Nvidia's recent release of ChatRTX. This is a "beta" product that allows you to run LLMs locally on your windows computer, as long as you have certain Nvidia GPUs. #generativeai #ai #llm #productmanager #techstartup

dianasaurbytes
dianasaurbytes
Open In TikTok:
Region: US
Monday 08 April 2024 21:33:10 GMT
10414
242
19
30

Music

Download

Comments

diegomendez5183
Letitride :
so you GPU would have more spyware
2024-04-09 10:42:56
0
alexnautilus
Alex :
Very generous review, though, Chat RTX is very basic when compared to LM Studio.
2024-04-08 21:39:46
0
dylans8036
Dylan S :
Pretty wild that any kind of llm trained on what I assume is probably terabytes of data could even run locally
2024-04-08 21:53:13
0
evilspyboy
Andrew B :
I'll need to go check it out and see if it's better than LMStudio
2024-04-23 03:12:32
2
travis.hoyt
Travis Hoyt :
You can do it now with Ollama and it doesn’t require an Nvidia GPU. Macbooks with M1/M2 chips can also run it.
2024-04-09 10:58:56
0
kiwi_feathers
kiwi_feathers :
too bad I don't have a rtx video card
2024-04-09 18:57:53
0
elonmusk_laminatedface0
ElonMusk :
Try “Faraday”
2024-04-08 22:42:19
1
boombanger
boombanger :
That chatbot must be reaching out to the full LLM ?
2024-04-08 21:42:14
0
megatronite5
Mega Tronite :
Need this on mobile device! Do a vid on that 😁
2024-04-11 19:23:50
0
nevin_esr
nsr :
ollama is what you looking for
2024-09-13 20:06:52
1
trevsbadcontent
Trev :
I’m hoping the localLLM movement continues their open source alliance to avoid central operated closed data gathering censored products
2024-04-08 22:46:32
1
To see more videos from user @dianasaurbytes, please go to the Tikwm homepage.

Other Videos


About