@welchlabs: The Geometry of Deep Learning - Part 1

Welch Labs
Welch Labs
Open In TikTok:
Region: US
Sunday 24 August 2025 23:35:04 GMT
148952
8384
73
466

Music

Download

Comments

mhmd_albazzon
محمد البزون :
I dont get it Why did we draw the first lines like that Is there a reason or is it just random
2025-08-25 11:22:47
36
acidayan5
acidayan :
i dont think they teach this in cs
2025-08-25 00:02:30
139
ughbleh1
? :
The visualizations go crazy. Would not have been able to understand any of that without the visuals. Great job
2025-08-25 01:52:54
474
tj_cates
Taylor Cates :
What are these words 😩
2025-08-26 03:45:04
40
alex_p_howard
alex_p_howard :
Can this be used in Economics and Economtrics ?
2025-08-25 23:17:04
0
flambruciofilliba
Jesus Christ • Creator :
What goes on when training it
2025-08-25 05:51:59
1
bananapinsker
🎀Aeden🎀 :
it kinda feels like integration to be honest, where continuously adding neurs brings the accuracy up
2025-08-25 01:34:22
6
ogrussy
ogrussy🥀 :
neural nets are definitely one of the biggest step innovations in human history
2025-08-25 05:37:36
30
yaaaaaahhhh11
glitch01 :
This is a theoretical result. We don’t know much about whether back prop gets us these clean classes
2025-08-25 03:34:33
7
oh.right.on
DT :
those are all definitely words.
2025-08-25 03:09:25
7
trent.greenan
Trent Greenan, MS, MBA :
Deep learning is so cool!
2025-08-25 06:23:55
0
memoakten
Memo Akten :
UFA theorem only proves that for any given fn, there exists a single hidden layer ANN that can approx it to any given precision. *Finding* that ANN is a different problem :)
2025-08-25 04:24:58
54
atomicself001
atomicself001 🇨🇦 :
Could use this for space travel navigation.
2025-09-03 00:54:00
0
dingdong8111
ding-dong :
This is absolutely excellent educational content
2025-08-25 03:47:13
50
citlalisstuff
Citlali :
Can you explain some use case/application? Why would I elect to have this information via the 2D neural network vs some other means of storing/accessing the data? Just better compared to the 1000+ version?
2025-08-25 22:40:08
0
kevinanders0n
Kevin :
You’re asking questions that we cannot solve. Higher dimensional math is true
2025-09-28 06:53:58
2
nosnej7
nosnej7 :
I was just about to say that
2025-08-27 14:05:40
5
plumpestbadger
Dysentery Gary :
Ready for part 2
2025-08-25 06:27:06
2
josephlouislagrange
Joseph Louis Lagrange :
Slow convergence is still convergence. Fourier series are practically divergent for many waveforms simply because convergence is so slow. Idk anything about ML but all you’re doing is selecting a more efficient subsequence—that it gives the same limit is a trivial consequence of the first sequence converging in the first place
2025-09-02 17:03:09
1
evod.333
Evo-D 333 :
instead of a flat plane each layer should be hyperbolic and when it flip scale you'll get a whole another set of data with intersecting multiple flat plans
2025-09-30 17:30:28
0
mcdonaldscupfullofpiss
mcdonald’s cup full of piss :
universal approximation theorems is actually BAD news :-) there is no such thing as a free lunch
2025-09-15 20:35:49
0
noorrkam
Noor Kamal :
can i find these videos on youtube?
2025-09-19 01:34:06
0
alg56y
fighter 23 :
the main problem is generalization and universal approximation theorem do not tell us anything. By the way from statistical learning theory neural networks are dumps approximation some one you can use
2025-08-25 05:52:23
0
quelquechose05
quelque-chose :
so glad to see you back making content! your earlier videos on neural networks are golden and contains such deep insights
2025-08-25 07:58:10
0
To see more videos from user @welchlabs, please go to the Tikwm homepage.

Other Videos


About