Zoom Logo

AI Frontiers Meetup - Shared screen with speaker view
Joanne Sun
45:53
anything on the screen
Steve Chen
46:29
ok from here
Joanne Sun
46:45
how to spell
Richard Yu
46:58
Hebbian learning
Joanne Sun
47:07
thank you!
Joanne Sun
01:02:32
Is this sparse thing like regularization?
Richard Yu
01:04:43
A lot of the results in this presentation that we saw is in this paper: https://arxiv.org/pdf/1903.11257.pdf
Patrick Felong
01:04:48
Do sparse networks use less computational resources to train than equivalent-sized dense ones?
Stephen McInerney
01:05:07
@Joanne: sparsity has tons of effects: reduced power usage, stability etc.
Joanne Sun
01:06:59
looks like it avoids overfitting like regularization
Edward Keyes
01:09:42
A random pointer: GrAI Matter Labs is working on ML chips which exploit some of this sort of sparse compute architecture to save power in edge-AI applications
Haider Ali
01:13:17
Hi Junling, will this presentation recording be available?
Junling Hu
01:13:47
yes
Joanne Sun
01:16:53
zoom dropped me. May have missed some posts
Richard Yu
01:18:23
There are a special class of neural networks called spiking neural networks.
Richard Yu
01:18:28
It includes time as a dimension.
Dave -
01:18:44
Nvidia’s new Ampere architecture can skip over 50% of zeros to effectively double performance.
Dan Goncharov
01:19:19
Thank you, Sabutai! Awesome lecture!
mickie winkler
01:20:15
thank you. outstanding. gotta go
Neha Jain
01:21:32
does this new kind of sparse network solves problems in a new domain like art where current systems dont work good.
Rajeev Seth
01:22:21
Very good new approach for me to try out. IS there a library or code sample available to work with ?
Shalini Keshavamurthy
01:25:04
What is the reason for the stability of sparse networks (apart from it being a property of it)?
Joanne Sun
01:26:54
cannot see the screen
Joanne Sun
01:27:40
where is the library
Joanne Sun
01:27:53
zoom is not nice to me
Edward Keyes
01:28:24
https://github.com/numenta/nupic.torch
Joanne Sun
01:28:53
thank you very much Edward
Michael Pon
01:30:55
gordon please mute
Dino Wun
01:36:55
Can i go now, i need to have dinner
Junling Hu
01:37:21
Sure. See you next time
Haider Ali
01:41:44
Is there an example in nature of animal or inset brains who have full connectivity instead sparse connectivity?
Shalini Keshavamurthy
01:45:15
Very interesting talk! Thank you so much Subutai.
Kaiwei chen
01:45:32
Thank you for the interesting talk!
Patrick Felong
01:45:34
Thanks, Subutai!
Patrick Felong
01:45:43
Have a great week, all