
45:53
anything on the screen

46:29
ok from here

46:45
how to spell

46:58
Hebbian learning

47:07
thank you!

01:02:32
Is this sparse thing like regularization?

01:04:43
A lot of the results in this presentation that we saw is in this paper: https://arxiv.org/pdf/1903.11257.pdf

01:04:48
Do sparse networks use less computational resources to train than equivalent-sized dense ones?

01:05:07
@Joanne: sparsity has tons of effects: reduced power usage, stability etc.

01:06:59
looks like it avoids overfitting like regularization

01:09:42
A random pointer: GrAI Matter Labs is working on ML chips which exploit some of this sort of sparse compute architecture to save power in edge-AI applications

01:13:17
Hi Junling, will this presentation recording be available?

01:13:47
yes

01:16:53
zoom dropped me. May have missed some posts

01:18:23
There are a special class of neural networks called spiking neural networks.

01:18:28
It includes time as a dimension.

01:18:44
Nvidia’s new Ampere architecture can skip over 50% of zeros to effectively double performance.

01:19:19
Thank you, Sabutai! Awesome lecture!

01:20:15
thank you. outstanding. gotta go

01:21:32
does this new kind of sparse network solves problems in a new domain like art where current systems dont work good.

01:22:21
Very good new approach for me to try out. IS there a library or code sample available to work with ?

01:25:04
What is the reason for the stability of sparse networks (apart from it being a property of it)?

01:26:54
cannot see the screen

01:27:40
where is the library

01:27:53
zoom is not nice to me

01:28:24
https://github.com/numenta/nupic.torch

01:28:53
thank you very much Edward

01:30:55
gordon please mute

01:36:55
Can i go now, i need to have dinner

01:37:21
Sure. See you next time

01:41:44
Is there an example in nature of animal or inset brains who have full connectivity instead sparse connectivity?

01:45:15
Very interesting talk! Thank you so much Subutai.

01:45:32
Thank you for the interesting talk!

01:45:34
Thanks, Subutai!

01:45:43
Have a great week, all