Who can see your viewing activity?
anything on the screen
ok from here
how to spell
Is this sparse thing like regularization?
A lot of the results in this presentation that we saw is in this paper: https://arxiv.org/pdf/1903.11257.pdf
Do sparse networks use less computational resources to train than equivalent-sized dense ones?
@Joanne: sparsity has tons of effects: reduced power usage, stability etc.
looks like it avoids overfitting like regularization
A random pointer: GrAI Matter Labs is working on ML chips which exploit some of this sort of sparse compute architecture to save power in edge-AI applications
Hi Junling, will this presentation recording be available?
zoom dropped me. May have missed some posts
There are a special class of neural networks called spiking neural networks.
It includes time as a dimension.
Nvidia’s new Ampere architecture can skip over 50% of zeros to effectively double performance.
Thank you, Sabutai! Awesome lecture!
thank you. outstanding. gotta go
does this new kind of sparse network solves problems in a new domain like art where current systems dont work good.
Very good new approach for me to try out. IS there a library or code sample available to work with ?
What is the reason for the stability of sparse networks (apart from it being a property of it)?
cannot see the screen
where is the library
zoom is not nice to me
thank you very much Edward
gordon please mute
Can i go now, i need to have dinner
Sure. See you next time
Is there an example in nature of animal or inset brains who have full connectivity instead sparse connectivity?
Very interesting talk! Thank you so much Subutai.
Thank you for the interesting talk!
Have a great week, all