NeurIPS 2020 Meetup Day 2b

  Machine learning

By your attendance, you agree with NeurIPS Code of Conduct:

1) Niklas Maximilian Heim: Neural Arithmetic - Learning to extrapolate beyond the training range

Neural networks are great at approximating functions but often fail to generalize beyond the training range. Neural Arithmetic aims to overcome this issue by assuming that the underlying function is composed of simple arithmetic operations. During the talk we will review the current state of the art of Neural Arithmetic Units with their advantages and drawbacks, possible applications, and finally our Neural Power Unit (NPU) which is published in this year's NeurIPS.

Short Bio
Niklas just started his PhD at the Artificial Intelligence Center in Prague after completing his MSc in Physics at the University of Copenhagen. He is passionate about problems in Physics that can be tackled by Computer Science and loves using Julia to try and solve them.

2) Lightning Talks: Industry Perspective on Research
Talks by,, and Rossum