EE Special Seminar - William J. Dally
Number Representation for Deep Learning
The current resurgence of artificial intelligence is due to advances in deep learning. Systems based on deep learning now exceed human capability in speech recognition, object classification, and playing games like Go. Deep learning has been enabled by powerful, efficient computing hardware. The algorithms used have been around since the 1980s, but it has only been in the last decade - when powerful GPUs became available to train networks - that the technology has become practical. Advances in DL are now gated by hardware performance. In the last decade, the efficiency of DL inference on GPUs had improved by 1000x. Much of this gain was due to improvements in data representation starting with FP32 in the Kepler generation of GPUs and scaling to Int8 and FP8 in the Hopper generation. This talk will review this history and discuss further improvements in number representation including logarithmic representation, optimal clipping, and per-vector quantization.
Host: Azita Emami
Refreshments after seminar in Moore Courtyard
Contact: Caroline Murphy at 626-395-2084 firstname.lastname@example.org