Towards this end, 8-bit floating point representations (FP8) were recently proposed for DNN training. However, its applicability was only demonstrated on a few selected models and significant degradation is observed when popular networks such as MobileNet and Transformer are trained using FP8. This...
文本讲义课件参考教程2019集8736hybrid8bit floating point hfp8training and inference for deep neural networks.pdf,Hybrid 8-bit Floating Point (HFP8) Training and Inference for Deep Neural Networks Xiao Sun Jungwook Choi Chia-Yu Chen Naigang Wang Swagath Ve