Open-Source Posit Dot-Product Unit (PDPU) for Deep Learning Applications: A Presentation at ISCAS 2023

86 views
Skip to first unread message

方超

unread,
May 7, 2023, 5:07:50 AM5/7/23
to Unum Computing
Dear colleagues,

We are excited to share with you our latest work on "PDPU: An Open-Source Posit Dot-Product Unit for Deep Learning Applications", which will be presented at the 2023 IEEE International Symposium on Circuits and Systems (ISCAS) on Tuesday, May 23rd. Our research team at Nanjing University, consisting of Qiong Li, Chao Fang, and Zhongfeng Wang, will be presenting this exciting project in the Data Path & Arithmetic Circuits & Systems Session. You can find our paper and slides through the provided links.

PDPU is a highly efficient hardware module that performs the dot-product operation of two input vectors Va and Vb in low-precision format, accumulating the result and previous output acc to a high-precision value out. This allows for more efficient computation and improved performance for deep learning applications, which often involve a large number of dot-product operations.

Our proposed PDPU comes with several features and contributions. Firstly, it implements efficient dot-product operations with fused and mixed-precision properties. This leads to significant reductions in area, latency, and power consumption, up to 43%, 64%, and 70% respectively, compared to discrete architectures. Secondly, it is equipped with a fine-grained 6-stage pipeline that minimizes the critical path and improves computational efficiency. The structure of PDPU is detailed by breaking down the latency and resources of each stage. Lastly, a configurable PDPU generator is developed to enable PDPU to flexibly support various posit data types, dot-product sizes, and alignment widths.

We believe that PDPU can make a significant contribution to the field of posit arithmetic units and deep learning. We encourage you to explore the provided links to learn more about our research and to use and contribute to the open-source code. If you have any questions or comments, please feel free to reach out to us.

Best regards,
Chao Fang
Reply all
Reply to author
Forward
0 new messages