AI Inference with Versal AI Core Series

2021-10-21

AI Inference with Versal AI Core Series:
●2.7X performance/watt vs. competing FPGAs1 for cloud acceleration
●Accelerates the whole application from pre- to post-processing
●Adaptable to evolving AI Algorithm
CHALLENGE: Applied machine learning techniques have now become pervasive across a wide range of applications, with tremendous growth in vision and video in particular. FPGA-based AI/ML acceleration has already shown performance and latency advantages over GPU accelerators, but next-generation CNN-based workloads demand compute density beyond what traditional FPGA programmable logic and multipliers can offer. Fabric-based DSP blocks offer flexible precision and are still capable accelerators, but the bit-level interconnect and fine-grained programmability come with overhead that limits scalability for the most compute-intensive CNN-based workloads.
SOLUTION: VERSAL AI CORE ACAP FOR AI COMPUTE ACCELERATION: The Versal™ AI Core adaptive compute acceleration platform (ACAP) is a highly integrated, multicore, heterogeneous device that can dynamically adapt at the hardware and software level for a wide range of AI workloads, making it ideal for cloud accelerator cards. The platform integrates next-generation Scalar Engines for embedded compute, Adaptable Engines for hardware flexibility, and Intelligent Engines consisting of DSP Engines and revolutionary AI Engines for inference and signal processing. The result is an adaptable accelerator that exceeds the performance, latency, and power efficiency of traditional FPGAs and GPUs for AI/ML workloads.

Xilinx

AI Compute AcceleratorFPGA

More

vision ]video ]

More

Solutions

More

More

Please see the document for details

More

More

English Chinese Chinese and English Japanese

2021

866 KB

- The full preview is over. If you want to read the whole 3 page document,please Sign in/Register -
  • +1 Like
  • Add to Favorites

Recommend

All reproduced articles on this site are for the purpose of conveying more information and clearly indicate the source. If media or individuals who do not want to be reproduced can contact us, which will be deleted.

Contact Us

Email: