Up next


Deep Learning on a Xilinx FPGA with MATLAB Code

3,155,366 Views
Generative AI
3
Published on 12/17/22 / In How-to & Learning

FPGA-based hardware is a good fit for deep learning inferencing on embedded devices because they deliver low latency and power consumption. Early prototyping is essential to developing a deep learning network that can be efficiently deployed to an FPGA.

See how Deep Learning HDL Toolbox™ automates FPGA prototyping of deep learning networks directly from MATLAB®. With a few lines of MATLAB code, you can deploy to and run inferencing on a Xilinx® ZCU102 FPGA board. This direct connection allows you to run deep learning inferencing on the FPGA as part of your application in MATLAB, so you can converge more quickly on a network that meets your system requirements.
--------------------------------------------------------------------------------------------------------
Get a free product trial: https://goo.gl/ZHFb5u
Learn more about MATLAB: https://goo.gl/8QV7ZZ
Learn more about Simulink: https://goo.gl/nqnbLe
See what's new in MATLAB and Simulink: https://goo.gl/pgGtod

© 2020 The MathWorks, Inc. MATLAB and Simulink are registered trademarks of The MathWorks, Inc.
See www.mathworks.com/trademarks for a list of additional trademarks. Other product or brand names may be trademarks or registered trademarks of their respective holders.

Show more
0 Comments sort Sort By

Up next