99精品在线视频观看,成年av免费网站大全,免费观看欧美大全毛片,在线观看深夜福利视频,日本不卡视频在线观看,麻豆成人精品二区三区免费,国产精品久久久久国产三级观,色偷偷91综合久久噜噜

Search...
ISA: LoongArch
03-12 2024

ONNX Runtime, the renowned AI inference framework software, officially supports LoongArch

Recently, ONNX Runtime, the open-source community behind the renowned AI inference framework, released the official version 1.17.0, which includes support for LoongArch. Users can now use this release directly from the ONNX Runtime open-source community to develop and deploy AI inference applications on the Loongson platform, thereby enhancing the LoongArch AI ecosystem.

ONNX Runtime (ORT) has gained significant popularity as an AI inference framework software in recent years. It serves as a foundational AI inference engine for numerous AI applications, supporting different model formats such as PyTorch, Tensorflow, and TFLite. In addition, it accommodates various computing backends, including CPU, GPU, IoT, NPU, and FPGA.

During the development of ORT version 1.17.0, Loongson's technology team closely collaborated with the ONNX Runtime community. They contributed 7697 lines of code to the community's code repository and conducted comprehensive vector optimization on core operators like matrix multiplication, convolution, and transposition. With the community's support, the LoongArch-optimized code successfully passed quality assurance processes, including inspection and test verification. As of ORT version 1.17.0, the ONNX Runtime community officially provides native support for LoongArch.