AMD Ryzen AI CPU and Radeon RX 7000 GPU now support local operation of LLM and AI chatbots

AMD states that users can run LLM and AI chatbots locally on Ryzen 7000 and Ryzen 8000 series APUs using AMD's new XDNA NPU, as well as Radeon RX 7000 series GPU devices with built-in AI acceleration cores.

AMD provided a detailed introduction to the operation steps in the announcement, such as running Mistral with 7 billion parameters, searching and downloading "TheBlock/OpenHermes-2.5-Multil-7B-GGUF"; If running a 7 billion parameter LLAMA v2, search and download "TheBlock/Llama-2-7B-Chat GGUF".



AMD is not the first company to do so, and NVIDIA has recently launched "Chat with RTX", an artificial intelligence chatbot supported by the GeForce RTX 40 and RTX 30 series GPUs. It utilizes the TensorRT-LLM feature set for acceleration and provides rapidly generated artificial intelligence results based on localized datasets.

Time: 2024-03-08
Views:
AMD announced today that users can localize the GPT based Large Language Model (LLM) to build exclusive AI chatbots.