Local LLM Plugin

Load and run Large Language Models (LLMs) locally in your project

  • Supported Platforms
  • Supported Engine Versions
    5.3 - 5.4
  • Download Type
    Engine Plugin
    This product contains a code plugin, complete with pre-built binaries and all its source code that integrates with Unreal Engine, which can be installed to an engine version of your choice then enabled on a per-project basis.

Documentation: Link

Demo Project: GitHub , EXE

Demo Video: English, Japanese


This product is designed to integrate AI chatbots into games without using online services.

This plugin allows to load large language models (LLMs) of GGUF format and run them on Unreal Engine.



Run locally and within BP/C++
  • Runs offline on a local PC.
  • Just add one component to your BP and you are ready to use it.
  • No Python or dedicated server is required.



Useful features
  • Works asynchronously, additional questions can be asked at any time during answer generation.
  • You can save and load "state" that preserve the context of a conversation, allowing you to resume a previous conversation later.
  • Supports multibyte characters.



Hardware Requirements

Requires a CPU that supports AVX, AVX2 and FMA.

The following CPUs should work, but please try our free demo exe to check if your CPU is supported.

  • Intel: 4th Generation (Haswell) and above
  • AMD: All Ryzen series



Supported models

The following models have been tested to work.

- llama-3 7B

- Phi-3-medium

- Gemma 7B

- Gemma-2 9B

- Mistral 7B

- ArrowPro 7B KUJIRA (Japanese model)

- ELYZA JP 8B (Japanese model)


Technical Details

Features:

  • Load models of GGUF format
  • Async execution of the text generation using the models
  • Save and load "state" that preserve the context of the conversation

Code Modules:

  • Local LLM (Runtime)

Number of Blueprints: 1

Number of C++ Classes: 3+

Network Replicated: No

Supported Development Platforms: Windows 64bit

Supported Target Build Platforms: Windows 64bit

Documentation: Link

Demo Project: GitHub , EXE