XCOM 2
Not enough ratings
[WotC] LLM Support
   
Award
Favorite
Favorited
Unfavorite
File Size
Posted
62.498 KB
31 Jul @ 1:53pm
1 Change Note ( view )
You need DLC to use this item.

Subscribe to download
[WotC] LLM Support

Description
Adds support for external Large Language Models (LLMs) to generate dialogue and responses during gameplay.

This mod runs a local server to interact with your selected AI model (OpenAI, Gemini, Anthropic, or Ollama), allowing you to inject dynamic, AI-generated text into the game.

I expect you to know what you are doing before choosing to download this, if you've never used SillyTavern before and don't know what an api key is, you should probably skip this.

🔧 Setup Instructions:

📥 Download the LLM server from GitHub:
https://github.com/Larryturbo/xcom-llm-server

🛠️ Configure your LLM:

Edit the XComLLM.ini file, you can find example set-ups in XComLLMExamples.ini

Don't have XComLLM.ini open during generation, so that it may save the response. Edit it only when it's not generating.

▶️ Always launch the server before starting XCOM 2 if you want it to work.

🎮 Launch XCOM 2: War of the Chosen normally.
2 Comments
Fading 1 Aug @ 11:12am 
Huh, I was actually looking into something like this. Good job!
jat11241976 1 Aug @ 9:15am 
Wish I knew what SillyTavern or an API key was, but congragulations on the release!:steamthumbsup: