# Local

The **Local LLM integration** allows you to connect to your **preferred** particular models and use them directly in StackAI projects.&#x20;

Once connected, Local Models behave like any other LLM provider in StackAI and can be selected inside AI Agent nodes in your workflows.

For detailed instructions on **setting up and managing** Local Model connections, see the full guide:&#x20;

[**Local LLM Setup Guide**](https://docs.stackai.com/workflow-builder/llms/local-llm)

***

### Usage Overview

<div align="left"><figure><img src="https://3697023207-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FFSlso1Kjob5CLDrh0dVn%2Fuploads%2FUvxEcDfJsWrn3y7vLz5L%2F6.png?alt=media&#x26;token=0f513c7d-1d44-4e2e-894a-2cf5b7aa2d78" alt=""><figcaption></figcaption></figure></div>

1. Click **New Connection** to create a **Local Model** connection.&#x20;
2. Enter the endpoint details for the server hosting your model.
3. Use **Test Connection** to verify the connection is **Healthy**.
4. Select the connection when configuring an **AI Agent node** in your workflow.

After the connection is created, it will appear in the connection dropdown when selecting **Local** as the AI Provider.

***

### Workspace Setup

Workspace administrators can configure **default Local Model connections** and control provider access in the workspace settings.

For detailed instructions on:

* Creating default connections
* Enabling or disabling AI Providers
* Managing connections across the workspace
* Disabling StackAI-managed API keys

See the full guide: [**Local LLM Setup Guide**](https://docs.stackai.com/workflow-builder/llms/local-llm)
