Skip to main content
Version: 2.0

LocalAI

LocalAI is Self-hosted AI running on consumer-grade hardware, used for AI DevOps AI Models Unified Interface Local AI model Service and Runtime . Local AI is an open-source platform that supports local deployment of AI models, protects data privacy, and enables efficient AI application development.

gui

Prepare

When referring to this document to use LocalAI, please read and ensure the following points:

  • Login to Websoft9 Console and find or install LocalAI:

    • Go to My Apps listing applications
    • Go to App Store installing target application
  • This application is installed by Websoft9 console.

  • The purpose of this application complies with the MIT open source license agreement.

  • Configure the domain name or server security group opens external network ports for application access.

Getting started

Configuring the AI Model

  1. After installing LocalAI in the Websoft9 console, view the application details under My Applications. Obtain the access URL from the Access tab.

  2. Enter the access URL in your browser. You cannot start AI chat immediately upon entering; you must configure the AI model first.

  3. Click the [Browse Model Gallery] button on the page, select a model to download and install (recommended: lfm2.5-1.2b-nova-function-calling—it's compact and ideal for demonstrations).

  4. Once configured, you can begin chatting.

Configuration options

Administer

Troubleshooting