|
@@ -1,13 +1,5 @@
|
|
|
-<h1 align="center">
|
|
|
- <br>
|
|
|
- <img height="300" src="https://user-images.githubusercontent.com/2420543/233147843-88697415-6dbf-4368-a862-ab217f9f7342.jpeg"> <br>
|
|
|
- LocalAI
|
|
|
-<br>
|
|
|
-</h1>
|
|
|
-
|
|
|
-[](https://github.com/go-skynet/LocalAI/actions/workflows/test.yml) [](https://github.com/go-skynet/LocalAI/actions/workflows/image.yml)
|
|
|
-
|
|
|
-[](https://discord.gg/uJAeKSAGDy)
|
|
|
+# LocalAI
|
|
|
+go语言开发,基于开源本地模型 llama.cpp 后端,可以docker部署,有webui。
|
|
|
|
|
|
**LocalAI** is a drop-in replacement REST API compatible with OpenAI API specifications for local inferencing. It allows to run models locally or on-prem with consumer grade hardware, supporting multiple models families compatible with the `ggml` format. For a list of the supported model families, see [the model compatibility table below](https://github.com/go-skynet/LocalAI#model-compatibility-table).
|
|
|
|