Update Local Mode guide

This commit is contained in:
Stanislas0 2024-07-11 16:38:13 +08:00
parent 4ac9e8157e
commit 8be6c39820
6 changed files with 57 additions and 1 deletions

View File

@ -18,6 +18,15 @@ We introduce CodeGeeX4-ALL-9B, the open-source version of the latest CodeGeeX4 m
## Get Started
### Ollama
CodeGeeX4 is now available on [Ollama](https://ollama.com/library/codegeex4)!
Please install [Ollama 0.2](https://github.com/ollama/ollama/releases/tag/v0.2.0) or later and run the following command:
```bash
ollama run codegeex4
```
To connect the local model to our [VS Code](https://marketplace.visualstudio.com/items?itemName=aminer.codegeex) / [Jetbrains](https://plugins.jetbrains.com/plugin/20587-codegeex) extensions, please check [Local Mode Guideline](./guides/Local_mode_guideline.md).
### Huggingface transformers
Use `4.39.0<=transformers<=4.40.2` to quickly launch [codegeex4-all-9b](https://huggingface.co/THUDM/codegeex4-all-9b)
```python
@ -39,6 +48,7 @@ with torch.no_grad():
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
### vLLM
Use `vllm==0.5.1` to quickly launch [codegeex4-all-9b](https://huggingface.co/THUDM/codegeex4-all-9b):
```
from transformers import AutoTokenizer
@ -86,6 +96,8 @@ CodeGeeX4-ALL-9B provides three user guides to help users quickly understand and
3. **[Repository Tasks Guideline](./guides/Repository_tasks_guideline.md)**: This guide demonstrates how to use repository tasks in CodeGeeX4-ALL-9B, including QA tasks at the repository level and how to trigger the aicommiter capability of CodeGeeX4-ALL-9B to perform deletions, additions, and changes to files at the repository level.
4. **[Local Mode Guideline](./guides/Local_mode_guideline.md)**This guide introduces how to deploy CodeGeeX4-ALL-9B locally and connect it to Visual Studio Code / Jetbrains extensions.
These guides aim to provide a comprehensive understanding and facilitate efficient use of the model.
## Evaluation

View File

@ -6,7 +6,7 @@
[English](./README.md) | [中文](./README_zh.md)
# CodeGeeX4: 开源多语言代码生成模型
# CodeGeeX4: 全能的开源多语言代码生成模型
我们推出了 CodeGeeX4-ALL-9B这是最新的 CodeGeeX4 系列模型的开源版本。该模型是在 [GLM-4-9B](https://github.com/THUDM/GLM-4) 基础上持续训练的多语言代码生成模型,显著提升了代码生成能力。使用单个 CodeGeeX4-ALL-9B 模型可以支持代码补全与生成、代码解释、联网搜索、函数调用、仓库级代码问答等多种功能覆盖了软件开发的各个场景。CodeGeeX4-ALL-9B 在 [BigCodeBench](https://huggingface.co/datasets/bigcode/bigcodebench) 和 [NaturalCodeBench](https://github.com/THUDM/NaturalCodeBench) 等公开基准测试中取得了极具竞争力的表现。它是目前参数量少于 100 亿的最强代码生成模型,甚至超越了更大的通用模型,在推理速度和模型性能方面达到了最佳平衡。
@ -18,6 +18,15 @@
## 快速开始
### Ollama
CodeGeeX4 正式上线[Ollama](https://ollama.com/library/codegeex4)
请安装[Ollama 0.2](https://github.com/ollama/ollama/releases/tag/v0.2.0)或更高版本,并运行以下命令:
```bash
ollama run codegeex4
```
把本地模型接入[VS Code](https://marketplace.visualstudio.com/items?itemName=aminer.codegeex) / [Jetbrains](https://plugins.jetbrains.com/plugin/20587-codegeex)插件,请参考[本地模式教程](./guides/Local_mode_guideline_zh.md).
### Huggingface transformers
请使用 `4.39.0<=transformers<=4.40.2` 部署 [codegeex4-all-9b](https://huggingface.co/THUDM/codegeex4-all-9b)
```python
@ -37,6 +46,8 @@ with torch.no_grad():
outputs = outputs[:, inputs['input_ids'].shape[1]:]
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
```
### vLLM
使用 `vllm==0.5.1` 快速启动 [codegeex4-all-9b](https://huggingface.co/THUDM/codegeex4-all-9b)
```python
@ -89,6 +100,8 @@ python -m vllm.entrypoints.openai.api_server \
3. **[项目级代码生成指南](./guides/Repository_tasks_guideline_zh.md)**:本指南展示了如何在 CodeGeeX4-ALL-9B 中使用项目级任务,包括项目级别的问答任务,以及如何触发 CodeGeeX4-ALL-9B 的 aicommiter 功能以执行仓库级别任务中的删除、添加和更改文件操作。
4. **[本地模式指南](./guides/Local_mode_guideline_zh.md)**:本指南展示了如果在本地部署 CodeGeeX4-ALL-9B 并接入 Visual Studio Code / Jetbrains 插件中使用。
这些指南旨在帮助大家全面理解模型的用法并更好发挥模型的能力。
## 评测指标

View File

@ -0,0 +1,16 @@
# Local Mode Tutorial: Local deployment with the Visual Studio Code / Jetbrains extensions
The steps for two platforms are the same.
1. Click [VS Code](https://marketplace.visualstudio.com/items?itemName=aminer.codegeex) / [Jetbrains](https://plugins.jetbrains.com/plugin/20587-codegeex) to download the extension.
2. Open the local mode in the extension settings (no need to login).
3. Start Ollama server (other OpenAI compatible APIs are also supported) with the following command (keep the server running in background):
```bash
export OLLAMA_ORIGINS="*"
ollama run codegeex4
ollama serve
```
4. Enter the api address and model name in local mode settings. Then enjoy coding with CodeGeeX4!
![local mode](../resources/local_mode.png)

View File

@ -0,0 +1,15 @@
# 本地模式教程: 本地部署并接入 Visual Studio Code / Jetbrains 插件
两个平台的步骤相同。
1. 点击 [VS Code](https://marketplace.visualstudio.com/items?itemName=aminer.codegeex) / [Jetbrains](https://plugins.jetbrains.com/plugin/20587-codegeex) 下载插件。
2. 在插件设置中打开本地模式(无需登录)。
3. 使用以下命令启动 Ollama 服务器(支持其他兼容 OpenAI 格式的 API
```bash
export OLLAMA_ORIGINS="*"
ollama run codegeex4
ollama serve
```
4. 在本地模式设置中输入 api 地址和模型名称。然后享受 CodeGeeX4 的编码体验!
![local mode](../resources/local_mode_zh.png)

BIN
resources/local_mode.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 642 KiB

BIN
resources/local_mode_zh.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 490 KiB