mirror of
https://github.com/JasonYANG170/CodeGeeX4.git
synced 2024-11-27 06:06:33 +00:00
.. | ||
.chainlit | ||
llm/api | ||
prompts | ||
public | ||
utils | ||
.env | ||
chainlit_zh-CN.md | ||
chainlit.md | ||
readme.md | ||
requirements.txt | ||
run.py |
CodeGeeX
Welcome to My Chat Demo Application
This is a simple demonstration application.
Instructions
- Enter your question.
- Wait for a response.
- Enjoy the conversation!
Features
- Supports multi-turn conversations.
- Supports online Q&A.
- Supports uploading local zip packages for project Q&A and modifications.
- Supports inputting GitHub project links for project Q&A and modifications.
Installation
- Clone the repository locally.
- Start the model. You can deploy the model using vllm or ollama, provide the OpenAI request format, and set the deployed
api_base
andapi_key
. Alternatively, visit CodeGeeX API to get the API key.
#use open.bigmodel.cn api
openai_api_key = "<|apikey|>"
openai_api_base = "https://open.bigmodel.cn/api/paas/v4/"
model_name = "codegeex-4"
#use vllm
openai_api_key = "EMPTY"
openai_api_base = "http://xxxx:xxxx/v1"
model_name = "codegeex4-all-9b"
- Fill in the corresponding model information and
bing_search_api
(if you want to experience online search) in the.env
file. - Install dependencies:
pip install -r requirements.txt
. - Run the application:
chainlit run run.py --port 8899
.
Note
Please ensure your network environment can access the CodeGeeX API.
Disclaimer
This application is for educational and research purposes only and should not be used for any commercial purposes. The developer is not responsible for any loss or damage caused by the use of this application.
Acknowledgements
Thank you for using our application. If you have any questions or suggestions, please feel free to contact us. We look forward to your feedback and are committed to providing you with better service.