You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 26, 2024. It is now read-only.
How to Upgrade:
Scenario 1: Initial Deployment
If you are a newcomer and just getting started, you can follow the guidelines below to install and deploy the Computing Provider.
Build and install the Computing Provider
Step-by-step installation of the Computing Provider
Scenario 2: Upgrading to
v0.3.0
If you are already running version
0.2.0
, please follow these steps to upgrade:Step 1: Modify the configuration file
config.toml
(Optional) Remove
from the configuration file, generate a new[MCS].AccessToken
[MCS].ApiKey
as follows:Step 2: Pull the latest version and compile the Computing Provider
Step 3: Install AI Inference Dependency
It is necessary for Computing Provider to deploy the AI inference endpoint
Step 4: Migrate the Computing Provider's
config.toml
toCP_PATH
computing-provider
Important
Need Helps
If you encounter any problems, you can either leave a comment within the document or open an issue
The text was updated successfully, but these errors were encountered: