Edit Mode
CESS AI-LINK
Byzantine-Robust Circuit with Privacy and Data Sovereignty from the CESS Network
Data usage at any time.
CESS AI-LINK allows participants in each CESS node to collaboratively train a shared model without sharing their original data. During the model training period, CESS AI-LINK uses smart contracts to delegate the task of local models to the computing node (GPUs, GPU clusters, or even another Decentralized GPU Computation Web3 DePINs) in the CESS network , allowing participants to join in data sharing at any time.
Features
Architecture
Organizations can use the CESS network to exchange parameters and models, which can be iterated under the encryption mechanism of CESS AI-LINK. That is, to establish a virtual shared global model. This global model is the optimal model that everyone aggregates data to build. However, when building a global model, the data itself does not need to move outside, nor does it leak privacy.
Algorithm
In federated learning, multiple participants collaboratively train a global model without sharing their local private data with the other participants. They only transfer the model or update the model during the training process, so the original data is not leaked.
After receiving local training on the server, the local model updates are sent to the CESS network, and then the smart contracts are called.
The Orgs obtain the global model from their associated CESS nodes
The user calculates the global model update again by using aggregated local model updates.
Initialize the local module, call local (or distributed computing network) computing resources, execute the gradient descent optimization algorithms, and update the model on the server.
When receiving a request to execute a smart contract, the consensus node obtains and broadcasts local model updates, and at the same time verifies the local model updates received from other nodes. After finishing the verification, these updates of model are aggregated through a weighted algorithm and get a new global model.