Edit Mode

CESS AI-LINK

Byzantine-Robust Circuit with Privacy and Data Sovereignty from the CESS Network

Data usage at any time.

CESS AI-LINK allows participants in each CESS node to collaboratively train a shared model without sharing their original data. During the model training period, CESS AI-LINK uses smart contracts to delegate the task of local models to the computing node (GPUs, GPU clusters, or even another Decentralized GPU Computation Web3 DePINs) in the CESS network , allowing participants to join in data sharing at any time.

Features

Aggregator-free federated learning environment based on the blockchain network which is decentralized essentially.
Using smart contracts to coordinate round division, model aggregation and updating tasks in federated learning.
Obtain the global model at any time from the smart contract deployed.
Based on Substrate has the powerful cross-chain interaction ability which can easily interact with the L2 of BTC/ETH.

Architecture

Organizations can use the CESS network to exchange parameters and models, which can be iterated under the encryption mechanism of CESS AI-LINK. That is, to establish a virtual shared global model. This global model is the optimal model that everyone aggregates data to build. However, when building a global model, the data itself does not need to move outside, nor does it leak privacy.
Organizations can use the CESS network to exchange parameters and models, which can be iterated under the encryption mechanism of CESS AI-LINK. That is, to establish a virtual shared global model. This global model is the optimal model that everyone aggregates data to build. However, when building a global model, the data itself does not need to move outside, nor does it leak privacy. Organizations can use the CESS network to exchange parameters and models, which can be iterated under the encryption mechanism of CESS AI-LINK. That is, to establish a virtual shared global model. This global model is the optimal model that everyone aggregates data to build. However, when building a global model, the data itself does not need to move outside, nor does it leak privacy.

Algorithm

In federated learning, multiple participants collaboratively train a global model without sharing their local private data with the other participants. They only transfer the model or update the model during the training process, so the original data is not leaked.
In federated learning, multiple participants collaboratively train a global model without sharing their local private data with the other participants. They only transfer the model or update the model during the training process, so the original data is not leaked.
CESS AI After receiving local training on the server, the local model updates are sent to the CESS network, and then the smart contracts are called.
CESS AI The Orgs obtain the global model from their associated CESS nodes
CESS AI The user calculates the global model update again by using aggregated local model updates.
CESS AI Initialize the local module, call local (or distributed computing network) computing resources, execute the gradient descent optimization algorithms, and update the model on the server.
CESS AI When receiving a request to execute a smart contract, the consensus node obtains and broadcasts local model updates, and at the same time verifies the local model updates received from other nodes. After finishing the verification, these updates of model are aggregated through a weighted algorithm and get a new global model.