EZKL

What is EZKL and how does it work?

EZKL is a library and command-line tool for doing inference for deep learning models and other computational graphs in a zk-snark (ZKML). It enables the following workflow:

The generated proofs can then be used on-chain to verify computation, only the Ethereum Virtual Machine (EVM) is supported at the moment.

In the backend we use Halo2 as a proof system.

  1. Define a computational graph, for instance a neural network (but really any arbitrary set of operations), as you would normally in pytorch or tensorflow.

  2. Export the final graph of operations as an .onnx file and some sample inputs to a .json file.

  3. Point EZKL to the .onnx and .json files to generate a ZK-SNARK circuit

EZKL takes a high-level description of your program and sets up a zero-knowledge prover and verifier. Our focus is on programs that are expressed as pytorch AI/ML models and other computational graphs. After setup, the prover can prove statements such as the following.

"I ran this publicly available neural network on some private data and it produced this output"

"I ran my private neural network on some public data and it produced this output"

"I correctly ran this publicly available neural network on some public data and it produced this output"

These proofs can be trusted by anyone with a copy of the verifier, and verified directly on Ethereum and compatible chains. EZKL can be used directly from Python; see this colab notebook and the python bindings docs. It can also be used from the command line.

EZKL can prove an MNIST-sized inference in less than a second and under 180mb of memory and verify it on the Ethereum Virtual Machine (or on the command line, or in the browser using wasm).

EZKL can be used to move large and complex computations off-chain in a way that is easy to program (you can write your own functions in Python) and manage. You are not limited to a pre-defined set of functions, there is no limit on input size (using hashing), and there is no centralized sequencer.

Last updated