In this episode, Anna Rose welcomes back Daniel Kang professor at UIUC and founding technical advisor at VAIL, for an update on ZKML and how the space has evolved since early 2023. Daniel covers the 2023-2024 cohort of ZKML tools including zkCNN, zkLLM, EZKL, and his original ZKML project, while introducing his new project ZKTorch, which offers a flexible hybrid of specialized and general-purpose approaches.
The discussion explores practical applications like verified FaceID, proof of prompt, and proof of training, along with the technical challenges of adding ZK proofs to machine learning models. Daniel shares insights on the performance trade-offs between specialized cryptographic systems and generic circuits, and how ZKTorch aims to offer both flexibility and speed for proving ML inference.
Related links:
ZKTorch: Open-Sourcing the First Universal ZKML Compiler for Real-World AIZKTorch: Compiling ML Inference to Zero-Knowledge Proofs via Parallel Proof ZK Torch GitHubAccumulation by Bing-Jyue Chen, Lilia Tang, Daniel KangEpisode 369: Ligero for Memory-Efficient ZK with MuthuEpisode 356: ZK Benchmarks with Conner SwannEpisode 364: AI and ZK Auditing with David WongEpisode 265: Where ZK and ML intersect with Yi Sun and Daniel KangBonus Episode: zkpod.ai & Attested Audio Experiment with Daniel KangZK13: ZKTorch: Efficiently Compiling ML Models to Zero-Knowledge Proof Protocols - Daniel KangAI Agent Benchmarks are BrokenVAILzkCNN: Zero Knowledge Proofs for Convolutional Neural Network Predictions and AccuracyzkLLM: Zero Knowledge Proofs for Large Language ModelsMLPerf Inference: Datacenter
Check out the latest jobs in ZK at the ZK Podcast Jobs Board.
**If you like what we do:**
* Find all our links here! @ZeroKnowledge | Linktree
* Subscribe to our podcast newsletter
* Follow us on Twitter @zeroknowledgefm
* Join us on Telegram
* Catch us on YouTube
**Support the show:**
* Patreon
* ETH - Donation address
* BTC - Donation address
* SOL - Donation address
Read transcript