View
11
Download
0
Category
Preview:
Citation preview
60K images 10 images
distill
50K images
94% accuracy
54% accuracy
train
100 images
Fixed init
100 imagesencoding domain difference
85% MNISTaccuracy
300 attack images
distill train
train
train
Fixed init
52% MNIST accuracy
Trained for SVHN
distill
60K images
50K images
distill Trained for CIFAR10
82% accuracyon class “plane”
Attacked Model
7% accuracyon class “plane”
73K images
Dataset distillation on MNIST and CIFAR10
Dataset distillation can quickly fine-tune pre-trained networks on new datasets
Dataset distillation can maliciously attack classifier networks
13% accuracy
9% accuracy
Recommended