Trending

#DatasetDistillation

Latest posts tagged with #DatasetDistillation on Bluesky

Latest Top
Trending

Posts tagged #DatasetDistillation

Automatic Inner-loop Optimization Boosts Dataset Distillation Performance

Automatic Inner-loop Optimization Boosts Dataset Distillation Performance

AT‑BPTT improves dataset distillation accuracy by 6.16%, speeds inner‑loop training 3.9× and cuts memory use by 63% on CIFAR‑10, CIFAR‑100 and ImageNet‑1K. Read more: getnews.me/automatic-inner-loop-opt... #datasetdistillation #atbptt

0 0 0 0
Rectified Decoupled Dataset Distillation Sets New Standard for AI Evaluation

Rectified Decoupled Dataset Distillation Sets New Standard for AI Evaluation

Rectified Decoupled Dataset Distillation (RD³) standardizes augmentation, soft‑label handling and training schedules, matching prior state‑of‑the‑art results. Read more: getnews.me/rectified-decoupled-data... #datasetdistillation #rd3

0 0 0 0
DD‑Ranking: Fair Evaluation Framework for Dataset Distillation

DD‑Ranking: Fair Evaluation Framework for Dataset Distillation

DD‑Ranking introduces four new metrics—Information Gain, Robustness, Generalization Gap and Consistency to evaluate dataset distillation beyond plain accuracy. September 2025. Read more: getnews.me/dd-ranking-fair-evaluati... #ddranking #datasetdistillation

0 0 0 0
EDGE: Fast Generative Approach for Multimodal Dataset Distillation

EDGE: Fast Generative Approach for Multimodal Dataset Distillation

EDGE speeds up multimodal dataset distillation about 18× faster than previous state‑of‑the‑art methods, achieving accuracy on Flickr30K, COCO and CC3M benchmarks. Read more: getnews.me/edge-fast-generative-app... #multimodal #datasetdistillation #edge

1 0 0 0

Boost Self-Supervised Dataset Distillation via Parameterization,
Predefined Augmentation, and Approximation
Jia-Jiun Yao, Sheng-Feng Yu et al.
Paper
Details
#SelfSupervisedLearning #DatasetDistillation #ParameterizationTechniques

0 0 0 0