We study a flexible and efficient zero-short learning method. Given a zero-shot task, we first generate a dataset from scratch using PLMs in an unsupervised manner. Then, we train a tiny task model under the supervision of the synthesized dataset. (EMNLP 2022)