no code implementations • 3 Oct 2023 • Haoyu Zhou, Mingyu Ding, Weikun Peng, Masayoshi Tomizuka, Lin Shao, Chuang Gan
This work introduces a framework harnessing the capabilities of Large Language Models (LLMs) to generate primitive task conditions for generalizable long-horizon manipulations with novel objects and unseen tasks.
5 code implementations • CVPR 2022 • Dailan He, Ziming Yang, Weikun Peng, Rui Ma, Hongwei Qin, Yan Wang
Recently, learned image compression techniques have achieved remarkable performance, even surpassing the best manually designed lossy image coders.
Ranked #1 on Image Compression on kodak