Search Results for author: Mingqin Li

Found 4 papers, 2 papers with code

DynaTune: Dynamic Tensor Program Optimization in Deep Neural Network Compilation

no code implementations ICLR 2021 Minjia Zhang, Menghao Li, Chi Wang, Mingqin Li

Recently, the DL compiler, together with Learning to Compile has proven to be a powerful technique for optimizing deep learning models.

Decision Making Uncertainty Quantification

AdaTune: Adaptive Tensor Program Compilation Made Efficient

no code implementations NeurIPS 2020 Menghao Li, Minjia Zhang, Chi Wang, Mingqin Li

Deep learning models are computationally intense, and implementations often have to be highly optimized by experts or hardware vendors to be usable in practice.

Cannot find the paper you are looking for? You can Submit a new open access paper.