no code implementations • 19 Jul 2024 • Matthias Karlbauer, Danielle C. Maddix, Abdul Fatir Ansari, Boran Han, Gaurav Gupta, Yuyang Wang, Andrew Stuart, Michael W. Mahoney
Remarkable progress in the development of Deep Learning Weather Prediction (DLWP) models positions them to become competitive with traditional numerical weather prediction (NWP) models.
no code implementations • 11 Jun 2024 • Shikai Qiu, Boran Han, Danielle C. Maddix, Shuai Zhang, Yuyang Wang, Andrew Gordon Wilson
Furthermore, AFT reliably translates improvement in pre-trained models into improvement in downstream performance, even if the downstream model is over $50\times$ smaller, and can effectively transfer complementary information learned by multiple pre-trained models.
no code implementations • 5 Jun 2024 • Dyah Adila, Shuai Zhang, Boran Han, Yuyang Wang
The question-answering (QA) capabilities of foundation models are highly sensitive to prompt variations, rendering their performance susceptible to superficial, non-meaning-altering changes.
1 code implementation • 6 Apr 2024 • Bonan Liu, Guoyang Zhao, Jianhao Jiao, Guang Cai, Chengyang Li, Handi Yin, Yuyang Wang, Ming Liu, Pan Hui
A Colored point cloud, as a simple and efficient 3D representation, has many advantages in various fields, including robotic navigation and scene reconstruction.
1 code implementation • 15 Mar 2024 • S. Chandra Mouli, Danielle C. Maddix, Shima Alizadeh, Gaurav Gupta, Andrew Stuart, Michael W. Mahoney, Yuyang Wang
Existing work in scientific machine learning (SciML) has shown that data-driven learning of solution operators can provide a fast approximate alternative to classical numerical partial differential equation (PDE) solvers.
1 code implementation • 12 Mar 2024 • Abdul Fatir Ansari, Lorenzo Stella, Caner Turkmen, Xiyuan Zhang, Pedro Mercado, Huibin Shen, Oleksandr Shchur, Syama Sundar Rangapuram, Sebastian Pineda Arango, Shubham Kapoor, Jasper Zschiegner, Danielle C. Maddix, Hao Wang, Michael W. Mahoney, Kari Torkkola, Andrew Gordon Wilson, Michael Bohlke-Schneider, Yuyang Wang
We introduce Chronos, a simple yet effective framework for pretrained probabilistic time series models.
no code implementations • 7 Mar 2024 • Abdoulaye Gamatié, Yuyang Wang
To achieve this, we propose a methodology consisting of: 1) the development of relevant ML models for explaining silent store prediction, and 2) the application of XAI to explain these models.
Explainable artificial intelligence Explainable Artificial Intelligence (XAI)
no code implementations • 19 Feb 2024 • Sining Zhoubian, Yuyang Wang, Zhihuan Jiang
This boosts the model's performance, achieving an accuracy of 73. 7% on the test dataset.
no code implementations • 14 Jan 2024 • Yuyang Wang, Yizhi Hao, Amando Xu Cong
Our application study contributes by applying and optimizing these advanced models for synthetic image detection, conducting a comparative analysis using various metrics, and demonstrating their superior capability in identifying AI-generated images over traditional machine learning techniques.
no code implementations • 22 Dec 2023 • Syama Sundar Rangapuram, Jan Gasthaus, Lorenzo Stella, Valentin Flunkert, David Salinas, Yuyang Wang, Tim Januschowski
This paper presents non-parametric baseline models for time series forecasting.
no code implementations • 27 Nov 2023 • Yuyang Wang, Ahmed A. Elhag, Navdeep Jaitly, Joshua M. Susskind, Miguel Angel Bautista
We present a novel way to predict molecular conformers through a simple formulation that sidesteps many of the heuristics of prior works and achieves state of the art results by using the advantages of scale.
no code implementations • 20 Oct 2023 • Ze Gao, Xiang Li, Changkun Liu, Xian Wang, Anqi Wang, Liang Yang, Yuyang Wang, Pan Hui, Tristan Braud
We present VR PreM+, an innovative VR system designed to enhance web exploration beyond traditional computer screens.
2 code implementations • 10 Aug 2023 • Oleksandr Shchur, Caner Turkmen, Nick Erickson, Huibin Shen, Alexander Shirkov, Tony Hu, Yuyang Wang
We introduce AutoGluon-TimeSeries - an open-source AutoML library for probabilistic time series forecasting.
1 code implementation • NeurIPS 2023 • Marcel Kollovieh, Abdul Fatir Ansari, Michael Bohlke-Schneider, Jasper Zschiegner, Hao Wang, Yuyang Wang
Prior works on time series diffusion models have primarily focused on developing conditional models tailored to specific forecasting or imputation tasks.
1 code implementation • NeurIPS 2023 • Zhihan Gao, Xingjian Shi, Boran Han, Hao Wang, Xiaoyong Jin, Danielle Maddix, Yi Zhu, Mu Li, Yuyang Wang
We conduct empirical studies on two datasets: N-body MNIST, a synthetic dataset with chaotic behavior, and SEVIR, a real-world precipitation nowcasting dataset.
no code implementations • 12 Jul 2023 • Ziru Zhang, Xuling Zhang, Guangzhi Zhu, Yuyang Wang, Pan Hui
In the era of Internet of Things (IoT), Digital Twin (DT) is envisioned to empower various areas as a bridge between physical objects and the digital world.
no code implementations • 25 May 2023 • Hilaf Hasson, Danielle C. Maddix, Yuyang Wang, Gaurav Gupta, Youngsuk Park
Ensembling is among the most popular tools in machine learning (ML) due to its effectiveness in minimizing variance and thus improving generalization.
no code implementations • 24 May 2023 • Ahmed A. Elhag, Yuyang Wang, Joshua M. Susskind, Miguel Angel Bautista
Our approach allows to sample continuous functions on manifolds and is invariant with respect to rigid and isometric transformations of the manifold.
1 code implementation • 10 Apr 2023 • Zhonglin Cao, Yuyang Wang, Cooper Lorsung, Amir Barati Farimani
Overall, our deep learning model is a fast, flexible, and accurate surrogate model to predict ion concentration profiles in nanoconfinement.
no code implementations • 14 Mar 2023 • Arun Jambulapati, Hilaf Hasson, Youngsuk Park, Yuyang Wang
Determining causal relationship between high dimensional observations are among the most important tasks in scientific discoveries.
1 code implementation • 3 Mar 2023 • Yuyang Wang, Changwen Xu, Zijie Li, Amir Barati Farimani
These results highlight the potential for leveraging denoise pretraining approaches to build more generalizable neural potentials for complex molecular systems.
1 code implementation • 23 Feb 2023 • Alan Edelman, Ekin Akyurek, Yuyang Wang
This paper has three contributions: (i) it is of intellectual value to replace traditional treatments of automatic differentiation with a (left acting) operator theoretic, graph-based approach; (ii) operators can be readily placed in matrices in software in programming languages such as Julia as an implementation option; (iii) we introduce a novel notation, ``transpose dot'' operator ``$\{\}^{T_\bullet}$'' that allows for the reversal of operators.
no code implementations • 7 Feb 2023 • Yuyang Wang, Ruichen Li, Jean-Rémy Chardonnet, Pan Hui
This work presents a dataset collected to predict cybersickness in virtual reality environments.
1 code implementation • 15 Dec 2022 • Xiyuan Zhang, Xiaoyong Jin, Karthick Gopalswamy, Gaurav Gupta, Youngsuk Park, Xingjian Shi, Hao Wang, Danielle C. Maddix, Yuyang Wang
Transformer-based models have gained large popularity and demonstrated promising results in long-term time-series forecasting in recent years.
no code implementations • 7 Dec 2022 • Tim Januschowski, Jan Gasthaus, Yuyang Wang, David Salinas, Valentin Flunkert, Michael Bohlke-Schneider, Laurent Callot
Classifying forecasting methods as being either of a "machine learning" or "statistical" nature has become commonplace in parts of the forecasting literature and community, as exemplified by the M4 competition and the conclusion drawn by the organizers.
no code implementations • 26 Oct 2022 • Qing Wang, Hang Chen, Ya Jiang, Zhe Wang, Yuyang Wang, Jun Du, Chin-Hui Lee
In this paper, we propose a deep learning based multi-speaker direction of arrival (DOA) estimation with audio and visual signals by using permutation-free loss function.
1 code implementation • 25 Oct 2022 • Zhonglin Cao, Rishikesh Magar, Yuyang Wang, Amir Barati Farimani
Furthermore, we revealed that MOFormer can be more data-efficient on quantum-chemical property prediction than structure-based CGCNN when training data is limited.
1 code implementation • 29 Sep 2022 • Yue Jian, Yuyang Wang, Amir Barati Farimani
Fingerprint works on graph structure at the feature extraction stage, while GNNs directly handle molecule structure in both the feature extraction and model prediction stage.
no code implementations • 15 Sep 2022 • Richard Kurle, Ralf Herbrich, Tim Januschowski, Yuyang Wang, Jan Gasthaus
Then, we transfer our analysis of the linear model to neural networks.
no code implementations • 12 Sep 2022 • Yuyang Wang, Zijie Li, Amir Barati Farimani
Graph neural networks (GNNs), which are capable of learning representations from graphical data, are naturally suitable for modeling molecular systems.
1 code implementation • 3 Sep 2022 • Changwen Xu, Yuyang Wang, Amir Barati Farimani
Rigorous experiments on ten polymer property prediction benchmarks demonstrate the superior performance of TransPolymer.
2 code implementations • 12 Jul 2022 • Zhihan Gao, Xingjian Shi, Hao Wang, Yi Zhu, Yuyang Wang, Mu Li, Dit-yan Yeung
With the explosive growth of the spatiotemporal Earth observation data in the past decade, data-driven models that apply Deep Learning (DL) are demonstrating impressive potential for various Earth system forecasting tasks.
Ranked #1 on Earth Surface Forecasting on EarthNet2021 OOD Track
no code implementations • 4 May 2022 • Rishikesh Magar, Yuyang Wang, Amir Barati Farimani
Machine learning (ML) models have been widely successful in the prediction of material properties.
1 code implementation • 24 Feb 2022 • Taeho Yoon, Youngsuk Park, Ernest K. Ryu, Yuyang Wang
Probabilistic time series forecasting has played critical role in decision-making processes due to its capability to quantify uncertainties.
1 code implementation • 18 Feb 2022 • Yuyang Wang, Rishikesh Magar, Chen Liang, Amir Barati Farimani
On most benchmarks, the generic GNN pre-trained by iMolCLR rivals or even surpasses supervised learning models with sophisticated architecture designs and engineered features.
1 code implementation • ICLR 2022 • Zihao Xu, Hao He, Guang-He Lee, Yuyang Wang, Hao Wang
In this work, we relax such uniform alignment by using a domain graph to encode domain adjacency, e. g., a graph of states in the US with each state as a domain and each edge indicating adjacency, thereby allowing domains to align flexibly based on the graph structure.
no code implementations • 1 Feb 2022 • Hao Wang, Yifei Ma, Hao Ding, Yuyang Wang
Recurrent neural networks have proven effective in modeling sequential user feedbacks for recommender systems.
no code implementations • 18 Dec 2021 • Ke Alexander Wang, Danielle Maddix, Yuyang Wang
We consider the problem of probabilistic forecasting over categories with graph structure, where the dynamics at a vertex depends on its local connectivity structure.
no code implementations • 14 Dec 2021 • Danielle C Maddix, Nadim Saad, Yuyang Wang
The transport of traffic flow can be modeled by the advection equation.
1 code implementation • 30 Nov 2021 • Rishikesh Magar, Yuyang Wang, Cooper Lorsung, Chen Liang, Hariharan Ramasubramanian, Peiyuan Li, Amir Barati Farimani
Inspired by the success of data augmentations in computer vision and natural language processing, we developed AugLiChem: the data augmentation library for chemical structures.
no code implementations • 22 Nov 2021 • Dheeraj Baby, Hilaf Hasson, Yuyang Wang
When the loss functions are strongly convex or exp-concave, we demonstrate that Strongly Adaptive (SA) algorithms can be viewed as a principled way of controlling dynamic regret in terms of path variation $V_T$ of the comparator sequence.
no code implementations • 12 Nov 2021 • Youngsuk Park, Danielle Maddix, François-Xavier Aubet, Kelvin Kan, Jan Gasthaus, Yuyang Wang
Quantile regression is an effective technique to quantify uncertainty, fit challenging underlying distributions, and often provide full probabilistic predictions through joint learnings over multiple quantile levels.
1 code implementation • NeurIPS 2021 • Abdul Fatir Ansari, Konstantinos Benidis, Richard Kurle, Ali Caner Turkmen, Harold Soh, Alexander J. Smola, Yuyang Wang, Tim Januschowski
We propose the Recurrent Explicit Duration Switching Dynamical System (RED-SDS), a flexible model that is capable of identifying both state- and time-dependent switching dynamics.
no code implementations • 3 Jul 2021 • Hao Yen, Chao-Han Huck Yang, Hu Hu, Sabato Marco Siniscalchi, Qing Wang, Yuyang Wang, Xianjun Xia, Yuanjun Zhao, Yuzhong Wu, Yannan Wang, Jun Du, Chin-Hui Lee
We propose a novel neural model compression strategy combining data augmentation, knowledge transfer, pruning, and quantization for device-robust acoustic scene classification (ASC).
1 code implementation • 13 Jun 2021 • Shantanu Gupta, Hao Wang, Zachary C. Lipton, Yuyang Wang
Link prediction methods are frequently applied in recommender systems, e. g., to suggest citations for academic papers or friends in social networks.
no code implementations • 9 Jun 2021 • Aldebaro Klautau, Pedro Batista, Nuria Gonzalez-Prelcic, Yuyang Wang, Robert W. Heath Jr
The increasing complexity of configuring cellular networks suggests that machine learning (ML) can effectively improve 5G technologies.
no code implementations • 18 May 2021 • Hao Ding, Yifei Ma, Anoop Deoras, Yuyang Wang, Hao Wang
This poses a chicken-and-egg problem for early-stage products, whose amount of data, in turn, relies on the performance of their RS.
no code implementations • 2 Mar 2021 • Yucheng Lu, Youngsuk Park, Lifan Chen, Yuyang Wang, Christopher De Sa, Dean Foster
In large-scale time series forecasting, one often encounters the situation where the temporal patterns of time series, while drifting over time, differ from one another in the same dataset.
no code implementations • 27 Feb 2021 • Yuyang Wang, Nitin Jonathan Myers, Nuria González-Prelcic, Robert W. Heath Jr
We design fully-connected layers to optimize channel acquisition and beam alignment.
1 code implementation • 19 Feb 2021 • Yuyang Wang, Jianren Wang, Zhonglin Cao, Amir Barati Farimani
In this work, we present MolCLR: Molecular Contrastive Learning of Representations via Graph Neural Networks (GNNs), a self-supervised learning framework that leverages large unlabeled data (~10M unique molecules).
1 code implementation • 13 Feb 2021 • Xiaoyong Jin, Youngsuk Park, Danielle C. Maddix, Hao Wang, Yuyang Wang
Recently, deep neural networks have gained increasing popularity in the field of time series forecasting.
no code implementations • 19 Jan 2021 • Yuyang Wang, Zhonglin Cao, Amir Barati Farimani
Structure and geometry optimization of nanopores on such materials is beneficial for their performances in real-world engineering applications, like water desalination.
no code implementations • 12 Jan 2021 • Yuyang Wang, Kenji Shimada, Amir Barati Farimani
Our model can (1) encode the existing airfoil into a latent vector and reconstruct the airfoil from that, (2) generate novel airfoils by randomly sampling the latent vectors and mapping the vectors to the airfoil coordinate domain, and (3) synthesize airfoils with desired aerodynamic properties by optimizing learned features via a genetic algorithm.
3 code implementations • 20 Nov 2020 • Rui Wang, Danielle Maddix, Christos Faloutsos, Yuyang Wang, Rose Yu
While much research on distribution shift has focused on changes in the data domain, our work calls attention to rethink generalization for learning dynamical systems.
no code implementations • 4 Oct 2020 • Ali Caner Turkmen, Tim Januschowski, Yuyang Wang, Ali Taylan Cemgil
Intermittency is a common and challenging problem in demand forecasting.
no code implementations • 11 May 2020 • Yuyang Wang, Nitin Jonathan Myers, Nuria González-Prelcic, Robert W. Heath Jr
Furthermore, based on the CS channel measurements, we develop techniques to update and learn such channel AoD statistics at the BS.
1 code implementation • 21 Apr 2020 • Konstantinos Benidis, Syama Sundar Rangapuram, Valentin Flunkert, Yuyang Wang, Danielle Maddix, Caner Turkmen, Jan Gasthaus, Michael Bohlke-Schneider, David Salinas, Lorenzo Stella, Francois-Xavier Aubet, Laurent Callot, Tim Januschowski
Deep learning based forecasting methods have become the methods of choice in many applications of time series prediction or forecasting often outperforming other approaches.
1 code implementation • 23 Nov 2019 • Ali Caner Turkmen, Yuyang Wang, Tim Januschowski
Intermittent demand, where demand occurrences appear sporadically in time, is a common and challenging problem in forecasting.
7 code implementations • 12 Jun 2019 • Alexander Alexandrov, Konstantinos Benidis, Michael Bohlke-Schneider, Valentin Flunkert, Jan Gasthaus, Tim Januschowski, Danielle C. Maddix, Syama Rangapuram, David Salinas, Jasper Schulz, Lorenzo Stella, Ali Caner Türkmen, Yuyang Wang
We introduce Gluon Time Series (GluonTS, available at https://gluon-ts. mxnet. io), a library for deep-learning-based time series modeling.
no code implementations • 28 May 2019 • Yuyang Wang, Alex Smola, Danielle C. Maddix, Jan Gasthaus, Dean Foster, Tim Januschowski
We provide both theoretical and empirical evidence for the soundness of our approach through a necessary and sufficient decomposition of exchangeable time series into a global and a local part.
2 code implementations • NeurIPS 2018 • Syama Sundar Rangapuram, Matthias W. Seeger, Jan Gasthaus, Lorenzo Stella, Yuyang Wang, Tim Januschowski
We present a novel approach to probabilistic time series forecasting that combines state space models with deep learning.
no code implementations • 30 Nov 2018 • Danielle C. Maddix, Yuyang Wang, Alex Smola
A large collection of time series poses significant challenges for classical and neural forecasting approaches.
no code implementations • 7 Dec 2017 • Lucas Roberts, Leo Razoumov, Lin Su, Yuyang Wang
Moreover, we show that the Gini regularized OT problem converges to the classical OT problem, when the Gini regularized problem is considered as a function of {\lambda}, the regularization parame-ter.
no code implementations • 22 Sep 2017 • Matthias Seeger, Syama Rangapuram, Yuyang Wang, David Salinas, Jan Gasthaus, Tim Januschowski, Valentin Flunkert
We present a scalable and robust Bayesian inference method for linear state space models.
no code implementations • 5 Mar 2012 • Yuyang Wang, Roni Khardon, Pavlos Protopapas
The paper applies this framework for data where each task is a phase-shifted periodic time series.