Search Results for author: Sercan Arik

Found 9 papers, 3 papers with code

ASPEST: Bridging the Gap Between Active Learning and Selective Prediction

1 code implementation7 Apr 2023 Jiefeng Chen, Jinsung Yoon, Sayna Ebrahimi, Sercan Arik, Somesh Jha, Tomas Pfister

In this work, we introduce a new learning paradigm, active selective prediction, which aims to query more informative samples from the shifted target domain while increasing accuracy and coverage.

Active Learning

Data-Efficient and Interpretable Tabular Anomaly Detection

no code implementations3 Mar 2022 Chun-Hao Chang, Jinsung Yoon, Sercan Arik, Madeleine Udell, Tomas Pfister

In addition, the proposed framework, DIAD, can incorporate a small amount of labeled data to further boost anomaly detection performances in semi-supervised settings.

Additive models Anomaly Detection

Decoupling Local and Global Representations of Time Series

1 code implementation4 Feb 2022 Sana Tonekaboni, Chun-Liang Li, Sercan Arik, Anna Goldenberg, Tomas Pfister

Learning representations that capture the factors contributing to this variability enables a better understanding of the data via its underlying generative process and improves performance on downstream machine learning tasks.

counterfactual Time Series +1

Distance-Based Learning from Errors for Confidence Calibration

no code implementations ICLR 2020 Chen Xing, Sercan Arik, Zizhao Zhang, Tomas Pfister

To circumvent this by inferring the distance for every test sample, we propose to train a confidence model jointly with the classification model.

Classification General Classification

On Concept-Based Explanations in Deep Neural Networks

no code implementations25 Sep 2019 Chih-Kuan Yeh, Been Kim, Sercan Arik, Chun-Liang Li, Pradeep Ravikumar, Tomas Pfister

Next, we propose a concept discovery method that considers two additional constraints to encourage the interpretability of the discovered concepts.

EPNAS: Efficient Progressive Neural Architecture Search

no code implementations7 Jul 2019 Yanqi Zhou, Peng Wang, Sercan Arik, Haonan Yu, Syed Zawad, Feng Yan, Greg Diamos

In this paper, we propose Efficient Progressive Neural Architecture Search (EPNAS), a neural architecture search (NAS) that efficiently handles large search space through a novel progressive search policy with performance prediction based on REINFORCE~\cite{Williams. 1992. PG}.

Neural Architecture Search

HybridNet: A Hybrid Neural Architecture to Speed-up Autoregressive Models

no code implementations ICLR 2018 Yanqi Zhou, Wei Ping, Sercan Arik, Kainan Peng, Greg Diamos

This paper introduces HybridNet, a hybrid neural network to speed-up autoregressive models for raw audio waveform generation.

Speech Synthesis

Deep Voice 2: Multi-Speaker Neural Text-to-Speech

1 code implementation NeurIPS 2017 Sercan Arik, Gregory Diamos, Andrew Gibiansky, John Miller, Kainan Peng, Wei Ping, Jonathan Raiman, Yanqi Zhou

We introduce Deep Voice 2, which is based on a similar pipeline with Deep Voice 1, but constructed with higher performance building blocks and demonstrates a significant audio quality improvement over Deep Voice 1.

Speech Synthesis

Supervised classification-based stock prediction and portfolio optimization

no code implementations3 Jun 2014 Sercan Arik, Sukru Burc Eryilmaz, Adam Goldberg

In this work, we apply machine learning techniques to address automated stock picking, while using a larger number of financial parameters for individual companies than the previous studies.

Classification General Classification +4

Cannot find the paper you are looking for? You can Submit a new open access paper.