no code implementations • 16 Dec 2024 • Savinay Nagendra, Kashif Rashid, Chaopeng Shen, Daniel Kifer
Few-shot segmentation is the problem of learning to identify specific types of objects (e. g., airplanes) in images from a small set of labeled reference images.
no code implementations • 4 Oct 2024 • Neisarg Dave, Daniel Kifer, Lee Giles, Ankur Mali
However, our research challenges this notion by demonstrating that RNNs primarily operate as state machines, where their linguistic capabilities are heavily influenced by the precision of their embeddings and the strategies used for sampling negative examples.
1 code implementation • 1 Oct 2024 • Brett Mullins, Miguel Fuentes, Yingtai Xiao, Daniel Kifer, Cameron Musco, Daniel Sheldon
Reconstruction is an important subproblem for such mechanisms to economize the privacy budget, minimize error on reconstructed answers, and allow for scalability to high-dimensional datasets.
no code implementations • 21 May 2024 • Neisarg Dave, Daniel Kifer, C. Lee Giles, Ankur Mali
However, most research has predominantly focused on language-based reasoning and word problems, often overlooking the potential of LLMs in handling symbol-based calculations and reasoning.
no code implementations • 4 Feb 2024 • Neisarg Dave, Daniel Kifer, C. Lee Giles, Ankur Mali
We sampled the datasets from $7$ Tomita and $4$ Dyck grammars and trained them on $4$ RNN cells: LSTM, GRU, O2RNN, and MIRNN.
no code implementations • 18 Nov 2023 • Savinay Nagendra, Chaopeng Shen, Daniel Kifer
Landslides are a recurring, widespread hazard.
no code implementations • 26 Sep 2023 • Ankur Mali, Alexander Ororbia, Daniel Kifer, Lee Giles
In this work, we extend the theoretical foundation for the $2^{nd}$-order recurrent network ($2^{nd}$ RNN) and prove there exists a class of a $2^{nd}$ RNN that is Turing-complete with bounded time.
1 code implementation • NeurIPS 2023 • Yingtai Xiao, Guanlin He, Danfeng Zhang, Daniel Kifer
Noisy marginals are a common form of confidentiality-protecting data release and are useful for many downstream tasks such as contingency table analysis, construction of Bayesian networks, and even synthetic data generation.
no code implementations • 10 Jan 2023 • Chaopeng Shen, Alison P. Appling, Pierre Gentine, Toshiyuki Bandai, Hoshin Gupta, Alexandre Tartakovsky, Marco Baity-Jesi, Fabrizio Fenicia, Daniel Kifer, Li Li, Xiaofeng Liu, Wei Ren, Yi Zheng, Ciaran J. Harman, Martyn Clark, Matthew Farthing, Dapeng Feng, Praveen Kumar, Doaa Aboelyazeed, Farshid Rahmani, Hylke E. Beck, Tadd Bindas, Dipankar Dwivedi, Kuai Fang, Marvin Höge, Chris Rackauckas, Tirthankar Roy, Chonggang Xu, Binayak Mohanty, Kathryn Lawson
Here we present differentiable geoscientific modeling as a powerful pathway toward dissolving the perceived barrier between them and ushering in a paradigm shift.
1 code implementation • 30 Nov 2022 • Yingtai Xiao, Guanhong Wang, Danfeng Zhang, Daniel Kifer
Since M* will be used no matter what, the analyst can use its output to decide whether to subsequently run M1'(thus recreating the analysis supported by M1) or M2'(recreating the analysis supported by M2), without wasting privacy loss budget.
1 code implementation • 12 Nov 2022 • Savinay Nagendra, Chaopeng Shen, Daniel Kifer
Given the logit scores produced by the base segmentation model, each pixel is given a pseudo-label that is obtained by optimally thresholding the logit scores in each image patch.
Ranked #1 on
Polyp Segmentation
on Kvasir-SEG
(mIoU metric)
no code implementations • 27 Jan 2022 • Ankur Mali, Alexander Ororbia, Daniel Kifer, Lee Giles
In light of this, we propose a system that learns to improve the encoding performance by enhancing its internal neural representations on both the encoder and decoder ends, an approach we call Neural JPEG.
no code implementations • 27 Jan 2022 • Ankur Mali, Alexander Ororbia, Daniel Kifer, Lee Giles
Recent advances in deep learning have resulted in image compression algorithms that outperform JPEG and JPEG 2000 on the standard Kodak benchmark.
no code implementations • 19 Apr 2021 • Shivansh Rao, Vikas Kumar, Daniel Kifer, Lee Giles, Ankur Mali
A common approach has been to use standard convolutional networks to predict the corners and boundaries, followed by post-processing to generate the 3D layout.
no code implementations • 7 Apr 2021 • Ankur Mali, Alexander Ororbia, Daniel Kifer, C. Lee Giles
Two particular tasks that test this type of reasoning are (1) mathematical equation verification, which requires determining whether trigonometric and linear algebraic statements are valid identities or not, and (2) equation completion, which entails filling in a blank within an expression to make it true.
no code implementations • 6 Jan 2021 • Kuai Fang, Daniel Kifer, Kathryn Lawson, Dapeng Feng, Chaopeng Shen
We hypothesize that DL models automatically adjust their internal representations to identify commonalities while also providing sufficient discriminatory information to the model.
no code implementations • 7 Dec 2020 • Alexander Ororbia, Daniel Kifer
Neural generative models can be used to learn complex probability distributions from data, to sample from them, and to produce probability density estimates.
1 code implementation • 30 Nov 2020 • Yingtai Xiao, Zeyu Ding, Yuxin Wang, Danfeng Zhang, Daniel Kifer
In practice, differentially private data releases are designed to support a variety of applications.
Databases
no code implementations • 8 Oct 2020 • Jaewoo Lee, Daniel Kifer
Standard methods for differentially private training of deep neural networks replace back-propagated mini-batch gradients with biased and noisy approximations to the gradient.
2 code implementations • 7 Sep 2020 • Jaewoo Lee, Daniel Kifer
The reason for this slowdown is a crucial privacy-related step called "per-example gradient clipping" whose naive implementation undoes the benefits of batch training with GPUs.
no code implementations • 17 Aug 2020 • Yuxin Wang, Zeyu Ding, Daniel Kifer, Danfeng Zhang
We propose CheckDP, the first automated and integrated approach for proving or disproving claims that a mechanism is differentially private.
Programming Languages D.3.1
no code implementations • 4 Apr 2020 • Ankur Mali, Alexander Ororbia, Daniel Kifer, Clyde Lee Giles
In this paper, we improve the memory-augmented RNN with important architectural and state updating mechanisms that ensure that the model learns to properly balance the use of its latent states with external memory.
no code implementations • 10 Feb 2020 • Alexander Ororbia, Ankur Mali, Daniel Kifer, C. Lee Giles
Training deep neural networks on large-scale datasets requires significant hardware resources whose costs (even on cloud platforms) put them out of reach of smaller organizations, groups, and individuals.
no code implementations • 10 Jun 2019 • Kuai Fang, Chaopeng Shen, Daniel Kifer
Soil moisture is an important variable that determines floods, vegetation health, agriculture productivity, and land surface feedbacks to the atmosphere, etc.
no code implementations • 25 May 2019 • Alexander Ororbia, Ankur Mali, Daniel Kifer, C. Lee Giles
In lifelong learning systems based on artificial neural networks, one of the biggest obstacles is the inability to retain old knowledge as new information is encountered.
no code implementations • 29 Apr 2019 • Zeyu Ding, Yuxin Wang, Danfeng Zhang, Daniel Kifer
We show that it can also release for free the noisy gap between the approximate maximizer and runner-up.
1 code implementation • 28 Mar 2019 • Yuxin Wang, Zeyu Ding, Guanhong Wang, Daniel Kifer, Danfeng Zhang
Sometimes, combining those two requires substantial changes to program logics: one recent paper is able to verify Report Noisy Max automatically, but it involves a complex verification system using customized program logics and verifiers.
Programming Languages D.2.4
2 code implementations • 17 Oct 2018 • Alexander Ororbia, Ankur Mali, C. Lee Giles, Daniel Kifer
We compare our model and learning procedure to other back-propagation through time alternatives (which also tend to be computationally expensive), including real-time recurrent learning, echo state networks, and unbiased online recurrent optimization.
no code implementations • 10 Oct 2018 • Songshan Yang, Jiawei Wen, Xiang Zhan, Daniel Kifer
The pseudo-features are constructed to be inactive by nature, which can be used to obtain a cutoff to select the tuning parameter that separates active and inactive features.
no code implementations • 9 Sep 2018 • Dafang He, Xiao Yang, Daniel Kifer, C. Lee Giles
We propose a novel and effective framework for this and experimentally demonstrate that: (1) A CNN that can be effectively used to extract instance-level text contour from natural images.
1 code implementation • 28 Aug 2018 • Jaewoo Lee, Daniel Kifer
It outperforms prior algorithms for model fitting and is competitive with the state-of-the-art for $(\epsilon,\delta)$-differential privacy, a strictly weaker definition than zCDP.
no code implementations • 26 Aug 2018 • Yu-Hsuan Kuo, Zhenhui Li, Daniel Kifer
Advances in sensor technology have enabled the collection of large-scale datasets.
2 code implementations • 25 May 2018 • Ding Ding, Yuxin Wang, Guanhong Wang, Danfeng Zhang, Daniel Kifer
The widespread acceptance of differential privacy has led to the publication of many sophisticated algorithms for protecting privacy.
Cryptography and Security
no code implementations • 23 Apr 2018 • Dafang He, Yeqing Li, Alexander Gorban, Derrall Heath, Julian Ibarz, Qian Yu, Daniel Kifer, C. Lee Giles
In this work, we propose a new framework that learns this task in an end-to-end way.
no code implementations • 22 Apr 2018 • Xiao Yang, Miaosen Wang, Wei Wang, Madian Khabsa, Ahmed Awadallah, Daniel Kifer, C. Lee Giles
We frame this task as a binary (relevant/irrelevant) classification problem, and present an adversarial training framework to alleviate label imbalance issue.
no code implementations • 11 Apr 2018 • Yue Wang, Daniel Kifer, Jaewoo Lee
The process of data mining with differential privacy produces results that are affected by two types of noise: sampling noise due to data collection and privacy noise that is designed to prevent the reconstruction of sensitive information.
no code implementations • 5 Mar 2018 • Alexander G. Ororbia, Ankur Mali, Daniel Kifer, C. Lee Giles
Using back-propagation and its variants to train deep networks is often problematic for new users.
no code implementations • 20 Jul 2017 • Kuai Fang, Chaopeng Shen, Daniel Kifer, Xiao Yang
The Soil Moisture Active Passive (SMAP) mission has delivered valuable sensing of surface soil moisture since 2015.
no code implementations • CVPR 2017 • Xiao Yang, Ersin Yumer, Paul Asente, Mike Kraley, Daniel Kifer, C. Lee Giles
We present an end-to-end, multimodal, fully convolutional network for extracting semantic structures from document images.
no code implementations • CVPR 2017 • Dafang He, Xiao Yang, Chen Liang, Zihan Zhou, Alexander G. Ororbi II, Daniel Kifer, C. Lee Giles
Scene text detection has attracted great attention these years.
no code implementations • CVPR 2017 • Xiao Yang, Ersin Yumer, Paul Asente, Mike Kraley, Daniel Kifer, C. Lee Giles
We present an end-to-end, multimodal, fully convolutional network for extracting semantic structures from document images.
no code implementations • 22 Jan 2017 • Omar Montasser, Daniel Kifer
For the task of predicting gender and race/ethnicity counts at the blockgroup-level, an approach adapted from prior work to our problem achieves an average correlation of 0. 389 (gender) and 0. 569 (race) on a held-out test dataset.
1 code implementation • 24 Oct 2016 • Daniel Kifer, Ryan Rogers
In this paper, we develop new test statistics for private hypothesis testing.
Statistics Theory Cryptography and Security Statistics Theory
no code implementations • 26 Jan 2016 • Alexander G. Ororbia II, C. Lee Giles, Daniel Kifer
Many previous proposals for adversarial training of deep neural nets have included di- rectly modifying the gradient, training on a mix of original and adversarial examples, using contractive penalties, and approximately optimizing constrained adversarial ob- jective functions.