1 code implementation • 21 Jul 2023 • Jiachen Yao, Yikai Zhang, Songzhu Zheng, Mayank Goswami, Prateek Prasanna, Chao Chen
However, segmentation label noise usually has strong spatial correlation and has prominent bias in distribution.
no code implementations • 15 Feb 2023 • Mayank Goswami
A simple-to-implement analytical equation is shown to be working for brightness manipulation with a 1% increment in mean pixel values and a 77% decrease in the number of zeros.
no code implementations • 27 May 2022 • Sunita Khod, Akshay Dvivedi, Mayank Goswami
It is found that ProJet MJP gives the best quality of printed samples with the least amount of surface roughness and almost near to the actual porosity value.
no code implementations • 16 May 2022 • Kajal Kumari, Mayank Goswami
The analysis shows that measurement data with normal distribution inflicts the least noise in inverse recovery.
no code implementations • 24 Mar 2022 • Wenjia Zhang, Yikai Zhang, Xiaoling Hu, Mayank Goswami, Chao Chen, Dimitris Metaxas
Assuming data lies in a manifold, we investigate two new types of adversarial risk, the normal adversarial risk due to perturbation along normal direction, and the in-manifold adversarial risk due to perturbation within the manifold.
no code implementations • 17 Aug 2021 • Ankur Kumar, Prasunika Khare, Mayank Goswami
Other performance indices show that FFT method is processing the UCT signal with best recovery.
no code implementations • 9 Jul 2021 • Mayank Goswami
U-net with UVgg16 is best for malign tumor data set with treatment (having considerable variation) and U-net with Inception backbone for benign tumor data (with minor variation).
no code implementations • 9 Jul 2021 • Prasunika Khare, Mayank Goswami
For this design algorithm has selected Support Vector Machine (SVM) as the best option with an accuracy of 96%, F1-Score is 95. 83%, and MCC of 92. 30%.
no code implementations • NeurIPS 2021 • Songzhu Zheng, Yikai Zhang, Hubert Wagner, Mayank Goswami, Chao Chen
Deep neural networks are known to have security issues.
1 code implementation • ICLR 2021 • Yikai Zhang, Songzhu Zheng, Pengxiang Wu, Mayank Goswami, Chao Chen
Label noise is frequently observed in real-world large-scale datasets.
Ranked #12 on Learning with noisy labels on ANIMAL
no code implementations • 10 Feb 2021 • Yikai Zhang, Wenjia Zhang, Sammy Bald, Vamsi Pingali, Chao Chen, Mayank Goswami
This raises the question: is the stability analysis of [18] tight for smooth functions, and if not, for what kind of loss functions and data distributions can the stability analysis be improved?
no code implementations • 1 Jan 2021 • Yikai Zhang, Samuel Bald, Wenjia Zhang, Vamsi Pritham Pingali, Chao Chen, Mayank Goswami
We provide empirical evidence that this condition holds for several loss functions, and provide theoretical evidence that the known tight SGD stability bounds for convex and non-convex loss functions can be circumvented by HC loss functions, thus partially explaining the generalization of deep neural networks.
1 code implementation • NeurIPS 2020 • Pengxiang Wu, Songzhu Zheng, Mayank Goswami, Dimitris Metaxas, Chao Chen
Noisy labels can impair the performance of deep neural networks.
3 code implementations • ICML 2020 • Songzhu Zheng, Pengxiang Wu, Aman Goswami, Mayank Goswami, Dimitris Metaxas, Chao Chen
To be robust against label noise, many successful methods rely on the noisy classifiers (i. e., models trained on the noisy training data) to determine whether a label is trustworthy.
Ranked #40 on Image Classification on Clothing1M
no code implementations • 25 Sep 2019 • Songzhu Zheng, Pengxiang Wu, Aman Goswami, Mayank Goswami, Dimitris Metaxas, Chao Chen
To collect large scale annotated data, it is inevitable to introduce label noise, i. e., incorrect class labels.