Search Results for author: Shanshan Tang

Found 6 papers, 1 papers with code

Recurrence-Free Survival Prediction for Anal Squamous Cell Carcinoma Chemoradiotherapy using Planning CT-based Radiomics Model

no code implementations5 Sep 2023 Shanshan Tang, Kai Wang, David Hein, Gloria Lin, Nina N. Sanford, Jing Wang

Conclusions: A treatment planning CT based radiomics and clinical combined model had improved prognostic performance in predicting RFS for ASCC patients treated with CRT as compared to a model using clinical features only.

feature selection Survival Prediction

Joint localization and classification of breast tumors on ultrasound images using a novel auxiliary attention-based framework

no code implementations11 Oct 2022 Zong Fan, Ping Gong, Shanshan Tang, Christine U. Lee, Xiaohui Zhang, Pengfei Song, Shigao Chen, Hua Li

By use of the attention mechanism, the auxiliary lesion-aware network can optimize multi-scale intermediate feature maps and extract rich semantic information to improve classification and localization performance.

Classification Lesion Detection

ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units via Chebyshev Approximations

2 code implementations7 Nov 2019 Shanshan Tang, Bo Li, Haijun Yu

As spectral accuracy is hard to obtain by direct training of deep neural networks, ChebNets provide a practical way to obtain spectral accuracy, it is expected to be useful in real applications that require efficient approximations of smooth functions.

PowerNet: Efficient Representations of Polynomials and Smooth Functions by Deep Neural Networks with Rectified Power Units

no code implementations9 Sep 2019 Bo Li, Shanshan Tang, Haijun Yu

In this paper, we construct deep neural networks with rectified power units (RePU), which can give better approximations for smooth functions.

Better Approximations of High Dimensional Smooth Functions by Deep Neural Networks with Rectified Power Units

no code implementations14 Mar 2019 Bo Li, Shanshan Tang, Haijun Yu

Comparing to the results on ReLU network, the sizes of RePU networks required to approximate functions in Sobolev space and Korobov space with an error tolerance $\varepsilon$, by our constructive proofs, are in general $\mathcal{O}(\log \frac{1}{\varepsilon})$ times smaller than the sizes of corresponding ReLU networks.

Numerical Analysis

Application of Bounded Total Variation Denoising in Urban Traffic Analysis

no code implementations4 Aug 2018 Shanshan Tang, Haijun Yu

While it is believed that denoising is not always necessary in many big data applications, we show in this paper that denoising is helpful in urban traffic analysis by applying the method of bounded total variation denoising to the urban road traffic prediction and clustering problem.

Clustering Denoising +1

Cannot find the paper you are looking for? You can Submit a new open access paper.