no code implementations • ECNLP (ACL) 2022 • Fan Yang, Alireza Bagheri Garakani, Yifei Teng, Yan Gao, Jia Liu, Jingyuan Deng, Yi Sun
In E-commerce search, spelling correction plays an important role to find desired products for customers in processing user-typed search queries.
no code implementations • 14 Nov 2023 • Dalong Zheng, Zebin Wu, Jia Liu, Chih-Cheng Hung, Zhihui Wei
In order to fully mine these three kinds of change features, we propose the triple branch network combining the transformer and convolutional neural network (CNN) to extract and fuse these change features from two perspectives of global information and local information, respectively.
no code implementations • 14 Nov 2023 • Zi Yin, Wei Ding, Jia Liu
Large Language Models (LLMs) are central to a multitude of applications but struggle with significant risks, notably in generating harmful content and biases.
no code implementations • 8 Nov 2023 • Tianchen Zhou, Jia Liu, Yang Jiao, Chaosheng Dong, Yetian Chen, Yan Gao, Yi Sun
Online learning to rank (ONL2R) is a foundational problem for recommender systems and has received increasing attention in recent years.
no code implementations • 12 Sep 2023 • Moyan Li, Jinmiao Fu, Shaoyuan Xu, Huidong Liu, Jia Liu, Bryan Wang
Unlike public data, another practical challenge on shopping websites is that some paired images are of low quality.
no code implementations • 22 Aug 2023 • Dalong Zheng, Zebin Wu, Jia Liu, Zhihui Wei
Therefore, based on swin transformer V2 (Swin V2) and VGG16, we propose an end-to-end compounded dense network SwinV2DNet to inherit the advantages of both transformer and CNN and overcome the shortcomings of existing networks in feature learning.
1 code implementation • 22 Jul 2023 • Zhixing Zhang, Ziwei Zhao, Dong Wang, Shishuang Zhao, Yuhang Liu, Jia Liu, LiWei Wang
Automatic labeling of coronary arteries is an essential task in the practical diagnosis process of cardiovascular diseases.
no code implementations • 21 Jun 2023 • Hao Xu, Jia Liu, Yang shen, Kenan Lou, Yanxia Bao, Ruihua Zhang, Shuyue Zhou, Hongsen Zhao, Shuai Wang
However, by analyzing the statistical characteristic of activated units after pooling, we found that a large number of units dropped by sorting pooling are negative-value units that contain useful information and can contribute considerably to the final decision.
no code implementations • 19 Jun 2023 • Minghe Zhang, Chaosheng Dong, Jinmiao Fu, Tianchen Zhou, Jia Liang, Jia Liu, Bo Liu, Michinari Momma, Bryan Wang, Yan Gao, Yi Sun
In this paper, we introduce AdaSelection, an adaptive sub-sampling method to identify the most informative sub-samples within each minibatch to speed up the training of large-scale deep learning models without sacrificing model performance.
no code implementations • 7 Jun 2023 • Hongru Liang, Jia Liu, Weihong Du, dingnan jin, Wenqiang Lei, Zujie Wen, Jiancheng Lv
The machine reading comprehension (MRC) of user manuals has huge potential in customer service.
1 code implementation • 5 Jun 2023 • Xinrui Zhou, Yuhao Huang, Wufeng Xue, Xin Yang, Yuxin Zou, Qilong Ying, Yuanji Zhang, Jia Liu, Jie Ren, Dong Ni
First, to avoid the requirement of laborious and unreliable annotation, we propose a novel and effective video classification network for weakly-supervised CSG.
1 code implementation • 31 Mar 2023 • Anbai Jiang, Wei-Qiang Zhang, Yufeng Deng, Pingyi Fan, Jia Liu
Automatic detection of machine anomaly remains challenging for machine learning.
1 code implementation • 21 Mar 2023 • Haisong Liu, Tao Lu, Yihui Xu, Jia Liu, LiMin Wang
To fuse dense image features and sparse point features, we propose a learnable operator named bidirectional camera-LiDAR fusion module (Bi-CLFM).
Ranked #1 on
Scene Flow Estimation
on KITTI 2015 Scene Flow Test
no code implementations • 5 Mar 2023 • Zhuqing Liu, Xin Zhang, Songtao Lu, Jia Liu
Decentralized min-max optimization problems with domain constraints underpins many important ML applications, including multi-agent ML fairness assurance, and policy evaluations in multi-agent reinforcement learning.
no code implementations • 20 Jan 2023 • Luyao Chen, Zhiqiang Chen, Longsheng Jiang, Xiang Liu, Linlu Xu, Bo Zhang, Xiaolong Zou, Jinying Gao, Yu Zhu, Xizi Gong, Shan Yu, Sen Song, Liangyi Chen, Fang Fang, Si Wu, Jia Liu
Nowadays, we have witnessed the great success of AI in various applications, including image classification, game playing, protein structure analysis, language translation, and content generation.
no code implementations • 13 Dec 2022 • Minghong Fang, Jia Liu, Neil Zhenqiang Gong, Elizabeth S. Bentley
Asynchronous FL aims to address this challenge by enabling the server to update the model once any client's model update reaches it without waiting for other clients' model updates.
no code implementations • 13 Dec 2022 • Minghong Fang, Jia Liu, Michinari Momma, Yi Sun
In this paper, we propose a new approach called fair recommendation with optimized antidote data (FairRoad), which aims to improve the fairness performances of recommender systems through the construction of a small and carefully crafted antidote dataset.
no code implementations • 5 Dec 2022 • Peiwen Qiu, Yining Li, Zhuqing Liu, Prashant Khanduri, Jia Liu, Ness B. Shroff, Elizabeth Serena Bentley, Kurt Turck
Decentralized bilevel optimization has received increasing attention recently due to its foundational role in many emerging multi-agent learning paradigms (e. g., multi-agent meta-learning and multi-agent reinforcement learning) over peer-to-peer edge networks.
no code implementations • 3 Oct 2022 • Haibo Yang, Peiwen Qiu, Jia Liu
A key assumption in most existing works on FL algorithms' convergence analysis is that the noise in stochastic first-order information has a finite variance.
no code implementations • 2 Oct 2022 • Haibo Yang, Zhuqing Liu, Xin Zhang, Jia Liu
To lower the communication complexity of federated min-max learning, a natural approach is to utilize the idea of infrequent communications (through multiple local updates) same as in conventional federated learning.
no code implementations • 23 Sep 2022 • Liang Jiang, Zhenyu Huang, Jia Liu, Zujie Wen, Xi Peng
Such a process will inevitably introduce mismatched pairs (i. e., noisy correspondence) due to i) the unavailable QA pairs in target documents, and ii) the domain shift during applying the QA construction model to the target domain.
no code implementations • 17 Aug 2022 • Zhuqing Liu, Xin Zhang, Jia Liu
To increase the training speed of distributed learning, recent years have witnessed a significant amount of interest in developing both synchronous and asynchronous distributed stochastic variance-reduced optimization methods.
no code implementations • 17 Aug 2022 • Xin Zhang, Minghong Fang, Zhuqing Liu, Haibo Yang, Jia Liu, Zhengyuan Zhu
Moreover, whether or not the linear speedup for convergence is achievable under fully decentralized FL with data heterogeneity remains an open question.
no code implementations • 27 Jul 2022 • Zhuqing Liu, Xin Zhang, Prashant Khanduri, Songtao Lu, Jia Liu
Our main contributions in this paper are two-fold: i) We first propose a deterministic algorithm called INTERACT (inner-gradient-descent-outer-tracked-gradient) that requires the sample complexity of $\mathcal{O}(n \epsilon^{-1})$ and communication complexity of $\mathcal{O}(\epsilon^{-1})$ to solve the bilevel optimization problem, where $n$ and $\epsilon > 0$ are the number of samples at each agent and the desired stationarity gap, respectively.
no code implementations • 12 Jul 2022 • Jia Liu, Ran Cheng, Yaochu Jin
First, we formulate the NAS problem for enhancing adversarial robustness of deep neural networks into a multiobjective optimization problem.
no code implementations • 2 Jun 2022 • Donghui Li, Jia Liu, Fang Liu, Wenhua Zhang, Andi Zhang, Wenfei Gao, Jiao Shi
With the better representation capability of optical images, we propose to enrich SAR images with generated optical images via the generative adversative network (GAN) trained by numerous SAR and optical images.
no code implementations • 12 May 2022 • Haibo Yang, Peiwen Qiu, Jia Liu, Aylin Yener
In order to fully utilize this advantage while providing comparable learning performance to conventional federated learning that presumes model aggregation via noiseless channels, we consider the joint design of transmission scaling and the number of local iterations at each round, given the power constraint at each edge device.
1 code implementation • 1 Apr 2022 • Jia Liu, Wenjie Xuan, Yuhang Gan, Juhua Liu, Bo Du
In this paper, we propose an end-to-end Supervised Domain Adaptation framework for cross-domain Change Detection, namely SDACD, to effectively alleviate the domain shift between bi-temporal images for better change predictions.
Change Detection
Change detection for remote sensing images
+1
no code implementations • 14 Jan 2022 • Benjamin Remy, Francois Lanusse, Niall Jeffrey, Jia Liu, Jean-Luc Starck, Ken Osato, Tim Schrabback
We introduce a novel methodology allowing for efficient sampling of the high-dimensional Bayesian posterior of the weak lensing mass-mapping problem, and relying on simulations for defining a fully non-Gaussian prior.
1 code implementation • NeurIPS 2021 • Wenbo Ren, Jia Liu, Ness Shroff
Here, a multi-wise comparison takes $m$ items as input and returns a (noisy) result about the best item (the winner feedback) or the order of these items (the full-ranking feedback).
no code implementations • NeurIPS 2021 • Xin Zhang, Zhuqing Liu, Jia Liu, Zhengyuan Zhu, Songtao Lu
To our knowledge, this paper is the first work that achieves both $\mathcal{O}(\epsilon^{-2})$ sample complexity and $\mathcal{O}(\epsilon^{-2})$ communication complexity in decentralized policy evaluation for cooperative MARL.
Multi-agent Reinforcement Learning
Reinforcement Learning (RL)
+1
1 code implementation • CVPR 2022 • Haisong Liu, Tao Lu, Yihui Xu, Jia Liu, Wenjie Li, Lijun Chen
In this paper, we study the problem of jointly estimating the optical flow and scene flow from synchronized 2D and 3D data.
Ranked #3 on
Scene Flow Estimation
on Spring
no code implementations • ICLR 2022 • Tianxiang Gao, Hailiang Liu, Jia Liu, Hridesh Rajan, Hongyang Gao
Implicit deep learning has received increasing attention recently due to the fact that it generalizes the recursive prediction rules of many commonly used neural network architectures.
no code implementations • ICLR 2022 • Tianchen Zhou, Jia Liu, Chaosheng Dong, Yi Sun
We show that the delay impacts in both cases can still be upper bounded by an additive penalty on both the regret and total incentive costs.
no code implementations • ICLR 2022 • FNU Hairi, Jia Liu, Songtao Lu
In this paper, we establish the first finite-time convergence result of the actor-critic algorithm for fully decentralized multi-agent reinforcement learning (MARL) problems with average reward.
Multi-agent Reinforcement Learning
reinforcement-learning
+1
no code implementations • ICLR 2022 • Prashant Khanduri, Haibo Yang, Mingyi Hong, Jia Liu, Hoi To Wai, Sijia Liu
We analyze the optimization and the generalization performance of the proposed framework for the $\ell_2$ loss.
1 code implementation • 17 Sep 2021 • Chris Cummins, Bram Wasti, Jiadong Guo, Brandon Cui, Jason Ansel, Sahir Gomez, Somya Jain, Jia Liu, Olivier Teytaud, Benoit Steiner, Yuandong Tian, Hugh Leather
What is needed is an easy, reusable experimental infrastructure for real world compiler optimization tasks that can serve as a common benchmark for comparing techniques, and as a platform to accelerate progress in the field.
no code implementations • 23 Aug 2021 • Haibo Yang, Xin Zhang, Prashant Khanduri, Jia Liu
To satisfy the need for flexible worker participation, we consider a new FL paradigm called "Anarchic Federated Learning" (AFL) in this paper.
no code implementations • 25 Jul 2021 • Fengjiao Li, Jia Liu, Bo Ji
Considering the achieved training accuracy of the global model as the utility of the selected workers, which is typically a monotone submodular function, we formulate the worker selection problem as a new multi-round monotone submodular maximization problem with cardinality and fairness constraints.
no code implementations • 4 Jul 2021 • Ziwei Cong, Jia Liu, Puneet Manchanda
Over the post-livestream period, the demand for the recorded version is still sensitive to price, but much less than in the pre-livestream period.
no code implementations • NeurIPS 2021 • Prashant Khanduri, Pranay Sharma, Haibo Yang, Mingyi Hong, Jia Liu, Ketan Rajawat, Pramod K. Varshney
Despite extensive research, for a generic non-convex FL problem, it is not clear, how to choose the WNs' and the server's update directions, the minibatch sizes, and the local update frequency, so that the WNs use the minimum number of samples and communication rounds to achieve the desired solution.
no code implementations • 14 Jun 2021 • Haibo Yang, Jia Liu, Elizabeth S. Bentley
This matches the convergence rate of distributed/federated learning without compression, thus achieving high communication efficiency while not sacrificing learning accuracy in FL.
no code implementations • 19 May 2021 • Tianchen Zhou, Jia Liu, Chaosheng Dong, Jingyuan Deng
In this paper, we investigate a new multi-armed bandit (MAB) online learning model that considers real-world phenomena in many recommender systems: (i) the learning agent cannot pull the arms by itself and thus has to offer rewards to users to incentivize arm-pulling indirectly; and (ii) if users with specific arm preferences are well rewarded, they induce a "self-reinforcing" effect in the sense that they will attract more users of similar arm preferences.
no code implementations • 4 May 2021 • Xin Zhang, Jia Liu, Zhengyuan Zhu, Elizabeth S. Bentley
Decentralized nonconvex optimization has received increasing attention in recent years in machine learning due to its advantages in system robustness, data privacy, and implementation simplicity.
no code implementations • 18 Feb 2021 • Minghong Fang, Minghao Sun, Qi Li, Neil Zhenqiang Gong, Jin Tian, Jia Liu
Our empirical results show that the proposed defenses can substantially reduce the estimation errors of the data poisoning attacks.
no code implementations • ICLR 2021 • Haibo Yang, Minghong Fang, Jia Liu
Our results also reveal that the local steps in FL could help the convergence and show that the maximum number of local steps can be improved to $T/m$ in full worker participation.
no code implementations • 16 Jan 2021 • Jia Liu, Yaochu Jin
Many existing deep learning models are vulnerable to adversarial examples that are imperceptible to humans.
no code implementations • 4 Jan 2021 • Yu Mei, Jia Liu, Zhiping Chen
We consider a distributionally robust second-order stochastic dominance constrained optimization problem.
Optimization and Control 90C15, 91B70, 90C31, 90-08
1 code implementation • 27 Dec 2020 • Xiaoyu Cao, Minghong Fang, Jia Liu, Neil Zhenqiang Gong
Finally, the service provider computes the average of the normalized local model updates weighted by their trust scores as a global model update, which is used to update the global model.
1 code implementation • 19 Oct 2020 • Ken Osato, Jia Liu, Zoltán Haiman
The $\kappa$TNG suite includes 10, 000 realisations of $5 \times 5 \, \mathrm{deg}^2$ maps for 40 source redshifts up to $z_s = 2. 6$, well covering the range of interest for existing and upcoming weak lensing surveys.
Cosmology and Nongalactic Astrophysics
no code implementations • 26 Aug 2020 • Jia Liu, Navin McGinnis, Carlos E. M. Wagner, Xiao-Ping Wang
Searches for weakly interacting particles is one of the main goals of the high luminosity LHC run.
High Energy Physics - Phenomenology High Energy Physics - Experiment
no code implementations • 6 Jul 2020 • Wenbo Ren, Xingyu Zhou, Jia Liu, Ness B. Shroff
To handle this dilemma, we adopt differential privacy and study the regret upper and lower bounds for MAB algorithms with a given LDP guarantee.
1 code implementation • ICML 2020 • Wenbo Ren, Jia Liu, Ness B. Shroff
From a given set of items, the learner can make pairwise comparisons on every pair of items, and each comparison returns an independent noisy result about the preferred item.
no code implementations • 25 Jun 2020 • Christina Gao, Jia Liu, Lian-Tao Wang, Xiao-Ping Wang, Wei Xue, Yi-Ming Zhong
Meanwhile, they can also scatter with the atoms through the inverse Primakoff process via the axion-photon coupling, which emits a photon and mimics the electronic recoil signals.
High Energy Physics - Phenomenology High Energy Physics - Experiment
2 code implementations • 4 Mar 2020 • Chandan Singh, Wooseok Ha, Francois Lanusse, Vanessa Boehm, Jia Liu, Bin Yu
Machine learning lies at the heart of new possibilities for scientific discovery, knowledge generation, and artificial intelligence.
no code implementations • 19 Feb 2020 • Minghong Fang, Jia Liu
To address the high mining cost problem of blockchain networks, in this paper, we propose a blockchain mining resources allocation algorithm to reduce the mining cost in PoW-based (proof-of-work-based) blockchain networks.
no code implementations • 19 Feb 2020 • Minghong Fang, Neil Zhenqiang Gong, Jia Liu
Given the number of fake users the attacker can inject, we formulate the crafting of rating scores for the fake users as an optimization problem.
1 code implementation • NeurIPS 2020 • Peizhong Ju, Xiaojun Lin, Jia Liu
Under a sparse true linear regression model with $p$ i. i. d.
no code implementations • 15 Jan 2020 • Tianxiang Gao, Songtao Lu, Jia Liu, Chris Chu
Further, we show that the iteration complexity of the proposed method is $O(n\varepsilon^{-2})$ to achieve $\epsilon$-stationary point, where $n$ is the number of blocks of coordinates.
no code implementations • 12 Jan 2020 • Xin Zhang, Minghong Fang, Jia Liu, Zhengyuan Zhu
In this paper, we consider the problem of jointly improving data privacy and communication efficiency of distributed edge learning, both of which are critical performance metrics in wireless edge network computing.
no code implementations • 16 Dec 2019 • Tianxiang Gao, Songtao Lu, Jia Liu, Chris Chu
In the applications of signal processing and data analytics, there is a wide class of non-convex problems whose objective function is freed from the common global Lipschitz continuous gradient assumption (e. g., the nonnegative matrix factorization (NMF) problem).
no code implementations • 8 Dec 2019 • Guangxia Lia, Yulong Shena, Peilin Zhaob, Xiao Lu, Jia Liu, Yangyang Liu, Steven C. H. Hoi
Similar to other information systems, a significant threat to industrial control systems is the attack from cyberspace---the offensive maneuvers launched by "anonymous" in the digital world that target computer-based assets with the goal of compromising a system's functions or probing for information.
1 code implementation • 2 Nov 2019 • Jia Liu, Quan Zhou, Yong Qiang, Bin Kang, Xiaofu Wu, Baoyu Zheng
The comprehensive experiments demonstrate that our model achieves state-of-the-art results in terms of available speed and accuracy trade-off on CityScapes and CamVid datasets.
no code implementations • 13 Oct 2019 • Wenhua Zhang, Licheng Jiao, Jia Liu
Moreover, with the novel expert selection strategy, overfitting caused by fixed experts for each frame can be mitigated.
no code implementations • 10 Sep 2019 • Haibo Yang, Xin Zhang, Minghong Fang, Jia Liu
In this work, we consider the resilience of distributed algorithms based on stochastic gradient descent (SGD) in distributed learning with potentially Byzantine attackers, who could send arbitrary information to the parameter server to disrupt the training process.
1 code implementation • NeurIPS 2019 • Wenbo Ren, Jia Liu, Ness B. Shroff
This paper studies the problem of finding the exact ranking from noisy comparisons.
no code implementations • 26 Aug 2019 • Peng Xie, Tianrui Li, Jia Liu, Shengdong Du, Xin Yang, Junbo Zhang
Urban spatial-temporal flows prediction is of great importance to traffic management, land use, public safety, etc.
no code implementations • 28 May 2019 • Xin Zhang, Jia Liu, Zhengyuan Zhu
In this work, we consider to improve the model estimation efficiency by aggregating the neighbors' information as well as identify the subgroup membership for each node in the network.
7 code implementations • 7 May 2019 • Yu Wang, Quan Zhou, Jia Liu, Jian Xiong, Guangwei Gao, Xiaofu Wu, Longin Jan Latecki
LEDNet: A Lightweight Encoder-Decoder Network for Real-time Semantic Segmentation
Ranked #29 on
Real-Time Semantic Segmentation
on Cityscapes test
no code implementations • 8 Apr 2019 • Jia Liu, Maoguo Gong, Haibo He
In this paper, we propose a nucleus neural network (NNN) and corresponding connecting architecture learning method.
12 code implementations • ICCV 2019 • Manolis Savva, Abhishek Kadian, Oleksandr Maksymets, Yili Zhao, Erik Wijmans, Bhavana Jain, Julian Straub, Jia Liu, Vladlen Koltun, Jitendra Malik, Devi Parikh, Dhruv Batra
We present Habitat, a platform for research in embodied artificial intelligence (AI).
Ranked #2 on
PointGoal Navigation
on Gibson PointGoal Navigation
no code implementations • 15 Jan 2019 • Fengjiao Li, Jia Liu, Bo Ji
To tackle this new problem, we extend an online learning algorithm, UCB, to deal with a critical tradeoff between exploitation and exploration and employ the virtual queue technique to properly handle the fairness constraints.
no code implementations • 28 Oct 2018 • Wenbo Ren, Jia Liu, Ness Shroff
Results in this paper provide up to $\rho n/k$ reductions compared with the "$k$-exploration" algorithms that focus on finding the (PAC) best $k$ arms out of $n$ arms.
no code implementations • 11 Sep 2018 • Minghong Fang, Guolei Yang, Neil Zhenqiang Gong, Jia Liu
To address the challenge, we formulate the poisoning attacks as an optimization problem, solving which determines the rating scores for the fake users.
2 code implementations • 25 Aug 2018 • Liang He, Xianhong Chen, Can Xu, Jia Liu
Most current state-of-the-art text-independent speaker verification systems take probabilistic linear discriminant analysis (PLDA) as their backend classifiers.
Multiobjective Optimization
Text-Independent Speaker Verification
no code implementations • 8 Jun 2018 • Wenbo Ren, Jia Liu, Ness B. Shroff
For the PAC top-$k$ ranking problem, we derive a lower bound on the sample complexity (aka number of queries), and propose an algorithm that is sample-complexity-optimal up to an $O(\log(k+l)/\log{k})$ factor.
no code implementations • 24 May 2018 • Xin Zhang, Jia Liu, Zhengyuan Zhu
Understanding the convergence performance of asynchronous stochastic gradient descent method (Async-SGD) has received increasing attention in recent years due to their foundational role in machine learning.
no code implementations • 23 May 2018 • Hejian Sang, Jia Liu
In this paper, we propose a new adaptive stochastic gradient Langevin dynamics (ASGLD) algorithmic framework and its two specialized versions, namely adaptive stochastic gradient (ASG) and adaptive gradient Langevin dynamics(AGLD), for non-convex optimization problems.
no code implementations • 26 Apr 2018 • Jia Liu, Yu Lei, Yan Ke, Jun Li, Min-qing Zhang, Xiaoyuan Yan
In this paper, a new data-driven information hiding scheme called generative steganography by sampling (GSS) is proposed.
no code implementations • 18 Dec 2017 • Ming-ming Liu, Min-qing Zhang, Jia Liu, Ying-nan Zhang, Yan Ke
The main idea of the method is that the class label of generative adversarial networks is replaced with the secret information as a driver to generate hidden image directly, and then extract the secret information from the hidden image through the discriminator.
Cryptography and Security Multimedia
no code implementations • 14 Nov 2017 • Yan Ke, Min-qing Zhang, Jia Liu, Tingting Su, Xiaoyuan Yang
The secret messages can be outputted by the generator if and only if the extraction key and the cover image are both inputted.
Multimedia
no code implementations • 14 Jul 2017 • Yi Liu, Liang He, Yao Tian, Zhuzi Chen, Jia Liu, Michael T. Johnson
Additionally, we also find that even though bottleneck features perform well for text-independent speaker verification, they do not outperform MFCCs on the most challenging Imposter-Correct trials on RedDots.
1 code implementation • 8 May 2017 • Juntao Gao, Yulong Shen, Jia Liu, Minoru Ito, Norio Shiratori
Adaptive traffic signal control, which adjusts traffic signal timing according to real-time traffic, has been shown to be an effective method to reduce traffic congestion.
Networking and Internet Architecture
1 code implementation • 27 May 2015 • Malte Buschmann, Joachim Kopp, Jia Liu, Pedro A. N. Machado
In this paper, we discuss lepton jets as a promising signature of an extended dark sector.
High Energy Physics - Phenomenology