Practical application improvement to Quantum SVM: theory to practice

14 Dec 2020  ·  Jae-Eun Park, Brian Quanz, Steve Wood, Heather Higgins, Ray Harishankar ·

Quantum machine learning (QML) has emerged as an important area for Quantum applications, although useful QML applications would require many qubits. Therefore our paper is aimed at exploring the successful application of the Quantum Support Vector Machine (QSVM) algorithm while balancing several practical and technical considerations under the Noisy Intermediate-Scale Quantum (NISQ) assumption. For the quantum SVM under NISQ, we use quantum feature maps to translate data into quantum states and build the SVM kernel out of these quantum states, and further compare with classical SVM with radial basis function (RBF) kernels. As data sets are more complex or abstracted in some sense, classical SVM with classical kernels leads to less accuracy compared to QSVM, as classical SVM with typical classical kernels cannot easily separate different class data. Similarly, QSVM should be able to provide competitive performance over a broader range of data sets including ``simpler'' data cases in which smoother decision boundaries are required to avoid any model variance issues (i.e., overfitting). To bridge the gap between ``classical-looking'' decision boundaries and complex quantum decision boundaries, we propose to utilize general shallow unitary transformations to create feature maps with rotation factors to define a tunable quantum kernel, and added regularization to smooth the separating hyperplane model. We show in experiments that this allows QSVM to perform equally to SVM regardless of the complexity of the data sets and outperform in some commonly used reference data sets.

PDF Abstract
No code implementations yet. Submit your code now

Datasets


  Add Datasets introduced or used in this paper

Results from the Paper


  Submit results from this paper to get state-of-the-art GitHub badges and help the community compare results to other papers.

Methods