Recent advances in characterizing the generalization ability of Support Vector Machines (SVMs) exploit refined concepts, such as Rademacher estimates of model complexity and nonlinear criteria for weighting empirical classification errors. Those methods improve the SVM representation ability and tighten generalization bounds. On the other hand, Quadratic-Programming algorithms are no longer applicable, hence the SVM-training process cannot benefit from the notable efficiency featured by those specialized techniques. The paper considers the possibility of using Quantum Computing to solve the resulting problem of effective optimization, especially in the case of digital SVM implementations. The behavioral aspects of conventional and enhanced SVMs are compared, supported by experiments in both a synthetic and a real-world problem. Likewise, the related differences between Quadratic- Programming and Quantum-based optimization techniques are analyzed.
File in questo prodotto:
Non ci sono file associati a questo prodotto.