posted on 2022-07-25, 00:20authored byB Goodrich, D Albrecht, P Tischer
Support Vector Machine (SVM) parameter selection has previously been performed by minimizing the ratio of the radius of the Minimal Enclosing Ball (MEB) enclosing the training data compared to the margin of the SVM. By considering the geometric properties of the SVM and MEB optimization problems, we show that upper and lower bounds on the radius-margin ratio of an SVM can be efficiently computed at any point during training. We use these bounds to accelerate radius-margin parameter selection by terminating training routines as early as possible, while still obtaining a guarantee that the parameters minimize the radius-margin ratio. Once an SVM has been partially trained on any set of parameters, we also show that these bounds can be used to evaluate and possibly reject neighboring parameter values with little or no additional training required. Empirical results show that this process can reduce the number of training iterations required in order to perform model selection by a factor of 10 or more, while suffering no loss of precision in minimizing the radius-margin ratio.