Keywords: cross-validation, evaluation, leave-one-out
Problem: How to estimate error of an algorithm using leave-one-out cross-validation?
Solution: Use sdcrossval
function and specify the 'method' parameter
Leave-one-out is an evaluation scheme repeatedly training an algorithm of interest on the full dataset excluding only one example and performing test on it. Because larger training sets are used compared to any other cross-validation technique, leave-one-out yields high accuracies. However, the estimated variances are also very high.
PRSD Studio provides leave-one-out evaluation using the standard sdcrossval
function. We will only need to specify the 'method' parameter as 'loo' or 'leave-one-out' to initiate it. In the following example, we perform leave-one-out evaluation of PRTools nearest mean classifier on a two-class problem:
>> load fruit; a
'Fruit set' 260 by 2 sddata, 3 classes: 'apple'(100) 'banana'(100) 'stone'(60)
>> pd=sdlinear
*sddecide
untrained pipeline 2 steps: sdlinear+sdp_decide
>> res=sdcrossval
(pd,a,'method','loo')
samples: ....................................................................................................
The decisions were made at the default operating point using equal weights for the classes.