perClass Documentation
version 5.4 (7-Dec-2018)
 SDPCA  Principal Component Analysis

    P=SDPCA(DATA)        % remove directions with zero variance
    P=SDPCA(DATA,DIM)    % project to DIM-D subspace
    P=SDPCA(DATA,FRAC)   % project preserving FRAC of variance
    [P,RES]=SDPCA(DATA,PPL)    % optimize PCA dimensionality for classifier PPL

   DATA    SDDATA set or data matrix
   DIM     Output dimensionality
   FRAC    Fraction of preserved variance (0,1)
   PPL     Untrained classifier pipeline (such as sdlinear)

   'no display'   Do not print any output
   'test',TS      When optimizing dimensionality, do not split data but
                  use externally provided set TS
   'tsfrac',F     If optimizing dimensionality by classifier error, set
                  the fraction of DATA used for testing/validation
                  (default: 0.2)

   P       PCA projection
   RES     Structure with details on optimization

   'weights' relative input feature importance in the projection

 SDPCA implements Principal Component Analysis projection maximizing
 variance in the data set DATA. SDPCA training is unsupervised meaning
 that the class labels are not used. If called without additional
 parameter, SDPCA projects data to a subspace with non-zero
 eigenvalues. This approach is also useful to ensure orthogonalization of
 data (useful e.g. for decision tree or random forest classifiers)

 SDPCA may also optimize dimensionality for a specific classifier. The
 DATA is split into training and validation subsets. PPL classifier is
 trained on the training subset and evaluated on the validation part.
 Dimensionality yielding lowest mean error is found. Details on the
 optimization process are returned in RES structure.

 p=sdpca(data)           % projection removing dimensions with zero variance
 p=sdpca(data,10)        % projection to 10D subspace
 p=sdpca(data,0.99)      % projection to a subspace preserving 99% of variance
 p=sdpca(data,sdlinear)  % optimizing dimensionality minimizing SDLINEAR error



sdpca is referenced in examples: