Perform Wold- or Gabriel-style cross-validation for determining the appropriate rank SVD approximation of a matrix.

cv.svd.gabriel(
  x,
  krow = 2,
  kcol = 2,
  maxrank = floor(min(n - n/krow, p - p/kcol))
)

cv.svd.wold(x, k = 5, maxrank = 20, tol = 1e-04, maxiter = 20)

Arguments

x

the matrix to cross-validate.

krow

the number of row folds (for Gabriel-style CV).

kcol

the number of column folds (for Gabriel-style CV).

maxrank

the maximum rank to cross-validate up to.

k

the number of folds (for Wold-style CV).

tol

the convergence tolerance for impute.svd.

maxiter

the maximum number of iterations for impute.svd.

Value

call

the function call

msep

the mean square error of prediction (MSEP); this is a matrix whose columns contain the mean square errors in the predictions of the holdout sets for ranks 0, 1, ..., maxrank across the different replicates.

maxrank

the maximum rank for which prediction error is estimated; this is equal to nrow(msep)+1.

krow

the number of row folds (for Gabriel-style only).

kcol

the number of column folds (for Gabriel-style only).

rowsets

the partition of rows into krow holdout sets (for Gabriel-style only).

colsets

the partition of the columns into kcol holdout sets (for Gabriel-style only).

k

the number of folds (for Wold-style only).

sets

the partition of indices into k holdout sets (for Wold-style only).

Details

These functions are for cross-validating the SVD of a matrix. They assume a model $X = U D V' + E$ with the terms being signal and noise, and try to find the best rank to truncate the SVD of x at for minimizing prediction error. Here, prediction error is measured as sum of squares of residuals between the truncated SVD and the signal part.

For both types of cross-validation, in each replicate we leave out part of the matrix, fit an SVD approximation to the left-in part, and measure prediction error on the left-out part.

In Wold-style cross-validation, the holdout set is "speckled", a random set of elements in the matrix. The missing elements are predicted using impute.svd.

In Gabriel-style cross-validation, the holdout set is "blocked". We permute the rows and columns of the matrix, and leave out the lower-right block. We use a modified Schur-complement to predict the held-out block. In Gabriel-style, there are krow*kcol total folds.

Note

Gabriel's version of cross-validation was for leaving out a single element of the matrix, which corresponds to n-by-p-fold. Owen and Perry generalized Gabriel's idea to larger holdouts, showing that 2-by-2-fold cross-validation often works better.

Wold's original version of cross-validation did not use the EM algorithm to estimate the SVD. He recommend using the NIPALS algorithm instead, which has since faded into obscurity.

Wold-style cross-validation takes a lot more computation than Gabriel-style. The maxrank, tol, and maxiter have been chosen to give up some accuracy in the name of expediency. They may need to be adjusted to get the best results.

References

Gabriel, K.R. (2002). Le biplot - outil d'explaration de données multidimensionelles. Journal de la Société française de statistique, Volume 143 (2002) no. 3-4, pp. 5-55. B.

Owen, A.B. and Perry, P.O. (2009). Bi-cross-validation of the SVD and the non-negative matrix factorization. Annals of Applied Statistics 3(2) 564--594.

Wold, S. (1978). Cross-validatory estimation of the number of components in factor and principal components models. Technometrics 20(4) 397--405.

Author

Patrick O. Perry

Examples


  # generate a rank-2 matrix plus noise
  n <- 50; p <- 20; k <- 2
  u <- matrix( rnorm( n*k ), n, k )
  v <- matrix( rnorm( p*k ), p, k )
  e <- matrix( rnorm( n*p ), n, p )
  x <- u %*% t(v) + e
  
  # perform 5-fold Wold-style cross-validtion
  (cvw <- cv.svd.wold( x, 5, maxrank=10 ))
#> 
#> Call:
#> cv.svd.wold(x = x, k = 5, maxrank = 10)
#> 
#>   Rank   MSEP       SE
#>  ---------------------
#>      0  3.088  0.13374
#>      1  1.987  0.05414
#>      2  1.234  0.04487 *+ 
#>      3  1.601  0.09816
#>      4  2.107  0.14289
#>      5  2.447  0.15695
#>      6  2.811  0.19498
#>      7  3.114  0.18488
#>      8  3.496  0.14459
#>      9  4.023  0.10214
#>     10  4.280  0.24088
#> 
  
  # perform (2,2)-fold Gabriel-style cross-validation
  (cvg <- cv.svd.gabriel( x, 2, 2, maxrank=10 ))
#> 
#> Call:
#> cv.svd.gabriel(x = x, krow = 2, kcol = 2, maxrank = 10)
#> 
#>   Rank   MSEP       SE
#>  ---------------------
#>      0  3.088  0.37766
#>      1  2.462  0.26377
#>      2  1.344  0.05373 *+ 
#>      3  1.369  0.05730
#>      4  1.382  0.06219
#>      5  1.456  0.05943
#>      6  1.482  0.04690
#>      7  1.537  0.04474
#>      8  1.717  0.09854
#>      9  1.808  0.10306
#>     10  2.098  0.12385
#>