nmfdiv {fabia} | R Documentation |
nmfdiv
: R implementation of nmfdiv
.
nmfdiv(X,p,cyc=100)
X |
the data matrix. |
p |
number of hidden factor. |
cyc |
maximal number of iterations. |
Non-negative Matrix Factorization represents positive matrix X by positive matrices L and Z.
Objective for reconstruction is Kullback-Leibler divergence.
X = L Z
X = sum_{i=1}^{p} L_i (Z_i )^T
The model selection is performed according to D. D. Lee and H. S. Seung, 1999, 2001.
The code is implemented in R.
L |
Left matrix: L |
Z |
Right matrix: Z |
Sepp Hochreiter
D. D. Lee and H. S. Seung, ‘Algorithms for non-negative matrix factorization’, In Advances in Neural Information Processing Systems 13, 556-562, 2001.
D. D. Lee and H. S. Seung, ‘Learning the parts of objects by non-negative matrix factorization’, Nature, 401(6755):788-791, 1999.
fabi
,
fabia
,
fabiap
,
fabias
,
fabiasp
,
mfsc
,
nmfeu
,
nmfsc
,
nprojfunc
,
projfunc
,
make_fabi_data
,
make_fabi_data_blocks
,
make_fabi_data_pos
,
make_fabi_data_blocks_pos
,
extract_plot
,
extract_bic
,
myImagePlot
,
PlotBicluster
,
Breast_A
,
DLBCL_B
,
Multi_A
,
fabiaDemo
,
fabiaVersion
#--------------- # TEST #--------------- dat <- make_fabi_data_blocks(n = 100,l= 50,p = 3,f1 = 5,f2 = 5, of1 = 5,of2 = 10,sd_noise = 3.0,sd_z_noise = 0.2,mean_z = 2.0, sd_z = 1.0,sd_l_noise = 0.2,mean_l = 3.0,sd_l = 1.0) X <- dat[[1]] Y <- dat[[2]] X <- abs(X) XX <- tcrossprod(X) dXX <- 1/sqrt(diag(XX)) X <- dXX*X resEx <- nmfdiv(as.matrix(abs(X)),3) ## Not run: #--------------- # DEMO #--------------- dat <- make_fabi_data_blocks(n = 1000,l= 100,p = 10,f1 = 5,f2 = 5, of1 = 5,of2 = 10,sd_noise = 3.0,sd_z_noise = 0.2,mean_z = 2.0, sd_z = 1.0,sd_l_noise = 0.2,mean_l = 3.0,sd_l = 1.0) X <- dat[[1]] Y <- dat[[2]] X <- abs(X) XX <- tcrossprod(X) dXX <- 1/sqrt(diag(XX)) X <- dXX*X resToy <- nmfdiv(as.matrix(abs(X)),8) rToy <- extract_plot(X,resToy$L,resToy$Z,ti="NMFDIV",Y=Y) ## End(Not run)