The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.

rsparse: Statistical Learning on Sparse Matrices

Implements many algorithms for statistical learning on sparse matrices - matrix factorizations, matrix completion, elastic net regressions, factorization machines. Also 'rsparse' enhances 'Matrix' package by providing methods for multithreaded <sparse, dense> matrix products and native slicing of the sparse matrices in Compressed Sparse Row (CSR) format. List of the algorithms for regression problems: 1) Elastic Net regression via Follow The Proximally-Regularized Leader (FTRL) Stochastic Gradient Descent (SGD), as per McMahan et al(, <doi:10.1145/2487575.2488200>) 2) Factorization Machines via SGD, as per Rendle (2010, <doi:10.1109/ICDM.2010.127>) List of algorithms for matrix factorization and matrix completion: 1) Weighted Regularized Matrix Factorization (WRMF) via Alternating Least Squares (ALS) - paper by Hu, Koren, Volinsky (2008, <doi:10.1109/ICDM.2008.22>) 2) Maximum-Margin Matrix Factorization via ALS, paper by Rennie, Srebro (2005, <doi:10.1145/1102351.1102441>) 3) Fast Truncated Singular Value Decomposition (SVD), Soft-Thresholded SVD, Soft-Impute matrix completion via ALS - paper by Hastie, Mazumder et al. (2014, <doi:10.48550/arXiv.1410.2596>) 4) Linear-Flow matrix factorization, from 'Practical linear models for large-scale one-class collaborative filtering' by Sedhain, Bui, Kawale et al (2016, ISBN:978-1-57735-770-4) 5) GlobalVectors (GloVe) matrix factorization via SGD, paper by Pennington, Socher, Manning (2014, <https://aclanthology.org/D14-1162/>) Package is reasonably fast and memory efficient - it allows to work with large datasets - millions of rows and millions of columns. This is particularly useful for practitioners working on recommender systems.

Version: 0.5.1
Depends: R (≥ 3.6.0), methods, Matrix (≥ 1.3)
Imports: MatrixExtra (≥ 0.1.7), Rcpp (≥ 0.11), data.table (≥ 1.10.0), float (≥ 0.2-2), RhpcBLASctl, lgr (≥ 0.2)
LinkingTo: Rcpp, RcppArmadillo (≥ 0.9.100.5.0)
Suggests: testthat, covr
Published: 2022-09-11
Author: Dmitriy Selivanov ORCID iD [aut, cre, cph], David Cortes [ctb], Drew Schmidt [ctb] (configure script for BLAS, LAPACK detection), Wei-Chen Chen [ctb] (configure script and work on linking to float package)
Maintainer: Dmitriy Selivanov <ds at rexy.ai>
BugReports: https://github.com/rexyai/rsparse/issues
License: GPL-2 | GPL-3 [expanded from: GPL (≥ 2)]
URL: https://github.com/rexyai/rsparse
NeedsCompilation: yes
Materials: README NEWS
In views: MissingData
CRAN checks: rsparse results

Documentation:

Reference manual: rsparse.pdf

Downloads:

Package source: rsparse_0.5.1.tar.gz
Windows binaries: r-devel: rsparse_0.5.1.zip, r-release: rsparse_0.5.1.zip, r-oldrel: rsparse_0.5.1.zip
macOS binaries: r-release (arm64): rsparse_0.5.1.tgz, r-oldrel (arm64): rsparse_0.5.1.tgz, r-release (x86_64): rsparse_0.5.1.tgz, r-oldrel (x86_64): rsparse_0.5.1.tgz
Old sources: rsparse archive

Reverse dependencies:

Reverse imports: LSX, PsychWordVec, text2vec

Linking:

Please use the canonical form https://CRAN.R-project.org/package=rsparse to link to this page.

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.