The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.
library(rcloner)
has_rclone <- rclone_available()
if (!has_rclone) {
message("rclone is not installed on this system. ",
"Code chunks that require rclone are skipped. ",
"Install with install_rclone().")
}rcloner provides an R interface to rclone, a command-line program that
supports over 40 cloud storage backends, including:
All file operations (copy, sync, list, move, delete, …) use the same consistent interface regardless of the storage backend.
Install from CRAN:
Or the development version from GitHub:
rcloner automatically locates a system-installed rclone
binary. If rclone is not already on your PATH, install it
with:
This downloads the appropriate pre-built binary from https://downloads.rclone.org for your operating system and architecture and stores it in a user-writable directory — no system privileges required.
Check the installed version:
rcloner manages cloud storage credentials through
remotes, which are named configurations stored in rclone’s
config file.
rclone_ls() returns a data frame of objects at a given
path.
# List a bucket on a configured remote
rclone_ls("aws:my-bucket")
# Recursive listing
rclone_ls("aws:my-bucket/data/", recursive = TRUE)
# Directories only
rclone_lsd("aws:my-bucket")rclone_ls() parses rclone lsjson output and
returns a data frame with columns Path, Name,
Size, MimeType, ModTime, and
IsDir.
rclone_copy() copies files from source to destination,
skipping identical files. It never deletes destination files.
# Upload a local directory to S3
rclone_copy("/local/data", "aws:my-bucket/data")
# Download a file from S3
rclone_copy("aws:my-bucket/report.csv", "/local/downloads/")
# Copy a URL directly to cloud storage (no local intermediate)
rclone_copyurl(
"https://raw.githubusercontent.com/tidyverse/readr/main/inst/extdata/mtcars.csv",
"aws:my-bucket/mtcars.csv"
)rclone_sync() makes the destination identical
to the source, deleting destination files that are not in the source.
Use with care.
# Read a remote file into R
contents <- rclone_cat("aws:my-bucket/config.yaml")
# Get metadata for an object
rclone_stat("aws:my-bucket/data.csv")
# Total size of a path
rclone_size("aws:my-bucket")
# Create a bucket/directory
rclone_mkdir("aws:new-bucket")
# Delete files (keeps directories)
rclone_delete("aws:my-bucket/old-data/")
# Remove a path and all its contents
rclone_purge("aws:my-bucket/scratch")
# Generate a public link (where supported)
rclone_link("aws:my-bucket/report.html")
# Get storage quota info
rclone_about("aws:")Every rclone subcommand is accessible via the rclone()
function, which accepts a character vector of arguments:
If you are migrating from the minioclient package, the
function mapping is:
minioclient |
rcloner |
|---|---|
mc_alias_set() |
rclone_config_create() |
mc_cp() |
rclone_copy() |
mc_mv() |
rclone_move() |
mc_mirror() |
rclone_sync() |
mc_ls() |
rclone_ls() |
mc_cat() |
rclone_cat() |
mc_mb() |
rclone_mkdir() |
mc_rb() |
rclone_purge() |
mc_rm() |
rclone_delete() |
mc_du() |
rclone_size() |
mc_stat() |
rclone_stat() |
mc() |
rclone() |
The main difference is that rcloner uses
remotes (e.g. "aws:bucket") rather than
aliases (e.g. "alias/bucket"). Remote
configuration is done with rclone_config_create() instead
of mc_alias_set().
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.