The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.

shiny.ollama

CRAN Version Dev Version

R Shiny Interface for Chatting with LLMs Offline via Ollama

Experience seamless, private, and offline AI conversations right on your machine! shiny.ollama provides a user-friendly R Shiny interface to interact with LLMs locally, powered by Ollama.

pkgdown Visitors CRAN downloads R-CMD-check

alt text

⚠️ Disclaimer

Important: shiny.ollama requires Ollama to be installed on your system. Without it, this package will not function. Follow the Installation Guide below to set up Ollama first.

Installation

install.packages("shiny.ollama")

From GitHub (Latest Development Version)

# Install devtools if not already installed
install.packages("devtools")

devtools::install_github("ineelhere/shiny.ollama")

Features

Quick Start

Launch the Shiny app in R with:

library(shiny.ollama)

# Start the application
shiny.ollama::run_app()

How to Install Ollama

To use this package, install Ollama first:

  1. Download Ollama from here (Mac, Windows, Linux supported).
  2. Install it by following the provided instructions.
  3. Verify your installation:
ollama --version

If successful, the version number will be displayed.

  1. Pull a model (e.g., deepseek-r1) to get started.

License and Declaration

This R package is an independent, passion-driven open source initiative, released under the Apache License 2.0. It is not affiliated with, owned by, funded by, or influenced by any external organization. The project is dedicated to fostering a community of developers who share a love for coding and collaborative innovation.

Contributions, feedback, and feature requests are always welcome!

Stay tuned for more updates. 🚀

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.