The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.

chatLLM

CRAN status Codecov Last Commit Issues Downloads

Overview

chatLLM is an R package providing a single, consistent interface to multiple “OpenAI‑compatible” chat APIs (OpenAI, Groq, Anthropic, DeepSeek, Alibaba DashScope, and GitHub Models).

Key features:


Installation

From CRAN:

install.packages("chatLLM")

Development version:

# install.packages("remotes")  # if needed
remotes::install_github("knowusuboaky/chatLLM")

Setup

Set your API keys or tokens once per session:

Sys.setenv(
  OPENAI_API_KEY     = "your-openai-key",
  GROQ_API_KEY       = "your-groq-key",
  ANTHROPIC_API_KEY  = "your-anthropic-key",
  DEEPSEEK_API_KEY   = "your-deepseek-key",
  DASHSCOPE_API_KEY  = "your-dashscope-key",
  GH_MODELS_TOKEN    = "your-github-models-token"
)

Usage

1. Simple Prompt

response <- call_llm(
  prompt     = "Who is Messi?",
  provider   = "openai",
  max_tokens = 300
)
cat(response)

2. Multi‑Message Conversation

conv <- list(
  list(role    = "system",    content = "You are a helpful assistant."),
  list(role    = "user",      content = "Explain recursion in R.")
)
response <- call_llm(
  messages          = conv,
  provider          = "openai",
  max_tokens        = 200,
  presence_penalty  = 0.2,
  frequency_penalty = 0.1,
  top_p             = 0.95
)
cat(response)

3. Verbose Off

Suppress informational messages:

res <- call_llm(
  prompt      = "Tell me a joke",
  provider    = "openai",
  verbose     = FALSE
)
cat(res)

4. Factory Interface

Create a reusable LLM function:

# Build a “GitHub Models” engine with defaults baked in
GitHubLLM <- call_llm(
  provider    = "github",
  max_tokens  = 60,
  verbose     = FALSE
)

# Invoke it like a function:
story <- GitHubLLM("Tell me a short story about libraries.")
cat(story)

5. Discover Available Models

# All providers at once
all_models <- list_models("all")
names(all_models)

# Only OpenAI models
openai_models <- list_models("openai")
head(openai_models)

6. Call a Specific Model

Pick from the list and pass it to call_llm():

anthro_models <- list_models("anthropic")
cat(call_llm(
  prompt     = "Write a haiku about autumn.",
  provider   = "anthropic",
  model      = anthro_models[1],
  max_tokens = 60
))

Troubleshooting


Contributing & Support

Issues and PRs welcome at https://github.com/knowusuboaky/chatLLM


License

MIT © Kwadwo Daddy Nyame Owusu - Boakye


Acknowledgements

Inspired by RAGFlowChainR, powered by httr and the R community. Enjoy!

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.