The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.

Simple chat with LLMR

knitr::opts_chunk$set(
  collapse = TRUE, comment = "#>",
  eval = identical(tolower(Sys.getenv("LLMR_RUN_VIGNETTES", "false")), "true")
)

This vignette shows basic chat usage with four providers and model names: - OpenAI: gpt-5-nano - Anthropic: claude-sonnet-4-20250514 - Gemini: gemini-2.5-flash - Groq: openai/gpt-oss-20b

You will need API keys in these environment variables: OPENAI_API_KEY, ANTHROPIC_API_KEY, GEMINI_API_KEY, GROQ_API_KEY.

To run these examples locally, set a local flag: - Sys.setenv(LLMR_RUN_VIGNETTES = “true”) - or add LLMR_RUN_VIGNETTES=true to ~/.Renviron

OpenAI: gpt-5-nano

library(LLMR)

cfg_openai <- llm_config(
  provider = "openai",
  model    = "gpt-5-nano",
  
)

chat_oai <- chat_session(cfg_openai, system = "Be concise.")
chat_oai$send("Say a warm hello in one short sentence.")
chat_oai$send("Now say it in Esperanto.")

Anthropic: claude-sonnet-4-20250514

cfg_anthropic <- llm_config(
  provider = "anthropic",
  model    = "claude-sonnet-4-20250514",
  max_tokens = 512   # avoid warnings; Anthropic requires max_tokens
)

chat_claude <- chat_session(cfg_anthropic, system = "Be concise.")
chat_claude$send("Name one interesting fact about honey bees.")

Gemini: gemini-2.5-flash

cfg_gemini <- llm_config(
  provider = "gemini",
  model    = "gemini-2.5-flash-lite",
  
)

chat_gem <- chat_session(cfg_gemini, system = "Be concise.")
chat_gem$send("Give me a single-sentence fun fact about volcanoes.")

Groq: openai/gpt-oss-20b

cfg_groq <- llm_config(
  provider = "groq",
  model    = "openai/gpt-oss-20b",
  
)

chat_groq <- chat_session(cfg_groq, system = "Be concise.")
chat_groq$send("Share a short fun fact about octopuses.")

Using the chat history

Chat sessions remember context automatically:

chat_oai$send("What did I ask you to do in my first message?")
# The model can reference the earlier "Say a warm hello" request

Inspect the full conversation

# View all messages
as.data.frame(chat_oai)

# Get summary statistics
summary(chat_oai)

Structured chat in one call (OpenAI example)

schema <- list(
  type = "object",
  properties = list(
    answer     = list(type = "string"),
    confidence = list(type = "number")
  ),
  required = list("answer", "confidence"),
  additionalProperties = FALSE
)

chat_oai$send_structured(
  "Return an answer and a confidence score (0-1) about: Why is the sky blue?",
  schema
)

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.