The hardware and bandwidth for this mirror is donated by dogado GmbH, the Webhosting and Full Service-Cloud Provider. Check out our Wordpress Tutorial.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]dogado.de.

llm.api

“Llamar” is Spanish for “to call”

Minimal-dependency LLM chat interface. Part of cornyverse.

Exports

Function Purpose
chat(prompt, model) Chat with any LLM
chat_openai(prompt) OpenAI GPT models
chat_claude(prompt) Anthropic Claude models
chat_ollama(prompt) Local Ollama server
list_ollama_models() List Ollama models
llm_base(url) Set API endpoint
llm_key(key) Set API key

Providers

Usage

# Auto-detect provider from model
chat("Hello", model = "gpt-4o")
chat("Hello", model = "claude-3-5-sonnet-latest")
chat("Hello", model = "kimi-k2")

# Use convenience wrappers
chat_ollama("What is R?")
chat_claude("Explain machine learning")

# Explicit Moonshot/Kimi provider
chat("Write a fast parser in R", provider = "moonshot", model = "kimi-k2")

# Conversation history
result <- chat("Hi, I'm Troy")
chat("What's my name?", history = result$history)

# Streaming
chat("Write a story", stream = TRUE)

Set MOONSHOT_API_KEY to use Moonshot/Kimi without overriding your OpenAI credentials.

Dependencies

Only curl and jsonlite. No tidyverse, no compiled code.

These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.
Health stats visible at Monitor.