Installation

This guide covers how to install RubyLLM.

Table of contents

  1. Prerequisites
  2. Installation Methods
    1. Using Bundler (Recommended)
    2. Manual Installation
  3. Basic Configuration (Required)
  4. Verifying Installation
  5. Next Steps

After reading this guide, you will know:

  • RubyLLM’s prerequisites.
  • How to install RubyLLM using Bundler or manually.
  • Where to find configuration details.

Prerequisites

  • Ruby 3.1 or later.
  • API keys for the AI providers you plan to use (e.g., OpenAI, Anthropic).

Installation Methods

The standard way to manage gem dependencies in Ruby projects.

Add RubyLLM to your project’s Gemfile:

gem 'ruby_llm'

Then, from your terminal, install the dependencies:

bundle install

Manual Installation

If you’re not using Bundler (e.g., for a simple script), you can install RubyLLM directly using the gem command:

gem install ruby_llm

You’ll then need to require 'ruby_llm' in your script.

Basic Configuration (Required)

RubyLLM needs API keys to communicate with AI providers. You must configure keys for the services you intend to use.

Here’s a minimal example showing how to configure the OpenAI key:

# config/initializers/ruby_llm.rb (in Rails) or at the start of your script
require 'ruby_llm'

RubyLLM.configure do |config|
  # Set keys for the providers you need. Using environment variables is best practice.
  config.openai_api_key = ENV.fetch('OPENAI_API_KEY', nil)
  # Add other keys like config.anthropic_api_key if needed
end

RubyLLM will raise a ConfigurationError if you attempt to use a provider whose key is not configured.

For a complete list of all configuration options, including setting default models, timeouts, custom endpoints, and using configuration contexts, please refer to the Configuration Guide.

Verifying Installation

You can quickly verify your setup by running a simple query. Make sure you have configured at least one API key (like openai_api_key in the example below).

require 'ruby_llm'

# Configure with at least one API key
RubyLLM.configure do |config|
  config.openai_api_key = ENV['OPENAI_API_KEY']
end

# Try a simple query
chat = RubyLLM.chat
response = chat.ask "Hello, world!"
puts response.content

# Check available models
puts "Available models:"
RubyLLM.models.chat_models.each do |model|
  puts "- #{model.id} (#{model.provider})"
end

If this runs without configuration errors and prints a response (or a specific API error like ‘Invalid API key’), the gem is installed correctly.

Next Steps

Now that you’ve installed RubyLLM: