---
title: "Using Ellmer Chat Models"
output: rmarkdown::html_vignette
vignette: >
  %\VignetteIndexEntry{Using Ellmer Chat Models}
  %\VignetteEncoding{UTF-8}
  %\VignetteEngine{knitr::rmarkdown}
editor_options: 
  markdown: 
    wrap: 72
---

```{r, include = FALSE}
knitr::opts_chunk$set(
  collapse = TRUE,
  comment = "#>"
)
```

There are two ways to use [ellmer chat
models](https://ellmer.tidyverse.org/reference/index.html) for batch
processing. This flexibility allows you to 1) pass a chat model object
or 2) pass a function. The chat object method is more flexible because
the object can be reused.

## Method 1: Passing an Object

The first method is to pass a chat model object. This is useful when you
want to reuse an existing model configuration:

``` r
library(hellmer)

openai <- chat_openai(
  model = "o3-mini",
  system_prompt = "Reply concisely, one sentence"
)

chat <- chat_sequential(openai)
```

## Method 2: Passing a Function

The second method is to pass an ellmer chat model function directly.
This method may be preferred if you only need to use a model once or
aesthetically prefer not to nest functions. There is not any great
reason to use this method, but it's available for backward
compatibility.

``` r
chat <- chat_sequential(
  chat_openai,
  model = "o3-mini",
  system_prompt = "Reply concisely, one sentence"
)
```