Skip to main content

ChatWriter

This notebook provides a quick overview for getting started with Writer chat models.

Writer has several chat models. You can find information about their latest models and their costs, context windows, and supported input types in the Writer docs.

:::

Overview

Integration details

ClassPackageLocalSerializableJS supportPackage downloadsPackage latest
ChatWriterlangchain-community

Model features

Tool callingStructured outputJSON modeImage inputAudio inputVideo inputToken-level streamingNative asyncToken usageLogprobs

Setup

To access Writer models you'll need to create a Writer account, get an API key, and install the writer-sdk and langchain-community packages.

Credentials

Head to Writer AI Studio to sign up to OpenAI and generate an API key. Once you've done this set the WRITER_API_KEY environment variable:

import getpass
import os

if not os.environ.get("WRITER_API_KEY"):
os.environ["WRITER_API_KEY"] = getpass.getpass("Enter your Writer API key: ")

Installation

The LangChain Writer integration lives in the langchain-community package:

%pip install -qU langchain-community writer-sdk
Note: you may need to restart the kernel to use updated packages.

Instantiation

Now we can instantiate our model object and generate chat completions:

from langchain_community.chat_models.writer import ChatWriter

llm = ChatWriter(
model="palmyra-x-004",
temperature=0.7,
max_tokens=1000,
# api_key="...", # if you prefer to pass api key in directly instaed of using env vars
# base_url="...",
# other params...
)
API Reference:ChatWriter
---------------------------------------------------------------------------
``````output
ImportError Traceback (most recent call last)
``````output
Cell In[3], line 1
----> 1 from langchain_community.chat_models import ChatWriter
3 llm = ChatWriter(
4 model="palmyra-x-004",
5 temperature=0.7,
(...)
9 # other params...
10 )
``````output
ImportError: cannot import name 'ChatWriter' from 'langchain_community.chat_models' (/home/yanomaly/PycharmProjects/whitesnake/writer/langсhain/libs/community/langchain_community/chat_models/__init__.py)

Invocation

messages = [
(
"system",
"You are a helpful assistant that writes poems about the Python programming language.",
),
("human", "Write a poem about Python."),
]
ai_msg = llm.invoke(messages)
ai_msg
print(ai_msg.content)

Chaining

We can chain our model with a prompt template like so:

from langchain_core.prompts import ChatPromptTemplate

prompt = ChatPromptTemplate.from_messages(
[
(
"system",
"You are a helpful assistant that writes poems about the {input_language} programming language.",
),
("human", "{input}"),
]
)

chain = prompt | llm
chain.invoke(
{
"input_language": "Java",
"input": "Write a poem about Java.",
}
)
API Reference:ChatPromptTemplate

Tool calling

Writer supports tool calling, which lets you describe tools and their arguments, and have the model return a JSON object with a tool to invoke and the inputs to that tool.

ChatWriter.bind_tools()

With ChatWriter.bind_tools, we can easily pass in Pydantic classes, dict schemas, LangChain tools, or even functions as tools to the model. Under the hood these are converted to tool schemas, which looks like:

{
"name": "...",
"description": "...",
"parameters": {...} # JSONSchema
}

and passed in every model invocation.

from pydantic import BaseModel, Field


class GetWeather(BaseModel):
"""Get the current weather in a given location"""

location: str = Field(..., description="The city and state, e.g. San Francisco, CA")


llm_with_tools = llm.bind_tools([GetWeather])
ai_msg = llm_with_tools.invoke(
"what is the weather like in New York City",
)
ai_msg

AIMessage.tool_calls

Notice that the AIMessage has a tool_calls attribute. This contains in a standardized ToolCall format that is model-provider agnostic.

ai_msg.tool_calls

For more on binding tools and tool call outputs, head to the tool calling docs.

API reference

For detailed documentation of all Writer features, head to our API reference.


Was this page helpful?