Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

add prompt engineering QuickStarts #640

Merged
merged 18 commits into from
Jan 30, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
245 changes: 245 additions & 0 deletions docs/prompt_engineering/quickstarts/quickstart_sdk.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,245 @@
---
sidebar_label: Quick Start (SDK)
sidebar_position: 0
table_of_contents: true
---

import {
CodeTabs,
python,
typescript,
ShellBlock,
} from "@site/src/components/InstructionsWithCode";
import { RegionalUrl } from "@site/src/components/RegionalUrls";

# Prompt Engineering Quick Start (SDK)

This quick start will walk through how to create, test, and iterate on prompts using the SDK. In this tutorial we will use OpenAI, but you can use whichever LLM you want.

:::info QuickStart
This tutorial uses the SDK for prompt engineering, if you are interested in using the UI instead, read [this guide](./quickstart_ui).
:::

## 1. Setup

First, install the required packages:

<CodeTabs
tabs={[
{
value: "python",
label: "Python",
language: "bash",
content: `pip install -qU langsmith openai langchain_core`,
},
{
value: "typescript",
label: "TypeScript",
language: "bash",
content: `yarn add langsmith @langchain/core langchain/hub openai`,
},
]}
groupId="client-language"
/>

Next, make sure you have signed up for a [LangSmith](https://langsmith.com) account, then [create](../../administration/how_to_guides/organization_management/create_account_api_key#create-an-api-key) and set your API key.
You will also want to sign up for an OpenAI API key to run the code in this tutorial.

```bash
LANGSMITH_API_KEY = '<your_api_key>'
OPENAI_API_KEY = '<your_api_key>'
```

## 2. Create a prompt

To create a prompt in LangSmith, define the list of messages you want in your prompt and then wrap them using the
`ChatPromptTemplate` function ([Python](https://python.langchain.com/api_reference/core/prompts/langchain_core.prompts.chat.ChatPromptTemplate.html)) or [TypeScript](https://v03.api.js.langchain.com/classes/_langchain_core.prompts.ChatPromptTemplate.html) function.
Then all you have to do is call [`push_prompt`](https://docs.smith.langchain.com/reference/python/client/langsmith.client.Client#langsmith.client.Client.push_prompt) (Python) or [`pushPrompt`](https://langsmith-docs-7jgx2bq8f-langchain.vercel.app/reference/js/classes/client.Client#pushprompt) (TypeScript) to send your prompt to LangSmith!

<CodeTabs
tabs={[
{
value: "python",
label: "Python",
language: "python",
content: `from langsmith import Client
from langchain_core.prompts import ChatPromptTemplate

# Connect to the LangSmith client

client = Client()

# Define the prompt

prompt = ChatPromptTemplate([
("system", "You are a helpful chatbot."),
("user", "{question}"),
])

# Push the prompt

client.push_prompt("my-prompt", object=prompt)`,
},
{
value: "typescript",
label: "TypeScript",
language: "typescript",
content: `import { Client } from "langsmith";
import { ChatPromptTemplate } from "@langchain/core/prompts";

// Connect to the LangSmith client
const client = new Client();

// Define the prompt
const prompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful chatbot."],
["user", "{question}"]
]);

// Push the prompt
await client.pushPrompt("my-prompt", {
object: prompt
});`,
},
]}
groupId="client-language"
/>

## 3. Test a prompt

To test a prompt, you need to pull the prompt, invoke it with the input values you want to test and then call the model with those input values.
your LLM or application expects.

<CodeTabs
tabs={[
{
value: "python",
label: "Python",
language: "python",
content: `from langsmith import Client
from openai import OpenAI
from langchain_core.messages import convert_to_openai_messages

# Connect to LangSmith and OpenAI

client = Client()
oai_client = OpenAI()

# Pull the prompt to use

# You can also specify a specific commit by passing the commit hash "my-prompt:<commit-hash>"

prompt = client.pull_prompt("my-prompt")

# Since our prompt only has one variable we could also pass in the value directly

# The code below is equivalent to formatted_prompt = prompt.invoke("What is the color of the sky?")

formatted_prompt = prompt.invoke({"question": "What is the color of the sky?"})

# Test the prompt

response = oai_client.chat.completions.create(
model="gpt-4o",
messages=convert_to_openai_messages(formatted_prompt.messages),
)`,
},
{
value: "typescript",
label: "TypeScript",
language: "typescript",
content: `import { OpenAI } from "openai";
import { pull } from "langchain/hub"
import { convertPromptToOpenAI } from "@langchain/openai";

// Connect to LangSmith and OpenAI
const oaiClient = new OpenAI();

// Pull the prompt to use
// You can also specify a specific commit by passing the commit hash "my-prompt:<commit-hash>"
const prompt = await pull("my-prompt");

// Format the prompt with the question
const formattedPrompt = await prompt.invoke({ question: "What is the color of the sky?" });

// Test the prompt
const response = await oaiClient.chat.completions.create({
model: "gpt-4o",
messages: convertPromptToOpenAI(formattedPrompt).messages,
});`,
},
]}
groupId="client-language"
/>

## 4. Iterate on a prompt

LangSmith makes it easy to iterate on prompts with your entire team. Members of your workspace can select a prompt to iterate on, and once they are happy with their changes, they can simply save it as a new commit.

To improve your prompts:

- We recommend referencing the documentation provided by your model provider for best practices in prompt creation,
such as [Best practices for prompt engineering with the OpenAI API](https://help.openai.com/en/articles/6654000-best-practices-for-prompt-engineering-with-the-openai-api) and [Gemini’s Introduction to prompt design](https://ai.google.dev/gemini-api/docs/prompting-intro).

- To help with iterating on your prompts in LangSmith, we've created Prompt Canvas — an interactive tool to build and optimize your prompts. Learn about how to use [Prompt Canvas](../concepts#prompt-canvas).

To add a new commit to a prompt, you can use the same [`push_prompt`](https://docs.smith.langchain.com/reference/python/client/langsmith.client.Client#langsmith.client.Client.push_prompt) (Python) or [`pushPrompt`](https://langsmith-docs-7jgx2bq8f-langchain.vercel.app/reference/js/classes/client.Client#pushprompt) (TypeScript) methods as
when you first created the prompt.

<CodeTabs
tabs={[
{
value: "python",
label: "Python",
language: "python",
content: `from langsmith import Client
from langchain_core.prompts import ChatPromptTemplate

# Connect to the LangSmith client

client = Client()

# Define the prompt to update

new_prompt = ChatPromptTemplate([
("system", "You are a helpful chatbot. Respond in Spanish."),
("user", "{question}"),
])

# Push the updated prompt making sure to use the correct prompt name

# Tags can help you remember specific versions in your commit history

client.push_prompt("my-prompt", object=new_prompt, tags=["Spanish"])`,
},
{
value: "typescript",
label: "TypeScript",
language: "typescript",
content: `import { Client } from "langsmith";
import { ChatPromptTemplate } from "@langchain/core/prompts";

// Connect to the LangSmith client
const client = new Client();

// Define the prompt
const newPrompt = ChatPromptTemplate.fromMessages([
["system", "You are a helpful chatbot. Speak in Spanish."],
["user", "{question}"]
]);

// Push the updated prompt making sure to use the correct prompt name
// Tags can help you remember specific versions in your commit history
await client.pushPrompt("my-prompt", {
object: newPrompt,
tags: ["Spanish"]
});`,
},
]}
groupId="client-language"
/>

## 5. Next steps

- Learn more about how to store and manage prompts using the Prompt Hub in [these how-to guides](../how_to_guides#prompt-hub)
- Learn more about how to use the playground for prompt engineering in [these how-to guides](../how_to_guides#playground)
71 changes: 71 additions & 0 deletions docs/prompt_engineering/quickstarts/quickstart_ui.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,71 @@
---
sidebar_label: Quick Start (UI)
sidebar_position: 0
table_of_contents: true
---

import {
CodeTabs,
python,
typescript,
ShellBlock,
} from "@site/src/components/InstructionsWithCode";
import { RegionalUrl } from "@site/src/components/RegionalUrls";

# Prompt Engineering Quick Start (UI)

This quick start will walk through how to create, test, and iterate on prompts in LangSmith.

:::info QuickStart
This tutorial uses the UI for prompt engineering, if you are interested in using the SDK instead, read [this guide](./quickstart_sdk).
:::

## 1. Setup

The only setup needed for this guide is to make sure you have signed up for a [LangSmith](https://langsmith.com) account.

## 2. Create a prompt

To create a prompt in LangSmith, navigate to the **Prompts** section of the left-hand sidebar and click on the “+ New Prompt” button.
You can then modify the prompt by editing/adding messages and input variables.

![](./static/create_prompt_ui.gif)

## 3. Test a prompt

To test a prompt, set the model configuration you want to use, add your LLM provider's API key, specify the prompt input values you want to test, and then click "Start".

To learn about more options for configuring your prompt in the playground, check out this [guide](../how_to_guides/playground/managing_model_configurations).
isahers1 marked this conversation as resolved.
Show resolved Hide resolved
If you are interested in testing how your prompt performs over a dataset instead of individual examples, read [this page](../how_to_guides/playground/testing_over_dataset).

![](./static/test_prompt_ui.gif)

## 4. Save a prompt

One you have run some tests and made your desired changes to your prompt you can click the “Save” button to save your prompt for future use.

![](./static/save_prompt_ui.gif)

## 5. Iterate on a prompt

LangSmith makes it easy to iterate on prompts with your entire team. Members of your workspace can select a prompt to iterate on in the playground,
and once they are happy with their changes, they can simply save it as a new commit.

To improve your prompts:

- We recommend referencing the documentation provided by your model provider for best practices in prompt creation,
such as [Best practices for prompt engineering with the OpenAI API](https://help.openai.com/en/articles/6654000-best-practices-for-prompt-engineering-with-the-openai-api) and [Gemini’s Introduction to prompt design](https://ai.google.dev/gemini-api/docs/prompting-intro).

- To help with iterating on your prompts in LangSmith, we've created Prompt Canvas — an interactive tool to build and optimize your prompts.
Learn about how to use [Prompt Canvas](../concepts#prompt-canvas).

![](./static/save_prompt_commit_ui.gif)

You can also tag specific commits to mark important moments in your commit history:

![](./static/tag_prompt_ui.gif)

## 6. Next steps

- Learn more about how to store and manage prompts using the Prompt Hub in [these how-to guides](../how_to_guides#prompt-hub)
- Learn more about how to use the playground for prompt engineering in [these how-to guides](../how_to_guides#playground)
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
15 changes: 14 additions & 1 deletion sidebars.js
Original file line number Diff line number Diff line change
Expand Up @@ -121,6 +121,19 @@ const sidebars = {
type: "category",
label: "Prompt Engineering",
items: [
{
type: "category",
label: "Quickstarts",
collapsible: true,
items: [
"prompt_engineering/quickstarts/quickstart_ui",
"prompt_engineering/quickstarts/quickstart_sdk",
],
link: {
type: "doc",
id: "prompt_engineering/quickstarts/quickstart_ui",
},
},
{
type: "category",
label: "Tutorials",
Expand Down Expand Up @@ -161,7 +174,7 @@ const sidebars = {
link: { type: "doc", id: "prompt_engineering/concepts/index" },
},
],
link: { type: "doc", id: "prompt_engineering/tutorials/index" },
link: { type: "doc", id: "prompt_engineering/quickstarts/quickstart_ui" },
},
"langgraph_cloud",
{
Expand Down
Loading