From 032e8f9daafee05cd78cad175fbb9a7188eacfda Mon Sep 17 00:00:00 2001 From: Isaac Francisco <78627776+isahers1@users.noreply.github.com> Date: Wed, 29 Jan 2025 17:39:28 -0800 Subject: [PATCH] Update docs/observability/index.mdx Co-authored-by: Tanushree <87711021+tanushree-sharma@users.noreply.github.com> --- docs/observability/index.mdx | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/docs/observability/index.mdx b/docs/observability/index.mdx index d588bb82..e084faf4 100644 --- a/docs/observability/index.mdx +++ b/docs/observability/index.mdx @@ -135,7 +135,7 @@ it has an LLM call! ## 5. Trace OpenAI calls -The first thing you might want to trace is all your OpenAI calls. LangSmith makes this easy with the [`wrap_openai`](https://docs.smith.langchain.com/reference/python/wrappers/langsmith.wrappers._openai.wrap_openai_)/[`wrapOpenAI`](https://docs.smith.langchain.com/reference/js/functions/wrappers_openai.wrapOpenAI) wrappers. +The first thing you might want to trace is all your OpenAI calls. LangSmith makes this easy with the [`wrap_openai`](https://docs.smith.langchain.com/reference/python/wrappers/langsmith.wrappers._openai.wrap_openai_) (Python) or [`wrapOpenAI`](https://docs.smith.langchain.com/reference/js/functions/wrappers_openai.wrapOpenAI) (TypeScript) wrappers. All you have to do is modify your code to use the wrapped client instead of using the `OpenAI` client directly.