Skip to main content

PromptLayer OpenAI

danger

This module has been deprecated and is no longer supported. The documentation below will not work in versions 0.2.0 or later.

LangChain integrates with PromptLayer for logging and debugging prompts and responses. To add support for PromptLayer:

  1. Create a PromptLayer account here: https://promptlayer.com.
  2. Create an API token and pass it either as promptLayerApiKey argument in the PromptLayerOpenAI constructor or in the PROMPTLAYER_API_KEY environment variable.
import { PromptLayerOpenAI } from "langchain/llms/openai";

const model = new PromptLayerOpenAI({
temperature: 0.9,
apiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.OPENAI_API_KEY
promptLayerApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.PROMPTLAYER_API_KEY
});
const res = await model.invoke(
"What would be a good company name a company that makes colorful socks?"
);

Azure PromptLayerOpenAI

LangChain also integrates with PromptLayer for Azure-hosted OpenAI instances:

import { PromptLayerOpenAI } from "langchain/llms/openai";

const model = new PromptLayerOpenAI({
temperature: 0.9,
azureOpenAIApiKey: "YOUR-AOAI-API-KEY", // In Node.js defaults to process.env.AZURE_OPENAI_API_KEY
azureOpenAIApiInstanceName: "YOUR-AOAI-INSTANCE-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_INSTANCE_NAME
azureOpenAIApiDeploymentName: "YOUR-AOAI-DEPLOYMENT-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_DEPLOYMENT_NAME
azureOpenAIApiCompletionsDeploymentName:
"YOUR-AOAI-COMPLETIONS-DEPLOYMENT-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_COMPLETIONS_DEPLOYMENT_NAME
azureOpenAIApiEmbeddingsDeploymentName:
"YOUR-AOAI-EMBEDDINGS-DEPLOYMENT-NAME", // In Node.js defaults to process.env.AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME
azureOpenAIApiVersion: "YOUR-AOAI-API-VERSION", // In Node.js defaults to process.env.AZURE_OPENAI_API_VERSION
azureOpenAIBasePath: "YOUR-AZURE-OPENAI-BASE-PATH", // In Node.js defaults to process.env.AZURE_OPENAI_BASE_PATH
promptLayerApiKey: "YOUR-API-KEY", // In Node.js defaults to process.env.PROMPTLAYER_API_KEY
});
const res = await model.invoke(
"What would be a good company name a company that makes colorful socks?"
);

The request and the response will be logged in the PromptLayer dashboard.

Note: In streaming mode PromptLayer will not log the response.


Help us out by providing feedback on this documentation page: