A cache that uses Upstash as the backing store. See https://docs.upstash.com/redis.

Example

// Initialize the OpenAI model with Upstash Redis cache for caching responses
const model = new ChatOpenAI({
cache: new UpstashRedisCache({
config: {
url: "UPSTASH_REDIS_REST_URL",
token: "UPSTASH_REDIS_REST_TOKEN",
},
}),
});

Hierarchy

Constructors

Methods

Constructors

Methods

  • Lookup LLM generations in cache by prompt and associated LLM key.

    Parameters

    • prompt: string
    • llmKey: string

    Returns Promise<null | Generation[]>

  • Update the cache with the given generations.

    Note this overwrites any existing generations for the given prompt and LLM key.

    Parameters

    • prompt: string
    • llmKey: string
    • value: Generation[]

    Returns Promise<void>

Generated using TypeDoc