Support system prompt caching via system_prompt_cache_type provider option#410
Open
GioChocolateBro wants to merge 1 commit intolaravel:0.xfrom
Open
Support system prompt caching via system_prompt_cache_type provider option#410GioChocolateBro wants to merge 1 commit intolaravel:0.xfrom
GioChocolateBro wants to merge 1 commit intolaravel:0.xfrom
Conversation
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Fixes #119
Problem
Anthropic's prompt caching can dramatically reduce input token costs for agents with large system prompts. The
AnthropicGatewaypasses system prompts as plain strings — there's no way to attach thecache_controlmetadata that Anthropic requires for caching.Solution
Add a
system_prompt_cache_typeprovider option. When set,BuildsTextRequestsformats the system prompt as a cached content block before sending to the API.This is 12 lines in one trait. When the option is not set, behavior is completely unchanged — the
issetcheck fails and the system prompt stays a plain string.Usage
Agents opt in via
HasProviderOptions:How it works
In
BuildsTextRequests::buildTextRequestBody(), after provider options are resolved:system_prompt_cache_typeis set$body['system']from a plain string to an array of content blocks withcache_controlunsetthe key from$providerOptionsso it doesn't leak into the API body viaarray_mergeBoth formats (string and content block array) are valid per the Anthropic Messages API.
Why not strip
cache_controlfrom provider options (PR #365)?That approach was closed because it would break the existing
tool_result_cache_typemechanism. This PR doesn't touch provider options flow at all — it only consumes and removes its own key.Tests
3 new tests in
ProviderOptionsTest.php:system_prompt_cache_typeformats system prompt as cached content blocksystem_prompt_cache_typeis not included as a top level key in request bodysystem_prompt_cache_typeResults
Tested in production on a multi-step analytics agent with a ~8.5k token system prompt:
~90% reduction in effective input token cost. Cache stays warm across conversations within the 5-minute TTL.
Changes
BuildsTextRequests.php— 12 lines addedCachedSystemPromptAgent.php— test fixtureProviderOptionsTest.php— 3 new tests