Prompt caching: 10x cheaper LLM tokens, but how? | ngrok blog
https://ngrok.com/blog/prompt-caching/
46800306