theo made a useful site for viewing the prices of different LLMs: model-prices.vercel.app

I just check the low/mid ones for any models I use through OpenRouter lmao, on ChatGPT/t3 chat go wild :P

A bar chart compares input and output costs across various devices, with a settings panel on the right for customization.

photos