Context Rot: How increasing input tokens impacts LLM performance

254 points | by kellyhongsn 4 days ago

63 comments