Today VS Code updated to v1.119.1, and suddenly LLMs are compacting conversations out of the blue, such as just after 50% on Claude Opus 4.7. It's happening with Claude Open 4.7 and GPT 5.5. (Just now it compacted Claude Opus 4.7 at 43% of the context window!!)
This is a huge blow to user progress; it's a big regression of things like #299810 .
Version: 1.119.1 (user setup)
Commit: 974500e64f0d1cfdf7c9821a2a51c2cb3bf0e561
Date: 2026-05-12T01:20:22+09:00
Electron: 39.8.8
ElectronBuildId: 13870025
Chromium: 142.0.7444.265
Node.js: 22.22.1
V8: 14.2.231.22-electron.0
OS: Windows_NT x64 10.0.26200
Today VS Code updated to v1.119.1, and suddenly LLMs are compacting conversations out of the blue, such as just after 50% on Claude Opus 4.7. It's happening with Claude Open 4.7 and GPT 5.5. (Just now it compacted Claude Opus 4.7 at 43% of the context window!!)
This is a huge blow to user progress; it's a big regression of things like #299810 .