Skip to content

Comments

🤖 fix: suppress browser notifications for idle compaction completions#2554

Merged
ammario merged 6 commits intomainfrom
compaction-9a6z
Feb 23, 2026
Merged

🤖 fix: suppress browser notifications for idle compaction completions#2554
ammario merged 6 commits intomainfrom
compaction-9a6z

Conversation

@ammar-agent
Copy link
Collaborator

@ammar-agent ammar-agent commented Feb 23, 2026

Summary

Suppress browser notification noise for compaction flows by moving idle-compaction detection to activity snapshots, skipping compaction notifications when an automatic follow-up message is queued, and hardening idle-marker cleanup across race/error paths.

Background

The previous idle-compaction suppression relied on an idle-compaction-started chat event. That signal is unreliable for true background work because onChat is only active for the currently selected workspace. During Codex review, additional edge cases surfaced where stale idle markers could leak across turns and incorrectly suppress non-idle notifications.

Implementation

  1. Activity snapshot contract

    • Added optional isIdleCompaction to WorkspaceActivitySnapshotSchema.
  2. Backend idle marker lifecycle (WorkspaceService)

    • Track in-flight idle maintenance with idleCompactingWorkspaces.
    • Set marker only after idle dispatch succeeds and the session is still busy.
    • Tag isIdleCompaction only on streaming=false snapshots (stop snapshots), not on streaming=true snapshots.
    • Always clear marker for streaming=false transitions in finally, so metadata write failures cannot leak state into future turns.
  3. Frontend completion suppression (WorkspaceStore + App)

    • Use activity transitions (streaming true→false + recency advance) for background completion detection.
    • Preserve idle detection via previous || snapshot for reconnect/restore resilience.
    • Route completion callbacks through a store-level wrapper.
    • For compaction completions, if a queued follow-up exists (auto-sent next turn), coerce hasContinueMessage=true so App suppresses intermediate "Compaction complete" notifications.
    • App.tsx suppresses notifications for compaction?.isIdle and compaction?.hasContinueMessage.
  4. Cleanup

    • Removed dead idle-compaction-started plumbing across backend/frontend:
      • backend emitter method
      • frontend callback subscription/export
      • stream schema branch and ignored-event handling
  5. Tests

    • Updated idle-compaction dispatch tests.
    • Added regression coverage for:
      • no stale marker when send succeeds without active stream
      • marker cleared when setStreaming fails on stream-stop
      • streaming=true snapshots are never idle-tagged
      • active compaction completion with queued follow-up suppression path

Validation

  • make static-check
  • bun test src/browser/stores/WorkspaceStore.test.ts
  • bun test src/browser/utils/messages/applyWorkspaceChatEventToAggregator.test.ts
  • bun test src/node/services/workspaceService.test.ts
  • make typecheck

Risks

Low risk and scoped to compaction completion notification behavior. Main risk is over-suppression in edge cases where queued follow-up intent is ambiguous, but this is preferable to duplicate/intermediate notifications and matches compaction auto-send semantics.

Pains

This needed several Codex-driven hardening passes for subtle ordering/failure edges (startup-abort, metadata write failure, and cross-turn marker bleed), which required tightening marker lifecycle guarantees rather than adding timing-based coordination.


Generated with mux • Model: openai:gpt-5.3-codex • Thinking: xhigh • Cost: $10.37

@ammar-agent
Copy link
Collaborator Author

@codex review

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 30b33f3ce2

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@ammar-agent
Copy link
Collaborator Author

@codex review

Addressed: now derive idle-compaction state from both the previous (start) and current (stop) snapshots, so UI reconnects that miss the transient flag on the start snapshot still suppress idle compaction notifications.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: ccb274fb8c

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Idle compaction is background maintenance, not user-initiated work.
Previously, the frontend tracked idle compaction via a Set populated
from the idle-compaction-started chat event, but onChat only subscribes
for the active workspace — so the event never arrived for truly
background workspaces (the main idle compaction target).

Fix: thread isIdleCompaction through the activity snapshot, which is
the only data channel reaching the frontend for ALL workspaces. The
backend tags the snapshot during updateStreamingStatus when an idle
compaction is in progress. The frontend reads the flag on the
streaming true→false transition and suppresses the notification.

Changes:
- Add isIdleCompaction field to WorkspaceActivitySnapshotSchema
- Backend: track idle compaction Set, enrich snapshot, clear on stop
- Frontend: remove broken idleCompactingWorkspaces Set, read from
  previous snapshot's isIdleCompaction flag instead
- App.tsx: early-return before notification when compaction?.isIdle
@ammar-agent
Copy link
Collaborator Author

@codex review

Addressed: moved idleCompactingWorkspaces.add() to after sendMessage returns successfully. At that point the idle stream is already running and the session is busy, so no concurrent user stream can race with the marker. The streaming=true snapshot is emitted without the flag, but the streaming=false snapshot picks it up — the frontend checks both.

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 20f0834c51

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

- Clarify WorkspaceStore comment about when isIdleCompaction is present
- Simplify background compaction payload construction on stream stop
- Remove unused idle-compaction-started event plumbing across frontend/backend
- Update idle compaction dispatch tests to match removed started-event emission

---

_Generated with `mux` • Model: `openai:gpt-5.3-codex` • Thinking: `xhigh` • Cost: `$10.37`_

<!-- mux-attribution: model=openai:gpt-5.3-codex thinking=xhigh costs=10.37 -->
@ammar-agent
Copy link
Collaborator Author

@codex review

Latest idle-compaction cleanup is pushed (including dead idle-compaction-started plumbing removal and test updates). Please re-review.

Guard idle-compaction marker assignment behind a post-dispatch session.isBusy() check.
If sendMessage succeeds without an active stream (startup-abort or ultra-fast completion),
clear any stale marker so later user streams are never misclassified as idle compaction.

Also add regression tests for both active-stream and no-stream success paths.

---

_Generated with `mux` • Model: `openai:gpt-5.3-codex` • Thinking: `xhigh` • Cost: `$10.37`_

<!-- mux-attribution: model=openai:gpt-5.3-codex thinking=xhigh costs=10.37 -->
@ammar-agent
Copy link
Collaborator Author

@codex review

Addressed stale idle-marker race:

  • only set idleCompactingWorkspaces when session.isBusy() is still true after dispatch
  • clear marker otherwise (startup-abort / ultra-fast completion path)
  • added regression tests for active-stream and no-stream-success cases

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: e36d9cbc88

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

When a compaction stream ends with a queued follow-up message, mark the
completion as hasContinueMessage=true before forwarding onResponseComplete.
This prevents an intermediate "Compaction complete" notification and keeps
notification delivery focused on the follow-up response.

Also adds a WorkspaceStore regression test covering active compaction + queued follow-up.

---

_Generated with `mux` • Model: `openai:gpt-5.3-codex` • Thinking: `xhigh` • Cost: `$10.37`_

<!-- mux-attribution: model=openai:gpt-5.3-codex thinking=xhigh costs=10.37 -->
@ammar-agent
Copy link
Collaborator Author

@codex review

Added another notification suppression fix:

  • compaction completions now inherit hasContinueMessage=true when a queued follow-up is pending
  • prevents intermediate "Compaction complete" notification and notifies on the follow-up response instead
  • includes WorkspaceStore regression test coverage

Move idle-compaction marker cleanup for streaming=false transitions into
updateStreamingStatus finally block so extension metadata failures cannot leak
isIdleCompaction state into later user streams.

Adds a regression test that forces setStreaming failure and verifies the marker
is still cleared.

---

_Generated with `mux` • Model: `openai:gpt-5.3-codex` • Thinking: `xhigh` • Cost: `$10.37`_

<!-- mux-attribution: model=openai:gpt-5.3-codex thinking=xhigh costs=10.37 -->
@ammar-agent
Copy link
Collaborator Author

@codex review

Addressed the new idle-marker leak case:

  • updateStreamingStatus now clears idleCompactingWorkspaces in finally for streaming=false
  • added regression test for setStreaming failure path

Copy link

@chatgpt-codex-connector chatgpt-codex-connector bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: 1ac1363262

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

Restrict updateStreamingStatus idle tagging to streaming=false snapshots so
follow-up non-idle turns cannot inherit stale isIdleCompaction metadata from
prior maintenance turns.

Also adds a regression test for streaming=true behavior while the idle marker is set.

---

_Generated with `mux` • Model: `openai:gpt-5.3-codex` • Thinking: `xhigh` • Cost: `$10.37`_

<!-- mux-attribution: model=openai:gpt-5.3-codex thinking=xhigh costs=10.37 -->
@ammar-agent
Copy link
Collaborator Author

@codex review

Addressed stream-stop tagging feedback:

  • idle marker now only tags streaming=false snapshots
  • added regression test that streaming=true snapshots stay untagged even when idle marker is set

@chatgpt-codex-connector
Copy link

Codex Review: Didn't find any major issues. Breezy!

ℹ️ About Codex in GitHub

Codex has been enabled to automatically review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

When you sign up for Codex through ChatGPT, Codex can also answer questions or update the PR, like "@codex address that feedback".

@ammario ammario merged commit 3ffbb82 into main Feb 23, 2026
22 checks passed
@ammario ammario deleted the compaction-9a6z branch February 23, 2026 19:56
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants