Skip to content

CCM-15297: Prevent MeshDownloader from publishing duplicate events#243

Open
lapenna-bjss wants to merge 11 commits intomainfrom
feature/CCM-15297_Prevent_MeshDownloader_from_publishing_duplicate_events
Open

CCM-15297: Prevent MeshDownloader from publishing duplicate events#243
lapenna-bjss wants to merge 11 commits intomainfrom
feature/CCM-15297_Prevent_MeshDownloader_from_publishing_duplicate_events

Conversation

@lapenna-bjss
Copy link
Copy Markdown
Collaborator

@lapenna-bjss lapenna-bjss commented Mar 16, 2026

Description

This PR ensures the MESH Downloader does not process or publish a message with the same meshMessageId more than once.

Context

Duplicate messages can occur internally due to retries or when a message is re-polled before it is removed from the MESH inbox.

Key changes

  • Added S3 conditional write (IfNoneMatch='*') to detect duplicates and raise DocumentAlreadyExistsError
  • Updated S3 key structure to include meshMessageId for guaranteed uniqueness
  • Skip event publishing on duplicates while still acknowledging messages
  • Added a mesh-duplicate-downloads CloudWatch metric
  • Added retry logic with backoff and jitter to EventPublisher (python) for transient failures
  • Updated unit and component tests to cover duplicate handling, retries, and new S3 key format

Duplicate message is not published, but is acknowledged:
image
image

Duplicate metric:
image

Type of changes

  • [] Refactoring (non-breaking change)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would change existing functionality)
  • Bug fix (non-breaking change which fixes an issue)

Checklist

  • I am familiar with the contributing guidelines
  • I have followed the code style of the project
  • I have added tests to cover my changes
  • I have updated the documentation accordingly
  • This PR is a result of pair or mob programming
  • If I have used the 'skip-trivy-package' label I have done so responsibly and in the knowledge that this is being fixed as part of a separate ticket/PR.

Sensitive Information Declaration

To ensure the utmost confidentiality and protect your and others privacy, we kindly ask you to NOT including PII (Personal Identifiable Information) / PID (Personal Identifiable Data) or any other sensitive data in this PR (Pull Request) and the codebase changes. We will remove any PR that do contain any sensitive information. We really appreciate your cooperation in this matter.

  • I confirm that neither PII/PID nor sensitive data are included in this PR and the codebase changes.

@lapenna-bjss lapenna-bjss added the skip-trivy-package Skip the Trivy Package Scan label Mar 16, 2026
@lapenna-bjss lapenna-bjss marked this pull request as ready for review March 25, 2026 14:55
@lapenna-bjss lapenna-bjss requested review from a team as code owners March 25, 2026 14:55
sidnhs
sidnhs previously approved these changes Mar 25, 2026
Copy link
Copy Markdown
Contributor

@aidenvaines-cgi aidenvaines-cgi left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you only need to do that if concatinating strings etc

simonlabarere
simonlabarere previously approved these changes Mar 31, 2026
@simonlabarere simonlabarere dismissed stale reviews from sidnhs and themself via 3bf71d7 March 31, 2026 08:23
simonlabarere
simonlabarere previously approved these changes Mar 31, 2026
Copy link
Copy Markdown
Contributor

@nhsd-angel-pastor nhsd-angel-pastor left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Jus wondering whether we want to add in the PII bucket the if-none-header match restriction to enforce putObject requests to have the header (https://docs.aws.amazon.com/AmazonS3/latest/userguide/conditional-writes-enforce.html)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

skip-trivy-package Skip the Trivy Package Scan

Projects

None yet

Development

Successfully merging this pull request may close these issues.

5 participants