Skip to content

test: add sql about time data type precision lost#4284

Open
wk989898 wants to merge 3 commits intomasterfrom
wk989898-patch-4
Open

test: add sql about time data type precision lost#4284
wk989898 wants to merge 3 commits intomasterfrom
wk989898-patch-4

Conversation

@wk989898
Copy link
Collaborator

@wk989898 wk989898 commented Feb 26, 2026

What problem does this PR solve?

Issue Number: close #4283

What is changed and how it works?

Check List

Tests

  • Unit test
  • Integration test
  • Manual test (add detailed scripts or steps below)
  • No code

Questions

Will it cause performance regression or break compatibility?
Do you need to update user documentation, design documentation or monitoring documentation?

Release note

Please refer to [Release Notes Language Style Guide](https://pingcap.github.io/tidb-dev-guide/contribute-to-tidb/release-notes-style-guide.html) to write a quality release note.

If you don't think this PR needs a release note then fill it with `None`.

Summary by CodeRabbit

  • Tests
    • Added test coverage for handling time values used as primary keys, including inserts and deletes to validate behavior.
  • Improvements
    • Enhanced parsing/handling of date, time, timestamp and duration values to improve accuracy and consistency when importing or decoding CSV/open-format data.

@ti-chi-bot ti-chi-bot bot added the release-note Denotes a PR that will be considered when it comes time to generate release notes. label Feb 26, 2026
@gemini-code-assist
Copy link

Summary of Changes

Hello @wk989898, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a new integration test case to validate the correct handling of TIME data type precision, particularly when it serves as a primary key. The added test aims to prevent and detect potential data precision loss issues in such scenarios, ensuring data integrity for time-based primary keys.

Highlights

  • New Test Case for TIME Data Type Precision: A new SQL test case was added to test.sql to specifically verify the handling of TIME data type precision, especially when it is used as a primary key. This test includes creating a table with TIME(3) as a primary key, inserting values with high precision, and performing a delete operation to check data integrity.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Changelog
  • tests/integration_tests/common_1/data/test.sql
    • Added a new test case for TIME data type precision, including table creation, data insertion, and deletion.
Activity
  • No activity found in the provided context.
Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

@coderabbitai
Copy link
Contributor

coderabbitai bot commented Feb 26, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info

Configuration used: Organization UI

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 8e1aa27 and 8a1b719.

📒 Files selected for processing (2)
  • pkg/sink/codec/csv/csv_decoder.go
  • pkg/sink/codec/open/decoder.go

📝 Walkthrough

Walkthrough

Adds test data reproducing a TIME primary-key scenario and adjusts CSV/open decoders to use tiTypes.MaxFsp for parsing date/time/duration values; also removes special-case decimal handling for duration in the open decoder.

Changes

Cohort / File(s) Summary
Test data: TIME PK
tests/integration_tests/common_1/data/test.sql
Adds time_is_pk table (id TIME(3) PRIMARY KEY, t DATETIME DEFAULT CURRENT_TIMESTAMP), inserts two rows and deletes one to reproduce data-inconsistency scenario.
Sink CSV decoder
pkg/sink/codec/csv/csv_decoder.go
Parsing for date/datetime/timestamp and duration now uses types.MaxFsp instead of per-field decimal/ft.GetDecimal(); no signature changes.
Open decoder adjustments
pkg/sink/codec/open/decoder.go
Removed special handling that set a default decimal for TypeDuration; ParseTime/ParseDuration now use tiTypes.MaxFsp rather than field decimal. Error flow unchanged.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Possibly related PRs

Suggested reviewers

  • hongyunyan
  • lidezhu
  • 3AceShowHand

Suggested labels

lgtm, approved, size/XS

Poem

🐰 I nibble code where times align,
New tests and parsers, neat design,
Times parsed truer, rows checked twice,
A hop, a fix — bug's carrot, nice! 🥕

🚥 Pre-merge checks | ✅ 3 | ❌ 2

❌ Failed checks (1 warning, 1 inconclusive)

Check name Status Explanation Resolution
Docstring Coverage ⚠️ Warning Docstring coverage is 0.00% which is insufficient. The required threshold is 80.00%. Write docstrings for the functions missing them to satisfy the coverage threshold.
Description check ❓ Inconclusive The description references issue #4283 but lacks detailed explanation of the changes. Implementation details for the 'What is changed and how it works?' section and release note content are missing. Complete the 'What is changed and how it works?' section and provide release note content describing the test additions and bug fix intent.
✅ Passed checks (3 passed)
Check name Status Explanation
Title check ✅ Passed The title accurately describes the main change: adding SQL test data for the TIME data type precision loss issue.
Linked Issues check ✅ Passed The code changes align with issue #4283 objectives. Test data matches the issue reproduction (TIME(3) PK with INSERT/DELETE), and decoder modifications address TIME precision handling.
Out of Scope Changes check ✅ Passed All changes are in scope: test data addition for TIME precision issue, CSV decoder precision updates, and open decoder TIME/Duration handling modifications directly address issue #4283.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
  • 📝 Generate docstrings (stacked PR)
  • 📝 Generate docstrings (commit on current branch)
🧪 Generate unit tests (beta)
  • Create PR with unit tests
  • Post copyable unit tests in a comment
  • Commit unit tests in branch wk989898-patch-4

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

@ti-chi-bot ti-chi-bot bot added the size/S Denotes a PR that changes 10-29 lines, ignoring generated files. label Feb 26, 2026
Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request adds a new integration test for an issue where using a time type with precision loss as a primary key could cause the downstream sink to panic. The added test case creates a table with a time(3) primary key and inserts a value with higher precision, which gets rounded. However, the test doesn't perform any subsequent UPDATE or DELETE operations on this specific row, which seems to be the core of the issue described in #4283. I've suggested an improvement to the test to make it more effective at catching the described bug.

Comment on lines +178 to +187

CREATE TABLE time_is_pk
(
id time(3) NOT NULL,
t datetime DEFAULT CURRENT_TIMESTAMP,
PRIMARY KEY (id)
);

INSERT INTO `time_is_pk`(id) VALUES ('517:51:04.777'),('-733:00:00.0011');
DELETE FROM `time_is_pk` WHERE id = '517:51:04.777';

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The added test case is a good start, but it may not fully cover the scenario described in issue #4283. The issue mentions that a panic can occur during DELETE or UPDATE operations on a row where the time primary key has lost precision. This test only inserts a row with a value that will be rounded (-733:00:00.0011) and then deletes a different, unrelated row.

To make the test more robust and ensure it covers the problematic case, it should perform UPDATE and DELETE operations on the row with the rounded key. This will better simulate the conditions that could lead to the panic.

CREATE TABLE time_is_pk
(
    id time(3) NOT NULL,
    t  int,
    PRIMARY KEY (id)
);
INSERT INTO `time_is_pk`(id, t) VALUES ('-733:00:00.0011', 1);
UPDATE `time_is_pk` SET t = 2 WHERE id = '-733:00:00.0011';
DELETE FROM `time_is_pk` WHERE id = '-733:00:00.0011';

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧹 Nitpick comments (1)
tests/integration_tests/common_1/data/test.sql (1)

178-187: Add a comment to explain the purpose of this test.

Other test sections in this file include descriptive comments (e.g., -- column null test, -- bit column, -- recover test). Consider adding a similar comment here to document the test purpose and reference the linked issue for future maintainability.

📝 Suggested comment addition
 UPDATE `column_is_null`
 SET t = NULL
 WHERE id = 1;

+-- time as primary key precision test
+-- Test issue: https://github.com/pingcap/tiflow/issues/4283
 CREATE TABLE time_is_pk
 (
     id time(3) NOT NULL,
     t  datetime DEFAULT CURRENT_TIMESTAMP,
     PRIMARY KEY (id)
 );

 INSERT INTO `time_is_pk`(id) VALUES ('517:51:04.777'),('-733:00:00.0011');
 DELETE FROM `time_is_pk` WHERE id = '517:51:04.777';
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@tests/integration_tests/common_1/data/test.sql` around lines 178 - 187, Add a
descriptive SQL comment above the block that explains this test creates table
time_is_pk (with a time(3) primary key and a datetime default), inserts
boundary/negative time values and then deletes one row; mention the purpose
(verifying time type as primary key, handling of negative/overflow time
literals) and include the related issue reference/ID for future maintainability
so readers understand why values like '517:51:04.777' and '-733:00:00.0011' are
used.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Nitpick comments:
In `@tests/integration_tests/common_1/data/test.sql`:
- Around line 178-187: Add a descriptive SQL comment above the block that
explains this test creates table time_is_pk (with a time(3) primary key and a
datetime default), inserts boundary/negative time values and then deletes one
row; mention the purpose (verifying time type as primary key, handling of
negative/overflow time literals) and include the related issue reference/ID for
future maintainability so readers understand why values like '517:51:04.777' and
'-733:00:00.0011' are used.

ℹ️ Review info

Configuration used: defaults

Review profile: CHILL

Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 5129f10 and 8e1aa27.

📒 Files selected for processing (1)
  • tests/integration_tests/common_1/data/test.sql

@wk989898
Copy link
Collaborator Author

/test mysql

@ti-chi-bot ti-chi-bot bot added needs-1-more-lgtm Indicates a PR needs 1 more LGTM. approved labels Feb 27, 2026
@ti-chi-bot ti-chi-bot bot added the lgtm label Feb 27, 2026
@ti-chi-bot
Copy link

ti-chi-bot bot commented Feb 27, 2026

[APPROVALNOTIFIER] This PR is APPROVED

This pull-request has been approved by: hongyunyan, lidezhu

The full list of commands accepted by this bot can be found here.

The pull request process is described here

Details Needs approval from an approver in each of these files:

Approvers can indicate their approval by writing /approve in a comment
Approvers can cancel approval by writing /approve cancel in a comment

@ti-chi-bot ti-chi-bot bot removed the needs-1-more-lgtm Indicates a PR needs 1 more LGTM. label Feb 27, 2026
@ti-chi-bot
Copy link

ti-chi-bot bot commented Feb 27, 2026

[LGTM Timeline notifier]

Timeline:

  • 2026-02-27 06:22:03.794886066 +0000 UTC m=+422396.309680675: ☑️ agreed by lidezhu.
  • 2026-02-27 06:22:29.972018759 +0000 UTC m=+422422.486813368: ☑️ agreed by hongyunyan.

@wk989898
Copy link
Collaborator Author

/retest

2 similar comments
@wk989898
Copy link
Collaborator Author

wk989898 commented Mar 2, 2026

/retest

@wk989898
Copy link
Collaborator Author

wk989898 commented Mar 2, 2026

/retest

@ti-chi-bot
Copy link

ti-chi-bot bot commented Mar 2, 2026

@wk989898: The following tests failed, say /retest to rerun all failed tests or /retest-required to rerun all mandatory failed tests:

Test name Commit Details Required Rerun command
pull-cdc-kafka-integration-light 8a1b719 link unknown /test pull-cdc-kafka-integration-light
pull-cdc-pulsar-integration-heavy 8a1b719 link unknown /test pull-cdc-pulsar-integration-heavy

Full PR test history. Your PR dashboard.

Details

Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes-sigs/prow repository. I understand the commands that are listed here.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

approved lgtm release-note Denotes a PR that will be considered when it comes time to generate release notes. size/S Denotes a PR that changes 10-29 lines, ignoring generated files.

Projects

None yet

Development

Successfully merging this pull request may close these issues.

data inconsistent when time data type is pk

3 participants