Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions .claude/commands/commit.md
19 changes: 19 additions & 0 deletions .claude/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
{
"permissions": {
"allow": [
"Bash(cat:*)",
"Bash(make test:*)",
"Bash(tee:*)",
"Bash(echo:*)",
"Bash(git show:*)",
"Bash(git log:*)",
"Bash(ls:*)",
"Bash(find:*)",
"Bash(git checkout:*)",
"Bash(head:*)"
],
"additionalDirectories": [
"../pgxntool-test/"
]
}
}
4 changes: 4 additions & 0 deletions .gitattributes
Original file line number Diff line number Diff line change
@@ -1,4 +1,8 @@
.gitattributes export-ignore
.claude/ export-ignore
CLAUDE.md export-ignore
PLAN-*.md export-ignore
.DS_Store export-ignore
*.asc export-ignore
*.adoc export-ignore
*.html export-ignore
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1 +1,2 @@
.*.swp
.claude/*.local.json
279 changes: 279 additions & 0 deletions CLAUDE.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,279 @@
# CLAUDE.md

This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.

## Git Commit Guidelines

**IMPORTANT**: When creating commit messages, do not attribute commits to yourself (Claude). Commit messages should reflect the work being done without AI attribution in the message body. The standard Co-Authored-By trailer is acceptable.

## Critical: What This Repo Actually Is

**pgxntool is NOT a standalone project.** It is a meta-framework that exists ONLY to be embedded into PostgreSQL extension projects via `git subtree`. This repo cannot be built, tested, or run directly.

**Think of it like this**: pgxntool is to PostgreSQL extensions what a Makefile template library is to C projects - it's infrastructure code that gets copied into other projects, not a project itself.

## Critical: Directory Purity - NO Temporary Files

**This directory contains ONLY files that get embedded into extension projects.** When extension developers run `git subtree add`, they pull the entire pgxntool directory into their project.

**ABSOLUTE RULE**: NO temporary files, scratch work, or development tools may be added to this directory.

**Examples of what NEVER belongs here:**
- Temporary files (scratch notes, test output, debugging artifacts)
- Development scripts or tools (these go in pgxntool-test/)
- Planning documents (PLAN-*.md files go in pgxntool-test/)
- Any file you wouldn't want in every extension project that uses pgxntool

**CLAUDE.md exception**: CLAUDE.md exists here for AI assistant guidance, but is excluded from distributions via `.gitattributes export-ignore`. Same with `.claude/` directory.

**Why this matters**: Any file you add here will be pulled into hundreds of extension projects via git subtree. Keep this directory lean and clean.

## Development Workflow: Work from pgxntool-test

**CRITICAL**: All development work on pgxntool should be done from the `../pgxntool-test/` repository, NOT from this repository.

**Why?** Because:
- This repository contains ONLY the framework files that get embedded into extension projects
- All development tools, test infrastructure, and convenience scripts live in `../pgxntool-test/`
- Adding development tools here would pollute every extension project that uses pgxntool

**What this means**:
- Clone and work from `../pgxntool-test/` when developing pgxntool
- Edit files in `../pgxntool/` but run all commands from `../pgxntool-test/`
- All testing, documentation of testing, and development utilities belong in pgxntool-test

## Two-Repository Development Pattern

This codebase uses a two-repository testing pattern:

1. **pgxntool/** (this repo) - The framework code that gets embedded into extension projects
2. **../pgxntool-test/** - The test harness that creates test repositories, exercises pgxntool's functionality, and validates outputs

The test harness contains template extension files in `../pgxntool-test/template/` which are used to create fresh test repositories.

**To test changes to pgxntool**, you must:
1. Ensure changes are visible to pgxntool-test (via git commit or using the rsync mechanism in the test harness)
2. Run tests from `../pgxntool-test/`
3. The test harness will create a fresh test repository from the template, sync in pgxntool changes, and validate behavior

## How Extension Developers Use pgxntool

Extension projects include pgxntool via git subtree:

```bash
git subtree add -P pgxntool --squash git@github.com:decibel/pgxntool.git release
pgxntool/setup.sh
```

After setup, their Makefile typically contains just:
```makefile
include pgxntool/base.mk
```

## Architecture: Two-Phase Build System

### Phase 1: Meta Generation (`build_meta.sh`)
- Processes `META.in.json` (template with placeholders/empty values)
- Strips out X_comment fields and empty values
- Produces clean `META.json`

### Phase 2: Variable Extraction (`meta.mk.sh`)
- Parses `META.json` using `JSON.sh` (a bash-based JSON parser)
- Generates `meta.mk` with Make variables:
- `PGXN` - distribution name
- `PGXNVERSION` - version number
- `EXTENSIONS` - list of extensions provided
- `EXTENSION_*_VERSION` - per-extension versions
- `EXTENSION_VERSION_FILES` - auto-generated versioned SQL files
- `base.mk` includes `meta.mk` via `-include`

### The Magic of base.mk

`base.mk` provides a complete PGXS-based build system:
- Auto-detects extension SQL files in `sql/`
- Auto-detects C modules in `src/*.c`
- Auto-detects tests in `test/sql/*.sql`
- Auto-generates versioned extension files (`extension--version.sql`)
- Handles Asciidoc → HTML conversion
- Integrates with PGXN distribution format
- Manages git tagging and release packaging

## File Structure for Consumer Projects

Projects using pgxntool follow this layout:
```
project/
├── Makefile # include pgxntool/base.mk
├── META.in.json # Template metadata (customize for your extension)
├── META.json # Auto-generated from META.in.json
├── extension.control # Standard PostgreSQL control file
├── pgxntool/ # This repo, embedded via git subtree
├── sql/
│ └── extension.sql # Base extension SQL
├── src/ # Optional C code (*.c files)
├── test/
│ ├── deps.sql # Load extension and test dependencies
│ ├── sql/*.sql # Test SQL files
│ └── expected/*.out # Expected test outputs
└── doc/ # Optional docs (*.adoc, *.asciidoc)
```

## Commands for Extension Developers (End Users)

These are the commands extension developers use (documented for context):

```bash
make # Build extension (generates versioned SQL, docs)
make test # Full test: testdeps → install → installcheck → show diffs
make results # Run tests and update expected output files
make html # Generate HTML from Asciidoc sources
make tag # Create git branch for current META.json version
make dist # Create PGXN .zip (auto-tags, places in ../)
make pgtle # Generate pg_tle registration SQL (see pg_tle Support below)
make check-pgtle # Check pg_tle installation and report version
make install-pgtle # Install pg_tle registration SQL files into database
make pgxntool-sync # Update to latest pgxntool via git subtree pull
```

## Testing with pgxntool

### Critical Testing Rules

**NEVER use `make installcheck` directly**. Always use `make test` instead. The `make test` target ensures:
- Clean builds before testing
- Proper test isolation
- Correct test dependency installation
- Proper cleanup and result comparison

**Database Connection Requirement**: PostgreSQL must be running before executing `make test`. If you get connection errors (e.g., "could not connect to server"), stop and ask the user to start PostgreSQL.

**Claude Code MUST NEVER run `make results`**. This target updates test expected output files and requires manual human verification of test changes before execution. The workflow is:
1. Human runs `make test` and examines diffs
2. Human manually verifies changes are correct
3. Human manually runs `make results` to update expected files

### Test Output Mechanics

pgxntool uses PostgreSQL's pg_regress test framework:
- **Actual test output**: Written to `test/results/` directory
- **Expected output**: Stored in `test/expected/` directory
- **Test comparison**: `make test` compares actual vs expected and shows diffs
- **Updating expectations**: `make results` copies `test/results/` → `test/expected/`

When tests fail, examine the diff output carefully. The actual test output in `test/results/` shows what your code produced, while `test/expected/` shows what was expected.

## Development Workflow (for pgxntool Contributors)

When modifying pgxntool:

1. **Make changes** in this repo (pgxntool/)
2. **Test changes** by running `make test` in `../pgxntool-test/`
- The test harness creates a fresh test repository from `../pgxntool-test/template/`
- If pgxntool is dirty, it rsyncs your uncommitted changes
- Runs setup, builds, tests, and validates outputs
3. **Examine results** in `../pgxntool-test/results/` and `../pgxntool-test/diffs/`
4. **Update expected outputs** if needed: `make sync-expected` in `../pgxntool-test/`
5. **Commit changes** once tests pass

## Key Implementation Details

### PostgreSQL Version Handling
- `MAJORVER` = version × 10 (e.g., 9.6 → 96, 13 → 130)
- Tests use `--load-language=plpgsql` for versions < 13
- Version detection via `pg_config --version`

### Test System (pg_regress based)
- Tests in `test/sql/*.sql`, outputs compared to `test/expected/*.out`
- Setup via `test/pgxntool/setup.sql` (loads pgTap and deps.sql)
- `.IGNORE: installcheck` prevents build failures on test errors
- `make results` updates expected outputs after test runs

### Document Generation
- Auto-detects `asciidoctor` or `asciidoc`
- Generates HTML from `*.adoc` and `*.asciidoc` in `$(DOC_DIRS)`
- HTML required for `make dist`, optional for `make install`
- Template-based rules via `ASCIIDOC_template`

### Distribution Packaging
- `make dist` creates `../PGXN-VERSION.zip`
- Always creates git branch tag matching version
- Uses `git archive` to package
- Validates repo is clean before tagging

### Subtree Sync Support
- `make pgxntool-sync` pulls latest release
- Multiple sync targets: release, stable, local variants
- Uses `git subtree pull --squash`
- Requires clean repo (no uncommitted changes)

### pg_tle Support

pgxntool can generate pg_tle (Trusted Language Extensions) registration SQL for deploying extensions in AWS RDS/Aurora without filesystem access.

**Usage:** `make pgtle` or `make pgtle PGTLE_VERSION=1.5.0+`

**Output:** `pg_tle/{version_range}/{extension}.sql`

**Version ranges:**
- `1.0.0-1.5.0` - pg_tle 1.0.0 through 1.4.x (no schema parameter)
- `1.5.0+` - pg_tle 1.5.0 and later (schema parameter support)

**Installation targets:**

- `make check-pgtle` - Checks if pg_tle is installed and reports the version. Reports version from `pg_extension` if extension has been created, or newest available version from `pg_available_extension_versions` if available but not created. Errors if pg_tle not available in cluster. Assumes `PG*` environment variables are configured.

- `make install-pgtle` - Auto-detects pg_tle version and installs appropriate registration SQL files. Updates or creates pg_tle extension as needed. Determines which version range files to install based on detected version. Runs all generated SQL files via `psql` to register extensions with pg_tle. Assumes `PG*` environment variables are configured.

**Version notation:**
- `X.Y.Z+` means >= X.Y.Z
- `X.Y.Z-A.B.C` means >= X.Y.Z and < A.B.C (note boundary)

**Key implementation details:**
- Script: `pgxntool/pgtle-wrap.sh` (bash)
- Parses `.control` files for metadata (NOT META.json)
- Fixed delimiter: `$_pgtle_wrap_delimiter_$` (validated not in source)
- Each output file contains ALL versions and ALL upgrade paths
- Multi-extension support (multiple .control files)
- Output directory `pg_tle/` excluded from git
- Depends on `make all` to ensure versioned SQL files exist first
- Only processes versioned files (`sql/{ext}--{version}.sql`), not base files

**SQL file handling:**
- **Version files** (`sql/{ext}--{version}.sql`): Generated automatically by `make all` from base `sql/{ext}.sql` file
- **Upgrade scripts** (`sql/{ext}--{v1}--{v2}.sql`): Created manually by users when adding new extension versions
- The script ensures the default_version file exists if the base file exists (creates it from base file if missing)
- All version files and upgrade scripts are discovered and included in the generated pg_tle registration SQL

**Dependencies:**
Generated files depend on:
- Control file (metadata source)
- All SQL files (sql/{ext}--*.sql) - must run `make all` first
- Generator script itself

**Limitations:**
- No C code support (pg_tle requires trusted languages only)
- PostgreSQL 14.5+ required (pg_tle not available on earlier versions)

See `README-pgtle.md` for complete user documentation.

## Critical Gotchas

1. **Empty Variables**: If `DOCS` or `MODULES` is empty, base.mk sets to empty to prevent PGXS errors
2. **testdeps Pattern**: Never add recipes to `testdeps` - create separate target and make it a prerequisite
3. **META.json is Generated**: Always edit `META.in.json`, never `META.json` directly
4. **Control File Versions**: No automatic validation that `.control` matches `META.json` version
5. **PGXNTOOL_NO_PGXS_INCLUDE**: Setting this skips PGXS inclusion (for special scenarios)
6. **Distribution Placement**: `.zip` files go in parent directory (`../`) to avoid repo clutter

## Scripts

- **setup.sh** - Initializes pgxntool in a new extension project (copies templates, creates directories)
- **build_meta.sh** - Strips empty fields from META.in.json to create META.json
- **meta.mk.sh** - Parses META.json via JSON.sh and generates meta.mk with Make variables
- **JSON.sh** - Third-party bash JSON parser (MIT licensed)
- **safesed** - Utility for safe sed operations

## Related Repositories

- **../pgxntool-test/** - Test harness for validating pgxntool functionality (includes template extension files in `template/` directory)
- Never produce any kind of metrics or estimates unless you have data to back them up. If you do have data you MUST reference it.
Loading