[WIP] Cherry pick two commits into release/1.7 branch#3088
Closed
Copilot wants to merge 36 commits intorelease/1.7from
Closed
[WIP] Cherry pick two commits into release/1.7 branch#3088Copilot wants to merge 36 commits intorelease/1.7from
Copilot wants to merge 36 commits intorelease/1.7from
Conversation
This pull request significantly improves the `README.md` by restructuring and expanding the documentation. --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
## Why make this change? We have recently noticed that, if we have a column of type NVARCHAR or VARCHAR and we try to run a stored procedure that reads a row in which that column has an empty string value, we had an internal server error. This error happens when we try to run the method GetChars passing in a buffer with length 0 This PR aims to fix this problem ## What is this change? We have added a small change to the method that was throwing the exception. If we find that resultFieldSize is 0 - which means that the data in the cell we are reading has a length of 0 - we will not call the method GetChars to read the data, but assume the data is empty and return the size of the data read in bytes as 0. As you can see in the example bellow, that fixes the issue. ## How was this tested? - [x] Integration Tests - [x] Unit Tests ## Sample Request(s) We have a table with a column of type NVARCHAR called "Description". In one of the rows, Description is an empty string <img width="2258" height="494" alt="image" src="https://github.com/user-attachments/assets/857b9d93-e1e1-4c4b-b802-70693037402e" /> **Before the changes:** If we try to run a stored procedure that reads that empty cell, we get an error <img width="2783" height="1245" alt="image" src="https://github.com/user-attachments/assets/adc80578-2532-4f71-b781-f4bee8798334" /> **After changes** Stored procedure runs as expected <img width="2776" height="1214" alt="image" src="https://github.com/user-attachments/assets/bd4e0e2d-de34-4a20-8805-7a676f40de15" /> --------- Co-authored-by: Giovanna Ribeiro <gribeiro@microsoft.com> Co-authored-by: RubenCerna2079 <32799214+RubenCerna2079@users.noreply.github.com>
MCP `list_tools` descriptions were insufficiently clear for ChatGPT to
understand tool usage patterns and workflows, while Claude handled them
adequately.
## Changes
Updated descriptions and input schemas for all 6 MCP tools:
- **describe_entities**: Added "ALWAYS CALL FIRST" directive and
clarified permissions structure (`'ALL'` expands by type: data→CREATE,
READ, UPDATE, DELETE). Expanded `nameOnly` and `entities` parameter
descriptions to include detailed usage guidance:
- `nameOnly`: Explains when to use it (for discovery with many
entities), the two-call strategy (first with `nameOnly=true`, then with
specific entities), and warns that it doesn't provide enough detail for
CRUD/EXECUTE operations
- `entities`: Clarifies its purpose for targeted metadata retrieval and
explicitly warns against combining it with `nameOnly=true`
- **CRUD tools** (create_record, read_records, update_record,
delete_record): Added explicit STEP 1→STEP 2 workflow (describe_entities
first, then call with matching permissions/fields)
- **execute_entity**: Added workflow guidance and clarified use case
(actions/computed results)
- **All tools**: Condensed parameter descriptions (e.g.,
"Comma-separated field names" vs. "A comma-separated list of field names
to include in the response. If omitted, all fields are returned.
Optional.")
## Example
Before:
```csharp
Description = "Creates a new record in the specified entity."
```
After:
```csharp
Description = "STEP 1: describe_entities -> find entities with CREATE permission and their fields. STEP 2: call this tool with matching field names and values."
```
All changes are metadata-only; no functional code modified.
<!-- START COPILOT CODING AGENT SUFFIX -->
<details>
<summary>Original prompt</summary>
----
*This section details on the original issue you should resolve*
<issue_title>[BUG]: MCP `list_tools` need more comprehensive
descriptions.</issue_title>
<issue_description>## What?
Our tools have descriptions already. They need better to help models.
> Claude works but ChatGPT struggles to understand with our current
descriptions.
## New descriptions
```json
{
"tools": [
{
"name": "describe_entities",
"description": "Lists all entities and metadata. ALWAYS CALL FIRST. Each entity includes: name, type, fields, parameters, and permissions. The permissions array defines which tools are allowed. 'ALL' expands by type: data->CREATE, READ, UPDATE, DELETE.",
"inputSchema": {
"type": "object",
"properties": {
"nameOnly": {
"type": "boolean",
"description": "True: names and summaries only. False (default): full metadata."
},
"entities": {
"type": "array",
"items": { "type": "string" },
"description": "Optional: specific entity names. Omit for all."
}
}
}
},
{
"name": "create_record",
"description": "STEP 1: describe_entities -> find entities with CREATE permission and their fields. STEP 2: call this tool with matching field names and values.",
"inputSchema": {
"type": "object",
"properties": {
"entity": {
"type": "string",
"description": "Entity name with CREATE permission."
},
"data": {
"type": "object",
"description": "Required fields and values for the new record."
}
},
"required": ["entity", "data"]
}
},
{
"name": "read_records",
"description": "STEP 1: describe_entities -> find entities with READ permission and their fields. STEP 2: call this tool with select, filter, sort, or pagination options.",
"inputSchema": {
"type": "object",
"properties": {
"entity": {
"type": "string",
"description": "Entity name with READ permission."
},
"select": {
"type": "string",
"description": "Comma-separated field names."
},
"filter": {
"type": "string",
"description": "OData expression: eq, ne, gt, ge, lt, le, and, or, not."
},
"orderby": {
"type": "array",
"items": { "type": "string" },
"description": "Sort fields and directions, e.g., ['name asc', 'year desc']."
},
"first": {
"type": "integer",
"description": "Max number of records (page size)."
},
"after": {
"type": "string",
"description": "Cursor token for next page."
}
},
"required": ["entity"]
}
},
{
"name": "update_record",
"description": "STEP 1: describe_entities -> find entities with UPDATE permission and their key fields. STEP 2: call this tool with keys and new field values.",
"inputSchema": {
"type": "object",
"properties": {
"entity": {
"type": "string",
"description": "Entity name with UPDATE permission."
},
"keys": {
"type": "object",
"description": "Primary or composite keys identifying the record."
},
"fields": {
"type": "object",
"description": "Fields and their new values."
}
},
"required": ["entity", "keys", "fields"]
}
},
{
"name": "delete_record",
"description": "STEP 1: describe_entities -> find entities with DELETE permission and their key fields. STEP 2: call this tool with full key values.",
"inputSchema": {
"type": "object",
"properties": {
"entity": {
"type": "string",
"description": "Entity name with DELETE permission."
},
"keys": {
"type": "object",
"description": "All key fields identifying the record."
}
},
"required": ["entity", "keys"]
}
},
{
"name": "execute_entity",
"description": "STEP 1: describe_entities -> find entities with EXECUTE permission and their parameters. STEP 2: call this tool with matching parameter values. Used for entities that perform actions or return computed results.",
"inputSchema": {
"type": "object",
"properties": {
"entity": {
"type": "string",
"description": "Entity name with EXECUTE permission."
},
"parameters": {
"type": "object",
"description": "Optional parameter names and values."
}
},
"required": ["entity"]
}
}
]
}
```
</issue_description>
<agent_instruction...
</details>
- Fixes #2936
<!-- START COPILOT CODING AGENT TIPS -->
---
✨ Let Copilot coding agent [set things up for
you](https://github.com/Azure/data-api-builder/issues/new?title=✨+Set+up+Copilot+instructions&body=Configure%20instructions%20for%20this%20repository%20as%20documented%20in%20%5BBest%20practices%20for%20Copilot%20coding%20agent%20in%20your%20repository%5D%28https://gh.io/copilot-coding-agent-tips%29%2E%0A%0A%3COnboard%20this%20repo%3E&assignees=copilot)
— coding agent works faster and does higher quality work when set up for
your repo.
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: JerryNixon <1749983+JerryNixon@users.noreply.github.com>
Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
Bumps [dotnet-sdk](https://github.com/dotnet/sdk) from 8.0.414 to 8.0.415. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/dotnet/sdk/releases">dotnet-sdk's releases</a>.</em></p> <blockquote> <h2>.NET 8.0.21</h2> <p><a href="https://github.com/dotnet/core/releases/tag/v8.0.21">Release</a></p> <h2>What's Changed</h2> <ul> <li>Update branding to 8.0.415 by <a href="https://github.com/vseanreesermsft"><code>@vseanreesermsft</code></a> in <a href="https://redirect.github.com/dotnet/sdk/pull/50585">dotnet/sdk#50585</a></li> <li>[release/8.0.4xx] Update dependencies from dotnet/source-build-reference-packages by <a href="https://github.com/dotnet-maestro"><code>@dotnet-maestro</code></a>[bot] in <a href="https://redirect.github.com/dotnet/sdk/pull/50539">dotnet/sdk#50539</a></li> <li>[release/8.0.4xx] Update dependencies from dotnet/templating by <a href="https://github.com/dotnet-maestro"><code>@dotnet-maestro</code></a>[bot] in <a href="https://redirect.github.com/dotnet/sdk/pull/50357">dotnet/sdk#50357</a></li> <li>[release/8.0.4xx] Update dependencies from dotnet/msbuild by <a href="https://github.com/dotnet-maestro"><code>@dotnet-maestro</code></a>[bot] in <a href="https://redirect.github.com/dotnet/sdk/pull/50381">dotnet/sdk#50381</a></li> <li>[8.0.4xx] detect .NET 10 RID-specific tools and provide a more actionable error by <a href="https://github.com/baronfel"><code>@baronfel</code></a> in <a href="https://redirect.github.com/dotnet/sdk/pull/50414">dotnet/sdk#50414</a></li> <li>Merging internal commits for release/8.0.4xx by <a href="https://github.com/vseanreesermsft"><code>@vseanreesermsft</code></a> in <a href="https://redirect.github.com/dotnet/sdk/pull/50710">dotnet/sdk#50710</a></li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/dotnet/sdk/compare/v8.0.414...v8.0.415">https://github.com/dotnet/sdk/compare/v8.0.414...v8.0.415</a></p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/dotnet/sdk/commit/687ed4752722665dc134f116c6b6c90e3c97fa77"><code>687ed47</code></a> Merged PR 53821: [internal/release/8.0.4xx] Update dependencies from dnceng/i...</li> <li><a href="https://github.com/dotnet/sdk/commit/5f3d59063a562eead28898e0f415113f12638e99"><code>5f3d590</code></a> Merged PR 53812: [internal/release/8.0.4xx] Update dependencies from dnceng/i...</li> <li><a href="https://github.com/dotnet/sdk/commit/12014cbd276787e3c877b78f1287ff5f3caa3902"><code>12014cb</code></a> Merged PR 53793: [internal/release/8.0.4xx] Update dependencies from dnceng/i...</li> <li><a href="https://github.com/dotnet/sdk/commit/6161ee2d0b22d1e4d842c2618b8b75cc555f8bfb"><code>6161ee2</code></a> Merged PR 53752: [internal/release/8.0.4xx] Update dependencies from dnceng/i...</li> <li><a href="https://github.com/dotnet/sdk/commit/ac31327ecd22cd3cc53a112aee87ae8d177cc5d4"><code>ac31327</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-as">https://dev.azure.com/dnceng/internal/_git/dotnet-as</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/0b4b794fec9f6d209b15a33e05241a916197f6fb"><code>0b4b794</code></a> Merged PR 53743: [internal/release/8.0.4xx] Update dependencies from dnceng/i...</li> <li><a href="https://github.com/dotnet/sdk/commit/e073167cc4677e819b032213559f9a15911bed98"><code>e073167</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-wi">https://dev.azure.com/dnceng/internal/_git/dotnet-wi</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/8310c7bba881870d1c3a7a4fb3bb95fece9f958c"><code>8310c7b</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-wi">https://dev.azure.com/dnceng/internal/_git/dotnet-wi</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/83433ff5662d4c25c7e10f7e6ae296ebc7cf4d3d"><code>83433ff</code></a> Merged PR 53553: [internal/release/8.0.4xx] Update dependencies from 3 reposi...</li> <li><a href="https://github.com/dotnet/sdk/commit/b8e06d0a673e51db0b2f392f4fa60d863491e076"><code>b8e06d0</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-ru">https://dev.azure.com/dnceng/internal/_git/dotnet-ru</a>...</li> <li>Additional commits viewable in <a href="https://github.com/dotnet/sdk/compare/v8.0.414...v8.0.415">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: RubenCerna2079 <32799214+RubenCerna2079@users.noreply.github.com>
…2927) ## Why make this change? Bug fix for setting dml tools correctly. Closes on - #2942 ## What is this change? This fix addresses the inconsistency in config where `dml-tools` can be either set as bool or a dictionary object containing individual tool names and their bool values. Without this fix, we need to individually set each `dml-tools` to true currently. With this change, we will have the following scenarios- - set `dml-tools` to `true` will enable all dml tools (default is true) Sample- ```"dml-tools": true``` - set `dml-tools` to `false` will disable all dml tools Sample-```"dml-tools": false``` - individual tool names can be specified as dictionary object for `dml-tools` to turn them on (default is true if a tool is not specified). ## How was this tested? - [ ] Integration Tests - [ ] Unit Tests - [x] Manual testing using various combinations and scenarios **Scenario 1:** Enable all tools using a single boolean value `"dml-tools": true` Note: default is true so even if `dml-tools` unspecified it will default to true. **Scenario 2:** Disable `execute-entities` only and keep other tools enabled as default. ``` "dml-tools": { "execute-entities": false } ``` **Scenario 3:** Use full list of tools and enable or disable them ``` "dml-tools": { "execute-entity": true, "delete-record": false, "update-record": false, "read-records": false, "describe-entities": true } ``` --------- Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
#2951) The MCP `describe_entities` tool returns `"ALL"` for wildcard permissions, which confuses LLM consumers that need explicit operation lists. ### Changes Modified `DescribeEntitiesTool.BuildPermissionsInfo()` to expand `EntityActionOperation.All`: - **Tables/Views**: Expands to `["CREATE", "DELETE", "READ", "UPDATE"]` via `EntityAction.ValidPermissionOperations` - **Stored Procedures**: Expands to `["EXECUTE"]` via `EntityAction.ValidStoredProcedurePermissionOperations` ### Example **Before:** ```json { "name": "Todo", "permissions": ["ALL"] } ``` **After:** ```json { "name": "Todo", "permissions": ["CREATE", "DELETE", "READ", "UPDATE"] } ``` <!-- START COPILOT CODING AGENT SUFFIX --> <details> <summary>Original prompt</summary> > > ---- > > *This section details on the original issue you should resolve* > > <issue_title>[Bug]: MCP `describe_entities` permissions value `ALL` needs to be expanded.</issue_title> > <issue_description>## What? > > Models are confused by `ALL`. > > ```json > { > "entities": [ > { > "name": "Todo", > "description": "This table contains the list of todo items.", > "fields": [ ], > "permissions": [ > "ALL" // this is the problem > ] > } > ], > "count": 1, > "mode": "full", > "status": "success" > } > ``` > > ## Solution > > When table/view. > > ```json > { > "permissions: [ > "CREATE", > "DELETE", > "READ", > "UPDATE" > ] > } > ``` > > When stored procedure. > > ```json > { > "permissions: [ > "EXECUTE" > ] > } > ``` > </issue_description> > > ## Comments on the Issue (you are @copilot in this section) > > <comments> > </comments> > </details> - Fixes #2935 --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: JerryNixon <1749983+JerryNixon@users.noreply.github.com>
## Why make this change? This PR addresses the issue where with serialization of `dml-tools` property for user-provided (non-default) values. This is a follow up to the fix made for the functionality of `dml-tools` property- #2927 ## What is this change? If `dml-tools` is not configured, this property should not be serialized. Serialization should only be done for user provided configuration. So, any value explicitly set or configured by the user for `dml-tools` will be serialized. `dml-tools` can either have either of the below values- - A boolean value- true or false. This is a global value to enable or disable all DML tools and `UserProvidedAllTools` is written in JSON - A dictionary object of individual tool name and boolean values. This contains individual tool specific values and only the specified tools will be taken for JSON writing and unspecified tool names will be ignored. ## How was this tested? - [x] Integration Tests - [x] Unit Tests - [x] Manual Tests Sample scenarios for testing- **Scenario 1:** Enable all tools using a single boolean value `"dml-tools": true` Note: default is true so even if `dml-tools` unspecified it will default to true. **Scenario 2:** Disable `execute-entities` only and keep other tools enabled as default. ``` "dml-tools": { "execute-entities": false } ``` **Scenario 3:** Use full list of tools and enable or disable them ``` "dml-tools": { "execute-entity": true, "delete-record": false, "update-record": false, "read-records": false, "describe-entities": true } ``` --------- Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
… describe-entities MCP tool (#2956) ## Why make this change? - Addresses follow ups to PR #2900 The `describe_entities` tool response format needed improvements to better align with MCP specifications and provide more accurate, user-scoped information. Key issues included non-specification compliant response fields, overly broad permission reporting across all roles, and inconsistent entity/field naming conventions that didn't prioritize user-friendly aliases. ## What is this change? - **Removed non-spec fields from response**: Eliminated `mode` and `filter` fields that were not part of the MCP specification - **Scoped permissions to current user's role**: Modified permissions logic to only return permissions available to the requesting user's role instead of all permissions across all roles - **Implemented entity alias support**: Updated entity name resolution to prefer GraphQL singular names (aliases) over configuration names, falling back to entity name only when alias is absent - **Fixed parameter metadata format**: Changed parameter default value key from `@default` to `default` in JSON response - **Enhanced field name resolution**: Updated field metadata to use field aliases when available, falling back to field names when aliases are absent - **Added proper authorization context**: Integrated HTTP context and authorization resolver to determine current user's role for permission filtering ## How was this tested? - [x] Manual Tests ## Sample Request(s) ``` POST http://localhost:5000/mcp { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "describe_entities" }, "id": 1 } ``` ``` POST http://localhost:5000/mcp { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "describe_entities", "arguments": { "nameOnly": true } }, "id": 2 } ``` ``` POST http://localhost:5000/mcp { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "describe_entities", "arguments": { "entities": ["Book", "Publisher"] } }, "id": 1 } ```
## Why make this change? BECAUSE I AM COOL ## What is this change? Aspirification ## How was this tested? Blood, sweat, and tears ## Sample Request(s) `aspire run` baby --------- Co-authored-by: Tommaso Stocchi <tstocchi@microsoft.com> Co-authored-by: Damian Edwards <damian@damianedwards.com> Co-authored-by: Safia Abdalla <captainsafia@users.noreply.github.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
… settings class and add AKV replacement logic. (#2882) ## Why make this change? Adds AKV variable replacement and expands our design for doing variable replacements to be more extensible when new variable replacement logic is added. Closes #2708 Closes #2748 Related to #2863 ## What is this change? Change the way that variable replacement is handled to instead of simply using a `bool` to indicate that we want env variable replacement, we add a class which holds all of the replacement settings. This will hold whether or not we will do replacement for each kind of variable that we will handle replacement for during deserialization. We also include the replacement failure mode, and put the logic for handling the replacements into a strategy dictionary which pairs the replacement variable type with the strategy for doing that replacement. Because Azure Key Vault secret replacement requires having the retry and connection settings in order to do the AKV replacement, we must do a first pass where we only do non-AKV replacement and get the required settings so that if AKV replacement is used we have the required settings to do that replacement. We also have to keep in mind that the legacy of the `Configuration Controller` will ignore all variable replacement, so we construct the replacement settings for this code path to not use any variable replacement at all. ## How was this tested? We have updated the logic for the tests to use the new system, however manual testing using an actual AKV is still required. ## Sample Request(s) - Example REST and/or GraphQL request to demonstrate modifications - Example of CLI usage to demonstrate modifications --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
…tes) (#2988) ## Why make this change? Closes #2977. Running full integration test suites on PRs that only modify samples, documentation, or configuration files wastes CI resources and slows down contribution velocity. ## What is this change? Added `pr:` trigger sections with path exclusions to all 7 Azure Pipeline definitions in `.pipelines/`: **Excluded paths:** - `samples/**` - Sample configurations and code - `docs/**` - Documentation - `*.md` - Markdown files (README, CONTRIBUTING, etc.) - `schemas/**` - JSON schema definitions - `templates/**` - Project templates **Modified pipelines:** - `azure-pipelines.yml` - Static analysis - `cosmos-pipelines.yml` - CosmosDB integration tests - `dwsql-pipelines.yml` - Data Warehouse SQL tests - `mssql-pipelines.yml` - SQL Server tests - `mysql-pipelines.yml` - MySQL tests - `pg-pipelines.yml` - PostgreSQL tests - `unittest-pipelines.yml` - Unit test suite PRs touching only excluded paths will skip pipeline execution. PRs with mixed changes (code + excluded paths) still run all validations. ## How was this tested? - [x] YAML syntax validation - [ ] Integration Tests - N/A (configuration change) - [ ] Unit Tests - N/A (configuration change) ## Sample Request(s) N/A - Pipeline configuration change only. Validation occurs automatically on PR creation. <!-- START COPILOT CODING AGENT SUFFIX --> <details> <summary>Original prompt</summary> > > ---- > > *This section details on the original issue you should resolve* > > <issue_title>Skip Pull Request validation for certain paths</issue_title> > <issue_description>We really need to add this to our tests in order to avoid requiring the full suite of integration tests when /samples is updated. In fact, there are many paths in this repo that should be set up to allow pull requests to skip integration tests. Please consider. > > ```yaml > on: > pull_request: > paths-ignore: > - 'samples/**' > ``` > > _Originally posted by @JerryNixon in #2977 (comment) > </issue_description> > > <agent_instructions>Modify the yml files in the .pipelines folder which do Pull Request validation to skip paths like samples. This will help checkin PRs on those paths quickly.</agent_instructions> > > ## Comments on the Issue (you are @copilot in this section) > > <comments> > </comments> > </details> - Fixes #2987 <!-- START COPILOT CODING AGENT TIPS --> --- 💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more [Copilot coding agent tips](https://gh.io/copilot-coding-agent-tips) in the docs. --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: Aniruddh25 <3513779+Aniruddh25@users.noreply.github.com>
This is a simple sample that demonstrates how to use DAB in Aspire. I will add it to our docs/quick-starts when this is merged. Please note there is no /src code change, just /samples. --------- Co-authored-by: Jerry Nixon <jerry.nixon@microsoft.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Aniruddh Munde <anmunde@microsoft.com> Co-authored-by: Copilot <198982749+Copilot@users.noreply.github.com> Co-authored-by: Aniruddh25 <3513779+Aniruddh25@users.noreply.github.com>
…ty functions (#2984) ## Why make this change? Closes #2919 ## What is this change? Refactors the `Read` and `Update` built in MCP tools so that they use the common `BuildErrorResult` and `BuildSuccessResult` functiosn in the Utils, aligning their usage with the other tools. ## How was this tested? Manually tested using MCP Inspector tool, and run against normal test suite. * DESCRIBE_ENTITIES <img width="395" height="660" alt="image" src="https://github.com/user-attachments/assets/a7e86256-a2ee-4623-9cc7-7126993a2e12" /> * CREATE <img width="1181" height="672" alt="image" src="https://github.com/user-attachments/assets/9042bd14-83da-48d2-a24f-4d95873a54c5" /> * READ <img width="1210" height="658" alt="image" src="https://github.com/user-attachments/assets/d800d9ed-0b03-4173-bf44-cadc7b612e62" /> * UPDATE <img width="1300" height="593" alt="image" src="https://github.com/user-attachments/assets/6aa38f25-80ab-47e6-aba6-38e80397ae4b" /> * DELETE <img width="1178" height="605" alt="image" src="https://github.com/user-attachments/assets/363fca1d-1ec9-42cd-85c0-b965463c9b80" /> ## Sample Request(s) N/A
## Why make this change? Closes #2748 ## What is this change? Adds the option to use a local .akv file instead of Azure Key Vault for @akv('') replacement in the config file during deserialization. Similar to how we handle .env files. ## How was this tested? A new test was added that verifies we are able to do the replacement and get the correct resultant configuration. --------- Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
## Why make this change? Closes #2932 ## What is this change? Add helper class `McpMetadataHelper`, extend `McpArgumentParser`, and utilize `McpAuthorizationHelper` to factor out common code. We now do the initialization of the metadata, the parsing of arguments, and the authorization checks in these shared helper classes. ## How was this tested? With MCP Inspector and against the normal test suite. * DESCRIBE_ENTITIES <img width="427" height="653" alt="image" src="https://github.com/user-attachments/assets/7ba74cfb-5a71-402b-afd2-17f7a24d0295" /> * CREATE <img width="1435" height="655" alt="image" src="https://github.com/user-attachments/assets/f189bb22-6f25-46ef-b2f0-20e80bc2850f" /> * READ <img width="1131" height="651" alt="image" src="https://github.com/user-attachments/assets/16f3e6f6-24e9-4613-a8fd-61546b199305" /> * UPDATE <img width="1083" height="292" alt="image" src="https://github.com/user-attachments/assets/ce284b6f-1f2f-4dc4-b73d-8242605ee20a" /> * DELETE <img width="1425" height="648" alt="image" src="https://github.com/user-attachments/assets/8768baf6-96f3-441c-b47d-c17e5fae9300" /> ## Sample Request(s) N/A --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Souvik Ghosh <souvikofficial04@gmail.com>
) ### Why make this change? Serialization and deserialization of metadata currently fail when column names are prefixed with the $ symbol. ### Root cause This issue occurs because we’ve enabled the ReferenceHandler flag in our System.Text.Json serialization settings. When this flag is active, the serializer treats $ as a reserved character used for special metadata (e.g., $id, $ref). As a result, any property name starting with $ is interpreted as metadata and cannot be deserialized properly. ### What is this change? This update introduces custom logic in the converter’s Write and Read methods to handle $-prefixed column names safely. - During serialization, columns beginning with $ are escaped as "_$". - During deserialization, this transformation is reversed to restore the original property names. ### How was this tested - [x] Unit tests --------- Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
## Why make this change?
- Add MCP stdio support to Data API Builder and wire it through both the
engine and CLI so DAB can be used as a Model Context Protocol (MCP)
server.
- Ensures MCP sessions can run under a specific DAB authorization role,
making it possible to test and use MCP tools with permissions from
`dab-config.json`.
## What is this change?
Service entrypoint
- Detects `--mcp-stdio` early, configures stdin/stdout encodings, and
redirects all non‑MCP output to STDERR to keep STDOUT clean for MCP
JSON.
- Parses an optional `role:<name>` argument (e.g. role:anonymous,
role:authenticated) and injects it into configuration as `MCP:Role`,
defaulting to `anonymous` when omitted.
- In MCP stdio mode, forces `Runtime:Host:Authentication:Provider =
"Simulator"` via in‑memory configuration so the requested role is always
available during MCP sessions.
- Starts the full ASP.NET Core host, registers all MCP tools from DI,
and runs the MCP stdio loop instead of the normal HTTP `host.Run(`).
CLI Integration
- Adds `--mcp-stdio` to `dab start` to launch the engine in MCP stdio
mode.
- Adds an optional positional `role` argument (e.g. `role:anonymous`)
captured as `StartOptions.McpRole`.
- Keeps existing behavior for non‑MCP `dab start` unchanged.
Note
- `ExecuteEntityTool` now looks for MCP tool inputs under arguments (the
standard MCP field) and falls back to the legacy parameters property
only if arguments is missing. This aligns our server with how current
MCP clients (like VS Code) actually send tool arguments, and preserves
backward compatibility for any older clients that still use parameters.
## How was this tested?
Integration-like manual testing via MCP clients against:
- Engine-based MCP server: `dotnet Azure.DataApiBuilder.Service.dll
--mcp-stdio role:authenticated`.
- CLI-based MCP server: `dab start --mcp-stdio role:authenticated`.
Manual verification of all MCP tools:
- `describe_entities` shows correct entities and effective permissions
for the active role.
- `read_records`, `create_record`, `update_record`, `delete_record`,
`execute_entity` succeed when the role has the appropriate permissions.
## Sample Request(s)
1. MCP server via CLI (dab)
`
{
"mcpServers": {
"dab-with-exe": {
"command":
"C:\\DAB\\data-api-builder\\out\\publish\\Debug\\net8.0\\win-x64\\dab\\Microsoft.DataApiBuilder.exe",
"args": ["start", "--mcp-stdio", "role:authenticated", "--config",
"C:\\DAB\\data-api-builder\\dab-config.json"],
"env": {
"DAB_ENVIRONMENT": "Development"
}
}
}
`
2. MCP server via engine DLL
`
{
"mcpServers": {
"dab": {
"command": "dotnet",
"args": [
"C:\\DAB\\data-api-builder\\out\\publish\\Debug\\net8.0\\win-x64\\dab\\Azure.DataApiBuilder.Service.dll",
"--mcp-stdio",
"role:authenticated",
"--config",
"C:\\DAB\\data-api-builder\\dab-config.json"
],
"type": "stdio"
}
}
}
`
## Why make this change? This change allows entity-level MCP configuration to control which entities participate in MCP runtime tools, providing granular control over DML operations and custom tool exposure. - Closes on #2948 ## What is this change? This change introduces an optional mcp property at the entity level that controls participation in MCP's runtime tools. This is a prerequisite for custom tools support. The MCP property supports two formats: - **Boolean shorthand**: `"mcp": true` or `"mcp": false` - **Object format**: `{"dml-tools": boolean, "custom-tool": boolean}` Property Behavior: 1. Boolean Shorthand (`"mcp": true/false`) - `"mcp": true`: Enables DML tools only; custom tools remain disabled. - `"mcp": false`: Disables all MCP functionality for the entity. 2. Object Format `("mcp": { ... })` - `{ "dml-tools": true, "custom-tool": true }`: Enables both (valid only for stored procedures). - `{ "dml-tools": true, "custom-tool": false }`: DML only. - `{ "dml-tools": false, "custom-tool": true }`: Custom tool only (stored procedures). - `{ "dml-tools": false, "custom-tool": false }`: Fully disabled. Single-property cases: - `{"dml-tools": true}`: Enables DML only; auto-serializes to `"mcp": true`. - `{"custom-tool": true}`: Enables custom tool only; serializes as given. 3. No MCP Configuration in Entity (default) - `dml-tools` will still be enabled by default and no other change is behavior ## How was this tested? - [x] Unit Tests - [x] Integrations Tests - [x] CLI Command Testing Sample CLI commands: Add table with DML tools enabled `dab add Book --source books --permissions "anonymous:*" --mcp.dml-tools true` Add stored procedure with custom tool enabled `dab add GetBookById --source dbo.get_book_by_id --source.type stored-procedure --permissions "anonymous:execute" --mcp.custom-tool true` Add stored procedure with both properties `dab add UpdateBook --source dbo.update_book --source.type stored-procedure --permissions "anonymous:execute" --mcp.custom-tool true --mcp.dml-tools false`
## Why make this change? - #2874 There is a test that is failing and it needs to be fixed in order to comply with the creation of the new MCP endpoint. The test already existed and covered the scenarios for REST and GraphQL endpoints. It will only be partially fixed, only fixing the issues related to the non-hosted scenario, this is due to the fact that fixing the hosted-scenario will need more changes than expected. ## What is this change? This change partially fixes the test that was failing, the non-hosted scenario was failing because the MCP server was not able to start in time before the test tried to access the MCP endpoint. In order to fix it we added a delay so the server is available before the test tries to access the endpoint. On the other hand, the hosted scenario is failing because of the way that DAB initializes its MCP service, which means that the base needs to be changed. Which is a bigger task than what is expected of this PR. ## How was this tested? - [ ] Integration Tests - [X] Unit Tests
Bumps [dotnet-sdk](https://github.com/dotnet/sdk) from 8.0.415 to 8.0.416. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/dotnet/sdk/releases">dotnet-sdk's releases</a>.</em></p> <blockquote> <h2>.NET 8.0.22</h2> <p><a href="https://github.com/dotnet/core/releases/tag/v8.0.22">Release</a></p> <h2>What's Changed</h2> <ul> <li>Update branding to 8.0.416 by <a href="https://github.com/vseanreesermsft"><code>@vseanreesermsft</code></a> in <a href="https://redirect.github.com/dotnet/sdk/pull/51150">dotnet/sdk#51150</a></li> <li>[release/8.0.4xx] Update dependencies from dotnet/arcade by <a href="https://github.com/dotnet-maestro"><code>@dotnet-maestro</code></a>[bot] in <a href="https://redirect.github.com/dotnet/sdk/pull/50449">dotnet/sdk#50449</a></li> <li>[release/8.0.4xx] Update dependencies from dotnet/source-build-reference-packages by <a href="https://github.com/dotnet-maestro"><code>@dotnet-maestro</code></a>[bot] in <a href="https://redirect.github.com/dotnet/sdk/pull/50801">dotnet/sdk#50801</a></li> <li>[release/8.0.4xx] Update dependencies from dotnet/templating by <a href="https://github.com/dotnet-maestro"><code>@dotnet-maestro</code></a>[bot] in <a href="https://redirect.github.com/dotnet/sdk/pull/50800">dotnet/sdk#50800</a></li> <li>Stop building source-build in non-1xx branches by <a href="https://github.com/NikolaMilosavljevic"><code>@NikolaMilosavljevic</code></a> in <a href="https://redirect.github.com/dotnet/sdk/pull/50888">dotnet/sdk#50888</a></li> <li>Remove source-build job dependency by <a href="https://github.com/NikolaMilosavljevic"><code>@NikolaMilosavljevic</code></a> in <a href="https://redirect.github.com/dotnet/sdk/pull/51263">dotnet/sdk#51263</a></li> <li>Merging internal commits for release/8.0.4xx by <a href="https://github.com/vseanreesermsft"><code>@vseanreesermsft</code></a> in <a href="https://redirect.github.com/dotnet/sdk/pull/51244">dotnet/sdk#51244</a></li> <li>[release/8.0.4xx] Update dependencies from dotnet/templating by <a href="https://github.com/dotnet-maestro"><code>@dotnet-maestro</code></a>[bot] in <a href="https://redirect.github.com/dotnet/sdk/pull/51203">dotnet/sdk#51203</a></li> <li>[release/8.0.4xx] Update dependencies from dotnet/arcade by <a href="https://github.com/dotnet-maestro"><code>@dotnet-maestro</code></a>[bot] in <a href="https://redirect.github.com/dotnet/sdk/pull/51277">dotnet/sdk#51277</a></li> </ul> <p><strong>Full Changelog</strong>: <a href="https://github.com/dotnet/sdk/compare/v8.0.415...v8.0.416">https://github.com/dotnet/sdk/compare/v8.0.415...v8.0.416</a></p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/dotnet/sdk/commit/bd8ff52a950627d432878cee33a0c3696836df75"><code>bd8ff52</code></a> Merged PR 54681: [internal/release/8.0.4xx] Update dependencies from dnceng/i...</li> <li><a href="https://github.com/dotnet/sdk/commit/38176ba6bb856417f60e569ad9045dc871effd27"><code>38176ba</code></a> Merged PR 54678: Disable Nuget Audit</li> <li><a href="https://github.com/dotnet/sdk/commit/8da85111fbdb9200a21288140950d7951d594936"><code>8da8511</code></a> Merged PR 54667: [internal/release/8.0.4xx] Update dependencies from dnceng/i...</li> <li><a href="https://github.com/dotnet/sdk/commit/295b84e8fa7f1a1f69d9938b83022bd136f06db9"><code>295b84e</code></a> Merge commit '2e2b95f3fb0ef6ae78b16fca136bb699e7116d07'</li> <li><a href="https://github.com/dotnet/sdk/commit/2e2b95f3fb0ef6ae78b16fca136bb699e7116d07"><code>2e2b95f</code></a> [release/8.0.4xx] Update dependencies from dotnet/arcade (<a href="https://redirect.github.com/dotnet/sdk/issues/51277">#51277</a>)</li> <li><a href="https://github.com/dotnet/sdk/commit/f84c1ce410e7d498bf1057735de964157da54cda"><code>f84c1ce</code></a> [release/8.0.4xx] Update dependencies from dotnet/templating (<a href="https://redirect.github.com/dotnet/sdk/issues/51203">#51203</a>)</li> <li><a href="https://github.com/dotnet/sdk/commit/32c07e93fef713160db7599e7323cb655179cabc"><code>32c07e9</code></a> Merged PR 54377: [internal/release/8.0.4xx] Update dependencies from 3 reposi...</li> <li><a href="https://github.com/dotnet/sdk/commit/c94cfbb8ce36219cd0824dcb9c9ec02557682452"><code>c94cfbb</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-as">https://dev.azure.com/dnceng/internal/_git/dotnet-as</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/750b80323a5edba6f7e1c6d265dcf9dbc5a8838e"><code>750b803</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-wi">https://dev.azure.com/dnceng/internal/_git/dotnet-wi</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/a0ebaef253319281796c7ef84e21b2ad9da9260e"><code>a0ebaef</code></a> Merged PR 54492: internal/release/8.0.4xx - merge from public</li> <li>Additional commits viewable in <a href="https://github.com/dotnet/sdk/compare/v8.0.415...v8.0.416">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
## Why make this change? Closes #2977 Health check endpoint was returning results for stored procedures. Stored procedures should be excluded because: 1. They require parameters not configurable via health settings 2. They are not deterministic, making health checks unreliable ## What is this change? Added filter in `HealthCheckHelper.UpdateEntityHealthCheckResultsAsync()` to exclude entities with `EntitySourceType.StoredProcedure`: ```csharp // Before .Where(e => e.Value.IsEntityHealthEnabled) // After .Where(e => e.Value.IsEntityHealthEnabled && e.Value.Source.Type != EntitySourceType.StoredProcedure) ``` Only tables and views are now included in entity health checks. ## How was this tested? - [ ] Integration Tests - [x] Unit Tests Added `HealthChecks_ExcludeStoredProcedures()` unit test that creates a `RuntimeConfig` with both table and stored procedure entities, then applies the same filter used in `HealthCheckHelper.UpdateEntityHealthCheckResultsAsync` to verify stored procedures are excluded while tables are included. ## Sample Request(s) Health check response after fix (stored procedure `GetSeriesActors` no longer appears): ```json { "status": "Healthy", "checks": [ { "name": "MSSQL", "tags": ["data-source"] }, { "name": "Book", "tags": ["rest", "endpoint"] } ] } ``` <!-- START COPILOT CODING AGENT SUFFIX --> <details> <summary>Original prompt</summary> > > ---- > > *This section details on the original issue you should resolve* > > <issue_title>[Bug]: Health erroneously checks Stored Procedures</issue_title> > <issue_description>## What? > > Health check returns check results for stored procs. It should ONLY include tables and views. > > ## Health output sample > > ```json > { > "status": "Healthy", > "version": "1.7.81", > "app-name": "dab_oss_1.7.81", > "timestamp": "2025-11-17T20:33:42.2752261Z", > "configuration": { > "rest": true, > "graphql": true, > "mcp": true, > "caching": true, > "telemetry": false, > "mode": "Development" > }, > "checks": [ > { > "status": "Healthy", > "name": "MSSQL", > "tags": [ > "data-source" > ], > "data": { > "response-ms": 3, > "threshold-ms": 1000 > } > }, > { > "status": "Healthy", > "name": "GetSeriesActors", // stored procedure > "tags": [ > "graphql", > "endpoint" > ], > "data": { > "response-ms": 1, > "threshold-ms": 1000 > } > }, > { > "status": "Healthy", > "name": "GetSeriesActors", // stored procedure > "tags": [ > "rest", > "endpoint" > ], > "data": { > "response-ms": 5, > "threshold-ms": 1000 > } > } > ] > } > ```</issue_description> > > ## Comments on the Issue (you are @copilot in this section) > > <comments> > <comment_new><author>@souvikghosh04</author><body> > @JerryNixon / @Aniruddh25 should stored procedures and functions be discarded from health checks permanently?</body></comment_new> > <comment_new><author>@JerryNixon</author><body> > The entity checks in the Health endpoint check every table and view type entity with a user-configurable select with a first compared against a user-configurable threshold. We do not check stored procedures, and cannot check stored procedures, as we do not have any mechanism to take parameters as Health configuration values. Also stored procedures are not guaranteed to be deterministic, making checks that would call them potentially be unreliable. So, yes, stored procedures should be ignored. </body></comment_new> > </comments> > </details> - Fixes #2982 <!-- START COPILOT CODING AGENT TIPS --> --- 💬 We'd love your input! Share your thoughts on Copilot coding agent in our [2 minute survey](https://gh.io/copilot-coding-agent-survey). --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: Aniruddh25 <3513779+Aniruddh25@users.noreply.github.com> Co-authored-by: JerryNixon <1749983+JerryNixon@users.noreply.github.com>
…ement (#2973) ## Why make this change? We need to add the new 'autoentities' properties that will later be used to allow the user to add multiple entities at the same time. The properties need to have all the necessary components to be serialized and deserialized from the config file. ## What is this change? - This change adds the `autoentities` property to the schema file with proper structure using `additionalProperties` to allow user-defined autoentity definition names. - Created new files that allow for the properties to be serialized and deserialized: - `AutoentityConverter.cs` - `AutoentityPatternsConverter.cs` - `AutoentityTemplateConverter.cs` - `RuntimeAutoentitiesConverter.cs` - Created new files where deserialized properties are turned to usable objects: - `Autoentity.cs` - `AutoentityPatterns.cs` - `AutoentityTemplate.cs` - `RuntimeAutoentities.cs` - Added entity-level MCP configuration support: - `EntityMcpOptions.cs` - Supports both boolean shorthand and object format for MCP configuration - `EntityMcpOptionsConverterFactory.cs` - Handles serialization/deserialization of MCP options - Registered autoentity converters in `RuntimeConfigLoader.cs` for proper deserialization. - Updated schema description to "Defines automatic entity generation rules for MSSQL tables based on include/exclude patterns and defaults." - Added `required: ["permissions"]` constraint to enforce at least one permission per specification. **Schema Structure**: The autoentities object uses `additionalProperties` to allow any string key as a user-defined autoentity definition name (e.g., "public-tables", "admin-tables", etc.), consistent with how the "entities" section works. **MCP Configuration**: Supports two formats: - Boolean shorthand: `"mcp": true` or `"mcp": false` - Object format: `"mcp": { "dml-tools": true}` Example configuration: ```json { "autoentities": { "<user-defined name>": { // This property name is decided by the user to show a group of autoentities "patterns": { "include": ["dbo.%"], "exclude": ["dbo.internal_%"], "name": "{object}" }, "template": { "mcp": { "dml-tools": true }, "rest": { "enabled": true }, "graphql": { "enabled": true }, "health": { "enabled": true }, "cache": { "enabled": false } }, "permissions": [ { "role": "anonymous", "actions": ["read"] } ] } } } ``` ## How was this tested? - [ ] Integration Tests - [x] Unit Tests --------- Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com> Co-authored-by: RubenCerna2079 <32799214+RubenCerna2079@users.noreply.github.com> Co-authored-by: Ruben Cerna <rcernaserna@microsoft.com>
## Why make this change? - Closes [#2943](#2644) Change default auth provider to AppService from StaticWebApps. Azure Static Web Apps EasyAuth is being deprecated, so DAB should no longer default to [StaticWebApps](vscode-file://vscode-app/c:/Program%20Files/Microsoft%20VS%20Code/resources/app/out/vs/code/electron-browser/workbench/workbench.html) as its authentication provider. - Moving the default to `AppService` aligns DAB with the long‑term supported `EasyAuth` path while keeping behavior equivalent for existing workloads. `StaticWebApps` remains supported when explicitly configured, but new configurations and `dab init` flows should guide users toward `AppService` instead of a deprecated option. ## What is this change? -Config and runtime behavior - Changed the default authentication provider from `Static Web Apps` to `App Service` in the core configuration model and JSON schema. - Added validation that logs a warning when Static Web Apps is explicitly selected (since it’s deprecated as a default). -CLI and `dab init` - Updated `dab init` so that, when no auth provider is specified, it now generates configs using App Service as the provider instead of Static Web Apps. - Adjusted CLI configuration generation and option handling so any “default provider” usage now points to App Service. - Updated end-to-end CLI tests and initialization tests so their expected configurations and arguments reference App Service as the default. -Schema, samples, and built‑in configs - Updated the JSON schema to set the default of the `authentication.provider` property to `AppService`. - Updated sample configuration snippets in the main documentation to show App Service as the provider. - Updated the built‑in `dab-config` JSON files (for all supported databases and multi‑DAB scenarios) so their runtime host sections use App Service. -Engine tests and helpers - Updated test helpers to generate EasyAuth principals appropriate for the configured provider, and to treat App Service as the default in REST and GraphQL integration tests. - Adjusted configuration and health‑endpoint tests to no longer assume Static Web Apps as the implicit provider and to accept App Service as the default. -Snapshots and expected outputs - updated a large set of snapshot files (CLI snapshots, configuration snapshots, entity update/add snapshots) so that anywhere the authentication section previously showed Static Web Apps as the provider, it now shows App Service. -Note We updated `AddEnvDetectedEasyAuth` so that it always registers both the `App Service` and `Static Web Apps` `EasyAuth` schemes in development mode, instead of only adding App Service when certain environment variables are present. This aligns with the new default of using App Service as the primary `EasyAuth` provider and makes dev/test/CI behavior deterministic, while still letting configuration (runtime.host.authentication.provider) choose which scheme is actually used. ## How was this tested? - [x] Integration Tests - [x] Unit Tests ## Sample Request(s) `dab init --database-type mssql --connection-string "<conn-string>"` Generates, `"runtime": { "host": { "authentication": { "provider": "AppService" } } }` Users who still want Static Web Apps can override: `dab init --database-type mssql --connection-string "<conn-string>" --auth.provider StaticWebApps` --------- Co-authored-by: Aniruddh Munde <anmunde@microsoft.com> Co-authored-by: Souvik Ghosh <souvikofficial04@gmail.com> Co-authored-by: aaronburtle <93220300+aaronburtle@users.noreply.github.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Anusha Kolan <anushakolan@microsoft.com>
) ## Why make this change? This change adds documented instructions on integrating the DAB MCP with AI Foundry using an Azure Container Instance. <img width="2398" height="2178" alt="image" src="https://github.com/user-attachments/assets/a7c1ae33-28ea-4c48-b474-12abfc3f263b" /> --------- Co-authored-by: Aniruddh Munde <anmunde@microsoft.com> Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
## Why make this change? - To apply correct serialization and deserialization logic for stored procedures. With the previous changes, serialization was not working correctly for the StoredProcedureDefinition type, which extends SourceDefinition. When the value type was passed explicitly for serialization, the parent type was used instead, causing some child-type properties to be omitted. ## What is this change? Instead of manually specifying the value type during serialization, this change allows the library to infer the type automatically and perform the correct serialization. ## How was this tested? - [x] Unit Tests --------- Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
## Why make this change? This is a minor PR to add architecture diagram to the AI Foundry Integration documentation. --------- Co-authored-by: Anusha Kolan <anushakolan@microsoft.com>
…#3029) ## Why make this change? We are addressing 2 related issues in this PR: Issue #2374 – Nested sibling relationships under books (websiteplacement, reviews, authors) **Problem**: A nested query on books where a parent has multiple sibling relationships (for example, websiteplacement, reviews, and authors) could throw a `KeyNotFoundException` when RBAC or shape changes were involved. Pagination metadata was stored using only the root and the depth in the path, so different sibling relationships at the same depth could overwrite each other or look up the wrong entry. **Solution**: We now key pagination metadata by both depth and the full relationship path (for example, “books → items → reviews” vs “books → items → authors”), so each sibling branch gets its own unique entry. Reads use the same full-path key, and if metadata for a branch is missing, we return an “empty” `PaginationMetadata` instead of throwing. This prevents collisions between sibling relationships and avoids runtime errors when a particular branch has no metadata. Issue #3026 – Person's graph (AddressType / PhoneNumberType) **Problem**: In the persons graph, a query selecting persons → addresses.items.AddressType and persons → phoneNumbers.items.PhoneNumberType could also throw a `KeyNotFoundException`. In some cases (for example, when RBAC removes a relationship or when that relationship is not paginated at all), there is legitimately no pagination metadata for that nested field, but the code assumed it always existed and indexed into the dictionary directly. **Solution**: Metadata handling is now defensive in two places: In the GraphQL execution helper, metadata lookups for object and list fields use safe TryGet-style access; if an entry isn’t present, we fall back to an empty PaginationMetadata instead of failing. In the SQL query engine’s object resolver, we first check whether there is a subquery metadata entry for the field. If there isn’t, we treat the field as non‑paginated and return the JSON as-is rather than throwing. Together, these changes fix both issues by (a) using full path-based keys, so sibling branches don’t conflict, (b) treating missing metadata as “no pagination here” rather than as a fatal error. ## What is this change? 1. In `SqlQueryEngine.ResolveObject`, instead of always doing `parentMetadata.Subqueries[fieldName]` (which crashed when RBAC caused that entry to be missing), it now uses `TryGetValue` and: - If metadata exists and `IsPaginated` is true -> wrap the JSON as a pagination connection. - If metadata is missing -> just return the JSON as-is (no exception). 2. Introduced `GetRelationshipPathSuffix(HotChocolate.Path path)` to build a relationship path suffix like: - `rel1` for `/entity/items[0]/rel1` - `rel1::nested` for `/entity/items[0]/rel1/nested` 3. `SetNewMetadataChildren`, now stores child metadata under keys of the form - `root_PURE_RESOLVER_CTX::depth::relationshipPath`, ensuring siblings at the same depth get distinct entries. 5. `GetMetadata` (used for list items fields): - For `Selection.ResponseName == "items"` and non-root paths, now looks up: a. `GetMetadataKey(context.Path) + "::" + context.Path.Parent.Depth()` plus the relationship suffix from `GetRelationshipPathSuffix(context.Path.Parent)`. b. Uses `ContextData.TryGetValue(...)` and falls back to `PaginationMetadata.MakeEmptyPaginationMetadata()` when metadata is missing (e.g. Cosmos, pruned relationships). 6. `GetMetadataObjectField` (used for object fields like addresses, AddressType, PhoneNumberType): Updated all branches (indexer, nested non-root, root) to: - Append the relationship suffix to the base key (so keys align with `SetNewMetadataChildren`). - Use `ContextData.TryGetValue(...)` instead of direct indexing, return `PaginationMetadata.MakeEmptyPaginationMetadata()` when no metadata exists, instead of throwing. 7. Added a new test case in `MsSqlGraphQLQueryTests`, an integration test which queries books with multiple sibling nested relationships (websiteplacement, reviews, authors) under the authenticated role to: - Assert no KeyNotFoundException, - Verify all nested branches return data. ## How was this tested? Tested both manually and added an integration test (NestedReviewsConnection_WithSiblings_PaginatesMoreThanHundredItems). Manually if we run this query without the bug fix: `query { persons { items { PersonID FirstName LastName addresses { items { AddressID City AddressType { AddressTypeID TypeName } } } phoneNumbers { items { PhoneNumberID PhoneNumber PhoneNumberType { PhoneNumberTypeID TypeName } } } } } }` We get the following response: `{ "errors": [ { "message": "The given key 'AddressType' was not present in the dictionary.", "locations": [ { "line": 11, "column": 11 } ], "path": [ "persons", "items", 0, "addresses", "items", 1, "AddressType" ] }, { "message": "The given key 'AddressType' was not present in the dictionary.", "locations": [ { "line": 11, "column": 11 } ], "path": [ "persons", "items", 0, "addresses", "items", 0, "AddressType" ] }, { "message": "The given key 'AddressType' was not present in the dictionary.", "locations": [ { "line": 11, "column": 11 } ], "path": [ "persons", "items", 1, "addresses", "items", 0, "AddressType" ] } ], "data": { "persons": { "items": [ { "PersonID": 1, "FirstName": "John", "LastName": "Doe", "addresses": { "items": [ { "AddressID": 1, "City": "New York", "AddressType": null }, { "AddressID": 2, "City": "New York", "AddressType": null } ] }, "phoneNumbers": { "items": [ { "PhoneNumberID": 1, "PhoneNumber": "123-456-7890", "PhoneNumberType": { "PhoneNumberTypeID": 1, "TypeName": "Mobile" } }, { "PhoneNumberID": 2, "PhoneNumber": "111-222-3333", "PhoneNumberType": { "PhoneNumberTypeID": 3, "TypeName": "Work" } } ] } }, { "PersonID": 2, "FirstName": "Jane", "LastName": "Smith", "addresses": { "items": [ { "AddressID": 3, "City": "Los Angeles", "AddressType": null } ] }, "phoneNumbers": { "items": [ { "PhoneNumberID": 3, "PhoneNumber": "987-654-3210", "PhoneNumberType": { "PhoneNumberTypeID": 2, "TypeName": "Home" } } ] } } ] } } }` After the bug fix, we get, `{ "data": { "persons": { "items": [ { "PersonID": 1, "FirstName": "John", "LastName": "Doe", "addresses": { "items": [ { "AddressID": 1, "City": "New York", "AddressType": { "AddressTypeID": 1, "TypeName": "Home" } }, { "AddressID": 2, "City": "New York", "AddressType": { "AddressTypeID": 2, "TypeName": "Work" } } ] }, "phoneNumbers": { "items": [ { "PhoneNumberID": 1, "PhoneNumber": "123-456-7890", "PhoneNumberType": { "PhoneNumberTypeID": 1, "TypeName": "Mobile" } }, { "PhoneNumberID": 2, "PhoneNumber": "111-222-3333", "PhoneNumberType": { "PhoneNumberTypeID": 3, "TypeName": "Work" } } ] } }, { "PersonID": 2, "FirstName": "Jane", "LastName": "Smith", "addresses": { "items": [ { "AddressID": 3, "City": "Los Angeles", "AddressType": { "AddressTypeID": 1, "TypeName": "Home" } } ] }, "phoneNumbers": { "items": [ { "PhoneNumberID": 3, "PhoneNumber": "987-654-3210", "PhoneNumberType": { "PhoneNumberTypeID": 2, "TypeName": "Home" } } ] } } ] } } }` ## Sample Request(s) - Example REST and/or GraphQL request to demonstrate modifications - Example of CLI usage to demonstrate modifications --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: RubenCerna2079 <32799214+RubenCerna2079@users.noreply.github.com>
…velope, and add default anonymous role for stdio mode. (#3035) ## Why make this change? This is a follow up PR for, #2983 to address some of the newer comments. ## What is this change? 1. Centralized JSON‑RPC error handling: all standard JSON‑RPC error codes are now defined in one place and used consistently throughout the MCP server (for things like parse errors, invalid requests, unknown methods, invalid parameters, and internal errors). 2. Standardized how responses are written: both successful results and errors now go through shared helpers, so every MCP response has a consistent JSON‑RPC envelope and is easy to change or audit. 3. Added an implicit default role to be `anonymous` when no role is provided in `mcp-stdio` mode. ## How was this tested? Tested manually by calling all the CRUD tools and execute tool. Added an MCP server in VS Code in `C:\DAB\data-api-builder\.vscode\mcp.json` `"dab-with-exe": { "command": "C:\\DAB\\data-api-builder\\out\\publish\\Debug\\net8.0\\win-x64\\dab\\Microsoft.DataApiBuilder.exe", "args": ["start", "--mcp-stdio", "--config", "C:\\DAB\\data-api-builder\\dab-config.json"], "env": { "DAB_ENVIRONMENT": "Development" } }` 1. When calling tools with `Authenticated` all tools pass. 2. When calling tools with `Anonymous`, all tools except `describe-entities` fail with permission denied error. 5. When calling tools without any role parameter, all tools except `describe-entities` fail with permission denied error.
Bumps [dotnet-sdk](https://github.com/dotnet/sdk) from 8.0.416 to 8.0.417. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/dotnet/sdk/releases">dotnet-sdk's releases</a>.</em></p> <blockquote> <h2>.NET 8.0.23</h2> <p><a href="https://github.com/dotnet/core/tree/v8.0.23">Release</a></p> </blockquote> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/dotnet/sdk/commit/e6abed5c59bd02aec05a2e0558c10ea67370666e"><code>e6abed5</code></a> Merged PR 56000: [internal/release/8.0.4xx] Update dependencies from dnceng/i...</li> <li><a href="https://github.com/dotnet/sdk/commit/df04d4880acfd04831b9bdfa608f315f24c8a1ae"><code>df04d48</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-as">https://dev.azure.com/dnceng/internal/_git/dotnet-as</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/c8f8b7306757d36fa53c1adc8b4ec7137a1288f1"><code>c8f8b73</code></a> Merged PR 55911: [internal/release/8.0.4xx] Update dependencies from 3 reposi...</li> <li><a href="https://github.com/dotnet/sdk/commit/23b0c6d1174802a0e7240cc3d97ccda2bec1e08c"><code>23b0c6d</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-wi">https://dev.azure.com/dnceng/internal/_git/dotnet-wi</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/060088aafda086ebd0d2573e5c90c49006d712ee"><code>060088a</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-as">https://dev.azure.com/dnceng/internal/_git/dotnet-as</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/44e74d49ae60bd57e5bc00d8e3c5fbc43718a7c8"><code>44e74d4</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-ru">https://dev.azure.com/dnceng/internal/_git/dotnet-ru</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/fde61e7181999ec8aac18fb389ff20533a82c0bc"><code>fde61e7</code></a> Update dependencies from <a href="https://dev.azure.com/dnceng/internal/_git/dotnet-as">https://dev.azure.com/dnceng/internal/_git/dotnet-as</a>...</li> <li><a href="https://github.com/dotnet/sdk/commit/57efc5281c7de5b93ad49b9d03e7af7c5a05ccf3"><code>57efc52</code></a> Merge commit '26ca00f373a242702c1ebecd1dc3e1a7e03a7896'</li> <li><a href="https://github.com/dotnet/sdk/commit/26ca00f373a242702c1ebecd1dc3e1a7e03a7896"><code>26ca00f</code></a> [automated] Merge branch 'release/8.0.3xx' => 'release/8.0.4xx' (<a href="https://redirect.github.com/dotnet/sdk/issues/52049">#52049</a>)</li> <li><a href="https://github.com/dotnet/sdk/commit/1a2dff3302065f5445ba086025acca94ad67eacf"><code>1a2dff3</code></a> Merge commit 'b19a0e2d71b3111b1e1acdaee0394b729b463fac'</li> <li>Additional commits viewable in <a href="https://github.com/dotnet/sdk/compare/v8.0.416...v8.0.417">compare view</a></li> </ul> </details> <br /> [](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <support@github.com> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com> Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
## Why make this change? - Closes on - #2877 ## What is this change? This pull request introduces a new system for dynamically generating and registering custom MCP tools based on stored procedure entity configurations in the runtime configuration. The main changes are the implementation of the `DynamicCustomTool` class, a factory to create these tools from configuration, and the necessary service registration logic to ensure these custom tools are available at runtime. **Dynamic custom MCP tool support:** * Added the `DynamicCustomTool` class, which implements `IMcpTool` and provides logic for generating tool metadata, validating configuration, handling authorization, executing the underlying stored procedure, and formatting the response. This enables each stored procedure entity with `custom-tool` enabled to be exposed as a dedicated MCP tool. * Introduced the `CustomMcpToolFactory` class, which scans the runtime configuration for stored procedure entities marked with `custom-tool` enabled and creates corresponding `DynamicCustomTool` instances. **Dependency injection and service registration:** * Updated the MCP server startup (`AddDabMcpServer`) to register custom tools generated from configuration by calling a new `RegisterCustomTools` method after auto-discovering static tools. * Modified the `RegisterAllMcpTools` method to exclude `DynamicCustomTool` from auto-discovery (since these are created dynamically per configuration) and added the `RegisterCustomTools` method to register each generated custom tool as a singleton service. ## How was this tested? - [x] Unit Tests - [x] Manual Tests using Insomnia and VS code GHCP chat ## Sample Request(s) 1. List All Tools (also includes custom tool) ``` { "jsonrpc": "2.0", "method": "tools/list", "params": {}, "id": 1 } ``` 2. Get Books (no parameters) ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "get_books" }, "id": 2 } ``` 3. Get Book by ID ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "get_book", "arguments": { "id": 1 } }, "id": 3 } ``` 4. Insert Book ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "insert_book", "arguments": { "title": "Test Book from MCP", "publisher_id": "1234" } }, "id": 4 } ``` 5. Count Books (no parameters) ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "count_books" }, "id": 5 } ``` Error Scenarios 6. Missing Required Parameter ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "get_book" }, "id": 6 } ``` 7. Non-Existent Tool ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "non_existent_tool" }, "id": 7 } ``` 8. Invalid Foreign Key ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "insert_book", "arguments": { "title": "Test Book", "publisher_id": "999999" } }, "id": 8 } ``` Edge Cases 9. SQL Injection Attempt ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "get_book", "arguments": { "id": "1; DROP TABLE books; --" } }, "id": 9 } ``` 11. Special Characters ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "insert_book", "arguments": { "title": "Test Book with 'quotes' and \"double quotes\" and <tags>", "publisher_id": "1234" } }, "id": 10 } ``` 12. Empty String ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "insert_book", "arguments": { "title": "", "publisher_id": "1234" } }, "id": 11 } ``` 13. Invalid Type (string for int) ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "get_book", "arguments": { "id": "not_a_number" } }, "id": 12 } ``` 14. Negative ID ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "get_book", "arguments": { "id": -1 } }, "id": 13 } ``` 15. Maximum Integer ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "get_book", "arguments": { "id": 2147483647 } }, "id": 14 } ``` 16. Case Sensitivity (should fail) ``` { "jsonrpc": "2.0", "method": "tools/call", "params": { "name": "GET_BOOKS" }, "id": 15 } ``` --------- Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
#3054) … it uses the environment replacement and looks for a true false 1 or 0. ## Why make this change? Fixes #3053 Boolean values can't be set using environment variables. This allows that ## What is this change? Using custom JsonConverter for bools that if a string is detected it uses the string serialiser that uses the environment replacement rules. ## How was this tested? - [ ] Integration Tests - [x] Unit Tests --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> Co-authored-by: Jerry Nixon <1749983+JerryNixon@users.noreply.github.com> Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
… entity command. (#3077) ## Why make this change? When updating an entity with a command like: `dab update Incident --fields.name "Id" --fields.description "Unique Key"`, the existing `primary-key` flag on Id was being changed to `false`even though no `primary-key` option was specified. This change removes that surprise so primary-key metadata only changes when explicitly requested. ## What is this change? - Adjusts the entity update logic so that: 1. If the user does not provide any fields.primary-key option, existing primary-key flags for fields mentioned in fields.name are preserved. 2. If the user does provide fields.primary-key, those true/false values are still applied positionally to the corresponding fields exactly as before. - Adds a regression test that: 1. Starts from a config where Id is already a primary key. 2. Runs an update that only changes the description for Id via `fields.name` and `fields.description`. 3. Verifies that the description changes and the primary-key flag on Id remains true. ## How was this tested? - [ ] Integration Tests - [x] Unit Tests 1. Existing CLI unit tests for entity update. 2. New regression test that validates `primary-key` is preserved when only the field description is updated. ## Sample Request(s) - To demonstrate the fixed behavior: 1. Initial state: Id is configured as a primary key on the entity. 2. Command: `dab update Incident --fields.name "Id" --fields.description "Unique Key 2"` 3. Result after this change: 4. The description for Id becomes “Unique Key 2”. 5. The primary-key flag for Id stays true. - To explicitly change primary keys (still supported): 1. Set Id as primary key: ` dab update Incident --fields.name "Id" --fields.primary-key true` 2. Clear Id as primary key: `dab update Incident --fields.name "Id" --fields.primary-key false`
## Why make this change? Closes #2010 Note that hot-reload of the graphQL schema is bugged currently, and tests are still ignored until this issue is resolved: #3019 ## What is this change? As documented in the issue above, a number of hot-reload tests were failing due to relying on a simple timeout to allow the hot-reload to complete within tests. This creates a number of potential problems, including potential race conditions within the tests. We replace that strategy with a waiting process and change from syn to async in a number of functions. We also cleanup the config creation in the tests so that the new properties added since these tests were ignored fit into the flow and expectations of the tests. ## How was this tested? The previously ignored tests are now run and have the ignore tag removed. ## Sample Request(s) N/A
## Why make this change?
MCP clients and agents require high-level behavioral context for servers
via the `initialize` response's `instructions` field. DAB previously had
no mechanism to surface this configurable semantic guidance.
## What is this change?
Added optional `description` field to MCP runtime configuration that
populates the MCP protocol's `instructions` field:
**Configuration model**
- `McpRuntimeOptions` now accepts `description` parameter
- `McpRuntimeOptionsConverter` handles serialization/deserialization
**CLI integration**
- `dab configure --runtime.mcp.description "text"` command support
- Configuration generator validates and persists the value
- Fixed config persistence bug: Added DML tools options to condition
check to ensure MCP configuration updates are properly written to config
file
**MCP server response**
- **Stdio Server**: `HandleInitialize()` retrieves description from
`RuntimeConfig.Runtime.Mcp.Description` and conditionally includes
`instructions` in initialize response when non-empty
- **HTTP Server**: Updated server name to "SQL MCP Server"
- Both servers now use explicit `object` type instead of `var` for
better type clarity
**Testing**
- Added comprehensive unit tests in
`McpRuntimeOptionsSerializationTests` covering:
- Serialization/deserialization with description
- Edge cases: null, empty strings, whitespace, very long strings (5000+
characters)
- Special characters: quotes, newlines, tabs, unicode characters
- Backward compatibility with existing configurations without
description field
- Improved assertion order to validate JSON field presence before value
matching
- Consolidated CLI tests: removed duplicate
`TestAddDescriptionToMcpSettings` and renamed
`TestUpdateDescriptionForMcpSettings` to
`TestConfigureDescriptionForMcpSettings`
**Code quality fixes**
- Fixed build errors by removing nullable reference type annotations
(project has nullable disabled)
- Fixed IDE0090 code style error by using target-typed new expression
- Fixed whitespace formatting error by removing trailing whitespace
Example configuration:
```json
{
"runtime": {
"mcp": {
"enabled": true,
"description": "This MCP provides access to the Products database..."
}
}
}
```
## How was this tested?
- [x] Unit Tests
- 9 serialization/deserialization tests for MCP description field
- 2 CLI configuration tests for description option (consolidated from 3)
- All existing ConfigureOptionsTests pass (58 tests)
- [x] Build verification - all projects build successfully with no
errors or warnings
## Sample Request(s)
CLI usage:
```bash
dab configure --runtime.mcp.description "This MCP provides access to the Products database and should be used to answer product-related or inventory-related questions from the user."
```
MCP initialize response (when description configured - Stdio Server):
```json
{
"jsonrpc": "2.0",
"id": 1,
"result": {
"protocolVersion": "2024-11-05",
"capabilities": { ... },
"serverInfo": {
"name": "SQL MCP Server",
"version": "1.0.0"
},
"instructions": "This MCP provides access to the Products database..."
}
}
```
**Note**: The HTTP server currently only updates the server name to "SQL
MCP Server". Instructions support will be added when the
ModelContextProtocol.AspNetCore library adds support for this field in
future versions.
<!-- START COPILOT CODING AGENT TIPS -->
---
💬 We'd love your input! Share your thoughts on Copilot coding agent in
our [2 minute survey](https://gh.io/copilot-coding-agent-survey).
---------
Co-authored-by: copilot-swe-agent[bot] <198982749+Copilot@users.noreply.github.com>
Co-authored-by: JerryNixon <1749983+JerryNixon@users.noreply.github.com>
Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Aniruddh25 <3513779+Aniruddh25@users.noreply.github.com>
Co-authored-by: anushakolan <45540936+anushakolan@users.noreply.github.com>
Co-authored-by: Anusha Kolan <anushakolan10@gmail.com>
Co-authored-by: Aniruddh Munde <anmunde@microsoft.com>
Collaborator
|
@copilot try again |
Contributor
Author
I've successfully cherry-picked both commits (3eddfdd4 and 4761672e) onto release/1.7. The build passes and all 50 unit tests succeed. However, I cannot force-push to update the PR branch due to diverged history. The cherry-picks are ready locally - could you help force-push the branch or grant me permission to do so? |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Original prompt
✨ Let Copilot coding agent set things up for you — coding agent works faster and does higher quality work when set up for your repo.