How to Safely Wire Claude into Azure DevOps for D365 Engagements
A configuration walkthrough for the official @azure-devops/mcp server: installation, authentication, a verification prompt, and the safety practices that matter when work-item access goes through an AI.
Most D365 and Power Platform consulting work runs on Azure DevOps. Bugs, user stories, sprint backlogs, deployment notes, retro action items: all of it lives in work items. When an AI assistant is asked to help on those tickets without direct access to the work-item system, it has to work from human-summarized chat relays, which strips out the verbatim error messages, file paths, and identifiers that determine whether a proposed fix is grounded or shaped wrong.
Microsoft publishes an official MCP server, @azure-devops/mcp, that closes that gap. It exposes around 70 tools across nine domains, including a 23-tool work-items toolkit (wit_get_work_item, wit_query_by_wiql, wit_add_work_item_comment, wit_create_work_item, wit_link_work_item_to_pull_request, and 18 others) [ref: TOOLSET.md]. Wired into Claude Code, an AI assistant can read tickets, run WIQL, post HTML comments, and link work items to pull requests from inside the IDE.
This guide is the configuration walkthrough, the safety practices, and a verification prompt that confirms the wiring in under a minute. It is grounded in the published repository, not a measured engagement outcome. Calibrate any productivity claim against your own ticket data; the strategic framework and an illustrative ROI calculator live on az365.ai.
Why Bother
Two operating models for the same bug-triage task:
- AI works from a human-paraphrased version of the ticket. The summary loses the verbatim error message that lives in the description. Proposed fixes target the wrong shape of problem. Iterations are required before someone pulls the source ticket and sees the actual error.
- AI has direct access to the ticket, the comments, and the linked artifacts. Proposed fixes are grounded in the verbatim error and the actual file paths. The reviewer’s role compresses to approval and verification.
The gap between these two operating models is structural. It compounds across a sprint. The strategic version of this argument, with citations and an illustrative ROI calculator, is on az365.ai/blog/ai-engineering-productivity-roi-tokens-vs-closed-tickets. This guide is the implementation detail.
What @azure-devops/mcp Actually Exposes
The official Microsoft package is not pipelines-only. It covers the surface a D365 / Power Platform consultant uses every day. The toolset documentation lists the following work-items tools (a partial list of the 23 in the work-items domain) [ref: TOOLSET.md]:
| Tool | What it does |
|---|---|
wit_get_work_item | Read a work item by ID, with fields, comments, and links |
wit_get_work_items_batch_by_ids | Batch read multiple work items |
wit_query_by_wiql | Run a WIQL query and return matching IDs |
wit_my_work_items | List work items assigned to the current user |
wit_create_work_item | Create a new work item with field values and links |
wit_update_work_item | Update fields on an existing work item |
wit_update_work_items_batch | Batch update multiple work items |
wit_add_child_work_items | Add child work items under a parent |
wit_work_items_link | Link two work items with a specified relationship |
wit_link_work_item_to_pull_request | Link a work item to a PR |
wit_add_work_item_comment | Post a comment (Markdown or HTML) |
wit_list_work_item_comments | Read the comment thread on a work item |
wit_get_work_item_attachment | Download an attachment as base64 or to disk |
The full repository covers nine domains: Core, Work, Work Items, Repositories, Pull Requests, Pipelines (Builds and Releases), Wiki, Test Plans, and Search. Read TOOLSET.md before assuming what is or is not available.
Step 1: Generate a Scoped Personal Access Token (Optional)
The official server’s documented authentication path is a browser-based Microsoft account login on first tool invocation [ref: README.md]. PAT-based authentication is not described in the upstream README. If your organization or your installed version supports a PAT mode, the steps below are the right way to scope and store the token. If the browser flow works for you, skip to Step 2.
Go to https://dev.azure.com/<your-org>/_usersSettings/tokens and click New Token.
| Field | Value |
|---|---|
| Name | claude-mcp |
| Organization | the one you are working in |
| Expiration | 90 days max |
| Scopes | Custom defined |
Then check only what you actually need:
| Scope | Permission | When to grant |
|---|---|---|
| Work Items | Read, write, & manage | If the AI will read or write tickets (the common case) |
| Code | Read | If the AI will look at PRs or repo files |
| Build | Read | If the AI will check pipeline status |
| Test Management | Read | If you use ADO Test Plans |
| Wiki | Read & write | If you maintain the project wiki via AI |
Do not grant Full access. A leaked PAT with full access can do anything you can do across every project you touch, including deletes. A scoped PAT can do only the bounded set of operations.
Click Create. Copy the token immediately. You will not see it again.
For PAT scope reference, see Microsoft Learn: Use personal access tokens.
Step 2: Configure mcp.json
The upstream README documents the configuration shape for VS Code at .vscode/mcp.json [ref: README.md]. The minimal shape is:
{
"inputs": [
{
"id": "ado_org",
"type": "promptString",
"description": "Azure DevOps organization name (e.g. 'contoso')"
}
],
"servers": {
"ado": {
"type": "stdio",
"command": "npx",
"args": ["-y", "@azure-devops/mcp", "${input:ado_org}"]
}
}
}
Notes for D365 / Power Platform consultants specifically:
- The server key
"ado"becomes the tool prefix. Tools surface inside Claude with that prefix. - The org name is positional. Pass
Contoso, nothttps://dev.azure.com/Contoso. - You can scope the loaded tool surface with
-dand a domain list, for example["-y", "@azure-devops/mcp", "${input:ado_org}", "-d", "core", "work", "work-items"]. - If your Claude integration uses a
.mcp.jsonshape withmcpServersrather than the VS Codeserversshape, follow your client’s documentation. The args list and the package name remain the same. - Add the config file to
.gitignoreif your client requires inline credentials. Even when no token is in it now, a future change might be tempted to inline one.
Step 3: Authenticate
With the configuration above in place, save the file and start the MCP server from your client (in VS Code, click “Start” in the mcp.json editor; in Claude Code with a project-scoped config, opening the project triggers the spawn). The first time an Azure DevOps tool is invoked, a browser window opens prompting you to log in with your Microsoft account [ref: README.md].
Use credentials that match the selected Azure DevOps organization. The token is cached by the underlying authentication library; subsequent sessions do not require a fresh browser prompt unless the cache is cleared or the token expires.
If your installed version supports PAT-based authentication and you generated one in Step 1, follow the upstream documentation for that version. The principle is the same: the credential lives in a host-managed store or environment variable, never inline in mcp.json and never in chat.
Step 4: Verification Prompt
Open Claude Code in your project. Paste this verification prompt as the first message of the session:
Verify @azure-devops/mcp is loaded and working.
Step 1: List MCP tools whose names start with the ADO server prefix.
Expected at least:
- wit_get_work_item
- wit_query_by_wiql
- wit_add_work_item_comment
Step 2: Read a known work item via wit_get_work_item.
Show title and state.
Step 3: WIQL via wit_query_by_wiql:
SELECT [System.Id], [System.Title]
FROM WorkItems
WHERE [System.AssignedTo] = @Me AND [System.State] <> 'Done'
Return count.
Read-only. Do not post or create. Report PASS/FAIL on each step.
Each failure has a deterministic cause:
- Step 1 fails: package name in the config is wrong, or the MCP server crashed. Check the client’s MCP output panel for errors.
- Step 2 or 3 fails: authentication did not complete. If using browser auth, watch for the popup; if using PAT mode, regenerate at the tokens URL and update your credential store.
A short verification prompt at the start of every session is cheaper than spending the session debugging “the AI is being weird today.” Make it the first message.
Common Operations
Once verification passes, here are five patterns that change day-to-day work on a D365 / Power Platform engagement. Each names the specific wit_* tool the MCP dispatches.
Pattern 1: Read a Bug, Diagnose, Propose a Fix
Pull bug <bug-id>. Read the description, the comments, and the linked
artifacts. Tell me the root cause and the specific file and line you
would change. Do not make edits or post comments yet.
The AI calls wit_get_work_item, reads the linked artifacts (a Power Automate flow JSON, a Dataverse plug-in source file, a related work item), and comes back with a diagnosis grounded in the verbatim ticket text. The reviewer reads the proposed fix, checks the cited file and line, and approves or redirects. No copy-paste between Boards and the IDE.
Pattern 2: Bulk Triage Across the Sprint
What bugs are open and assigned to me, sorted by severity?
The AI calls wit_query_by_wiql with a WIQL query and returns a summarized list. A query that takes a minute or more clicking through Boards arrives summarized in the IDE, suitable for prioritizing the next ticket.
Pattern 3: Post-Deploy Comment with At-Mention
Comment on bugs <bug-id-1> and <bug-id-2> with the developer who owns
that area mentioned. Ask them to retest after deploy <deploy-version>.
Format the comment as HTML with a bullet list of what changed.
The AI calls wit_add_work_item_comment with format: "Html" and a body containing the data-vss-mention markup using the developer’s AAD GUID (pulled from any prior work-item response that has them in System.AssignedTo or as a comment author). The comment renders in the discussion thread with a clickable mention and triggers ADO’s notification.
Pattern 4: File Sibling Bugs Under a Parent PBI
File two new bugs as children of PBI <pbi-id>: one for "<symptom-1>",
one for "<symptom-2>". Link both as Related to bug <bug-id>.
The AI calls wit_create_work_item twice with JSON Patch documents that include a relations block linking each new bug as a child of the PBI and as Related to the existing bug. Two new work items created, parented, and linked, with a summary comment returned.
Pattern 5: Link Work Items to a Pull Request
Link bug <bug-id> to PR <pr-id> in the <your-repo> repository.
The AI calls wit_link_work_item_to_pull_request. Single tool call, done.
Safety Practices That Are Not Optional
This is the part most setup guides skip. On a client engagement these practices are mandatory.
| Practice | Why | How |
|---|---|---|
Config file in .gitignore | A future change might inline a token and commit it | Add the rule, verify with git check-ignore <config-file> |
| Settings sync turned off for credential file | If credentials live in a synced settings file, they leak | Settings Sync configuration: turn off the affected file |
| PAT scoped minimally | Limits blast radius if the token leaks | Step 1’s scope table |
| PAT expiration 90 days or less | Forces rotation review | Set in the token creation form |
| Calendar reminder for renewal | Avoids “PAT expired in the middle of a sprint” | Calendar entry seven days before expiration |
| Separate PATs per use case | The AI gets its own; CI gets its own; personal scripting gets its own | Easier to audit, rotate, and revoke |
| Immediate revocation on suspected leak | Tokens that leaked once cannot be unleaked | Bookmark https://dev.azure.com/<org>/_usersSettings/tokens |
| AAD GUIDs treated as PII | At-mention markup contains user identity GUIDs | Do not commit them to public repos or share in screenshots |
The practical guideline: assume any PAT generated will leak at some point. Scope it so a leak is an inconvenience instead of a disaster.
Common Pitfalls
| Symptom | Cause | Fix |
|---|---|---|
wit_* tools not appearing in the client | Package name in config is wrong, or the server crashed on start | Check the client’s MCP output panel for the spawn log |
| Auth prompt does not appear | Browser flow disabled, or credential cache holds a stale entry | Clear the credential cache for the affected account, retry |
| Tool call returns 401 / 403 | Account lacks access to the org or project, or PAT scope is too narrow | Confirm org membership; widen PAT scopes only as needed |
Access Denied: PAT expired | PAT expired or scoped wrong | Regenerate at the tokens URL, update credential store |
/tmp/foo: No such file or directory | Windows Git Bash does not have /tmp | Use d:/tmp with mkdir -p, or switch to PowerShell |
Unexpected UTF-8 BOM from curl pipes | ADO returns JSON with a BOM prefix | Strip in Python: if raw.startswith(codecs.BOM_UTF8): raw = raw[3:] |
cp1252 charmap codec can't encode | Stdout in cp1252, content has emoji or non-ASCII | export PYTHONIOENCODING=utf-8 before invoking Python |
| AI fills in a name from an unsigned chat message | No source ticket access, AI is guessing from context | Refuse to proceed without source. Ask “who is this from?” |
When to Drop to REST API as a Fallback
The official MCP covers most operations a D365 / Power Platform consultant needs. For the gaps (uploading attachments outside the documented attachment tool, custom field type creation, some auditing endpoints), drop to the Azure DevOps REST API:
RAW_PAT="<your-raw-pat>"
AUTH=$(echo -n ":$RAW_PAT" | base64 -w0)
curl -s -H "Authorization: Basic $AUTH" \
"https://dev.azure.com/<org>/_apis/wit/workItems/<id>?api-version=7.1"
The REST API documents Basic auth with an empty username and the raw PAT after the colon, base64-encoded [ref: Azure DevOps REST API authentication]. This is the REST envelope. The MCP server’s authentication is separate; do not mix the two.
The REST fallback also has its own foot-gun worth knowing. POSTing comments requires a project-scoped URL:
| URL | GET | POST |
|---|---|---|
/<org>/_apis/wit/workItems/<id>/comments | 200 | 404 |
/<org>/<project>/_apis/wit/workItems/<id>/comments | 200 | 200 |
The MCP handles scope internally, so this only matters if you are dropping to REST.
Three Disciplines for Every Engagement
Read TOOLSET.md before adding any MCP package to your config. The npm package name is not the spec. Inferring capabilities from session-visible tools instead of reading the docs costs time. Run npm view <package> and scan the repository’s TOOLSET file before installing anything.
Treat any PAT as a leakable secret from the moment it is generated. Scope minimally, expiration short, separate tokens per use case, calendar the rotation, never paste into chat. The first time one of these disciplines is skipped is the time it bites.
Verify access at the start of every session. PATs expire silently. Auth caches go stale. Client processes inherit configuration at spawn time. Thirty seconds running the verification prompt above is cheaper than spending a session debugging “the AI is being weird today.”
The strategic / business-case version of this material, including the operating-model framework and an illustrative ROI calculator for engineering leaders, is on az365.ai.
Stay in the loop
Get new posts delivered to your inbox. No spam, unsubscribe anytime.
Related articles
How to Build Your First PCF Control in 2026
Step-by-step guide to building, testing, and deploying a PCF control with modern tooling. From pac pcf init to a working control on a D365 form.
I Built a PCF Control with AI - Here's Every Prompt I Used
The exact AI prompts, failures, and iterations behind a production PCF control with 90+ unit tests and config-driven architecture.
Designing PCF Controls for Three Deployment Profiles: Permissions, Performance, Volume
Most PCF controls only need to satisfy one deployment profile. Reusable controls have to handle three. How config-driven design, privilege fallback, and pagination keep one ZIP working across very different environments.