Skip to content

How to Safely Wire Claude into Azure DevOps for D365 Engagements

A configuration walkthrough for the official @azure-devops/mcp server: installation, authentication, a verification prompt, and the safety practices that matter when work-item access goes through an AI.

Victoria Pechenizka 10 min read

Most D365 and Power Platform consulting work runs on Azure DevOps. Bugs, user stories, sprint backlogs, deployment notes, retro action items: all of it lives in work items. When an AI assistant is asked to help on those tickets without direct access to the work-item system, it has to work from human-summarized chat relays, which strips out the verbatim error messages, file paths, and identifiers that determine whether a proposed fix is grounded or shaped wrong.

Microsoft publishes an official MCP server, @azure-devops/mcp, that closes that gap. It exposes around 70 tools across nine domains, including a 23-tool work-items toolkit (wit_get_work_item, wit_query_by_wiql, wit_add_work_item_comment, wit_create_work_item, wit_link_work_item_to_pull_request, and 18 others) [ref: TOOLSET.md]. Wired into Claude Code, an AI assistant can read tickets, run WIQL, post HTML comments, and link work items to pull requests from inside the IDE.

This guide is the configuration walkthrough, the safety practices, and a verification prompt that confirms the wiring in under a minute. It is grounded in the published repository, not a measured engagement outcome. Calibrate any productivity claim against your own ticket data; the strategic framework and an illustrative ROI calculator live on az365.ai.

Why Bother

Two operating models for the same bug-triage task:

  • AI works from a human-paraphrased version of the ticket. The summary loses the verbatim error message that lives in the description. Proposed fixes target the wrong shape of problem. Iterations are required before someone pulls the source ticket and sees the actual error.
  • AI has direct access to the ticket, the comments, and the linked artifacts. Proposed fixes are grounded in the verbatim error and the actual file paths. The reviewer’s role compresses to approval and verification.

The gap between these two operating models is structural. It compounds across a sprint. The strategic version of this argument, with citations and an illustrative ROI calculator, is on az365.ai/blog/ai-engineering-productivity-roi-tokens-vs-closed-tickets. This guide is the implementation detail.

What @azure-devops/mcp Actually Exposes

The official Microsoft package is not pipelines-only. It covers the surface a D365 / Power Platform consultant uses every day. The toolset documentation lists the following work-items tools (a partial list of the 23 in the work-items domain) [ref: TOOLSET.md]:

ToolWhat it does
wit_get_work_itemRead a work item by ID, with fields, comments, and links
wit_get_work_items_batch_by_idsBatch read multiple work items
wit_query_by_wiqlRun a WIQL query and return matching IDs
wit_my_work_itemsList work items assigned to the current user
wit_create_work_itemCreate a new work item with field values and links
wit_update_work_itemUpdate fields on an existing work item
wit_update_work_items_batchBatch update multiple work items
wit_add_child_work_itemsAdd child work items under a parent
wit_work_items_linkLink two work items with a specified relationship
wit_link_work_item_to_pull_requestLink a work item to a PR
wit_add_work_item_commentPost a comment (Markdown or HTML)
wit_list_work_item_commentsRead the comment thread on a work item
wit_get_work_item_attachmentDownload an attachment as base64 or to disk

The full repository covers nine domains: Core, Work, Work Items, Repositories, Pull Requests, Pipelines (Builds and Releases), Wiki, Test Plans, and Search. Read TOOLSET.md before assuming what is or is not available.

Step 1: Generate a Scoped Personal Access Token (Optional)

The official server’s documented authentication path is a browser-based Microsoft account login on first tool invocation [ref: README.md]. PAT-based authentication is not described in the upstream README. If your organization or your installed version supports a PAT mode, the steps below are the right way to scope and store the token. If the browser flow works for you, skip to Step 2.

Go to https://dev.azure.com/<your-org>/_usersSettings/tokens and click New Token.

FieldValue
Nameclaude-mcp
Organizationthe one you are working in
Expiration90 days max
ScopesCustom defined

Then check only what you actually need:

ScopePermissionWhen to grant
Work ItemsRead, write, & manageIf the AI will read or write tickets (the common case)
CodeReadIf the AI will look at PRs or repo files
BuildReadIf the AI will check pipeline status
Test ManagementReadIf you use ADO Test Plans
WikiRead & writeIf you maintain the project wiki via AI

Do not grant Full access. A leaked PAT with full access can do anything you can do across every project you touch, including deletes. A scoped PAT can do only the bounded set of operations.

Click Create. Copy the token immediately. You will not see it again.

For PAT scope reference, see Microsoft Learn: Use personal access tokens.

Step 2: Configure mcp.json

The upstream README documents the configuration shape for VS Code at .vscode/mcp.json [ref: README.md]. The minimal shape is:

{
  "inputs": [
    {
      "id": "ado_org",
      "type": "promptString",
      "description": "Azure DevOps organization name (e.g. 'contoso')"
    }
  ],
  "servers": {
    "ado": {
      "type": "stdio",
      "command": "npx",
      "args": ["-y", "@azure-devops/mcp", "${input:ado_org}"]
    }
  }
}

Notes for D365 / Power Platform consultants specifically:

  • The server key "ado" becomes the tool prefix. Tools surface inside Claude with that prefix.
  • The org name is positional. Pass Contoso, not https://dev.azure.com/Contoso.
  • You can scope the loaded tool surface with -d and a domain list, for example ["-y", "@azure-devops/mcp", "${input:ado_org}", "-d", "core", "work", "work-items"].
  • If your Claude integration uses a .mcp.json shape with mcpServers rather than the VS Code servers shape, follow your client’s documentation. The args list and the package name remain the same.
  • Add the config file to .gitignore if your client requires inline credentials. Even when no token is in it now, a future change might be tempted to inline one.

Step 3: Authenticate

With the configuration above in place, save the file and start the MCP server from your client (in VS Code, click “Start” in the mcp.json editor; in Claude Code with a project-scoped config, opening the project triggers the spawn). The first time an Azure DevOps tool is invoked, a browser window opens prompting you to log in with your Microsoft account [ref: README.md].

Use credentials that match the selected Azure DevOps organization. The token is cached by the underlying authentication library; subsequent sessions do not require a fresh browser prompt unless the cache is cleared or the token expires.

If your installed version supports PAT-based authentication and you generated one in Step 1, follow the upstream documentation for that version. The principle is the same: the credential lives in a host-managed store or environment variable, never inline in mcp.json and never in chat.

Step 4: Verification Prompt

Open Claude Code in your project. Paste this verification prompt as the first message of the session:

Verify @azure-devops/mcp is loaded and working.

Step 1: List MCP tools whose names start with the ADO server prefix.
Expected at least:
  - wit_get_work_item
  - wit_query_by_wiql
  - wit_add_work_item_comment

Step 2: Read a known work item via wit_get_work_item.
  Show title and state.

Step 3: WIQL via wit_query_by_wiql:
  SELECT [System.Id], [System.Title]
  FROM WorkItems
  WHERE [System.AssignedTo] = @Me AND [System.State] <> 'Done'
  Return count.

Read-only. Do not post or create. Report PASS/FAIL on each step.

Each failure has a deterministic cause:

  • Step 1 fails: package name in the config is wrong, or the MCP server crashed. Check the client’s MCP output panel for errors.
  • Step 2 or 3 fails: authentication did not complete. If using browser auth, watch for the popup; if using PAT mode, regenerate at the tokens URL and update your credential store.

A short verification prompt at the start of every session is cheaper than spending the session debugging “the AI is being weird today.” Make it the first message.

Common Operations

Once verification passes, here are five patterns that change day-to-day work on a D365 / Power Platform engagement. Each names the specific wit_* tool the MCP dispatches.

Pattern 1: Read a Bug, Diagnose, Propose a Fix

Pull bug <bug-id>. Read the description, the comments, and the linked
artifacts. Tell me the root cause and the specific file and line you
would change. Do not make edits or post comments yet.

The AI calls wit_get_work_item, reads the linked artifacts (a Power Automate flow JSON, a Dataverse plug-in source file, a related work item), and comes back with a diagnosis grounded in the verbatim ticket text. The reviewer reads the proposed fix, checks the cited file and line, and approves or redirects. No copy-paste between Boards and the IDE.

Pattern 2: Bulk Triage Across the Sprint

What bugs are open and assigned to me, sorted by severity?

The AI calls wit_query_by_wiql with a WIQL query and returns a summarized list. A query that takes a minute or more clicking through Boards arrives summarized in the IDE, suitable for prioritizing the next ticket.

Pattern 3: Post-Deploy Comment with At-Mention

Comment on bugs <bug-id-1> and <bug-id-2> with the developer who owns
that area mentioned. Ask them to retest after deploy <deploy-version>.
Format the comment as HTML with a bullet list of what changed.

The AI calls wit_add_work_item_comment with format: "Html" and a body containing the data-vss-mention markup using the developer’s AAD GUID (pulled from any prior work-item response that has them in System.AssignedTo or as a comment author). The comment renders in the discussion thread with a clickable mention and triggers ADO’s notification.

Pattern 4: File Sibling Bugs Under a Parent PBI

File two new bugs as children of PBI <pbi-id>: one for "<symptom-1>",
one for "<symptom-2>". Link both as Related to bug <bug-id>.

The AI calls wit_create_work_item twice with JSON Patch documents that include a relations block linking each new bug as a child of the PBI and as Related to the existing bug. Two new work items created, parented, and linked, with a summary comment returned.

Link bug <bug-id> to PR <pr-id> in the <your-repo> repository.

The AI calls wit_link_work_item_to_pull_request. Single tool call, done.

Safety Practices That Are Not Optional

This is the part most setup guides skip. On a client engagement these practices are mandatory.

PracticeWhyHow
Config file in .gitignoreA future change might inline a token and commit itAdd the rule, verify with git check-ignore <config-file>
Settings sync turned off for credential fileIf credentials live in a synced settings file, they leakSettings Sync configuration: turn off the affected file
PAT scoped minimallyLimits blast radius if the token leaksStep 1’s scope table
PAT expiration 90 days or lessForces rotation reviewSet in the token creation form
Calendar reminder for renewalAvoids “PAT expired in the middle of a sprint”Calendar entry seven days before expiration
Separate PATs per use caseThe AI gets its own; CI gets its own; personal scripting gets its ownEasier to audit, rotate, and revoke
Immediate revocation on suspected leakTokens that leaked once cannot be unleakedBookmark https://dev.azure.com/<org>/_usersSettings/tokens
AAD GUIDs treated as PIIAt-mention markup contains user identity GUIDsDo not commit them to public repos or share in screenshots

The practical guideline: assume any PAT generated will leak at some point. Scope it so a leak is an inconvenience instead of a disaster.

Common Pitfalls

SymptomCauseFix
wit_* tools not appearing in the clientPackage name in config is wrong, or the server crashed on startCheck the client’s MCP output panel for the spawn log
Auth prompt does not appearBrowser flow disabled, or credential cache holds a stale entryClear the credential cache for the affected account, retry
Tool call returns 401 / 403Account lacks access to the org or project, or PAT scope is too narrowConfirm org membership; widen PAT scopes only as needed
Access Denied: PAT expiredPAT expired or scoped wrongRegenerate at the tokens URL, update credential store
/tmp/foo: No such file or directoryWindows Git Bash does not have /tmpUse d:/tmp with mkdir -p, or switch to PowerShell
Unexpected UTF-8 BOM from curl pipesADO returns JSON with a BOM prefixStrip in Python: if raw.startswith(codecs.BOM_UTF8): raw = raw[3:]
cp1252 charmap codec can't encodeStdout in cp1252, content has emoji or non-ASCIIexport PYTHONIOENCODING=utf-8 before invoking Python
AI fills in a name from an unsigned chat messageNo source ticket access, AI is guessing from contextRefuse to proceed without source. Ask “who is this from?”

When to Drop to REST API as a Fallback

The official MCP covers most operations a D365 / Power Platform consultant needs. For the gaps (uploading attachments outside the documented attachment tool, custom field type creation, some auditing endpoints), drop to the Azure DevOps REST API:

RAW_PAT="<your-raw-pat>"
AUTH=$(echo -n ":$RAW_PAT" | base64 -w0)
curl -s -H "Authorization: Basic $AUTH" \
  "https://dev.azure.com/<org>/_apis/wit/workItems/<id>?api-version=7.1"

The REST API documents Basic auth with an empty username and the raw PAT after the colon, base64-encoded [ref: Azure DevOps REST API authentication]. This is the REST envelope. The MCP server’s authentication is separate; do not mix the two.

The REST fallback also has its own foot-gun worth knowing. POSTing comments requires a project-scoped URL:

URLGETPOST
/<org>/_apis/wit/workItems/<id>/comments200404
/<org>/<project>/_apis/wit/workItems/<id>/comments200200

The MCP handles scope internally, so this only matters if you are dropping to REST.

Three Disciplines for Every Engagement

Read TOOLSET.md before adding any MCP package to your config. The npm package name is not the spec. Inferring capabilities from session-visible tools instead of reading the docs costs time. Run npm view <package> and scan the repository’s TOOLSET file before installing anything.

Treat any PAT as a leakable secret from the moment it is generated. Scope minimally, expiration short, separate tokens per use case, calendar the rotation, never paste into chat. The first time one of these disciplines is skipped is the time it bites.

Verify access at the start of every session. PATs expire silently. Auth caches go stale. Client processes inherit configuration at spawn time. Thirty seconds running the verification prompt above is cheaper than spending a session debugging “the AI is being weird today.”


The strategic / business-case version of this material, including the operating-model framework and an illustrative ROI calculator for engineering leaders, is on az365.ai.

Stay in the loop

Get new posts delivered to your inbox. No spam, unsubscribe anytime.

Related articles