Building Custom Connectors
Deep dive into building custom connectors to integrate your Copilot Studio agent with any API in government cloud environments.
Overview
The Power Platform connector gallery includes hundreds of pre-built connectors. But your agency’s internal case management system, that legacy HR API behind a gateway, or the cross-agency data sharing service your program depends on—those are not in the gallery. Custom connectors close this gap.
A custom connector lets you define your own integration to any REST API so your Copilot Studio agent can call it just like a pre-built connector. This video walks through the entire process: deciding when to build custom, creating the connector definition, configuring authentication for government clouds, testing, and deploying with proper governance.
What You’ll Learn
- When to build custom: Decision framework for pre-built vs. custom connectors
- Connector definition: Creating operations from an OpenAPI spec
- Authentication: OAuth 2.0, Azure AD, API keys, and government-specific endpoints
- Testing and deployment: Validating in your cloud and governing with DLP policies
Script
Hook: When no connector exists
The connector gallery has hundreds of options. SharePoint, Dataverse, ServiceNow, Salesforce—if a major platform has an API, there is probably a connector for it. But your agency’s case management system is not one of them. Neither is that internal HR API, the facility management platform, or the cross-agency data sharing service your program relies on.
Custom connectors let you define your own integration to any REST API—internal or external, on-premises or in the cloud. You tell Power Platform how to talk to the API, how to authenticate, and what data to expect back. From that point, your Copilot Studio agent can use it like any other connector.
In the next fifteen minutes, you will build a custom connector from scratch, configure authentication for a government cloud environment, and wire it into your agent.
When to build custom connectors
Before you start building, make sure you actually need a custom connector. Here is the decision framework.
First, check the connector gallery filtered for your cloud environment. If a pre-built connector exists and supports the operations you need, use it. Pre-built connectors are maintained by Microsoft or the publisher, which means less work for you over time.
Second, check if a third-party connector exists in commercial but is not yet available in your government cloud. If so, you may need to wait for certification or build a custom connector as a bridge until the pre-built version arrives.
Build custom when no connector exists for the system you need, or when an existing connector does not expose the specific operations your agent requires.
The most common government scenarios for custom connectors are internal agency APIs—case management, HR systems, facility management—that are built and maintained in-house. Legacy systems exposed through API gateways, where a SOAP service has been wrapped in a REST layer, are another frequent case. Cross-agency data sharing APIs and specialized government services like USAspending or SAM.gov also require custom connectors.
Know the limitations upfront. Custom connectors support REST APIs only—no direct SOAP, GraphQL, or gRPC. Request and response size limits apply. Throttling works the same as with pre-built connectors.
For government environments, custom connector endpoints must be reachable from your cloud boundary. If the API lives on-premises or behind a firewall, you need an on-premises data gateway. Your security team may also require a review before a new custom connector is deployed to production.
Creating a custom connector definition
You have three ways to start building a custom connector. You can import an OpenAPI (Swagger) file, import from a Postman collection, or create from blank. The recommended approach is to start with an OpenAPI definition. Most government APIs either have one already or can generate one from their codebase. Importing an OpenAPI file gives you operations, parameters, and response schemas automatically, saving significant manual work.
Open the Power Platform maker portal, navigate to Custom Connectors, and select “New custom connector.” Choose “Import an OpenAPI file” and upload your spec.
The connector wizard has four tabs. On the General tab, you set the connector name, description, host URL, and base URL. The host URL is the base address of your API—for example, api.agency.gov or your-app.azurewebsites.us. In government environments, this URL must be within your cloud boundary or accessible through a configured gateway.
The Security tab is where you configure authentication. We will cover this in detail in the next section.
On the Definition tab, you see the actions and triggers imported from your OpenAPI file. Each action maps to an API endpoint—a GET, POST, PUT, or DELETE operation. For each action, review the request parameters—path parameters, query parameters, headers, and request body. Then review the response schema, which tells Power Platform what fields the API returns. This is critical: if a field is not in the response schema, it will not be available as a variable in your Copilot Studio topic.
Add clear descriptions to every parameter and operation. These descriptions appear in the Copilot Studio authoring UI when a developer configures the action node. “CaseID” with no description forces someone to guess what format is expected. “CaseID - The unique identifier for the case, e.g., CS-2024-00145” makes it obvious.
The Test tab lets you validate connectivity before publishing. Create a test connection with valid credentials and execute each operation to confirm the responses match your schema.
Follow these best practices for defining operations. Use clear, verb-based operation IDs: GetCaseById, CreateTicket, UpdateStatus. Add a summary and description to every operation. Define response schemas accurately—test against the real API and update the schema if the actual response includes fields you missed. Include error response schemas for 400, 401, 404, and 500 status codes so your agent can handle failures.
Keep the definition focused. Only expose the operations your agent actually needs. Avoid including administrative or destructive operations—like DeleteAllRecords—through a connector that agents will call. Version your connector definition alongside your API so changes are tracked and coordinated.
Authentication options for custom connectors
Authentication is where government developers spend the most time configuring custom connectors. Here are your options.
No authentication is suitable only for truly public APIs with no access control. This is rare in government contexts and should be avoided for anything that handles agency data.
API Key authentication is the simplest method. The API expects a static key passed as a header or query parameter. It works well for internal APIs with straightforward access control. In government environments, establish a key rotation schedule and store keys securely—do not embed them in documentation or share them over email.
OAuth 2.0 is the most common authentication method for modern government APIs. It supports two main flows. The authorization code flow requires the user to authenticate interactively, which creates per-user connections—each user has their own authenticated session. The client credentials flow is service-to-service with no user interaction, which creates shared connections using a service account identity. Configuration requires a client ID, client secret, authorization URL, token URL, and scope—all obtained from the API’s app registration.
In government clouds, the identity provider endpoints differ from commercial. For GCC, use login.microsoftonline.com with your GCC tenant ID. For GCC High, use login.microsoftonline.us. For DoD, use login.microsoftonline.us with your DoD tenant ID. Getting these endpoints wrong is one of the most common causes of authentication failures in government custom connectors.
Azure Active Directory (now Entra ID) authentication is a specialized variant of OAuth 2.0 designed for APIs hosted in Azure. It simplifies configuration when both the API and the connector live in the same tenant. You provide the resource URL pointing to the API’s app registration, and Power Platform handles the token exchange.
Windows Authentication is used for on-premises APIs accessed through a data gateway. The gateway handles credential negotiation with the on-premises service.
Choosing the right method: if your API is hosted in Azure, use Azure AD or Entra ID. If it is on-premises, use Windows Authentication via a gateway or API Key depending on what the API supports. For external government APIs, follow their documentation—most will specify OAuth 2.0 or API Key.
Per-user connections are preferred in government environments because they create individual audit trails and enforce least-privilege access. Shared connections using service accounts are appropriate for service-to-service integrations where no user context is needed.
Never hardcode secrets in the connector definition. Use Azure Key Vault for secret rotation where possible, and document the credential lifecycle so your operations team knows when secrets need to be refreshed.
Testing and validating your connector
Thorough testing prevents issues from reaching your users.
Start in the connector wizard’s Test tab. Create a test connection using valid credentials for your environment. Execute each operation one at a time and verify the response. Check that every field in your response schema is populated with the expected data type. If the schema defines a field as a string but the API returns a number, the mapping will break in your agent.
Here are the most common issues you will encounter. A 401 Unauthorized error means your authentication configuration is wrong—check the token URL, scopes, and client secret. A 403 Forbidden means the authenticated identity does not have permission—check API role assignments and app registrations. A 404 Not Found means the host URL or path parameters are incorrect. Timeout errors mean the API is too slow—increase the timeout setting in the connector or work with the API team to optimize the endpoint.
After validating in the connector wizard, test inside Copilot Studio. Add the custom connector as an action in a test topic. Walk through the full conversation flow—user input, question nodes, action call, response display. Verify that all output fields are populated and formatted correctly. The Test pane in Copilot Studio is your primary debugging tool here.
For government-specific testing, always test from within your actual cloud environment. A connector that works in a commercial sandbox may fail in GCC High because of different endpoints, network routing, or DLP policies. Verify that the connector works with your tenant’s DLP policies in place—a connector not yet classified in DLP may be blocked by default. Test with multiple user accounts to verify that per-user authentication works correctly across different roles and permissions.
Conduct load and performance testing before going live. Custom connectors are subject to the same throttling limits as pre-built connectors. Test with realistic request volumes to ensure the underlying API can handle the traffic. Monitor response times—an API that takes 10 seconds to respond will create a poor conversational experience.
Deploying custom connectors
Once testing is complete, you are ready to deploy.
Publishing is straightforward. Save your connector and it becomes available as a custom connector in your Power Platform environment. It appears in the connector list alongside pre-built connectors when developers add actions to their topics.
Share the connector with the people who need it. You can share with specific users or with the entire environment. In government settings, limit sharing to authorized developers and service accounts. Not everyone who builds agents should have access to every custom connector.
For environment management, follow the standard Power Platform promotion path. Build and test in a development environment. When the connector is ready, add it to a solution. Solutions package the connector with your agent, its topics, and any related components so you can deploy everything together from development to test to production.
When you need to update the connector, you can add new operations or modify existing ones. Existing connections continue to work as long as you do not change the authentication configuration. Breaking changes—removing operations, changing parameter names, altering response schemas—require updating the agent topics that depend on them. Communicate changes to your development team before deploying updates.
Governance is essential. Register your custom connector in your tenant’s DLP policies. Classify it as Business or Non-Business based on the sensitivity of the data it accesses. An unclassified connector may be blocked by default in strict DLP configurations. Document the connector: what API it connects to, what data it handles, who maintains it, and when credentials need rotation.
If your API is behind a firewall, the on-premises data gateway handles connectivity. Install the gateway on a server within your network, configure it in Power Platform, and point the connector’s host URL through the gateway. A gateway cluster is recommended for high availability. Gateway traffic stays within your network—data does not traverse the public internet.
Close: Custom connectors in practice
Here is a real-world government pattern that ties everything together. An agency has an internal case management system with a REST API. A developer obtains the OpenAPI definition from the development team and creates a custom connector with three operations: GetCase, CreateCase, and UpdateCase. The connector uses Azure AD authentication with per-user connections so every case lookup is tied to the authenticated user’s identity. An agent topic lets users check case status by providing a case number and submit new cases conversationally by answering a series of questions. The DLP policy classifies the new connector as Business alongside Dataverse and SharePoint, ensuring it can be used together with those services.
To recap the process: identify the API and obtain or create its OpenAPI definition. Create the custom connector in the Power Platform maker portal. Configure authentication appropriate to your government cloud—using the correct Entra ID endpoints. Test thoroughly within your actual cloud environment, not in commercial. Deploy via solutions and register the connector in your DLP policies.
Avoid these common pitfalls. Exposing too many API operations invites misuse—keep it focused on what the agent needs. Skipping error response schemas means your agent cannot handle failures gracefully. Testing only in commercial and assuming it works the same in GCC High will cause problems—always test in your target environment. Forgetting to update DLP policies can result in the connector being blocked by default.
Three things to do next. Pick an internal API your team manages and create or obtain its OpenAPI definition. Build a custom connector for it in your development environment. Wire it into a test agent and validate the full round trip—user question to API call to agent response.
Custom connectors are the ultimate extensibility point. If it has a REST API, your agent can talk to it.
Sources & References
- Create a custom connector in Power Platform — Official documentation for building custom connectors, including the connector wizard, authentication options, and OpenAPI import
- Use connectors in Copilot Studio — Documentation on using connectors, including custom connectors, as actions in Copilot Studio agents
- Power Platform documentation — Main hub covering environments, solutions, DLP policies, and governance
- Define your OpenAPI definition for a custom connector — Detailed guide on creating and editing OpenAPI definitions for custom connectors