Integrating with Dynamics 365 Finance & Operations using Package REST API
The Package REST API facilitates the integration, automation, or migration of data into Dynamics 365 Finance and Operations using data packages. This API mirrors the principles of manual data import through the user interface.
Import process in the Data Management module via the user interface:
- A user creates a project in the Data Management workspace.
- In the lines section, the user adds the necessary file format and data entity, then selects the file for import, or simply adds the package if the format is a package.
- D365FO imports the file into its internal blob storage.
- Upon executing the import project, D365FO processes the file from blob storage to the Data Entity staging table, and then to the Target table.
Export process of a data entity in the Data Management module via the user interface:
- A user creates a project in the Data Management workspace.
- In the lines section, the user adds the data entity and file format.
- Executing the export project generates a temporary file in D365FO environment's internal blob storage.
- Downloading the file transfers it from blob storage to the user's local workstation.
- Downloading the package compresses all data entity files in the project into a zip file, including the data entity and project definition, for direct download to the user's workstation.
Package API Process
The process of importing or exporting data via the Package API mirrors the manual import and export of data packages in D365FO.
1. Authorization - This step involves authenticating the external system or integration middleware to establish a connection to D365FO as an application through OAuth. Refer to the Authentication section in the Microsoft article for instructions on setting up or registering an Active Directory application for authentication.
Below is an example of a Postman request illustrating what the Authorization step would look like.
I've implemented a script to automatically save the bearer token into a variable, eliminating the need to manually copy and paste the access token for subsequent steps as the Authentication bearer token.
2. Get-AzureWriteUrl - This step generates a temporary internal blob URL from the D365FO environment to which you are attempting to connect. It is important to note that this URL includes a Shared Access Signature, negating the need for authentication to access or write content in this space.
In the body of the Get-AzureWriteUrl you must mention a unique file name for file you're going to import. Ensure that you generate a unique filename like a file with date timestamp or a guid to avoid duplicates.
The Authorization will be set to the access_token received from step1 for all subsequent steps except for upload package to blob.
Below is an example of a Postman request illustrating what the Get-AzureWriteUrl step would look like.
Recommended by LinkedIn
3. Upload the package to the blob URL - Note the Blob URL you got in the response body in previous step. Upload your package with the data entity definitions & the file containing the data. Ensure that the filename is the same as the uniquefilename mentioned in step 2.
4. Import From package (from blob URL) - This step will import the data package from the blob url into D365FO for processing into Data entity staging table and then to the Target table.
Body :
{
"packageUrl":"{{PackageUrl}}", // URL of the blob where the package was uploaded
"definitionGroupId":"{{DefinitionGroup}}", // This refers to the project name you have in D365FO, not mandatory, If there is no definitionGroupId mentioned, this process will generate one automatically. Ideally keep a unique definition Group Id (will explain below why?)
"executionId":"", // Leave empty, the response from this step will have an execution Id for tracking.
"execute":true, // set to true to execute Target step.
"overwrite":true, //overwrite any existing entities in the project
"legalEntityId":"USMF" //mandatory , which legal entity to import this data
}
5. Get ExecutionSummaryStatus - This step can be used for verifying the status of an import process by parsing the executionId from the previous step in the body of the request.
6. Get ExecutionErrors - If the import process didn't go well and if the import if failed due to any error in the data, this step will give a lot more details on what the error is. In the body you need to parse the executionId for the import.
Software Engineer at CGI
4moHi