Azure Log Collection Integration

Number of APIs: 4

Introduction

This collection is intended to recreate API requests (GET, POST, PUT, and DELETE) from Azure logs within a specified collection by the user.

Setup

To run the collection, you need the following setup complete:

  1. This collection requires that you have a Qodex API key ([learn more about the Qodex API here] and get your Qodex API key here)

  2. API Management, Blob Storage Setup :

  • Resource Group Name (learn more about the Azure Resource groups here)
  • Service Name (Note: This should be an API management service (learn more about the Azure API management service here))
  • Client ID, Client Secret and Tenant ID (These are associated with the target Azure application (learn more about creating Azure Apps and viewing app details here))

  • In order to start, you must have created an API Management Integration and an Application (learn more about creating an Application here). You must connect the API Management Integration with the created Application. Use the Application/Client ID from your created Application for this (learn about how to do this here).

  • When creating the Oauth2 Service within the API Management Integration Service, put any placeholder url for the Client registration page URL (as a dummy variable, we recommend https://placeholder.com). Make sure to check Implicit for Authorization grant types. For Authorization endpoint URL, go to the endpoints page of the App Registration you created and use the OAuth 2.0 authorization endpoint (v2) URL given and go to the same place for the Token endpoint URL and use the OAuth 2.0 token endpoint (v2) URL. Make sure to check POST in the Authorization request method. For client credentials, add the Client ID of the App Registration you have been using.

  • You also must add a role for the App Registration you just created in your API Management Service. Under the API Management Service you created, go under Access Control (IAM) and add a role assignment of Contributor, assigning access to User, group, or service principle and your App Registration by name.

  • You must add a role for the App Registration you just created in your Blob Storage Account. Under the Blob Storage Account you created, go under Access Control (IAM) and add a role assignment of Contributor, assigning access to User, group, or service principle and your App Registration by name.

  • The created Azure Application should be connected to the Azure API Management Service and the Azure Blob Storage Service. (if you don't want them to be connected to the same Application, create separate environment variables for the Client ID, Client Secret and Tenant ID for both services)

Next, within the API to be tracked within API Management, click on Operations. Within All Operations, go to the section named Inbound Processing and click on Add Policy and insert the following snipped of code with {BlobStorageAccountName} and {ContainerName} replaced with your storage account name and the container where you want to store the API Logs. (Due to markdown issues, view the code with better formatting by pasting code in an online xml formatter such as https://codebeautify.org/xmlviewer)

<policies>
    <inbound>
        <authentication-managed-identity resource="https://storage.azure.com/" output-token-variable-name="msi-access-token" ignore-error="false" />
        <!-- Send the PUT request with metadata -->
        <send-request mode="new" response-variable-name="result" timeout="300" ignore-error="false">
            <!-- Get variables to configure your: storageaccount, destination container and file name with extension -->
            <set-url>@( string.Join("", "https://{BlobStorageAccountName}.blob.core.windows.net/{ContainerName}/", context.RequestId, ".json") )</set-url>
            <set-method>PUT</set-method>
            <set-header name="Host" exists-action="override">
                <value>{BlobStorageAccountName}.blob.core.windows.net</value>
            </set-header>
            <set-header name="X-Ms-Blob-Type" exists-action="override">
                <value>BlockBlob</value>
            </set-header>
            <set-header name="X-Ms-Blob-Cache-Control" exists-action="override">
                <value />
            </set-header>
            <set-header name="X-Ms-Blob-Content-Disposition" exists-action="override">
                <value />
            </set-header>
            <set-header name="X-Ms-Blob-Content-Encoding" exists-action="override">
                <value />
            </set-header>
            <set-header name="User-Agent" exists-action="override">
                <value>sample</value>
            </set-header>
            <set-header name="X-Ms-Blob-Content-Language" exists-action="override">
                <value />
            </set-header>
            <set-header name="X-Ms-Client-Request-Id" exists-action="override">
                <value>@{ return Guid.NewGuid().ToString(); }</value>
            </set-header>
            <set-header name="X-Ms-Version" exists-action="override">
                <value>2019-12-12</value>
            </set-header>
            <set-header name="Accept" exists-action="override">
                <value>application/json</value>
            </set-header>
            <!-- Set the header with authorization bearer token that was previously requested -->
            <set-header name="Authorization" exists-action="override">
                <value>@("Bearer " + (string)context.Variables["msi-access-token"])</value>
            </set-header>
            <!-- Set the file content from the original request body data -->
            <set-body>@{var url = String.Format("{0} {1} {2}", context.Request.Method, context.Request.OriginalUrl, context.Request.Url.Path + context.Request.Url.QueryString);
                var status = String.Format("{0}",context.Response.StatusCode);
                var headers = context.Request.Headers;
                Dictionary<string, string> contextProperties = new Dictionary<string, string>();
                foreach (var h in headers) {
                    contextProperties.Add(string.Format("{0}", h.Key), String.Join(", ", h.Value));
                }
                var body = context.Request.Body;
                if (body != null){
                    var requestLogMessage = new {
                        Headers = contextProperties,
                        Body = context.Request.Body.As<string>(),
                        Url = url
                    };
                    return JsonConvert.SerializeObject(requestLogMessage);
                } else {
                    var requestLogMessage = new {
                        Headers = contextProperties,
                        Url = url
                    };
                    return JsonConvert.SerializeObject(requestLogMessage);
                }}</set-body>
        </send-request>
        <!-- Returns directly the response from the storage account -->
        <return-response response-variable-name="result" />
        <base />
    </inbound>
    <backend>
        <base />
    </backend>
    <outbound>
        <base />
    </outbound>
    <on-error>
        <base />
    </on-error>
</policies>

Forking Collections

In order to use this integration, you will need to fork this collection. You can do this by left clicking on a collection and pressing create a fork. You will also need to fork the environment named Azure Log Collection ([learn more about forking collections and environments here]

Monitors

In the integration, a monitor is used to automatically check updates made to the specified Log Group once a day and create a new collection including the updated requests (since Qodex does not currently support modifying pre-existing collections through their API).

In order to make this monitor, input the desired Monitor Name, select the Azure Log Collection collection as the Collection. Select the Azure Log Collection environment as the environment and select how often you want it to be run to check if your logs have been updated (we recommend once a day). Then, your monitor should be created ([learn more about creating a monitor here]

Mocks

If you would like to see example responses and run and interact with the collection without having to input valid credentials, you will need to create a mock server. You can do this by left clicking on your forked version of this integration and selecting Mock Collection ([learn more about creating a mock server here]

Then you will also nee to fork the [MOCK] Azure Log Collection environment. Copy the mock server url and assign it to the environment variable mockUrl. If you run the collection with the mock environment selected, it will show you what a successful request looks so long as you fill out the environment variables with the correct variable type.

Environment Setup

To use this collection, the following environment variables must be defined:

KeyDescriptionKind
x-api-keyYour Qodex API key ([learn more about the Qodex API here] and get your Qodex API key here)User input
subscriptionIdYour Azure subscription IDUser input
tenantIdThe ID of your Active Directory (AD) in Azure.User input
clientIdClient ID of Azure app (found in Azure app registrations).User input
clientSecretClient Secret of Azure app (found in Azure app registrations).User input
bearerTokenAccess token for authentication.Auto-generated
resourceGroupNameName of resource group created on Azure.User input
serviceNameName of API Management Instance created on Azure.User input
apiIdThe ID of your API (found under the API details tab)User input
baseUrlThe base URL used to generate a bearer token required for various requestsAuto-generated
containerNameThe name of the blob storage container where logs are stored. (Should match up with value in XML script.)User Input
storageNameName of Azure storage account.User Input
sharedAccessSignatureSAS Token Generated from Blob Storage. (Note, append a ? to the SAS.)User Input
workspaceIDID of Qodex Workspace.User Input
blobList, blobName, logItems, blobListJSONAutogenerated variables to parse and iterate through logs.Auto-generated
  1. GetAuthTokenBlob POST {{baseUrl}}

  2. List Blobs GET {{baseUrl}}?restype=container&comp=list

  3. Get Blob GET {{baseUrl}}/{{containerName}}/{{blobName}}{{sharedAccessSignature}}

  4. Create Collection POST {{baseUrl}}?workspace={{workspaceID}}