Tuesday, November 26, 2019

D365FO / C# - Accessing the InfoLog / InfoLogEntries Object within a custom webservice

While working on testing a custom web-service that originally tried to read the info log and report it back to a field in web service via custom logic I noticed that that error message being returned was actually pretty generic and told me nothing about the issue.

After debugging the code it looks like the info log itself is now being returned automatically with a SOAP / JSON web service which is great because this actually tells us why a record cant be created or what the issue is in detail with no extra code. However it appears nothing is logged online about this and it doesn't appear to work like a contract class response and which would allow you to use a foreach() on the object. Instead we need to treat it like an array based object as the code below shows

Whats cool about this is that it actually returns the type of message from the infolog as well (Info, Warning, Error) so you can add in logic that will tell you what exactly the issue is.

 //call out the infolog class  
 [webserviceName].Infolog messages = new [webservicename].Infolog();  
 //read in the infolog from the response  
 messages = response.Infolog;  
 //get the length of the log and read in each message  
 for(int messagePosition = 0; messagePosition <= messages.Entries.Length - 1; ++messagePosition)  
 {  
    AzureLog.AppendText(String.Format("Info Log Message {1}: {0}", messages.Entries[messagePosition].Message.ToString(), messages.Entries[messagePosition].Type.ToString()) + Environment.NewLine);  
 }  




The output will then look something like this


Info Log Message Info: A custom info statement based on an extension of salesTable.ValidateWrite()
Info Log Message Warning: Site [SITENAME] does not exist.
Info Log Message Warning: Warehouse [WAREHOUSENAME] is not connected to site [SITENAME]
Info Log Message Warning: Warehouse [WAREHOUSE] does not exist.
Info Log Message Warning: Site [SITE] does not exist.
Info Log Message Error: Update has been canceled.







Wednesday, August 21, 2019

D365FO - BYODB Entity Export Failing with no error message (Catch Weight Exporting Issue)


Recently after standing up a new export and going to export the Sales order lines V2 and Purchase order lines V2 entity's to a Azure SQL DB it started to fail. However in the execution log there is no error





If you try running it in batch mode the batch completes but nothing is export but the export shows as completed.


If you dig into event viewer on the server under

Application and Services Logs > Microsoft > Dynamics > AX-DIXFRuntime > Operational

You will find your error but it is very generic and does not give you any helpful information just that entity is crashing







Error:
System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.Exception: Exception from HRESULT: 0xC0202009 at Microsoft.Dynamics.AX.Framework.Tools.DMF.ServiceProxy.DmfEntitySharedTypesProxy.DoWork[T](Func`1 work) --- End of inner exception stack trace --- at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor) at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments) at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) at System.RuntimeType.InvokeMember(String name, BindingFlags bindingFlags, Binder binder, Object target, Object[] providedArgs, ParameterModifier[] modifiers, CultureInfo culture, String[] namedParams) at Microsoft.Dynamics.Ax.Xpp.CLRInterop.MakeReflectionCall(Object instance, String methodName, Object[] parameters)



After trying multiple ways of regenerating the mapping, refreshing the entity list, builds, db syncs I was still getting the error.

However I deleted all fields within the mapping and just did a few base fields the entity index is based on and found that the entity would export. Because of this I knew that something was wrong with one of the field mappings.


After regenerating the complete list and started to remove fields in order to see if it would work I was able to find the specific fields that were cause the issue.

ORDEREDCATCHWEIGHTQUANTITY


Currently we have Catch Weight enabled within the configuration key but are not using this feature as we are still in the design phase of our implementation


If we have an entity that is failing export and it is a base export one thing to do is whenever defining the mapping or the scheme do a search for "catch" and remove it from the mapping (in data project assignment or main modify target mapping) or  the export will now work. It does not matter if the fields get published to the scheme it only matters if these catch weight fields get included in the export.



Update 9/26/19: It turns out that disabling the catchweight key does not stop the field from being added to the entity mapping.

Monday, July 15, 2019

D365FO - Copy custom value from Accounts Payable Invoice Register (LedgerJournalTrans) to Invoice Approval (VendTrans)

When creating a accounts payable invoice register entry it will create a record in the table LedgerJournalTrans. Whenever you post this journal it will create a record for the journal in the VendTrans table.

If you go into the Invoice Approval form and click on "Find Vouchers" you will see that any custom fields you added to LedgerJournalTrans are available. However once one of these vouchers are loaded into the approval form you will find that the values are longer available. This is because it has converted your LedgerJournalTrans instance into its own VendTrans table instance.

In order to copy the values from LedgerJournalTrans to VendTrans you will need to extend the VendVoucher class which is shown below. It is good to note that in theory this method should be able to be applied to the CustVoucher (customer invoices) as it applies the same structure (CustVendTrans map) and using the common table method to pass VendTrans or CustTrans. 


[ExtensionOf(classStr(VendVoucher))]
final class APInvoiceModsVendVoucher_Extension
{
    

    /// <summary>
    /// COC of initCustVendTrans which sets the initial values of vendTrans from ledgerjournaltrans
    /// </summary>
    /// <param name = "_custVendTrans">VendTrans</param>
    /// <param name = "_ledgerPostingJournal">LedgerVoucher</param>
    /// <param name = "_useSubLedger">Use sub ledger default is false</param>
    protected void initCustVendTrans(
                                    CustVendTrans _custVendTrans,
                                    LedgerVoucher _ledgerPostingJournal,
                                    boolean _useSubLedger)
    {
        //execute the base functionality
        next initCustVendTrans(_custVendTrans, _ledgerPostingJournal, false);
        
        //get vendTrans table buffer form CustVendTrans map instance
        VendTrans vendTrans = _custVendTrans as VendTrans;
        
        //if the common instance is being initialized via the ledgerjournaltrans table we need to copy the custom field
        if (common && common.TableId == tableNum(LedgerJournalTrans))
        {
            LedgerJournalTrans ledgerJournalTrans = common;
            
            //copy internal invoice notes and the description from the ledgerjournal trans to the vendtrans
            vendTrans.CustomField = ledgerJournalTrans.CustomField;
            //this is not a custom field but the values do not get transfered
            vendTrans.Txt = ledgerJournalTrans.Txt;
        }
    }

}

D365FO / Setting default values for custom fields on a vendor invoice from a purchase order (VendInvoiceInfoTable / PurchFormletterParmDataInvoice)

When creating a vendor invoice and you are trying to populate custom fields on table VendInvoiceInfoTable you may find that the following will only work if you are creating a blank invoice and not executing it from the invoice create button on a purchase order



[DataEventHandler(tableStr(VendInvoiceInfoTable), DataEventType::InitializingRecord)] 
 public static void VendInvoiceInfoTable_onInitializingRecord(Common sender, DataEventArgs e) 
 { 
      VendInvoiceInfoTable vendInvoiceInfoTable = sender as VendInvoiceInfoTable; 
      vendInvoiceInfoTable.CustomField   VendParameters::find().DefaultCustomField; 
 }

This is due to the class PurchFormletterParmDataInvoice.createInvoiceHeaderFromTempTable()
which called .skipEvents(true), skipDataMethods(true), .skipDatabaseLog(true) and calls a Query::insert_recordset() because of this none of the default events will get triggered.

In order to properly populate customfields on a blank vendor invoice or creating once from a purchase order the following will need to be done via extensions and CoC.


  1. Add custom fields to VendInvoiceInfoTable
  2. Add custom fields to VendInvoiceInfoTableTmp (name and types need to match what was created on VendInvoiceInfoTable)
    1. We have to add it to this table because on PurchFormLetterParmDataInvoice.insertParmTable() it calls a buf2buf command which loops through the field list.
  3. Create extension class of table VendInvoiceInfoTable and apply a CoC of defaultRow method which will populate the value onto both the regular and tmp instance of VendInvoiceInfoTable however it will not save to the database unless the remaining steps are completed
    1. /// <summary>
      /// Default values for a vendor invoice header from a purchase order
      /// </summary>
      [ExtensionOf(tableStr(VendInvoiceInfoTable))]
      final class APInvoiceModsVendInvoiceInfoTable_Extension
      {
          /// <summary>
          /// COC table method defaultRow which resets default valies
          /// </summary>
          /// <param name = "_purchTable">PurchTable</param>
          /// <param name = "_ledgerJournalTrans">LedgerJournalTrans</param>
          /// <param name = "_resetFieldState">Reset field state</param>
          public void defaultRow(PurchTable _purchTable, LedgerJournalTrans _ledgerJournalTrans, boolean _resetFieldState)
          {
              CustomEDTType defaultCustomField;
              
      next defaultRow(_purchTable,_ledgerJournalTrans,_resetFieldState);

              //find the default custom field from vend parms
      defaultCustomField = VendParameters::find().DefaultCustomField;
              
      //save the default invoice type to the header which will get copied to    VendInvoiceInfoTableTmp and then back to the main table due to PurchFormLetterParmDataInvoice.createInvoiceHeaderFromTempTable
      this.CustomField = defaultCustomField;

          }

      }
  4. Create extension class of PurchFormLletterParmDataInvoice and apply a CoC to buildCreateInvoiceHeaderFromTempTableFieldQuery and buildCreateInvoiceHeaderFromTempTableFieldMap because the method PurchFormletterParmDataInvoice.createInvoiceHeaderFromTempTable() creates a dynamic map of the fields to copy from the temp table into the main table 
    1. [ExtensionOf(classStr(PurchFormletterParmDataInvoice))]
      final class APInvoiceModsPurchFormletterParmDataInvoice_Extension
      {
          /// <summary>
          /// COC method that adds our custom field to the Field Query selection  from the temp table
          /// </summary>
          /// <param name = "_qbdsVendInvoiceInfoTableTmp">VendInvoiceInfoTableTmp</param>
          protected void buildCreateInvoiceHeaderFromTempTableFieldQuery(QueryBuildDataSource _qbdsVendInvoiceInfoTableTmp)
          {
              next buildCreateInvoiceHeaderFromTempTableFieldQuery(_qbdsVendInvoiceInfoTableTmp);

              //add custom selection field
      _qbdsVendInvoiceInfoTableTmp.addSelectionField(fieldNum(VendInvoiceInfoTableTmp, CustomField));
          }

          /// <summary>
          /// COC method thats adds our custom field to the dynamic field map against the main table. Which will get copied from the selection query
          /// </summary>
          /// <param name = "_qbdsVendInvoiceInfoTableTmp">VendInvoiceInfoTableTmp</param>
          /// <returns>Modified dynamic map to insert into the db</returns>
          protected Map buildCreateInvoiceHeaderFromTempTableFieldMap(QueryBuildDataSource _qbdsVendInvoiceInfoTableTmp)
          {
              var targetToSourceMap = next buildCreateInvoiceHeaderFromTempTableFieldMap(_qbdsVendInvoiceInfoTableTmp);
              targetToSourceMap.insert(fieldStr(VendInvoiceInfoTable, CustomField), [_qbdsVendInvoiceInfoTableTmp.uniqueId(), fieldStr(VendInvoiceInfoTableTmp, CustomField)]);
             
      return targetToSourceMap;
           }

      }

By applying our populate method to the defaultRow() method this will also trigger the value to populated on a blank vendor invoice once a vendor account is selected.

Monday, May 20, 2019

D365FO / Azure Powershell NSG Rule Scripting to allow Azure Datacenters access


Currently we have the need to setup Azure based VM's for ISV products. Because of this we need to create NSG (Network security group)  rules to allow inbound and outbound communication so that D365FO can communication with these machines. However because the D365FO instances will not have a static IP we need to account for all of the Azure Data Center IP's so that we do not leave the public ip/port available for unwanted communication.

This is my first stab at Azure based powershell scripting. So I am not sure if this is optimal but it seems like there isn't much information out about D365FO communication with outside servers that are also hosted on Azure. So hopefully this will help others

The official list of Azure Data Center IP's can be found at the following locations:



In the following example we will open up Port 21 for inbound and outbound communication. It will detect to see if the NSG rule exists and if it does it will update the rule. If no rule is found then we will create the rule against the defined NSG.


You will need to gather the following data points in order for this to work.

Azure subscription id
Azure resource group that the network security group is associated with
Azure network security group name


All of the information can be found on either the overview page of the VM you are creating an NSG rule for or on the NSG itself.


It is good to note that currently NSG's have a limit of 1000 rules because of this we cannot define an rule per ip but rather must create a single rule that contains multiple ip ranges.




# Sign-in with Azure account credentials

Login-AzureRmAccount



# Select Azure Subscription

#Azure subscription id

$subscriptionId = '';

#Azure resource group name associated with the network security group

$rgName = '';

#Azure network security group that we need to create the rule against
$nsgname = '';









# Download current list of Azure Public IP ranges

Write-Host "Downloading AzureCloud Ip addresses..."

$downloadUri = "https://www.microsoft.com/en-us/download/confirmation.aspx?id=56519"

$downloadPage = Invoke-WebRequest -Uri $downloadUri;

$request = ($downloadPage.RawContent.Split('"') -like "*.json")[0];

$json = Invoke-WebRequest -Uri $request | ConvertFrom-Json | Select Values

$ipRanges = ($json.values | Where-Object {$_.Name -eq 'AzureCloud'}).properties.addressPrefixes



#set rule priority

$rulePriority = 200



    

#define the rule names    

$ruleNameOut = "Allow_AzureDataCenters_Out"

$ruleNameIn = "Allow_AzureDataCenters_In" 






#nonprod network security group 

$nsg = Get-AzureRmNetworkSecurityGroup -Name $nsgname -ResourceGroupName $rgName -ErrorAction:Stop;

Write-Host "Applying AzureCloud Ip addresses to non production NSG  $nsgname..."



#check to see if the inbound rule already existed

$inboundRule = ($nsg | Get-AzureRmNetworkSecurityRuleConfig -Name $ruleNameIn -ErrorAction SilentlyContinue)





if($inboundRule -eq $null)

{

    #create inbound rule    

    $nsg | Add-AzureRmNetworkSecurityRuleConfig -Name $ruleNameIn -Description "Allow Inbound to Azure data centers" -Access Allow -Protocol * -Direction Inbound -Priority $rulePriority -SourceAddressPrefix $ipRanges -SourcePortRange * -DestinationAddressPrefix VirtualNetwork -DestinationPortRange 21   -ErrorAction:Stop | Set-AzureRmNetworkSecurityGroup -ErrorAction:Stop | Out-NULL;

     Write-Host "Created NSG rule $ruleNameIn for $nsgname"

}

else

{

    #update inbound rule

    $nsg | Set-AzureRmNetworkSecurityRuleConfig -Name $ruleNameIn -Description "Allow Inbound to Azure data centers" -Access Allow -Protocol * -Direction Inbound -Priority $rulePriority -SourceAddressPrefix $ipRanges -SourcePortRange * -DestinationAddressPrefix VirtualNetwork -DestinationPortRange 21 -ErrorAction:Stop | Set-AzureRmNetworkSecurityGroup -ErrorAction:Stop | Out-NULL;

    Write-Host "Updated NSG rule $ruleNameIn for $nsgname"

}



#check to see if the outbound rule already existed

$outboundRule = ($nsg | Get-AzureRmNetworkSecurityRuleConfig -Name $ruleNameOut -ErrorAction SilentlyContinue)



if($outboundRule -eq $null)

{

    #create outbound rule

    $nsg | Add-AzureRmNetworkSecurityRuleConfig -Name $ruleNameOut -Description "Allow outbound to Azure data centers" -Access Allow -Protocol * -Direction Outbound -Priority $rulePriority -SourceAddressPrefix VirtualNetwork -SourcePortRange * -DestinationAddressPrefix $ipRanges -DestinationPortRange 21 -ErrorAction:Stop | Set-AzureRmNetworkSecurityGroup -ErrorAction:Stop | Out-NULL;

    Write-Host "Created NSG rule $ruleNameOut for $nsgname"

}

else

{

    #update outbound rule

    $nsg | Set-AzureRmNetworkSecurityRuleConfig -Name $ruleNameOut -Description "Allow outbound to Azure centers" -Access Allow -Protocol * -Direction Outbound -Priority $rulePriority -SourceAddressPrefix VirtualNetwork -SourcePortRange * -DestinationAddressPrefix $ipRanges -DestinationPortRange 21 -ErrorAction:Stop | Set-AzureRmNetworkSecurityGroup -ErrorAction:Stop | Out-NULL;

    Write-Host "Updated NSG rule $ruleNameOut for $nsgname"

}










 Write-Host "Finished."

Wednesday, December 5, 2018

D365FO / Azure - Create website redirect in azure to handle supplemental environment url changes

Currently while dealing with D365FO supplemental environments within LCS that are all hosted on a single machine anytime we update them and there is an error microsoft's recommended fix is to stand up a new environment. Because of this the url will change and you need to notify your users. In order to handle this scenario so our users do not need to worry about a new url you can create a redirect website in azure that you can point to whatever url you want and if your users bookmark these urls instead of the direct ones then you don't need to worry about who has what book marked.

Azure portal 

Step 1. In the Azure portal you need to create a webapp. You do this by going to "Create a resource" and then choose a web app.




Step 2. In the newly created webapp go to Advanced tools (Kudu)


Step 3. In Kudu go to Debug console > cmd


Step 4. Browse to /site/wwwroot and click on the add ("+") button and choose to create a new file and name it web.config

Step 5. Click on the edit button for the web.config file and add in the following and hit save.


<?xml version="1.0" encoding="utf-8" ?>
<configuration>
    <system.webServer>
        <rewrite>
            <rules>
                <rule name="Redirect test enviornment" stopProcessing="true">
                    <match url="^test" />
                    <action type="Redirect" url="https://testurl.cloudax.dynamics.com" redirectType="Permanent" />
                </rule>   
                <rule name="Redirect stage enviornment" stopProcessing="true">
                    <match url="^stage" />
                    <action type="Redirect" url="https://stageurl.cloudax.dynamics.com" redirectType="Permanent" />
                </rule>
                <rule name="Redirect dm enviornment" stopProcessing="true">
                    <match url="^dm" />
                    <action type="Redirect" url="https://datamigrationurl.cloudax.dynamics.com" redirectType="Permanent" />
                </rule>
                <rule name="Redirect qa enviornment" stopProcessing="true">
                    <match url="^qa" />
                    <action type="Redirect" url="https://qaurl.operations.dynamics.com/" redirectType="Permanent" />
                </rule>
                <rule name="Redirect header" stopProcessing="true">
                    <match url=".*" />
                <action type="Redirect" url="https://www.google.com" redirectType="Permanent" />
                </rule>
            </rules>
        </rewrite>
    </system.webServer>
</configuration>    


Step 6. After hitting save you should now be able to access your azure web app in the following manner

https://yourwebappurl.com/test
https://yourwebappurl.com/qa
https://yourwebappurl.com/stage
https://yourwebappurl.com/dm

and they should now forward to their respective website.

If you do not include a /* tag and just put https://yourwebappurl.com then it should forward you to google.com as well





D365FO - Can not rename database - The database could not be exclusively locked to perform the operation

Currently in D365 when importing a new dataset via the bacpac method you are supposed to import the data into a new database and then rename the old AxDB to something else and then rename the new AxDB_new to AxDB. In order to do this previously we would turn off the following services
  • Microsoft Dynamics 365 Unified Operations: Batch Management Service
  • Microsoft Dynamics 365 Unified Operations: Data Import Export Framework Service
  • Management Reporter 2012 Process Service
  • World Wide Web Publishing Service (IIS/W3SVC)

Which SHOULD which should drop all connections to the AxDB. However I am noticing that now (in 8.1+) there are still connections from axOnline, axSystem and a .net provider that I am not sure where they are coming from. Which will cause the error

Unable to rename AxDB

The database could not be exclusively locked to perform the operation.

I have tried putting the db into single user mode and restricted mode however the connections still happen. So in order to rename the databases and drop the connections you can use the following script which will wrap up everything into a single transaction and seems to work well



--set main db to single connections and drop the existing connections
ALTER DATABASE AxDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE
--rename datbase to db_old
ALTER DATABASE AxDB MODIFY NAME = AxDB_old
--set the old db to multi user
ALTER DATABASE AxDB_old SET MULTI_USER
--rename the new db to the main db name
ALTER DATABASE AxDB_new MODIFY NAME = AxDB        

Monday, November 5, 2018

D365FO - Oracle Virtual Box Dupe UUID found on new OneBox version

Currently I am using Oracle VM VirtualBox Manager to run my onebox instance of D365FO for development. However anytime a new version comes out and I try to create a new instance of my onebox I get the following error

Cannot register the hard disk 'newD365FO.vhd' because a harddisk 'oldD365FO.vhd' with UUID already exists.

In order to fix the problem we need to do the following:

1. Open cmd.exe as admin and run the following
2. cd C:\Program Files\Oracle\VirtualBox\
3. VBOXMANAGE.EXE internalcommands sethduuid "C:\locationofd365vhd\FinandOps8.0withPlatUpdate15.vhd"



Once this has been completed continue adding the disk to a new vm setup like you normally would and you should no longer get the error.


Tuesday, August 28, 2018

D365FO - C#/X++ - Authentication & Custom web service examples

While starting my path with D365FO and the need to create custom web services to integrate outside systems with D365FO with no previous Azure experience I found myself spending a large amount of time trying to figure out how to properly authenticate with D365FO along with the different options we now have with D365FO compared to the old AX2012 AIF services.

There doesn't seem to be many resources online or in books about this and each one I was able to find seems to be structured differently compared if we are doing interactive logins, defined creds, or using the web api/secret key even though the difference is tiny.

Note: this is just my first stab at authentication so there might be better ways to do some of this. These are just examples that I have found to work well.


There are three different ways to accomplish authenticating with the D365FO system

1. Interactive login
2. Define username/password
3. Using secret key

Because option 2 requires the user the to login interactively before we can use it the examples I am providing will not go over it in my C# app but the code is listed at the bottom of this post.

If you receive the error The user or administrator has not consented to use the application with ID # named #. Send an interactive authorization request for this user and resource

It is because the exact scenario I decided has occurred and you need to login in interactively to grant it permission.

Files:
C# D365Auth - Download - This project includes the following:
The project includes the odataclient.cs/tt which is why it is so big

D365FO Azure Authentication - Interactive & Defined Creds (Secret key), Setting up a class to define multiple environments

SOAP - Calling custom webservice - single input/single output, No input/multiple output, Multiple inputs/multiple outputs, Single input/update record/single output(before and after update results)

JSON - No input/multiple outputs, Single input/single output

Odata - Calls customer entity to get basic customer info, filter customer entity by defined customer account

X++ D365FO - Download - This project includes the following:
Webservice class: method that expects inputclass outputs class, method that expects a string and returns a string, method with no parms and returns an output class, method that inspects 2 parms and returns a string.

It is good to note that I found if you name the service + service group with the same name you may run into issues even though it will compile.



Outside of downloading the project I thought I would post the 2 main methods  I have defined for interactive and secret key login so it is easy to tell the difference.

Interactive:
        public static AuthenticationResult GenerateAzureToken(AXEnvironments _environmentName)
        {
            //create the context of the token
            AuthenticationContext context = new AuthenticationContext(_environmentName.AzureADTenant, TokenCache.DefaultShared);
            
            //create token for the client 
            Task<AuthenticationResult> task = context.AcquireTokenAsync(_environmentName.AzureResource, _environmentName.NativeAzureClientAppId, new Uri(_environmentName.UriString), new PlatformParameters(PromptBehavior.Always));
            
            //wait for the task to finsh
            task.Wait();
            
            //return the result
            return task.Result;
        }

Defined:
        public static AuthenticationResult GenerateAzureToken(AXEnvironments _environment)
        {

            //create the credential object based upon the encrypted secret key in azure
            ClientCredential creds = new ClientCredential(_environment.WebAzureClientAppId, _environment.AzureClientSecret);

            //setup the context of the token
            AuthenticationContext context = new AuthenticationContext(_environment.AzureADTenant, TokenCache.DefaultShared);
            
            //generate the token
            Task<AuthenticationResult> task = context.AcquireTokenAsync(_environment.AzureResource, creds);
            
            //wait for the task to finish
            task.Wait();
            
            //return the result
            return task.Result;
        }


Both of the methods are called via
        public static string GetAzureAuthenticationHeader(AXEnvironments _environment)
        {
            AuthenticationResult result;
            string authHeader = "";

            try
            {
                //generate the token and get an authorization
                result = GenerateAzureToken(_environment);
                authHeader = result.CreateAuthorizationHeader();
            }
            catch
            {
                authHeader = "";
            }

            return authHeader;
        }

Which is passed to
string authHeader = D365Auth.AXAuthorizationDefined.GetAzureAuthenticationHeader(currentEviornment);

which is sent to the http request header.


Just for reference if you do want to use a defined username and password you can use the following

        public static AuthenticationResult GenerateAzureToken(AXEnvironments _environment)
        {

            //create the credential object based upon the stored username/password
            UserPasswordCredential credential = new UserPasswordCredential(_environment.UserName, _environment.Password);
            
            //setup the context of the token
            AuthenticationContext context = new AuthenticationContext(_environment.AzureADTenant, TokenCache.DefaultShared);
            
            //generate the token
            Task<AuthenticationResult> task = context.AcquireTokenAsync(_environment.AzureResource, _environment.AzureClientAppId, credential);
            
            //wait for the task to finish
            task.Wait();
            
            //return the result
            return task.Result;
        }




This post assumes you have already setup authenication in Azure and D365FO which is gone over @ D365FO - Azure authentication setup for web services (defined creds and interactive login)






Azure - Delete button on native application registration is disabled / cannot remove

While working on figuring out how authentication between Azure <--> D365FO <--> Custom web service works I found myself unable to delete one of the application registrations I created because the delete button was disabled for the native application I created but not for the web api registration I created.


In order to fix this we need to do the following:



1: Click on manifest -> Edit

2: Change the property 'availableToOtherTenants' from true to false




3: Hit Save


The delete button should now be enabled.




If you have the same issue with a registered app that is of type Web app/ API then you can use the GUI to update this property by going to Settings > Properties > Multi-tenanted and simply change from yes to no.