Wednesday, December 5, 2018

D365FO / Azure - Create website redirect in azure to handle supplemental environment url changes

Currently while dealing with D365FO supplemental environments within LCS that are all hosted on a single machine anytime we update them and there is an error microsoft's recommended fix is to stand up a new environment. Because of this the url will change and you need to notify your users. In order to handle this scenario so our users do not need to worry about a new url you can create a redirect website in azure that you can point to whatever url you want and if your users bookmark these urls instead of the direct ones then you don't need to worry about who has what book marked.

Azure portal 

Step 1. In the Azure portal you need to create a webapp. You do this by going to "Create a resource" and then choose a web app.




Step 2. In the newly created webapp go to Advanced tools (Kudu)


Step 3. In Kudu go to Debug console > cmd


Step 4. Browse to /site/wwwroot and click on the add ("+") button and choose to create a new file and name it web.config

Step 5. Click on the edit button for the web.config file and add in the following and hit save.


<?xml version="1.0" encoding="utf-8" ?>
<configuration>
    <system.webServer>
        <rewrite>
            <rules>
                <rule name="Redirect test enviornment" stopProcessing="true">
                    <match url="^test" />
                    <action type="Redirect" url="https://testurl.cloudax.dynamics.com" redirectType="Permanent" />
                </rule>   
                <rule name="Redirect stage enviornment" stopProcessing="true">
                    <match url="^stage" />
                    <action type="Redirect" url="https://stageurl.cloudax.dynamics.com" redirectType="Permanent" />
                </rule>
                <rule name="Redirect dm enviornment" stopProcessing="true">
                    <match url="^dm" />
                    <action type="Redirect" url="https://datamigrationurl.cloudax.dynamics.com" redirectType="Permanent" />
                </rule>
                <rule name="Redirect qa enviornment" stopProcessing="true">
                    <match url="^qa" />
                    <action type="Redirect" url="https://qaurl.operations.dynamics.com/" redirectType="Permanent" />
                </rule>
                <rule name="Redirect header" stopProcessing="true">
                    <match url=".*" />
                <action type="Redirect" url="https://www.google.com" redirectType="Permanent" />
                </rule>
            </rules>
        </rewrite>
    </system.webServer>
</configuration>    


Step 6. After hitting save you should now be able to access your azure web app in the following manner

https://yourwebappurl.com/test
https://yourwebappurl.com/qa
https://yourwebappurl.com/stage
https://yourwebappurl.com/dm

and they should now forward to their respective website.

If you do not include a /* tag and just put https://yourwebappurl.com then it should forward you to google.com as well





D365FO - Can not rename database - The database could not be exclusively locked to perform the operation

Currently in D365 when importing a new dataset via the bacpac method you are supposed to import the data into a new database and then rename the old AxDB to something else and then rename the new AxDB_new to AxDB. In order to do this previously we would turn off the following services
  • Microsoft Dynamics 365 Unified Operations: Batch Management Service
  • Microsoft Dynamics 365 Unified Operations: Data Import Export Framework Service
  • Management Reporter 2012 Process Service
  • World Wide Web Publishing Service (IIS/W3SVC)

Which SHOULD which should drop all connections to the AxDB. However I am noticing that now (in 8.1+) there are still connections from axOnline, axSystem and a .net provider that I am not sure where they are coming from. Which will cause the error

Unable to rename AxDB

The database could not be exclusively locked to perform the operation.

I have tried putting the db into single user mode and restricted mode however the connections still happen. So in order to rename the databases and drop the connections you can use the following script which will wrap up everything into a single transaction and seems to work well



--set main db to single connections and drop the existing connections
ALTER DATABASE AxDB SET SINGLE_USER WITH ROLLBACK IMMEDIATE
--rename datbase to db_old
ALTER DATABASE AxDB MODIFY NAME = AxDB_old
--set the old db to multi user
ALTER DATABASE AxDB_old SET MULTI_USER
--rename the new db to the main db name
ALTER DATABASE AxDB_new MODIFY NAME = AxDB        

Monday, November 5, 2018

D365FO - Oracle Virtual Box Dupe UUID found on new OneBox version

Currently I am using Oracle VM VirtualBox Manager to run my onebox instance of D365FO for development. However anytime a new version comes out and I try to create a new instance of my onebox I get the following error

Cannot register the hard disk 'newD365FO.vhd' because a harddisk 'oldD365FO.vhd' with UUID already exists.

In order to fix the problem we need to do the following:

1. Open cmd.exe as admin and run the following
2. cd C:\Program Files\Oracle\VirtualBox\
3. VBOXMANAGE.EXE internalcommands sethduuid "C:\locationofd365vhd\FinandOps8.0withPlatUpdate15.vhd"



Once this has been completed continue adding the disk to a new vm setup like you normally would and you should no longer get the error.


Tuesday, August 28, 2018

D365FO - C#/X++ - Authentication & Custom web service examples

While starting my path with D365FO and the need to create custom web services to integrate outside systems with D365FO with no previous Azure experience I found myself spending a large amount of time trying to figure out how to properly authenticate with D365FO along with the different options we now have with D365FO compared to the old AX2012 AIF services.

There doesn't seem to be many resources online or in books about this and each one I was able to find seems to be structured differently compared if we are doing interactive logins, defined creds, or using the web api/secret key even though the difference is tiny.

Note: this is just my first stab at authentication so there might be better ways to do some of this. These are just examples that I have found to work well.


There are three different ways to accomplish authenticating with the D365FO system

1. Interactive login
2. Define username/password
3. Using secret key

Because option 2 requires the user the to login interactively before we can use it the examples I am providing will not go over it in my C# app but the code is listed at the bottom of this post.

If you receive the error The user or administrator has not consented to use the application with ID # named #. Send an interactive authorization request for this user and resource

It is because the exact scenario I decided has occurred and you need to login in interactively to grant it permission.

Files:
C# D365Auth - Download - This project includes the following:
The project includes the odataclient.cs/tt which is why it is so big

D365FO Azure Authentication - Interactive & Defined Creds (Secret key), Setting up a class to define multiple environments

SOAP - Calling custom webservice - single input/single output, No input/multiple output, Multiple inputs/multiple outputs, Single input/update record/single output(before and after update results)

JSON - No input/multiple outputs, Single input/single output

Odata - Calls customer entity to get basic customer info, filter customer entity by defined customer account

X++ D365FO - Download - This project includes the following:
Webservice class: method that expects inputclass outputs class, method that expects a string and returns a string, method with no parms and returns an output class, method that inspects 2 parms and returns a string.

It is good to note that I found if you name the service + service group with the same name you may run into issues even though it will compile.



Outside of downloading the project I thought I would post the 2 main methods  I have defined for interactive and secret key login so it is easy to tell the difference.

Interactive:
        public static AuthenticationResult GenerateAzureToken(AXEnvironments _environmentName)
        {
            //create the context of the token
            AuthenticationContext context = new AuthenticationContext(_environmentName.AzureADTenant, TokenCache.DefaultShared);
            
            //create token for the client 
            Task<AuthenticationResult> task = context.AcquireTokenAsync(_environmentName.AzureResource, _environmentName.NativeAzureClientAppId, new Uri(_environmentName.UriString), new PlatformParameters(PromptBehavior.Always));
            
            //wait for the task to finsh
            task.Wait();
            
            //return the result
            return task.Result;
        }

Defined:
        public static AuthenticationResult GenerateAzureToken(AXEnvironments _environment)
        {

            //create the credential object based upon the encrypted secret key in azure
            ClientCredential creds = new ClientCredential(_environment.WebAzureClientAppId, _environment.AzureClientSecret);

            //setup the context of the token
            AuthenticationContext context = new AuthenticationContext(_environment.AzureADTenant, TokenCache.DefaultShared);
            
            //generate the token
            Task<AuthenticationResult> task = context.AcquireTokenAsync(_environment.AzureResource, creds);
            
            //wait for the task to finish
            task.Wait();
            
            //return the result
            return task.Result;
        }


Both of the methods are called via
        public static string GetAzureAuthenticationHeader(AXEnvironments _environment)
        {
            AuthenticationResult result;
            string authHeader = "";

            try
            {
                //generate the token and get an authorization
                result = GenerateAzureToken(_environment);
                authHeader = result.CreateAuthorizationHeader();
            }
            catch
            {
                authHeader = "";
            }

            return authHeader;
        }

Which is passed to
string authHeader = D365Auth.AXAuthorizationDefined.GetAzureAuthenticationHeader(currentEviornment);

which is sent to the http request header.


Just for reference if you do want to use a defined username and password you can use the following

        public static AuthenticationResult GenerateAzureToken(AXEnvironments _environment)
        {

            //create the credential object based upon the stored username/password
            UserPasswordCredential credential = new UserPasswordCredential(_environment.UserName, _environment.Password);
            
            //setup the context of the token
            AuthenticationContext context = new AuthenticationContext(_environment.AzureADTenant, TokenCache.DefaultShared);
            
            //generate the token
            Task<AuthenticationResult> task = context.AcquireTokenAsync(_environment.AzureResource, _environment.AzureClientAppId, credential);
            
            //wait for the task to finish
            task.Wait();
            
            //return the result
            return task.Result;
        }




This post assumes you have already setup authenication in Azure and D365FO which is gone over @ D365FO - Azure authentication setup for web services (defined creds and interactive login)






Azure - Delete button on native application registration is disabled / cannot remove

While working on figuring out how authentication between Azure <--> D365FO <--> Custom web service works I found myself unable to delete one of the application registrations I created because the delete button was disabled for the native application I created but not for the web api registration I created.


In order to fix this we need to do the following:



1: Click on manifest -> Edit

2: Change the property 'availableToOtherTenants' from true to false




3: Hit Save


The delete button should now be enabled.




If you have the same issue with a registered app that is of type Web app/ API then you can use the GUI to update this property by going to Settings > Properties > Multi-tenanted and simply change from yes to no.




D365FO - Azure authentication setup for web services (defined creds and interactive login)

When developing web services within D365FO for other applications/languages we need to follow a new authentication process compared to AX 2012 since everything is hosted on Azure. The following will show you the settings that need to be defined within your Azure dashboard and within D365FO to enabled authentication to execute a custom web service (SOAP/JSON) or Odata calls.

It is good to note that you may need to setup up a webapi or a native application depending on what you are trying to accomplish. Both are about the same just the native application does not require the key generation.

Before I go over the steps needed the following goes over the multiple types of authentication for Azure and explains the difference between Native vs Web API auth scenarios: https://docs.microsoft.com/en-us/azure/active-directory/develop/authentication-scenarios 


Azure Setup for defined credentials/web api

Register a web app / api

Step 1: In the Azure port go to Azure Active Directory > App registrations > New Application registration

Step 2: Enter in the environment information for D365 and hit create

Step 3: Once initialized click on the settings button


Step 4: Click on required permissions > add > Select API


Step 5: Choose "Microsoft Dynamics ERP"

Select which options you want to give it access to. (usually all of them)

Choose done.

Step 6:  choose "keys"

 Under password input a description and choose when the cert should expire






Hit the save button and save the "value" aka the key. This is what will be used as a "handshake"


Step 7 (optional): Do the same thing but for a native application (interactive login). Just enter the main login url as the redirect url. You will not need the Key generation part.



D365FO setup:

Step 8 : In D365FO go to System administration > Setup > Azure Active Directory applications and create a new record for the app registration with the client id(s) from the Azure setup. If you are using the defined cred's via the web app / api type as listed below the User Id listed in this screen is what the system will log an new records created via the webservice








Monday, August 6, 2018

D365FO - TableGroupAll dynamic lookup based on previous field example

The following is an example of how to add in a lookup field to a form that will change sources of the lookup depending on a previous selection. In the example below we a mandatory field that has to be filled in before it can be inserted into a table however the field that needs to be filled in will be populated by a dropdown from either CustomCustomerAccountView or CustomCustomerGroupsView which is determined by a previous field in the datasource. The value that will be saved will be of two different edt types or blank but saved within the same field. This utilizes the idea behind the default Enum TableGroupAll which allows the user to select a Table, GroupId or All as an enum value and then the user selects something that is of type table(specific), group or all.

Our detailed scenario:

Form Details:
Custom form called MyCustomForm which we need need to look at the control MyCustomForm_AccountCode to determine what the lookup for MyCustomForm_AccountSelection will show.

Table Details:
Table: MyCustomRuleTable  - this is where the value of AccountCode(enum TableGroupAll) and AccountSelection (lookup value) will be saved

Lookup Details:
TableGroupAll::Table = pull the data from CustomCustomerAccountView and save the customer account id
TableGroupAll::GroupId = pull the data from CustomCustomerGroupsView and save the customer group id
TableGroupAll::All = do not allow the user to select anything and the value should be blank

This field will be in a grid value so we need to also determine when to enable/disable the selection field as well as marking the field as mandatory.



[ExtensionOf(formStr(MyCustomForm))]
final class MyCustomForm_Extension
{
    /// <summary>
    /// Lookup code for account selection based on TableGroupAll selection
    /// </summary>
    /// <param name="sender"></param>
    /// <param name="e"></param>
    [FormControlEventHandler(formControlStr(MyCustomForm, MyCustomForm_AccountSelection), FormControlEventType::Lookup)]
    public static void MyCustomForm_AccountSelection_OnLookup(FormControl sender, FormControlEventArgs e)
    {
        Query query = new Query();
        QueryBuildDataSource queryBuildDataSource;
        SysTableLookup sysTableLookup;
        FormRun formRun;
        FormControl formControl;
        TableGroupAll accountCodeType;

        //get the value the account code selection
        formRun = sender.formRun();
        formControl = formRun.design().controlName(formControlStr(MyCustomForm, MyCustomForm_AccountCode));

     

        // check to see what the value of the account code field is to determine what lookup table we should use
        if(str2Enum(accountCodeType, formControl.valueStr()) == TableGroupAll::Table)
        {
            //look up that shows the customer number and name
            sysTableLookup = SysTableLookup::newParameters(tableNum(CustomCustomerAccountView),sender,true);
            sysTableLookup.addLookupfield(fieldNum(CustomCustomerAccountView, AccountNum),true);
            sysTableLookup.addLookupfield(fieldNum(CustomCustomerAccountView, Name));

            queryBuildDataSource = query.addDataSource(tableNum(CustomCustomerAccountView));
            queryBuildDataSource.addSortField(fieldnum(CustomCustomerAccountView,AccountNum));

            sysTableLookup.parmQuery(query);
            sysTableLookup.performFormLookup();
        }
        else if(str2Enum(accountCodeType, formControl.valueStr()) == TableGroupAll::GroupId)
        {
            //lookup that shows the smart atp group id and descriptions
            sysTableLookup = SysTableLookup::newParameters(tableNum(CustomCustomerGroupsView),sender,true);
            sysTableLookup.addLookupfield(fieldNum(CustomCustomerGroupsView, GroupCodeId),true);
            sysTableLookup.addLookupfield(fieldNum(CustomCustomerGroupsView, GroupCodeDescription));

            queryBuildDataSource = query.addDataSource(tableNum(CustomCustomerGroupsView));
            queryBuildDataSource.addSortField(fieldnum(CustomCustomerGroupsView,GroupCodeId));

            sysTableLookup.parmQuery(query);
            sysTableLookup.performFormLookup();
        }

        //cancel super() to prevent error.
        FormControlCancelableSuperEventArgs ce = e as FormControlCancelableSuperEventArgs;
        ce.CancelSuperCall();
    }
 
    /// <summary>
    ///  Selecting the current record from the forms data source
    /// </summary>
    /// <param name="sender">Forms data source as customtablename</param>
    /// <param name="e">Data source event</param>
    [FormDataSourceEventHandler(formDataSourceStr(MyCustomForm, MyCustomRuleTable), FormDataSourceEventType::Activated)]
    public static void MyCustomRuleTable_OnActivated(FormDataSource sender, FormDataSourceEventArgs e)
    {
        FormDataSource formDS = sender.formRun().dataSource(formDataSourceStr(MyCustomForm, MyCustomRuleTable));
        MyCustomRuleTable currentRule = formDS.cursor();
        boolean enableAccountSelection;


        //if the parent code is currently set to all then we should not allow the user to select the child selection field
enableAccountSelection = currentRule.AccountCode == TableGroupAll::All ? false : true;


//apply the allow edit/mandatory logic checks
formDS.object(fieldNum(MyCustomRuleTable, AccountSelection)).allowEdit(enableAccountSelection);
        formDS.object(fieldNum(MyCustomRuleTable, AccountSelection)).mandatory(enableAccountSelection);
     }

    /// <summary>
    /// Account code modified event
    /// </summary>
    /// <param name="sender">MyCustomRuleTable.AccountCode</param>
    /// <param name="e">Event args</param>
    [FormDataFieldEventHandler(formDataFieldStr(MyCustomForm, MyCustomRuleTable, AccountCode), FormDataFieldEventType::Modified)]
    public static void AccountCode_OnModified(FormDataObject sender, FormDataFieldEventArgs e)
    {
        FormDataSource formDS = sender.datasource();
        MyCustomRuleTable currentRule = formDS.cursor();
        boolean enableAccountSelection;

//clear the current account selection anytime the account code is changed
        currentRule.AccountSelection = "";

        //if the parent code is currently set to all then we should not allow the user to select the child selection field
enableAccountSelection = currentRule.AccountCode == TableGroupAll::All ? false : true;

        //apply the allow edit/mandatory logic checks
        formDS.object(fieldNum(MyCustomRuleTable, AccountSelection)).allowEdit(enableAccountSelection);
        formDS.object(fieldNum(MyCustomRuleTable, AccountSelection)).mandatory(enableAccountSelection);
    }
}

D365FO - Extension method data accessor examples


Six months into learning how to transition from AX 2012 X++ to D365FO X++ one of the things I have been struggling with the new extension model is how many different code structures you need in order to access the calling method’s property’s/datasource when subscribing to various types of events or methods. So I started to log down anytime I discover a new method. As I am sure there are plenty of other ones that I am missing as I have only touched the tip of the iceberg when it comes to field events but I decided it would be a good time to post this information as a reference. There is more than likely better ways to do some of these however I am posting this as a starting point.

I will be updating this post as my experience with D365FO grows as well

Source Event Parm Example
Class Pre/Post event XppPrePostArgs Get args and parmameter values from a method that is being extended. Parm 1 = Object Parm 2 = Common
PurchCreateFromSalesOrder callingClass = args.getThis() as PurchCreateFromSalesOrder;       
Object callerObject = args.getArgNum(1) as Object;
Common callerRecord = args.getArgNum(2) as Common;
Class Pre/Post event XppPrePostArgs Class example: SalesLineType salesLineType = args.getThis() as SalesLineType; 
Class main args Getting the caller record that is sent to a class via args. This is the same as 2012 however args.record().datasource() is now deprecated
//check the caller
if(_args.callerName() == formStr(SalesTable))
{
        //check to see which dataset type was passed
        if(_args.dataset() == tableNum(SalesLine))
        {
                //get the forms datasource record (FormDataUtil::getFormDataSource() replaced _args.record().datasource())
               FormDataSource salesLineDS = FormDataUtil::getFormDataSource(_args.record());
        }
}
Form Initialized xFormRun FormDataSource purchLine = sender.dataSource(formDataSourceStr([formname],[table]));
Form DataSource FormDataSource FormDataSource formDS = sender.formRun().dataSource(formDataSourceStr(EcoResProductDetailsExtended, MHSmartATPItemSettings));
MHSmartATPItemSettings smartATPItemSettings = formDS.cursor();
Form DataSource Field FormDataObject FormDataSource formDS = sender.datasource();
PurchLine purchLine = formDS.cursor();
Form Form Control FormControl FormRun formRun;
FormControl formControl;
formRun = sender.formRun();
formControl = FormRun.design().controlName(formControlStr(<form name>, <control name>));
someVariable = formControl.valueStr();
Form onClicked FormControl  FormRun formRun = sender.formRun();
 FormDataSource formDSSalesTable = formRun.dataSource(formDataSourceStr(SalesTable, SalesTable));
 FormDataSource formDSSalesLine = formRun.dataSource(formDataSourceStr(SalesTable, SalesLine));
       
 SalesTable salesTable = formDSSalesTable.cursor();
 SalesLine salesLine = formDSSalesLine.cursor();
Form Pre/Post event XppPrePostArgs FormRun formRun = args.getThis();
FormDataSource formDSLogisticsPostalAddress = formRun.dataSource(formDataSourceStr(LogisticsPostalAddress, LogisticsPostalAddress));
LogisticsPostalAddress logisticsPostalAddress = formDSLogisticsPostalAddress.cursor();
Table onDelete Common PurchLine purchLine = sender as PurchLine;
Table Modified Field Value Common TableName itemSettings = sender as TableName;
ModifyFieldValueEventArgs fieldEvent = e as ModifyFieldValueEventArgs;

  //check to see which field was modified
  switch(fieldEvent.parmFieldName())
        {
            case fieldStr([tablename], [fieldname]):
            ...do stuff
            break;
        }
Table ValidateFieldValue Common/DataEventArgs ValidateFieldValueEventArgs fieldEvent = e;
boolean isValid;
PurchLine purchLine = sender as PurchLine;
       
//declare the checkFailed      
isValid = checkFailed("some error event");
//save the result
fieldEvent.parmValidateResult(isValid);
Table Pre/Post event XppPrePostArgs PurchLine purchLine = args.getThis() as PurchLine;
Form Getting different datasource from calling object FormDataObject Get different datasource from formdataobject. Such as modified InventDim.InventLocationId -> PurchLine
 FormDataSource formDS = sender.datasource();
InventDim inventDim = formDS.cursor();
FormRun formRun = sender.datasource().formRun();
FormDataSource formPurchLineDS = formRun.datasource(formDataSourceStr(PurchTable, PurchLine));
PurchLine purchLine = formPurchLineDS.cursor();

It is good to note that when it comes to accessing anything on a form I have realized the key component is getting access to the FormRun object. Once you have access to that then you can really access anything that is public on the form such as controls or datasources

Update 8/7/18: added class args example as args.record().datasource() is now deprecated
Update 1/23/19: added getting different datasource example than the sender's main common source

D365FO – Adding a custom .dll to a project that can be deployed via source control.


Currently I wrote a custom C# .dll add-in for D365FO that will reach out to google maps api. By doing so it seemed easy to integrate into a D365FO project and it was. How ever until I went to move it I realized that it requires some extra steps in order for it to be deployed to another machine via TFS properly.

You will first need to add the reference node within visual studio to source control






However adding this referenced file into source control will not actually add the .dll to the source control like you would think. This is actually just an xml file that references the dll which can be found at

C:\AOSService\PackagesLocalDirectory\[package name]\[model/project]\AxReference

In order to add the actual .dll we need to go to source control explorer and go to the main package folder and right click and choose “add items to folder…”



browse to the folder: C:\AOSService\PackagesLocalDirectory\[package]\bin 

select the actual .dll that was added to the project.

This will create a bin folder in the main project node within TFS but will only include the file you selected and not every file in the folder.
At this point just check in the two files and then do a pull/get latest on the destination system and the project should now compile.

So to sum it up be sure to include the following two files
C:\AOSService\PackagesLocalDirectory\[package name]\[model/project]\AxReference\
And
C:\AOSService\PackagesLocalDirectory\[package]\bin\dllname.dll

or else the package/project will not compile when loaded onto a different server. 

Friday, August 3, 2018

D365FO - BP Rule: [BPUnusedStrFmtArgument]:The placeholder '%4' to strFmt is not used in the format string

Currently I am using the strfmt("mylabel:labelDesc", somevariable) method to display an alert to the user. Originally it had 3 parameters (%1, %2, %3) however later on I added a fourth variable %4 and now I get the following best practice error

BP Rule: [BPUnusedStrFmtArgument]:The placeholder '%4' to strFmt is not used in the format string.

I have tried to compile the project, model, db sync, retype out the line of code but no matter what the error still happens. In order to fix the error you need to regenerate the label resources which can be found @ C:\AOSService\PackagesLocalDirectory\[package name]\Resources

cmd.exe (as admin)
run the following based on what drive your aos service is running on
C:\AosService\PackagesLocalDirectory\bin\labelc.exe -metadata="C:\AosService\PackagesLocalDirectory" -output="C:\AosService\PackagesLocalDirectory\[package name]\Resources" -modelmodule="[package name]"


Monday, June 25, 2018

D365FO - Importing License File (ISV/VAR Add-on) via deployable package into QA/Prod

Because we do not have the same access in QA/Production/Build as we do in the lower level environments installing a license file is a bit different compared to development, test, stage (any environment within the supplemental scope)

In order to deploy this license file to production or QA we need to do the following. It is good to note that creating this package is not environment specific. Meaning you can create this package on dev and deploy to prod or create on test and push it is QA or stage. It is just a set of pre-defined scripts which is why it doesn’t matter.

Any AOS server:

Windows Explorer > \AOSService\PackagesLocalDirectory\Bin\CustomDeployablePackage
*usually C:\AOSService\PackagesLocalDirectory\Bin\CustomDeployablePackage or on the K-drive*



In here you will find the file ImportISVLicense.zip

Make a copy of this zip file and open it up. Then browse to the following location: ImportISVLicense.zip\AosService\Scripts\License

Copy the license .txt file into this folder and save the zip.


NOTE: During my first try of this I extracted the zip, copied the file into it and zipped it back up. However during the next steps below that allow the license file to be applied I received an error message saying it has an invalid HotfixInstallationInfo.xml even though I did not touch this file. If you modify the zip directly LCS has no issues with the file so I would recommend not extracting the zip file but rather just using the default one as a template and copy the zip somewhere else and make the change there.

It is also good to note that you can put multiple license files in single package. But it is good note that it will import from alphabetically


LCS > Open the AX project in which you wish to apply the license to which in this case is the QA environment which means we need to open the implementation project. Then go to Asset library


Go to software deployable package > Click on the add button and fill out the info regarding the file being uploaded and select our zip file we created in the steps above

You will need to give it a couple of minutes before the "valid" box has a check mark. You will not be able to apply this file until the instance is marked as valid.

Once the instance has been marked as valid we need to apply it to the system.

Go to full details page for the specific environment we need to apply this to and click on the Maintain option > Apply updates




From this window we need to choose the asset we just created and click on apply



After this the system will process the request and trigger emails whenever the process starts and completes.

Whenever you open the main page you will also see the current status of the import itself as well.


From this point we can choose to either A. Do nothing and allow people to test or B. can mark the asset was as release candidate.

You should now be able to log back onto the environment  and see that the isv solution is now there

And that the license configuration page now has the license checked.


Friday, June 22, 2018

D365FO - OneBox Default account information

It seems like you have to dig through the internet to find onebox login information so I thought I would put it on here to make it easily find-able.

OneBox default windows login:

Login: local\Administrator
Password: pass@word1

OneBox SQL login
Login: axdbadmin   
Password: AOSWebSite@123


I will add more as I run across the need for different accounts that are associated with the OneBox.

D365FO - Maintenance mode / Importing License File (ISV/VAR Add-on License import)

In AX 2012 in order to import a license file for an ISV or VAR add-on all you simply did was go to your license management form and click on import, select the file and run a full sync. In D365FO this has changed dramatically.


In D365FO there is now only a license configuration which is available via  System administration > Setup > License configuration however you will notice that the data is read-only.

(This form is read-only unless the system is in the maintenance mode. Maintenance mode can be enabled in this environment by running maintenance job from LCS, or using Deployment.Setup tool locally)

In order to change this from read-only to modifiable we need to put the system in maintenance mode. We do this by opening the command prompt (cmd.exe)

change the dir to your local \AosService\PackagesLocalDirectory\Bin\ folder:

>cd C:\AosService\PackagesLocalDirectory\Bin\

Put the service into maintenance mode:

>Microsoft.Dynamics.AX.Deployment.Setup.exe --metadatadir C:\AosService\PackagesLocalDirectory --bindir C:\AosService\PackagesLocalDirectory\Bin --sqlserver . --sqldatabase axdb --sqluser [sql login] --sqlpwd [sqlpassword] --setupmode maintenancemode --isinmaintenancemode true

Reset IIS/Cycle the AOS> IISRESET


The system should now be in maintenance mode so we can import the file

via the cmd.exe prompt run the following

Import license file:
> Microsoft.Dynamics.AX.Deployment.Setup.exe --setupmode importlicensefile --metadatadir C:\AOSService\PackagesLocalDirectory --bindir C:\AOSService\PackagesLocalDirectory --sqlserver . --sqldatabase AxDB --sqluser [sql login] --sqlpwd [sql password] --licensefilename C:\licensefolder\license2018.txt


In D365FO you should now be able to open  System administration > Setup > License configuration and not see the read-only warning. You should see the name of the license that was import and you should be able to enable the configuration for the new license file.





Turn off maintenance mode

> Microsoft.Dynamics.AX.Deployment.Setup.exe --metadatadir C:\AosService\PackagesLocalDirectory --bindir C:\AosService\PackagesLocalDirectory\Bin --sqlserver . --sqldatabase axdb --sqluser [sql login] --sqlpwd [sqlpassword] --setupmode maintenancemode --isinmaintenancemode false



Reset IIS/Cycle the AOS: > iisreset


Note:
The example above was for importing it into a onebox. Depending on the environment you may need to change the drive from C:\* to K:\*
For this exe we list the metadatadir as the bin folder as well
The sqluser & sqlpassword will be different for each enviornment
The database name may be different depending on the environment


Reference:  D365FO - Importing License File (ISV/VAR Add-on) via deployable package into QA/Prod (importing license via deployable package)