Thursday, February 29, 2024

D365FO - Generating ZPL Label code based on template replacement

 While exploring how the advanced warehouse document routing feature worked in order to generate zpl code based on a dataset which can be dynamically defined I kept running into road blocks on using the default D365 code base because of the internal flag being set on this class base. The code that is out of the box really wants to to only be able to print labels and never view them which is kind of silly if you think about it.  Of course there is also nothing documented about these classes  (yet) on any of the boards, blogs, official channels.

Because of this I thought I would document how the replacement feature within these classes works will allows you to dynamiclly generate ZPL based on a user defined query within the UI. This is quite interesting and could be used elsewhere within the platform to allow you define a custom query and then have a replacement based on it. I know there are many products out there that already do this but figure I would show how to do this on your own as the majority of people still follow the approach of defining the tables and variables that need replaced within the code base (guilty as charged)


Key tables to understand:

WHSLabelLayout: this will house the header record and generic settings for the zpl template

WHSLabelLayoutVersion: this will house the ZPL code and contains the template in which we will need

WHSLabelLayoutDataSource: this will house the dynamics data source query object which can be defined dynamicly from the user but based on the main layouts calling record record.


Key classes to understand:

WhsCustomLabelPrintCommandGenerator: WHS class that will extend the base class WhsLabelPrintCommandGenerator. This class is used to overload some of the methods in order to define a custom query object based on what was defined within WHSLabelLayoutDataSource

WhsLabelPrintCommandGenerator: WHS base class which contains the overall calling logic to generate the labels as well as print them based on being dynamic or out of the box structure.

WhsDocumentRoutingTranslator: WHS translator object which will hold the structure of replacing things like $Table.FieldName$, queryRun or language objects that would be related to the variables. 

 WhsDocumentRoutingTemplateTranslator: WHS class that will allow you to take the translator, queryRun, zpl template and  execute a find and replaced based on dynamic variables defined within the template based on the query structure defined by the user



Whats really great about this approach from MS is that it will basically allow you to apply this logic to any scenario where you want to dynamically define variables based on field listing or methods on tables vs hardcoding a strReplace.


The following will show you how to take a template for a purchase order (PurchTable) which has a custom query and apply it to a zpl template. However it is good to note there is another type of template (out of the box) which I did not tackle.



 

        PurchTable purchTable = PurchTable::find(_purchId);
        //find the specific layout we want to use based on the name
        WHSLabelLayout labelLayout = WHSLabelLayout::find(WHSLabelLayoutType::CustomLabel, "Purchase orders");
        WHSLabelLayoutVersion layoutSourceVersion;

        //find the specific layout version which houses the zpl code
        select firstonly * from layoutSourceVersion where layoutSourceVersion.LabelLayoutId == labelLayout.LabelLayoutId;

       
        if(purchTable && labelLayout && layoutSourceVersion)
        {
            //create the initial translator object
            WhsDocumentRoutingTranslator translator = WhsDocumentRoutingTranslator::construct();

            if (labelLayout.LabelLocale)
            {
                //define the specific language
                translator.withLanguage(labelLayout.LabelLocale);
            }

            //is the layout a based on a custom template
            if (labelLayout.EnableTemplateTranslator)
            {
                QueryRun queryRun;
                Query newQuery;
                WHSLabelLayoutDataSource layoutDataSource;
                
                //grab the dynamic query that is defined for the specific label layout
                select firstonly DataSourceQuery from layoutDataSource
                    where layoutDataSource.LabelLayoutDataSourceId == layoutSourceVersion.LabelLayoutId;

                if (layoutDataSource && layoutDataSource.DataSourceQuery != conNull())
                {
                    newQuery = new Query(layoutDataSource.DataSourceQuery);
                    const FieldId RecIdFieldId = 65534; // Platform defined FieldId for RecId field (see Query/Constants.cs in Platform for example)
                    //add in a filter to the new query object so we pull down the specific record
                    newQuery.dataSourceNo(1)
                    .addRange(RecIdFieldId)
                    .value(queryValue(purchTable.RecId));
                }

                //convert the query object so we can pass it to the translator
                queryRun = new QueryRun(newQuery);
                translator.withRecordsFromQueryRun(queryRun);
                
                //define the original layout zpl code that needs to populated with dynamic variables
                WHSZPL layoutSource = layoutSourceVersion.zpl;
                
                //define a new translator object for the document routing structure based on the translator object which has our query which needs to be applied to the zpl code source
                WhsDocumentRoutingTemplateTranslator newZPLLabels = WhsDocumentRoutingTemplateTranslator::newFromTemplateAndQueryRun(layoutSource, queryRun)
                    .withTranslator(translator);
                
                //apply the queryRun object to the template via the translator
		List labelList = newZPLLabels.translateTemplate();
		//convert to string
                str zplStringWithData = labelList.toString();

		//TODO: now that we have raw dynamic zpl code we can do whatever we want with it. IE Send it to printer, create an image as a print preview, create a pdf, create an event that can manually do things without the build in prompt
                

                

            }
            else
            {
                //TODO:: what about when a default label is called? need to look into WhsLabelPrintCommandGenerator.printLabels() second half where the it calls the provider to generate the query run
                //WHSZPL outputLabel = translator.translate(layoutSource);
                //List labelsList = this.translateLabelTemplate();
            }



Tuesday, February 27, 2024

D365FO - ZPL Printer Emulation (labelary.com) directly within D365FO

It would appear that recently the Chrome ZPL printer emulator is no longer valid which is triggering some people to install a ZPL printer emulator in order to validate Dynamic ZPL code within D365FO. After much research it appears that all these extensions and printer emulators all use the free Label service https://labelary.com/

Because this API is being used throughout the industry why not create our own way to call this API directly within D365 vs installing a printer emulator which then requires a document routing agent to be installed.

The below code snippet shows you how to connect to their RESTful API service which can generate various types of images based on the ZPL codes. You could easily modularize the following logic in order to create something pretty powerful and dynamic based on any dataset.


Webservice to generate a label based on ZPL code: https://labelary.com/service.html

Online ZPL Viewer: https://labelary.com/viewer.html 


 

            System.Exception clrError;
            InteropPermission interopPermission;
            System.Net.HttpWebRequest request;
            System.IO.Stream stream;
            System.Byte[] zpl = System.Text.Encoding::UTF8.GetBytes("^xa^cfa,50^fo100,100^fdHello World^fs^xz"); //raw zpl code

            request = System.Net.WebRequest::Create("http://api.labelary.com/v1/printers/8dpmm/labels/4x6/0/") as System.Net.HttpWebRequest;
            request.Method = 'POST';
            request.Accept = "image/png";
            request.ContentType = 'application/x-www-form-urlencoded';
  
            try
            {
                interopPermission = new InteropPermission(InteropKind::ComInterop);
                interopPermission.assert();

                // send out the payload
                using (System.IO.Stream dataStream = request.GetRequestStream())
                {
                    dataStream.Write(zpl, 0, zpl.Length);
                }

                using (System.Net.HttpWebResponse response = request.GetResponse() as System.Net.HttpWebResponse)
                {
                    stream = response.GetResponseStream();
                    System.IO.MemoryStream filestream = new System.IO.MemoryStream();
                    
                    //we need to convert the main stream to a memory stream in order to access the binary objects
                    stream.CopyTo(filestream);
     
                    //send the memory stream directly to the user which now contains a file object
                    File::SendFileToUser(filestream, "test.png");
                    
                    //convert the memory stream into an image object which can then be displayed to the user
		    BinData binData = new BinData();
                    container baseContainer;
                    Image image = new Image();
                    
                    baseContainer =  Binary::constructFromMemoryStream(fileStream).getContainer();

                    binData.setData(baseContainer);
                    image.setData(binData.getData());
                    //display the newly created image to the user
		    FormImageControl1.image(image);
                }
                Info("done");
            }
            catch (Exception::CLRError)
            {
                //catch any clr error
                clrError = CLRInterop::getLastException();
                if (clrError != null)
                {
                    clrError = clrError.InnerException;
                    throw error(clrError.ToString());
                }
            }
            finally
            {
                CodeAccessPermission::revertAssert();
            }	 





D365FO - Converting a System.IO.MemoryStream object to Image

 Sometimes we need to either generate or download an image from a 3rd party service and want to serve up the image within the UI.


Below is an example on how to convert a MemoryStream Object that was generated from an HttpWebResponse into something that is viewable on screen or can be saved into the data base



 
	 
                using (System.Net.HttpWebResponse response = request.GetResponse() as System.Net.HttpWebResponse)
                {
                    stream = response.GetResponseStream();
                    System.IO.MemoryStream filestream = new System.IO.MemoryStream();
                    
                    //we need to convert the main stream to a memory stream in order to access the binary objects
                    stream.CopyTo(filestream);
     
                   
                    BinData binData = new BinData();
                    container baseContainer;
                    Image image = new Image();
                    
                    //convert the memory stream into a container object which can then be converted into a binData object which can then be defined on an image object which can then be displayed to the end user
		    baseContainer =  Binary::constructFromMemoryStream(fileStream).getContainer();

                    binData.setData(baseContainer);
                    image.setData(binData.getData());
                    //define the image control on a form to the image that was generated from the memory stream
		    FormImageControl1.image(image);
                }   

D365FO - Calling external RESTful API via POST to download a file and send to the user

 

I have recently found a mixture of AX 2012 / D365 code online when it comes to calling external REST web API’s directly from AX/D365FO. It seems as though in today’s world the majority of people only care about calling and receiving JSON or XML or some type of text-based object which the majority of time holds true. But what happens when that API should return some type of file or object rather than raw text where we want to save or serve up the file?

This is where it gets tricky because the System.IO.StreamReader like shown within the majority of examples online doesn’t provide the structure that we need in order to access binary objects to be able to send a file to the user or display something on the screen. It is however great when it comes to reading text based responses.

The key to understanding  the framework is once we are able to put the main System.IO.Stream into a System.IO.MemoryStream then we are easily able to convert the binary objects to various output's.

 

The best way to describe is that System.IO.Stream operates much like a FormRun object. It is the top most level and once we get access to that then we can access any child objects like StreamReader or MemoryStream which can then be handled differently based on the scenario.


Below is an example how to call a generic RESTful API via a post (the request structure may change based on your specific API being called)



 
	    System.Exception clrError;
            InteropPermission interopPermission;
            System.Net.HttpWebRequest request;
            System.IO.Stream stream;
            System.Byte[] zpl = System.Text.Encoding::UTF8.GetBytes("post body");

            request = System.Net.WebRequest::Create("API URL") as System.Net.HttpWebRequest;
            request.Method = 'POST';
            request.Accept = "image/png"; //document type to generate
            request.ContentType = 'application/x-www-form-urlencoded';
  
            try
            {
                interopPermission = new InteropPermission(InteropKind::ComInterop);
                interopPermission.assert();

                // send out the payload
                using (System.IO.Stream dataStream = request.GetRequestStream())
                {
                    dataStream.Write(zpl, 0, zpl.Length);
                }

                
                using (System.Net.HttpWebResponse response = request.GetResponse() as System.Net.HttpWebResponse)
                {
                    stream = response.GetResponseStream();
                    System.IO.MemoryStream filestream = new System.IO.MemoryStream();
                    
                    //we need to convert the main stream to a memory stream in order to access the binary objects
                    stream.CopyTo(filestream);
     
                    //send the memory stream directly to the user which now contains a file object
                    File::SendFileToUser(filestream, "test.png");
                }
            }
            catch (Exception::CLRError)
            {
                //catch any clr error
                clrError = CLRInterop::getLastException();
                if (clrError != null)
                {
                    clrError = clrError.InnerException;
                    throw error(clrError.ToString());
                }
            }
            finally
            {
                CodeAccessPermission::revertAssert();
            }




Tuesday, November 26, 2019

D365FO / C# - Accessing the InfoLog / InfoLogEntries Object within a custom webservice

While working on testing a custom web-service that originally tried to read the info log and report it back to a field in web service via custom logic I noticed that that error message being returned was actually pretty generic and told me nothing about the issue.

After debugging the code it looks like the info log itself is now being returned automatically with a SOAP / JSON web service which is great because this actually tells us why a record cant be created or what the issue is in detail with no extra code. However it appears nothing is logged online about this and it doesn't appear to work like a contract class response and which would allow you to use a foreach() on the object. Instead we need to treat it like an array based object as the code below shows

Whats cool about this is that it actually returns the type of message from the infolog as well (Info, Warning, Error) so you can add in logic that will tell you what exactly the issue is.

 //call out the infolog class  
 [webserviceName].Infolog messages = new [webservicename].Infolog();  
 //read in the infolog from the response  
 messages = response.Infolog;  
 //get the length of the log and read in each message  
 for(int messagePosition = 0; messagePosition <= messages.Entries.Length - 1; ++messagePosition)  
 {  
    AzureLog.AppendText(String.Format("Info Log Message {1}: {0}", messages.Entries[messagePosition].Message.ToString(), messages.Entries[messagePosition].Type.ToString()) + Environment.NewLine);  
 }  




The output will then look something like this


Info Log Message Info: A custom info statement based on an extension of salesTable.ValidateWrite()
Info Log Message Warning: Site [SITENAME] does not exist.
Info Log Message Warning: Warehouse [WAREHOUSENAME] is not connected to site [SITENAME]
Info Log Message Warning: Warehouse [WAREHOUSE] does not exist.
Info Log Message Warning: Site [SITE] does not exist.
Info Log Message Error: Update has been canceled.







Wednesday, August 21, 2019

D365FO - BYODB Entity Export Failing with no error message (Catch Weight Exporting Issue)


Recently after standing up a new export and going to export the Sales order lines V2 and Purchase order lines V2 entity's to a Azure SQL DB it started to fail. However in the execution log there is no error





If you try running it in batch mode the batch completes but nothing is export but the export shows as completed.


If you dig into event viewer on the server under

Application and Services Logs > Microsoft > Dynamics > AX-DIXFRuntime > Operational

You will find your error but it is very generic and does not give you any helpful information just that entity is crashing







Error:
System.Reflection.TargetInvocationException: Exception has been thrown by the target of an invocation. ---> System.Exception: Exception from HRESULT: 0xC0202009 at Microsoft.Dynamics.AX.Framework.Tools.DMF.ServiceProxy.DmfEntitySharedTypesProxy.DoWork[T](Func`1 work) --- End of inner exception stack trace --- at System.RuntimeMethodHandle.InvokeMethod(Object target, Object[] arguments, Signature sig, Boolean constructor) at System.Reflection.RuntimeMethodInfo.UnsafeInvokeInternal(Object obj, Object[] parameters, Object[] arguments) at System.Reflection.RuntimeMethodInfo.Invoke(Object obj, BindingFlags invokeAttr, Binder binder, Object[] parameters, CultureInfo culture) at System.RuntimeType.InvokeMember(String name, BindingFlags bindingFlags, Binder binder, Object target, Object[] providedArgs, ParameterModifier[] modifiers, CultureInfo culture, String[] namedParams) at Microsoft.Dynamics.Ax.Xpp.CLRInterop.MakeReflectionCall(Object instance, String methodName, Object[] parameters)



After trying multiple ways of regenerating the mapping, refreshing the entity list, builds, db syncs I was still getting the error.

However I deleted all fields within the mapping and just did a few base fields the entity index is based on and found that the entity would export. Because of this I knew that something was wrong with one of the field mappings.


After regenerating the complete list and started to remove fields in order to see if it would work I was able to find the specific fields that were cause the issue.

ORDEREDCATCHWEIGHTQUANTITY


Currently we have Catch Weight enabled within the configuration key but are not using this feature as we are still in the design phase of our implementation


If we have an entity that is failing export and it is a base export one thing to do is whenever defining the mapping or the scheme do a search for "catch" and remove it from the mapping (in data project assignment or main modify target mapping) or  the export will now work. It does not matter if the fields get published to the scheme it only matters if these catch weight fields get included in the export.



Update 9/26/19: It turns out that disabling the catchweight key does not stop the field from being added to the entity mapping.

Monday, July 15, 2019

D365FO - Copy custom value from Accounts Payable Invoice Register (LedgerJournalTrans) to Invoice Approval (VendTrans)

When creating a accounts payable invoice register entry it will create a record in the table LedgerJournalTrans. Whenever you post this journal it will create a record for the journal in the VendTrans table.

If you go into the Invoice Approval form and click on "Find Vouchers" you will see that any custom fields you added to LedgerJournalTrans are available. However once one of these vouchers are loaded into the approval form you will find that the values are longer available. This is because it has converted your LedgerJournalTrans instance into its own VendTrans table instance.

In order to copy the values from LedgerJournalTrans to VendTrans you will need to extend the VendVoucher class which is shown below. It is good to note that in theory this method should be able to be applied to the CustVoucher (customer invoices) as it applies the same structure (CustVendTrans map) and using the common table method to pass VendTrans or CustTrans. 


[ExtensionOf(classStr(VendVoucher))]
final class APInvoiceModsVendVoucher_Extension
{
    

    /// <summary>
    /// COC of initCustVendTrans which sets the initial values of vendTrans from ledgerjournaltrans
    /// </summary>
    /// <param name = "_custVendTrans">VendTrans</param>
    /// <param name = "_ledgerPostingJournal">LedgerVoucher</param>
    /// <param name = "_useSubLedger">Use sub ledger default is false</param>
    protected void initCustVendTrans(
                                    CustVendTrans _custVendTrans,
                                    LedgerVoucher _ledgerPostingJournal,
                                    boolean _useSubLedger)
    {
        //execute the base functionality
        next initCustVendTrans(_custVendTrans, _ledgerPostingJournal, false);
        
        //get vendTrans table buffer form CustVendTrans map instance
        VendTrans vendTrans = _custVendTrans as VendTrans;
        
        //if the common instance is being initialized via the ledgerjournaltrans table we need to copy the custom field
        if (common && common.TableId == tableNum(LedgerJournalTrans))
        {
            LedgerJournalTrans ledgerJournalTrans = common;
            
            //copy internal invoice notes and the description from the ledgerjournal trans to the vendtrans
            vendTrans.CustomField = ledgerJournalTrans.CustomField;
            //this is not a custom field but the values do not get transfered
            vendTrans.Txt = ledgerJournalTrans.Txt;
        }
    }

}

D365FO / Setting default values for custom fields on a vendor invoice from a purchase order (VendInvoiceInfoTable / PurchFormletterParmDataInvoice)

When creating a vendor invoice and you are trying to populate custom fields on table VendInvoiceInfoTable you may find that the following will only work if you are creating a blank invoice and not executing it from the invoice create button on a purchase order



[DataEventHandler(tableStr(VendInvoiceInfoTable), DataEventType::InitializingRecord)] 
 public static void VendInvoiceInfoTable_onInitializingRecord(Common sender, DataEventArgs e) 
 { 
      VendInvoiceInfoTable vendInvoiceInfoTable = sender as VendInvoiceInfoTable; 
      vendInvoiceInfoTable.CustomField   VendParameters::find().DefaultCustomField; 
 }

This is due to the class PurchFormletterParmDataInvoice.createInvoiceHeaderFromTempTable()
which called .skipEvents(true), skipDataMethods(true), .skipDatabaseLog(true) and calls a Query::insert_recordset() because of this none of the default events will get triggered.

In order to properly populate customfields on a blank vendor invoice or creating once from a purchase order the following will need to be done via extensions and CoC.


  1. Add custom fields to VendInvoiceInfoTable
  2. Add custom fields to VendInvoiceInfoTableTmp (name and types need to match what was created on VendInvoiceInfoTable)
    1. We have to add it to this table because on PurchFormLetterParmDataInvoice.insertParmTable() it calls a buf2buf command which loops through the field list.
  3. Create extension class of table VendInvoiceInfoTable and apply a CoC of defaultRow method which will populate the value onto both the regular and tmp instance of VendInvoiceInfoTable however it will not save to the database unless the remaining steps are completed
    1. /// <summary>
      /// Default values for a vendor invoice header from a purchase order
      /// </summary>
      [ExtensionOf(tableStr(VendInvoiceInfoTable))]
      final class APInvoiceModsVendInvoiceInfoTable_Extension
      {
          /// <summary>
          /// COC table method defaultRow which resets default valies
          /// </summary>
          /// <param name = "_purchTable">PurchTable</param>
          /// <param name = "_ledgerJournalTrans">LedgerJournalTrans</param>
          /// <param name = "_resetFieldState">Reset field state</param>
          public void defaultRow(PurchTable _purchTable, LedgerJournalTrans _ledgerJournalTrans, boolean _resetFieldState)
          {
              CustomEDTType defaultCustomField;
              
      next defaultRow(_purchTable,_ledgerJournalTrans,_resetFieldState);

              //find the default custom field from vend parms
      defaultCustomField = VendParameters::find().DefaultCustomField;
              
      //save the default invoice type to the header which will get copied to    VendInvoiceInfoTableTmp and then back to the main table due to PurchFormLetterParmDataInvoice.createInvoiceHeaderFromTempTable
      this.CustomField = defaultCustomField;

          }

      }
  4. Create extension class of PurchFormLletterParmDataInvoice and apply a CoC to buildCreateInvoiceHeaderFromTempTableFieldQuery and buildCreateInvoiceHeaderFromTempTableFieldMap because the method PurchFormletterParmDataInvoice.createInvoiceHeaderFromTempTable() creates a dynamic map of the fields to copy from the temp table into the main table 
    1. [ExtensionOf(classStr(PurchFormletterParmDataInvoice))]
      final class APInvoiceModsPurchFormletterParmDataInvoice_Extension
      {
          /// <summary>
          /// COC method that adds our custom field to the Field Query selection  from the temp table
          /// </summary>
          /// <param name = "_qbdsVendInvoiceInfoTableTmp">VendInvoiceInfoTableTmp</param>
          protected void buildCreateInvoiceHeaderFromTempTableFieldQuery(QueryBuildDataSource _qbdsVendInvoiceInfoTableTmp)
          {
              next buildCreateInvoiceHeaderFromTempTableFieldQuery(_qbdsVendInvoiceInfoTableTmp);

              //add custom selection field
      _qbdsVendInvoiceInfoTableTmp.addSelectionField(fieldNum(VendInvoiceInfoTableTmp, CustomField));
          }

          /// <summary>
          /// COC method thats adds our custom field to the dynamic field map against the main table. Which will get copied from the selection query
          /// </summary>
          /// <param name = "_qbdsVendInvoiceInfoTableTmp">VendInvoiceInfoTableTmp</param>
          /// <returns>Modified dynamic map to insert into the db</returns>
          protected Map buildCreateInvoiceHeaderFromTempTableFieldMap(QueryBuildDataSource _qbdsVendInvoiceInfoTableTmp)
          {
              var targetToSourceMap = next buildCreateInvoiceHeaderFromTempTableFieldMap(_qbdsVendInvoiceInfoTableTmp);
              targetToSourceMap.insert(fieldStr(VendInvoiceInfoTable, CustomField), [_qbdsVendInvoiceInfoTableTmp.uniqueId(), fieldStr(VendInvoiceInfoTableTmp, CustomField)]);
             
      return targetToSourceMap;
           }

      }

By applying our populate method to the defaultRow() method this will also trigger the value to populated on a blank vendor invoice once a vendor account is selected.

Monday, May 20, 2019

D365FO / Azure Powershell NSG Rule Scripting to allow Azure Datacenters access


Currently we have the need to setup Azure based VM's for ISV products. Because of this we need to create NSG (Network security group)  rules to allow inbound and outbound communication so that D365FO can communication with these machines. However because the D365FO instances will not have a static IP we need to account for all of the Azure Data Center IP's so that we do not leave the public ip/port available for unwanted communication.

This is my first stab at Azure based powershell scripting. So I am not sure if this is optimal but it seems like there isn't much information out about D365FO communication with outside servers that are also hosted on Azure. So hopefully this will help others

The official list of Azure Data Center IP's can be found at the following locations:



In the following example we will open up Port 21 for inbound and outbound communication. It will detect to see if the NSG rule exists and if it does it will update the rule. If no rule is found then we will create the rule against the defined NSG.


You will need to gather the following data points in order for this to work.

Azure subscription id
Azure resource group that the network security group is associated with
Azure network security group name


All of the information can be found on either the overview page of the VM you are creating an NSG rule for or on the NSG itself.


It is good to note that currently NSG's have a limit of 1000 rules because of this we cannot define an rule per ip but rather must create a single rule that contains multiple ip ranges.




# Sign-in with Azure account credentials

Login-AzureRmAccount



# Select Azure Subscription

#Azure subscription id

$subscriptionId = '';

#Azure resource group name associated with the network security group

$rgName = '';

#Azure network security group that we need to create the rule against
$nsgname = '';









# Download current list of Azure Public IP ranges

Write-Host "Downloading AzureCloud Ip addresses..."

$downloadUri = "https://www.microsoft.com/en-us/download/confirmation.aspx?id=56519"

$downloadPage = Invoke-WebRequest -Uri $downloadUri;

$request = ($downloadPage.RawContent.Split('"') -like "*.json")[0];

$json = Invoke-WebRequest -Uri $request | ConvertFrom-Json | Select Values

$ipRanges = ($json.values | Where-Object {$_.Name -eq 'AzureCloud'}).properties.addressPrefixes



#set rule priority

$rulePriority = 200



    

#define the rule names    

$ruleNameOut = "Allow_AzureDataCenters_Out"

$ruleNameIn = "Allow_AzureDataCenters_In" 






#nonprod network security group 

$nsg = Get-AzureRmNetworkSecurityGroup -Name $nsgname -ResourceGroupName $rgName -ErrorAction:Stop;

Write-Host "Applying AzureCloud Ip addresses to non production NSG  $nsgname..."



#check to see if the inbound rule already existed

$inboundRule = ($nsg | Get-AzureRmNetworkSecurityRuleConfig -Name $ruleNameIn -ErrorAction SilentlyContinue)





if($inboundRule -eq $null)

{

    #create inbound rule    

    $nsg | Add-AzureRmNetworkSecurityRuleConfig -Name $ruleNameIn -Description "Allow Inbound to Azure data centers" -Access Allow -Protocol * -Direction Inbound -Priority $rulePriority -SourceAddressPrefix $ipRanges -SourcePortRange * -DestinationAddressPrefix VirtualNetwork -DestinationPortRange 21   -ErrorAction:Stop | Set-AzureRmNetworkSecurityGroup -ErrorAction:Stop | Out-NULL;

     Write-Host "Created NSG rule $ruleNameIn for $nsgname"

}

else

{

    #update inbound rule

    $nsg | Set-AzureRmNetworkSecurityRuleConfig -Name $ruleNameIn -Description "Allow Inbound to Azure data centers" -Access Allow -Protocol * -Direction Inbound -Priority $rulePriority -SourceAddressPrefix $ipRanges -SourcePortRange * -DestinationAddressPrefix VirtualNetwork -DestinationPortRange 21 -ErrorAction:Stop | Set-AzureRmNetworkSecurityGroup -ErrorAction:Stop | Out-NULL;

    Write-Host "Updated NSG rule $ruleNameIn for $nsgname"

}



#check to see if the outbound rule already existed

$outboundRule = ($nsg | Get-AzureRmNetworkSecurityRuleConfig -Name $ruleNameOut -ErrorAction SilentlyContinue)



if($outboundRule -eq $null)

{

    #create outbound rule

    $nsg | Add-AzureRmNetworkSecurityRuleConfig -Name $ruleNameOut -Description "Allow outbound to Azure data centers" -Access Allow -Protocol * -Direction Outbound -Priority $rulePriority -SourceAddressPrefix VirtualNetwork -SourcePortRange * -DestinationAddressPrefix $ipRanges -DestinationPortRange 21 -ErrorAction:Stop | Set-AzureRmNetworkSecurityGroup -ErrorAction:Stop | Out-NULL;

    Write-Host "Created NSG rule $ruleNameOut for $nsgname"

}

else

{

    #update outbound rule

    $nsg | Set-AzureRmNetworkSecurityRuleConfig -Name $ruleNameOut -Description "Allow outbound to Azure centers" -Access Allow -Protocol * -Direction Outbound -Priority $rulePriority -SourceAddressPrefix VirtualNetwork -SourcePortRange * -DestinationAddressPrefix $ipRanges -DestinationPortRange 21 -ErrorAction:Stop | Set-AzureRmNetworkSecurityGroup -ErrorAction:Stop | Out-NULL;

    Write-Host "Updated NSG rule $ruleNameOut for $nsgname"

}










 Write-Host "Finished."

Wednesday, December 5, 2018

D365FO / Azure - Create website redirect in azure to handle supplemental environment url changes

Currently while dealing with D365FO supplemental environments within LCS that are all hosted on a single machine anytime we update them and there is an error microsoft's recommended fix is to stand up a new environment. Because of this the url will change and you need to notify your users. In order to handle this scenario so our users do not need to worry about a new url you can create a redirect website in azure that you can point to whatever url you want and if your users bookmark these urls instead of the direct ones then you don't need to worry about who has what book marked.

Azure portal 

Step 1. In the Azure portal you need to create a webapp. You do this by going to "Create a resource" and then choose a web app.




Step 2. In the newly created webapp go to Advanced tools (Kudu)


Step 3. In Kudu go to Debug console > cmd


Step 4. Browse to /site/wwwroot and click on the add ("+") button and choose to create a new file and name it web.config

Step 5. Click on the edit button for the web.config file and add in the following and hit save.


<?xml version="1.0" encoding="utf-8" ?>
<configuration>
    <system.webServer>
        <rewrite>
            <rules>
                <rule name="Redirect test enviornment" stopProcessing="true">
                    <match url="^test" />
                    <action type="Redirect" url="https://testurl.cloudax.dynamics.com" redirectType="Permanent" />
                </rule>   
                <rule name="Redirect stage enviornment" stopProcessing="true">
                    <match url="^stage" />
                    <action type="Redirect" url="https://stageurl.cloudax.dynamics.com" redirectType="Permanent" />
                </rule>
                <rule name="Redirect dm enviornment" stopProcessing="true">
                    <match url="^dm" />
                    <action type="Redirect" url="https://datamigrationurl.cloudax.dynamics.com" redirectType="Permanent" />
                </rule>
                <rule name="Redirect qa enviornment" stopProcessing="true">
                    <match url="^qa" />
                    <action type="Redirect" url="https://qaurl.operations.dynamics.com/" redirectType="Permanent" />
                </rule>
                <rule name="Redirect header" stopProcessing="true">
                    <match url=".*" />
                <action type="Redirect" url="https://www.google.com" redirectType="Permanent" />
                </rule>
            </rules>
        </rewrite>
    </system.webServer>
</configuration>    


Step 6. After hitting save you should now be able to access your azure web app in the following manner

https://yourwebappurl.com/test
https://yourwebappurl.com/qa
https://yourwebappurl.com/stage
https://yourwebappurl.com/dm

and they should now forward to their respective website.

If you do not include a /* tag and just put https://yourwebappurl.com then it should forward you to google.com as well