Sometimes when dealing with scenarios where you need to generate a file and then do something else with said file the code base can tell you the file is created and ready for use but the OS has not released the file lock. This can also occur when trying to grab or do something with a file that is located on a network share that other users may be modifying at any given time. Because of these scenarios AX can give you an error
"The process cannot access the file 'filelocation' because it is being used by another process."
The following code will allow you to check to see if the os has a file in a 'locked' status. I ended up adding a special wait 4 second clause after retrying 10x because I was finding all of the retries were finishing within under 1 second so the system didn't have a chance to finish what it was doing. You can remove the 2nd retry if this clause does not suite your needs.
/// <summary>
/// Check to see if a file a locked
/// </summary>
/// <param name="fileName">
/// File Name + Path
/// </param>
/// <returns>
/// If the file is locked
/// </returns>
/// <remarks>
/// The first 10 trys were executing under 1 sec so I added a sleep if its still locked after 10 trys to be safe.
/// </remarks>
public boolean isFileLocked(FileName fileName)
{
Counter retryCount;
System.Exception clrException;
System.Type exceptionType;
boolean validRetry;
boolean isLocked;
new InteropPermission(InteropKind::ClrInterop);
new FileIOPermission (Filename, 'rw').assert();
try
{
//check to see if we can open the file
System.IO.File::Open(fileName, System.IO.FileMode::Open);
CodeAccessPermission::revertAssert();
}
catch(Exception::CLRError)
{
validRetry = false;
//get the last CLR error and display it to the user
clrException = CLRInterop::getLastException();
if(clrException.get_InnerException() != null)
{
clrException = clrException.get_InnerException();
exceptionType = clrException.GetType();
//check to see if the exception type is a valid one we need to retry
switch(CLRInterop::getAnyTypeForObject(exceptionType.get_FullName()))
{
case 'System.UnauthorizedAccessException':
validRetry = true;
break;
case 'System.IO.IOException':
validRetry = true;
break;
default:
throw error(CLRInterop::getAnyTypeForObject(clrException.get_Message()));
}
}
//check to see if we have a valid file to retry
if(validRetry)
{
++retryCount;
if(retryCount <= 10)
{
//execute a retry the first 10x right away
retry;
}
else if(retryCount == 11)
{
//on the last try sleep for 4 seconds then try again
sleep(4000);
retry;
}
else
{
isLocked = true;
CodeAccessPermission::revertAssert();
}
}
}
return isLocked;
}
I am currently using the tool Easy File Locker located here in order to test this code out.
Monday, November 23, 2015
Tuesday, October 27, 2015
CostSheetCache table is too large. How to clear it out
The CostSheetCache table which is used as a temp location for cost sheets to help generate prices is not cleared automatically in AX
unless you tell it to. So it can become very large inside of your DB (sometimes 10-30GB+) There are two ways you can clear out this information properly to keep this table small in size.
1.) Manually trigger the clearing inside of AX (this is the only way I know how to do it via default AX)
Go to Production Control / Setup / Routes / Cost Groups Modify any record on this form and it will trigger the cleanup whenever you close the form.
2.) Create a job with the following code.
static void CostSheetCacheClear(Args _args)
{
//this will clear the table CostSheetCache
CostSheetFactory::construct().clearCachedData();
info("Cleared the table CostSheetCache");
}
This is the same code that gets triggered in step 1 and can be found on
\Tables\BOMCostGroup\insert(), update(), delete()
1.) Manually trigger the clearing inside of AX (this is the only way I know how to do it via default AX)
Go to Production Control / Setup / Routes / Cost Groups Modify any record on this form and it will trigger the cleanup whenever you close the form.
2.) Create a job with the following code.
static void CostSheetCacheClear(Args _args)
{
//this will clear the table CostSheetCache
CostSheetFactory::construct().clearCachedData();
info("Cleared the table CostSheetCache");
}
This is the same code that gets triggered in step 1 and can be found on
\Tables\BOMCostGroup\insert(), update(), delete()
Wednesday, October 21, 2015
AX SSRS - Repeat Row/Column Headers on every page
If you need to repeat a row header on every page within a report that is being generated from AX you will have noticed that the majority of the times the default settings of "repeat header" under row/column properties does not work. The method below will always work.
1.) Create a new group on the report that is at the top most level. Lets give it a name of "RepeatHeader" (or whatever you like) When it asks what to group by you need to input 0. Yes the number 0, no spaces, no quotes, just 0.
2.) On the grouping "frame" there is a down arrow on the top right. Click it and choose "Advanced Mode" You will now see "static" rows within your grouping list.
3.) On the static row above your new group go to properties and change the following
KeepWithGroup = After
RepeatOnNewPage = true
You should now have a row that contains headers that repeat on every page. If they are not repeating then your existing header row may not be within the correct group and you may need to change it according to how your data's grouping structure is set.
1.) Create a new group on the report that is at the top most level. Lets give it a name of "RepeatHeader" (or whatever you like) When it asks what to group by you need to input 0. Yes the number 0, no spaces, no quotes, just 0.
2.) On the grouping "frame" there is a down arrow on the top right. Click it and choose "Advanced Mode" You will now see "static" rows within your grouping list.
3.) On the static row above your new group go to properties and change the following
KeepWithGroup = After
RepeatOnNewPage = true
You should now have a row that contains headers that repeat on every page. If they are not repeating then your existing header row may not be within the correct group and you may need to change it according to how your data's grouping structure is set.
CostSheets - Finding the costing header that is associated with a cost group id
If you take a look at costing sheets within AX 2012 (Company/Inventory and warehouse management/Setup/Costing/Costing Sheets) you will find that this is where the item price costing sheet structure is setup (Cost group id <-> Headers) This will allow you to break down/sum up an items price by costgroupid and then summary/costing headers.
However whenever you try to view the data that will populate your cost sheet via the table BOMCalcTrans you will find that everything is related to the cost group id which is good for displaying the "details" of the costsheet but we sometimes need to summarizes the cost group ids by which header it is under. By using the method bellow you can create a "summary" of your bomcalctrans table.
(note CostGroupHeaderDescription EDT is just a generic str edt I created to store a header group id + description)
This method will only work if you use the header checkbox shown on the Costing sheet setup form listed above under Edit>Identification>Setup>Header
(you can easily can this logic to include the total check box as well or use both fields)
/// <summary>
/// Find the costing header that a cost group id is associated with
/// </summary>
/// <param name="costGroupId">
/// Cost Group Id
/// </param>
/// <returns>
/// Costing Header Code + Description
/// </returns>
/// <remarks>
/// The will pull the header node that a cost group id is under
/// </remarks>
public CostGroupHeaderDescription getCostCategoryByCostGroupId(CostGroupId costGroupId)
{
CostSheetNodeTable costSheetNodesBase, costSheetNodesParent;
boolean foundTotalNode;
CostSheetNodeId currentNodeIdToSearch;
CostGroupCodeAndDescription description;
//find the base instance to find the parent node to see where we need to start the search
select NodeId, ParentNodeId, Code, Description, Total, Header from costSheetNodesBase where costSheetNodesBase.CostGroupId == costGroupId;
//check to see if we are already at the header level when we are at the base
if(costSheetNodesBase.Header == NoYes::Yes)
{
foundTotalNode = true;
description = strFmt("%1 (%2)", costSheetNodesBase.Code, costSheetNodesBase.Description);
}
currentNodeIdToSearch = costSheetNodesBase.ParentNodeId;
//execute this logic until we find a node flagged as header
while(!foundTotalNode)
{
//find the parent nodes record to see if it is a header record
select NodeId, ParentNodeId, Code, Description, Total, Header from costSheetNodesParent where costSheetNodesParent.NodeId == currentNodeIdToSearch;
if(costSheetNodesParent)
{
//check to see if the current node is a header which is the topmost bucket that we need to log
if(costSheetNodesParent.Header == NoYes::Yes)
{
foundTotalNode = true;
description = strFmt("%1 (%2)", costSheetNodesParent.Code, costSheetNodesParent.Description);
}
else
{
//the current node is not a header so we need to get the parent node of the parent
currentNodeIdToSearch = costSheetNodesParent.ParentNodeId;
if(currentNodeIdToSearch == 0)
{
//if we have reached the top most level there has to be a misconfig issue
throw error(strFmt("No header cost sheet node was found for cost group %1 (%2). Please check the configuration of the costsheet", costSheetNodesBase.Code, costSheetNodesBase.Description));
}
}
}
else
{
throw error("There was an issue finding the parent node cost sheet record.");
}
}
return description;
}
However whenever you try to view the data that will populate your cost sheet via the table BOMCalcTrans you will find that everything is related to the cost group id which is good for displaying the "details" of the costsheet but we sometimes need to summarizes the cost group ids by which header it is under. By using the method bellow you can create a "summary" of your bomcalctrans table.
(note CostGroupHeaderDescription EDT is just a generic str edt I created to store a header group id + description)
This method will only work if you use the header checkbox shown on the Costing sheet setup form listed above under Edit>Identification>Setup>Header
(you can easily can this logic to include the total check box as well or use both fields)
/// <summary>
/// Find the costing header that a cost group id is associated with
/// </summary>
/// <param name="costGroupId">
/// Cost Group Id
/// </param>
/// <returns>
/// Costing Header Code + Description
/// </returns>
/// <remarks>
/// The will pull the header node that a cost group id is under
/// </remarks>
public CostGroupHeaderDescription getCostCategoryByCostGroupId(CostGroupId costGroupId)
{
CostSheetNodeTable costSheetNodesBase, costSheetNodesParent;
boolean foundTotalNode;
CostSheetNodeId currentNodeIdToSearch;
CostGroupCodeAndDescription description;
//find the base instance to find the parent node to see where we need to start the search
select NodeId, ParentNodeId, Code, Description, Total, Header from costSheetNodesBase where costSheetNodesBase.CostGroupId == costGroupId;
//check to see if we are already at the header level when we are at the base
if(costSheetNodesBase.Header == NoYes::Yes)
{
foundTotalNode = true;
description = strFmt("%1 (%2)", costSheetNodesBase.Code, costSheetNodesBase.Description);
}
currentNodeIdToSearch = costSheetNodesBase.ParentNodeId;
//execute this logic until we find a node flagged as header
while(!foundTotalNode)
{
//find the parent nodes record to see if it is a header record
select NodeId, ParentNodeId, Code, Description, Total, Header from costSheetNodesParent where costSheetNodesParent.NodeId == currentNodeIdToSearch;
if(costSheetNodesParent)
{
//check to see if the current node is a header which is the topmost bucket that we need to log
if(costSheetNodesParent.Header == NoYes::Yes)
{
foundTotalNode = true;
description = strFmt("%1 (%2)", costSheetNodesParent.Code, costSheetNodesParent.Description);
}
else
{
//the current node is not a header so we need to get the parent node of the parent
currentNodeIdToSearch = costSheetNodesParent.ParentNodeId;
if(currentNodeIdToSearch == 0)
{
//if we have reached the top most level there has to be a misconfig issue
throw error(strFmt("No header cost sheet node was found for cost group %1 (%2). Please check the configuration of the costsheet", costSheetNodesBase.Code, costSheetNodesBase.Description));
}
}
}
else
{
throw error("There was an issue finding the parent node cost sheet record.");
}
}
return description;
}
Tuesday, October 6, 2015
AX SSRS Report times out after 10 minutes (Error: A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond)
Sometimes when running a report that is going to return a large data set you may get the following error message after 10 minutes:
"A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond"
Whenever you receive this error the first thing you should look at is your DP class and can performance be improved by rewriting some of your code. If it cant be or this is a default ax report sometimes your only option is to increase the default timeout period.
In order to increase the timeout for an AX SSRS report you need to do the following:
1.) Create a new client config(or copy an existing one.) This is the same config that could be used as your .axc file that is used to access different environments/aos. However this will have larger timeout values in it so only use this on the server mentioned in step 4. You do not need your user's applications config to have these settings.
2.) Update the binding node of "QueryServiceEndPoint" and change the sendTimeout property from "00:10:00" (10 minutes) to whatever timeout value you like in hh:mm:ss format. You can also update any other timeout setting as well if you like. (Just search for 00:10:00 within the document and replace it with your new time)
3.) Save the new config as "Microsoft.Dynamics.AX.ReportConfiguration.axc" (https://technet.microsoft.com/en-us/library/hh389774.aspx)
4.) Copy the new file Microsoft.Dynamics.AX.ReportConfiguration.axc into the following folder on your sql/ssrs server (inside of the os/windows file system)
C:\Program Files\Microsoft SQL Server\MSRS11.<AOS>\Reporting Services\ReportServer\bin
Note: This is for a sql2012 instance. If you are running a different version of ms sql please refer to the following TechNet article to find the location of your report server bin folder https://technet.microsoft.com/en-us/library/hh389774.aspx if you need help locating it.
5.) Run the report and the report should no longer time out. You do not need to restart the aos, sql or ssrs services in order for the changes to take place once the file is in this folder it should now respect your new timeout settings
If this does not fix your issue please refer to the following sites which can also help you on time out issues with ssrs reports
TechNet: Tips to help prevent long-running reports from timing out [AX 2012]
You can also change the timeout settings on ssrs via the following method:
Global timeout setting: Go to site settings > General > Report Timeout > do not timeout report.
Report timeout setting: Go to the report > Manage > Processing options > Report time out> do not timeout report.
There are also some timeout settings if you connect to the reporting service via sql management studio. Right click on your server node and go to properties > Advanced.
Update: Also see AX SSRS - Regular Processing (SRSReportDataProviderBase) VS PreProcessing (SrsReportDataProviderPreProcess) (fixes timeout issues on reporting)
Update 2: After applying the Microsoft.axc file to the ssrs server you may run into a security issue (Microsoft Dynamics AX 2012 Reporting extensions: Error System.Security.Permissions.EnvironmentPermission while running report)
To fix this go to the SSRS installation folder and modify rssrvpolicy.config Report_Expressions_Default_Permissions node from "Execution" to "FullTrust" (https://blogs.msdn.microsoft.com/axsupport/2012/02/02/microsoft-dynamics-ax-2012-reporting-extensions-error-system-security-permissions-environmentpermission-while-running-report/)
"A connection attempt failed because the connected party did not properly respond after a period of time, or established connection failed because connected host has failed to respond"
Whenever you receive this error the first thing you should look at is your DP class and can performance be improved by rewriting some of your code. If it cant be or this is a default ax report sometimes your only option is to increase the default timeout period.
In order to increase the timeout for an AX SSRS report you need to do the following:
1.) Create a new client config(or copy an existing one.) This is the same config that could be used as your .axc file that is used to access different environments/aos. However this will have larger timeout values in it so only use this on the server mentioned in step 4. You do not need your user's applications config to have these settings.
2.) Update the binding node of "QueryServiceEndPoint" and change the sendTimeout property from "00:10:00" (10 minutes) to whatever timeout value you like in hh:mm:ss format. You can also update any other timeout setting as well if you like. (Just search for 00:10:00 within the document and replace it with your new time)
3.) Save the new config as "Microsoft.Dynamics.AX.ReportConfiguration.axc" (https://technet.microsoft.com/en-us/library/hh389774.aspx)
4.) Copy the new file Microsoft.Dynamics.AX.ReportConfiguration.axc into the following folder on your sql/ssrs server (inside of the os/windows file system)
C:\Program Files\Microsoft SQL Server\MSRS11.<AOS>\Reporting Services\ReportServer\bin
Note: This is for a sql2012 instance. If you are running a different version of ms sql please refer to the following TechNet article to find the location of your report server bin folder https://technet.microsoft.com/en-us/library/hh389774.aspx if you need help locating it.
5.) Run the report and the report should no longer time out. You do not need to restart the aos, sql or ssrs services in order for the changes to take place once the file is in this folder it should now respect your new timeout settings
If this does not fix your issue please refer to the following sites which can also help you on time out issues with ssrs reports
TechNet: Tips to help prevent long-running reports from timing out [AX 2012]
You can also change the timeout settings on ssrs via the following method:
Global timeout setting: Go to site settings > General > Report Timeout > do not timeout report.
Report timeout setting: Go to the report > Manage > Processing options > Report time out> do not timeout report.
There are also some timeout settings if you connect to the reporting service via sql management studio. Right click on your server node and go to properties > Advanced.
Update: Also see AX SSRS - Regular Processing (SRSReportDataProviderBase) VS PreProcessing (SrsReportDataProviderPreProcess) (fixes timeout issues on reporting)
Update 2: After applying the Microsoft.axc file to the ssrs server you may run into a security issue (Microsoft Dynamics AX 2012 Reporting extensions: Error System.Security.Permissions.EnvironmentPermission while running report)
To fix this go to the SSRS installation folder and modify rssrvpolicy.config Report_Expressions_Default_Permissions node from "Execution" to "FullTrust" (https://blogs.msdn.microsoft.com/axsupport/2012/02/02/microsoft-dynamics-ax-2012-reporting-extensions-error-system-security-permissions-environmentpermission-while-running-report/)
Thursday, August 27, 2015
Importing security roles & users via the TDTT may cause some roles to not appear under the users
If you ran my previous blog entry on exporting and importing users + role assignments via the TDTT (Test Data Transfer Tool) then in some cases where a custom role was created and assigned then the assignment may not show up in the UI. This is because the data is transferred but the root handle id has changed/is different from environment to environment. So we need to update the security role assignment with the correct root handle.
In order to fix this issue we need to do the following
On the model db of the src and of the target
SELECT [RootHandle],[ElementHandle],[Name]
FROM [dbo].[ModelElement]
where ElementType = '133'
order by name
(Element Type = 133 is telling us to display roles only)
Then we need to compare the root handle ids to one another for each of the role names. If they do not match then the following needs to happen on the target db (data database)
update [ax_target_db].[dbo].[SECURITYUSERROLE]
set SECURITYROLE ='roothandlefromtargetdb'
where SECURITYROLE = 'roothandlefromsrcdb'
After running this then your root handles for roles should match up now.
In order to fix this issue we need to do the following
On the model db of the src and of the target
SELECT [RootHandle],[ElementHandle],[Name]
FROM [dbo].[ModelElement]
where ElementType = '133'
order by name
(Element Type = 133 is telling us to display roles only)
Then we need to compare the root handle ids to one another for each of the role names. If they do not match then the following needs to happen on the target db (data database)
update [ax_target_db].[dbo].[SECURITYUSERROLE]
set SECURITYROLE ='roothandlefromtargetdb'
where SECURITYROLE = 'roothandlefromsrcdb'
After running this then your root handles for roles should match up now.
Exporting/Importing specific tables via the TDTT (Test Data Transfer Tool) / Transfering only user tables with role assignments via TDTT
Recently after starting to use the Test Data Transfer Tool (TDTT) it was brought to my attention that users and role associations were not coming over. This is because by default this tool has a list of tables to not export. You can control which tables do and do not get exported/imported by modifying these lists. Currently you can find these lists within the install folder of the TDDT (dp.exe) If you choose the default install location it would be located at
C:\Program Files (x86)\Microsoft Dynamics AX 2012 Test Data Transfer Tool (Beta)\[Lists]
By default the tool will ignore any table names found within these txt files. However in our case we only want to export/import the specific user tables and nothing else. So first we need to look at how we can accomplish this.
To only include a specific table we can use the following code:
.*(?<!^Table1)
or if we only want to export 2 tables .*(?<!^Table1)(?<!^Table2)
This gives us the functionality to export/import specific tables. However now we need to figure out which tables we actually need to transfer in order to correctly transfer users + their role assignments.
We can start by looking at the default file "Exclude - User.txt" However this does not include the role assignments which you may also need. The role assignments are present at the table SECURITYUSERROLE. So combining the default list + the role table we get the following
.*(?<!^SYSBCPROXYUSERACCOUNT)(?<!^USERINFOSTARTUPMODEL)(?<!^SYSCOMPANYUSERINFO)(?<!^USERSIDCACHE)(?<!^(DEL_)?USERINFO)(?<!^(DEL_)?USERGROUPLIST)(?<!^(DEL_)?USERGROUPINFO)(?<!^SYSUSERPROFILES)(?<!^SYSUSERLOG)(?<!^SYSUSERINFO)(?<!^SYSUSER.*)(?<!^SECURITYUSERROLE)
Once we have the new file defined with the code above we need to remove all other lists from this folder so that it will only export/import the tables we have defined in our new file instead of exporting the entire db but excluding the tables defined by default. Whenever these files are removed and only 1 is left in the folder you issue the same command to export/import the entire db but this time it should only grab the tables defined in your new file.
Commands to export/import:
DP.exe export C:\AXExport src_db_name sql\AXInstance
DP.exe import C:\AXExport target_db_name sql\AXInstance
Per the TechNet article on the TDTT we can use the following regular expression syntax in our include/exclude lists
You can view the rest of the filtering options for this tool @ Run the Test Data Transfer Tool (beta) for Microsoft Dynamics AX [AX 2012]
C:\Program Files (x86)\Microsoft Dynamics AX 2012 Test Data Transfer Tool (Beta)\[Lists]
By default the tool will ignore any table names found within these txt files. However in our case we only want to export/import the specific user tables and nothing else. So first we need to look at how we can accomplish this.
To only include a specific table we can use the following code:
.*(?<!^Table1)
or if we only want to export 2 tables .*(?<!^Table1)(?<!^Table2)
This gives us the functionality to export/import specific tables. However now we need to figure out which tables we actually need to transfer in order to correctly transfer users + their role assignments.
We can start by looking at the default file "Exclude - User.txt" However this does not include the role assignments which you may also need. The role assignments are present at the table SECURITYUSERROLE. So combining the default list + the role table we get the following
.*(?<!^SYSBCPROXYUSERACCOUNT)(?<!^USERINFOSTARTUPMODEL)(?<!^SYSCOMPANYUSERINFO)(?<!^USERSIDCACHE)(?<!^(DEL_)?USERINFO)(?<!^(DEL_)?USERGROUPLIST)(?<!^(DEL_)?USERGROUPINFO)(?<!^SYSUSERPROFILES)(?<!^SYSUSERLOG)(?<!^SYSUSERINFO)(?<!^SYSUSER.*)(?<!^SECURITYUSERROLE)
Once we have the new file defined with the code above we need to remove all other lists from this folder so that it will only export/import the tables we have defined in our new file instead of exporting the entire db but excluding the tables defined by default. Whenever these files are removed and only 1 is left in the folder you issue the same command to export/import the entire db but this time it should only grab the tables defined in your new file.
Commands to export/import:
DP.exe export C:\AXExport src_db_name sql\AXInstance
DP.exe import C:\AXExport target_db_name sql\AXInstance
Per the TechNet article on the TDTT we can use the following regular expression syntax in our include/exclude lists
Regular expression syntax | Description |
---|---|
. | Match any single character. |
* | Match the previous expression zero or more times. |
(?<! subexpression) | Prevent a match if the end of the previously matched content matches the subexpression. This expression is known as a zero-width negative lookbehind assertion. |
^ | Match the start of the string. |
You can view the rest of the filtering options for this tool @ Run the Test Data Transfer Tool (beta) for Microsoft Dynamics AX [AX 2012]
The number of joins in the statement is 31. This exceeds the maximum of 26.
When modifying some base forms to include new tables you might get the error
The number of joins in the statement is 31. This exceeds the maximum of 26. Turn on exception for this warning to see the call stack.
In order to fix this we need to bump the max limit (default is 26.) We can do this by going to
System Administration > Setup > System > Server Configuration Then go to the Performance Optimization tab and change the field 'Maximum number of tables in join' After making this change you may need to restart the aos in order for the changes to take affect.
This setting fixes the error but before making this change you should take a look at what you are trying to accomplish with this many joins. Can we get rid of some of the tables? Have we split the data into to many tables? Can this be accomplished another way? While having a form with this many joins can cause performance issues sometimes its your only option as default AX may already hit that 24-26 join mark so you are left with no option but to uptick the amount of joins allowed.
The number of joins in the statement is 31. This exceeds the maximum of 26. Turn on exception for this warning to see the call stack.
In order to fix this we need to bump the max limit (default is 26.) We can do this by going to
System Administration > Setup > System > Server Configuration Then go to the Performance Optimization tab and change the field 'Maximum number of tables in join' After making this change you may need to restart the aos in order for the changes to take affect.
This setting fixes the error but before making this change you should take a look at what you are trying to accomplish with this many joins. Can we get rid of some of the tables? Have we split the data into to many tables? Can this be accomplished another way? While having a form with this many joins can cause performance issues sometimes its your only option as default AX may already hit that 24-26 join mark so you are left with no option but to uptick the amount of joins allowed.
Wednesday, August 19, 2015
How to copy data from one Microsoft Dynamics AX 2012 R2 AOS to another (MSSQL VS Test Data Transfer Tool (TDTT))
Below will show you the steps on how to copy all of the data from one aos to another.
Related: How to promote entire Microsoft Dynamics AX 2012 R2 Code Base (MSSQL VS AXUtil VS Powershell)
*Option 2 should be only used when promoting to
non-production environments
Related: How to promote entire Microsoft Dynamics AX 2012 R2 Code Base (MSSQL VS AXUtil VS Powershell)
Export Data
o
Option 1 (MSSQL Server):
§
Stop AOS
§
Make backup of database from within MSSQL
management studio
§
Be sure to compress the db! (via software
method)
§
Note: this option can cause issues with syncing
the db and you may need to manually drop tables and reimport the data
o
Option 2 (Test Data Transfer Tool):
§
Info: TechNet
§
Download: Available in Microsoft Lifecycle
Services > Downloadable tools
§
DP.exe <direction> <directory>
<database> <server>
·
Example: DP.exe export C:\AXExport AX_PROD_01 SQL\AXPROD
§
This should never be used to import data into
prod, only user it on environments such as qc, test, dev.
§
It also has the ability to skip certain tables
such as system settings. See the link above for more details on skipping tables
NOTE: This needs to be ran on the sql box where the db is located. It uses whatever account triggers dp.exe so that user needs to have access to the raw sql db as wellImport Data
o
Option 1 (MSSQL Server): Restore Database
§
Settings (General): Override old db name with
the db you want to restore to
§
Settings (Files): Change file name under
“restore as” to the db you want import to.
§
Settings (Options): Overwrite the existing
database (WITH REPLACE) should be checked
§
Start AOS
§
Run full compile via client side (long compile)
·
Since the DB is different it’s better to do this
on the client to make sure no errors exist
§
Sync Data Dictionary (makes sure table indexes
are good to go)
§
Full CIL
o
Option 2 (Test Data Transfer Tool):
§
DP.exe <direction> <directory>
<database> <server>
·
Example: DP.exe import C:\AXExport AX_PROD_01 SQL\AXPROD
NOTE: This needs to be ran on the sql box where the db is located. It uses whatever account triggers dp.exe so that user needs to have access to the raw sql db as well
NOTE: This needs to be ran on the sql box where the db is located. It uses whatever account triggers dp.exe so that user needs to have access to the raw sql db as well
Settings that need updated
· System administration/Setup
o
Batch Group
§
Remove servers from transferring dbs
§
Assign all groups to the correct batch server(s)
related to the current environment
o
System / Server configuration
§
Remove servers from transferring dbs
§
Mark batch servers related to the current
environment
o
System / Cluster configuration (Prod)
§
Doing the previous step should remove the
servers from the transferring dbs, but if it doesn’t remove it from here as
well.
§
Create group for batch server
·
Assign batch servers into this group via the
cluster name field on the grid
§
Create group for non-batch servers
·
Assign non-batch server into this group via the
cluster name field on the grid
o
System / E-mail Parameters
§
Local Computer Name – update to the current aos
§
Password: enter password for email
o
System / Help system parameters
§
Update URL according to the environment
§
Example: http://aoswebserver:8080/DynamicsAX6HelpServer/HelpService.svc
o
Business intelligence / Analysis Services /
Analysis Servers
§
OLAP servers (tab) > OLAP Server Name –
change to db\instance of current aos sql server
·
Example: sql\axprod
o
Business intelligence / Reporting Services /
Report servers
§
Create instance for every aos and update with
related SQL/SSRS information
§
Server Name: <sql server name>
·
When dealing with a cluster it seems to work
better to list the default ssrs box
§
Server instance name: <sql instance name>
§
Report Manager URL: http://aosssrs/Reports_AXPROD
§
Web service URL: http://aosssrs/ReportServer_AXPROD
§
Report Folder: DynamicsAX
§
AOS: point to the aos (this is the setting you
change for every aos when dealing with a cluster)
o
Enterprise Portal / Web sites
§
Change internal/external urls to AX Sharepoint
urls
·
Example: http://aoswebserver/sites/DynamicsAx
§
Type: Full
·
With importing data, there comes a risk that the
code may be expecting certain number sequences to be setup which are not. THIS IS OPTIONAL AND NOT REQUIRED ON
EVERY IMPORT
o
Because of this, run the Number Sequence Wizard
under Organization Administration \ Common \ Number Sequences \ Number
Sequences List page \ “Generate” under “New” on the Action Bar
o
That will generate any un-generated Number
sequences.
§
Take note of what it generates, and the Number
sequence code, as further tuning may be necessary because the sequences default
to Company-####, which may not be desired.
How to promote entire Microsoft Dynamics AX 2012 R2 Code Base (MSSQL VS AXUtil VS Powershell)
The following will walk you thru multiple ways to export and import a code base from one aos to another.
Microsoft recommends option 2 or 3. But option 2 utilizes powershell which Microsoft has been pushing so this is probably the best route.
All methods below goes under the assumption that a clean compile has just occurred on the source that you are exporting from.
Related: How to copy data from one Microsoft Dynamics AX 2012 R2 AOS to another (MSSQL VS Test Data Transfer Tool (TDTT))
Microsoft recommends option 2 or 3. But option 2 utilizes powershell which Microsoft has been pushing so this is probably the best route.
All methods below goes under the assumption that a clean compile has just occurred on the source that you are exporting from.
Related: How to copy data from one Microsoft Dynamics AX 2012 R2 AOS to another (MSSQL VS Test Data Transfer Tool (TDTT))
Export Code
o
Option 1 (MSSQL Server):
§
Stop AOS
§
Make backup of database from within MSSQL
management studio
§
Be sure to compress the db! (via software
method)
o
Option 2 (AOS Server): Microsoft Dynamics AX Management Shell
(Control Panel > Admin Tools)
§
Export-AXModel –Model <name> -File
<Filename.axmodel>
·
Example: Export-AXModel –Model “CUS Model” –File
Cus073115.axmodel
o
Option 3 (AOS Server): AXUtil
§
CMD.EXE > CD %ProgramFiles%\Microsoft Dynamics
AX\60\ManagementUtilities
§
CMD.EXE > axutil export /model:<Modelname>
/file:<filename> /verbose
·
Example: axutil export /model:”CUS Model”
/file:Cus073115
Import Code
o
Option 1 (MSSQL Server): Restore Database
§
Settings (General): Override old db name with
the db you want to restore to
§
Settings (Files): Change file name under
“restore as” to the db you want import to.
§
Settings (Options): Overwrite the existing
database (WITH REPLACE) should be checked
§
Start AOS
§
Run full compile via client side (long compile)
·
Since the DB is different it’s better to do this
on the client to make sure no errors exist
§
Sync Data Dictionary (makes sure table indexes
are good to go)
§
Full CIL
§
Deploy ALL reports
o
Option 2 (AOS Server): Microsoft Dynamics AX Management Shell
(Control Panel > Admin Tools)
§
Install-AXModel –File <Filename.axmodel>
-Details
·
Example: Install-AXModel –File Cus073115.axmodel
–Details
§
Sync Data Dictionary (makes sure table indexes
are good to go)
§
Full CIL
§
*It’s good to note that a full compile is not
necessary but it won’t hurt anything
§
You can run server side compile (quick)
·
CMD.EXE (As Admin) > cd C:\Program
Files\Microsoft Dynamics AX\60\Server\AX_TEST_01\bin
·
CMD.EXE > axbuild.exe xppcompileall
/s=01 /altbin="C:\Program Files (x86)\Microsoft Dynamics
AX\60\Client\Bin"
§
Deploy ALL reports
§
You may need to run the note at the bottom of
this list as well
§
To re-index the model if need be: axutil
optimize /config:ax_configuration_utility_instance_name
o
Option 3 (AOS Server): AXUtil
§
CMD.EXE > CD %ProgramFiles%\Microsoft Dynamics
AX\60\ManagementUtilities
§
CMD.EXE > axutil import
/file:<filename> /verbose
·
Exmple: axutil /file:Cus073115 /verbose
§
Sync Data Dictionary (makes sure table indexes
are good to go)
§
Full CIL
§
*It’s good to note that a full compile is not
necessary but it won’t hurt anything
§
You can run server side compile (quick)
·
CMD.EXE (As Admin) > cd C:\Program
Files\Microsoft Dynamics AX\60\Server\AX_TEST_01\bin
·
CMD.EXE > axbuild.exe xppcompileall
/s=01 /altbin="C:\Program Files (x86)\Microsoft Dynamics
AX\60\Client\Bin"
§
Deploy ALL reports
§
You may need to run the note at the bottom of
this list as well
§
To re-index the model if need be: axutil
optimize /config:ax_configuration_utility_instance_name
Note: You may need to do
the following if you have a clean code base but can’t get a clean cil
·
Stop all AOS instances
·
Delete all files on the AOS server under
C:\Program Files\Microsoft Dynamics AX\60\Server\<aos instance name>\bin\XppIL
·
Connect to the model db via SQL Management
studio
·
Truncate the table SYSXPPASSEMBLY which lists
all of the files found in step 2
·
You can do this via the command: TRUNCATE TABLE
SYSXPPASSEMBLY
·
Start 1 AOS instance if running under a cluster
·
Run Full CIL
·
Start remaining AOS instances
Wednesday, August 5, 2015
KB 2905338 - fix - Incorrect yields being applied when running price calculations for multiple parts at a time (2012 R2)
Recently we discovered a bug when running pricing calculations from Inventory and warehouse management/Setup/Costing/Costing Versions
it appeared that whenever you ran pricing from the item itself than it ran but whenever you ran it from the generic costing versions screen the yield was always 100%
We discovered that "KB 2905338, Bug Id 839225: When calculating price on the cost version form, AX ignores the batch size setup on formula version and uses the default order quantity instead" fixed our issue which is located at \Classes\BOMCalcJob.constructBOMCalcItem() because it added in the clause to populate the yield % from the current bomVersionitem. However this updates a new bug into the system.
Whenever you run a single item from the main costing versions form then the correct yields are pulled in. The moment you run it for multiple parts (at the same time, when its not wide open for an entire site) via Inv/Ware house Mgt >Costing Versions then it ever so slightly modifies your costing results incorrectly.
The issue is that when running pricing for multiple parts at a time it takes the first item's info and uses that as the "main item" for every part you run. Thus affecting your output. You only notice this issue whenever you run it for multiple parts at a time.
You can fix this by removing the piece of code in \Classes\BOMCalcJob.constructBOMCalcItem() this.parmPmfBOMVersion(bomVersionItem); Once this is removed it appears to still respect the hotfix and the fix to handle processing multiple specific items at a time.
The issue is caused by the if statement at the stop that checks to see if pmfBOMVersion.RecId has a value which it now does.
it appeared that whenever you ran pricing from the item itself than it ran but whenever you ran it from the generic costing versions screen the yield was always 100%
We discovered that "KB 2905338, Bug Id 839225: When calculating price on the cost version form, AX ignores the batch size setup on formula version and uses the default order quantity instead" fixed our issue which is located at \Classes\BOMCalcJob.constructBOMCalcItem() because it added in the clause to populate the yield % from the current bomVersionitem. However this updates a new bug into the system.
Whenever you run a single item from the main costing versions form then the correct yields are pulled in. The moment you run it for multiple parts (at the same time, when its not wide open for an entire site) via Inv/Ware house Mgt >Costing Versions then it ever so slightly modifies your costing results incorrectly.
The issue is that when running pricing for multiple parts at a time it takes the first item's info and uses that as the "main item" for every part you run. Thus affecting your output. You only notice this issue whenever you run it for multiple parts at a time.
You can fix this by removing the piece of code in \Classes\BOMCalcJob.constructBOMCalcItem() this.parmPmfBOMVersion(bomVersionItem); Once this is removed it appears to still respect the hotfix and the fix to handle processing multiple specific items at a time.
The issue is caused by the if statement at the stop that checks to see if pmfBOMVersion.RecId has a value which it now does.
Subscribe to:
Posts (Atom)