Wednesday 5 November 2014

Rebuild indexes

And another small post for the archive. If you'd like to rebuild an index (due to fragmentation or growth), you can use the following statement: 


USE [namedatabase]


ALTER INDEX ALL ON [dbo].[nametable] REBUILD;

Deploy reports AX 2012

Another small post for my own archive. To deploy a report, you can use the following command in powershell

Publish-axreport -servicesaosname 01@nameAOS -servicesaoswsdlport 8101 -reportname PurchPurchaseOrder

Tuesday 4 November 2014

Inventsum recalculation

Due to different kinds of problems, it can be needed to recalculate the inventsum for an item. The cause is most of the time incorrect manipulation of the inventtrans records (due to manual adjustments or for example incorrect software modifications). 

There's the possibility to run the consistency check which will recalculate the onhand stock for all items. This is an operation which can take a serious amount of time. If you'd only want to recalculate for one item: 

InventSumRecalcItem InventSumRecalcItem;

InventSumRecalcItem = new InventSumRecalcItem("itemnumber", true, checkfix::fix);
InventSumRecalcItem.updatenow();


Friday 5 September 2014

Shrink logfile dynamics AX

Just a small post for my own archive. I ran into a problem of not having enough space left after deleting a company in AX (the logfile became too big) in my own testenvironment (recovery model is set to simple).  

Shrinking did not solve it. I needed a full backup first, but there's the problem again, no diskspace left :). There appears to be a backup faker for both the db and the db log and after running it, you can shrink the database. 

alter database "dbname" set recovery full

GO

backup database "dbname" to disk='nul'

GO

backup log "dbname" to disk='nul'

GO

alter database "dbname" set recovery simple


GO

And finally Shrink and the logfile is back in proportian. 



Edit: a small update, after a restore of another database, in order to be able to log in, use this script:


UPDATE dbo.UserInfo
SET networkdomain='domain',networkalias='networkname', sid='sidid'

WHERE id='admin';

Friday 6 June 2014

Power Query and Dynamics AX

On May 28th, Microsoft released the latest version of Power Query. This is a free light version BI tool and you can attach a variety of datasources including SQL server. So I decided to link it to my to see if I could create a small report to be able to take a look at the usability.

First of all, you can download Power Query here and it is compatible with Excel 2010 and Excel 2013.

After installing it, you'll have a new add-in called Power query. You can get your data for the web (search through wikipedia for instance), files and datasources. For AX I'll choose From SQL Server database:


All you need to do is to enter the name of the server (on which SQL server is installed) and optional is the database name. If you do not enter the database name, you can select any database found on the entered server. 

As you can see below, the tables are on the right. A cool thing is the preview when hovering the table, a not so cool thing is the lack of ability to search for a table name. You cannot search or type the first couple of letters to quickjump to the table you'd like. In this example, I'll link SalesLine and SalesTable and my objective is to get an overview of sales and cost price per week in 2014 on sales order which have been fully invoiced.

So tick the option 'Select multiple items' and select SalesLine and SalesTable. On the Load Settings infopane, select Load to data model. You'll need to do this to create relation(s) between tables.


To create a relation, click to Merge button in the ribbon bar. Select your tables and select a field in both tables to create a relation. If you'd like more than one relation, hold the Control button and click the next to fields to relate. Hit enter to open the Query editor. 

Click the Choose columns button to include the field you'd like, since AX holds all information for all companies in one table, you'll probably want DataAreaId to be included in every query. 

In this example, I've choosen 7 fields and included dataareaid. Here I'll put a filter on the field as can be seen in the screenshot below. Every action you take is recorded on the right, there you can remove an action, or alter it if needed. I like this part, it's very userfrienldy.


You can create custom columns using formula's. In this example, I'll want the weeknumber from the CreatedDateTime. So insert a column and add the formula: =Date.WeekOfYear([CreatedDateTime]).All the formula options can be found here. Besides that, I created a new column CostPriceCustom and mulitplied SalesQTY and CostPrice.

Last two thing is to filter CreatedDateTime to This Year and SalesStatus to 3 (invoiced). So all is set, hit Apply and Close and the data is loaded in Excel. 

Now all that needs to be done is to create a simple PivotTable and from it a diagram it shows my costs and revenues per week in the current year:

You can save the worksheet. When reopening, you'll have the same data. In order to refresh it, go to the Data tab and refresh and if you use a pivot table, refresh that as well. 


My verdict: 
Pro:
- It's free!
- Quick to install and setup
- Easy to use

Con:
- The query screen is light weight, no dragging and dropping, linking is very basic and difficult to find your table. 
- Refreshing is basic. I'd like a more powerful option here, like scheduled refreshing and being able to quickly overview the tables / queries which can be refreshed.








Wednesday 28 May 2014

Test Data Transfer Tool Dynamics AX - How it works and what are the possibilities

Earlier I described how to import demo data for AX 2012 R3 using the Test Data Transfer Tool (beta).  

The post will look into it's possibilities and how it works. 

What actually is the Test Data Transfer Tool?
The Test Data Transfer Tool (TDTT) uses SQL Server's Bulk Processing Program (BCP) to move the data from one AX database to another. That means that the business logic validation in AX itself isn't used. On the other hand, it's fast and we like fast. 
In order to be able to use the TDTT, you need to have SQL Server Client Tools installed and run the tool from the server which hosts the database. 

Expressions in the Export
There are four expressions you can use to export: 

.  Match any single character
* Match the previous expression zero or more times. 
(?<! subexpression) Prevent a match if the end of the previously matched content matches the subexpression. This expression is known as a zero-width negative lookbehind assertion.
^ Match the start of the string.

How to export / import
In the command prompt, go to C:\Program Files (x86)\Microsoft Dynamics AX 2012 Test Data Transfer Tool (Beta)\ 

To Export: 
dp export "c:\temp" <databasename> <servername>

To Import: 
dp import "c:\temp\importfilename" <databasename> <servername>

Some possiblities

Export a single or two tables
After installing the tool, you'll have an interesting directory named: [Lists]. Here you can manipulate the tables which are to be excluded from the export (default installation path is: C:\Program Files (x86)\Microsoft Dynamics AX 2012 Test Data Transfer Tool (Beta)\[Lists]).

So, if we would like to exclude all tables, but the table TaxAuthorityAddress, create a new file in the [Lists] folder and the expression should be: 

// Don't export any tables except TaxAuthorityAddress

.*(?<!^TaxAuthorityAddress)

If you only want to export TaxAuthorityAddress and CustTable, the expression would be: 
// Don't export any tables except TaxAuthorityAddress and CustTable

.*(?<!^TaxAuthorityAddress)(?<!^CustTable)


Export a table and filter rows from it
Now we want to export the TaxAuthorityAddress, but only for company NL. In order to do this, you need to create a new file named: filters.xml.

In it, enter the following and save the file in the [Lists] folder. Define the table name and in it, set the field on which you need the filter and the filter value.

<tables>
<table name="TaxAuthorityAddress">
    <filter>
<field>DataAreaID</field> = 'NL'
   </filter>
</table>

</tables>


Import data and run a SQL Script after it
The import tool truncates the table before importing the file. So be aware, you cannot import data only for one company when you have multiple companies in your environment. 

You can create your own SQL scripts and place them in the [import] folder in the TDTT root folder. When the import of the data has been done, the script(s) will be executed. It could be very handy to include a script which updates your parameters to your test environment settings.

As an example, I created a simple script to update my TaxAuthorityId to '007':

USE [DAX2012R3];
UPDATE TaxAuthorityAddress
SET TaxAuthorityId ='007'

WHERE DataAreaId ='NL';

After importing the TaxAuthorityAddress which I exported earlier, you can now see that the field is updated: 




My verdict: nice tool; clean, fast and simple, I like it. 




Monday 26 May 2014

Brand new setup: SQL 2014, Visual Studio 2010, TFS 2013 and Dynamics AX 2012 R3

I've been wanting to upgrade my software setup for some time now and after an inspiring presentation regarding automatic builds (by Kenny Saelen) I decided to fully upgrade my software. In two evenings I installed:

  • SQL Server 2014 developer edition
  • Visual Studio 2010 (AX 2012 isn't compatible with VS2012)
  • TFS 2013
  • Dynamics AX 2012 R3
I had already installed AX, so I only needed to move my databases to the SQL Server 2014 data dir and it was up and running again. 

My laptop is a Dell Intel Core i7, 2.9 GHz, running with 8GB RAM and 128GB SSD. I must say, AX2012 runs pretty smooth and is more than sufficient for my testing purposes. With the help of AXBuild, a complete compile only took 37 minutes. 

A minor issue running into when using Team explorer was that I could no longer create a new Team Project after linking it to TFS 2013. The error was: 
TF30172: You are trying to create a team project either without required permissions or with an older version of Team Explorer. Contact your project administrator to check your permissions or to determine how to upgrade Team Explorer. 

It is required to update your Team explorer for TFS, you can use the following link: http://www.microsoft.com/en-us/download/details.aspx?id=40776.

Now everything is working and as you can see, TFS 2013 is up and running on my local laptop, nice :-).



Tuesday 20 May 2014

Data import export Framework - Copy entity data between companies

The Data import export Framework has the possibility to import or export data from external sources, but you can also copy data from one company to another within AX itself. 

Below you'll find the how to (like importing data, again this is really easy), but first some downsides in my opinion:


  • You can't see the selected companies after having run the wizard. I'd like to now this information to double check. You'd want to see the in the [Copy processing group] form. 
  • Business validations cannot be run when copying the data. I'd want the possibility to check if mandatory metadata is present in my new company and if not, warn me about it, like when importing data. Now you should be careful to validate your data in the staging table and correct errors if there are any. If you forget to do this, you'll enter invalid data in AX and loose your data integrity. 
How to copy the data


To use it, you must select option: [Copy entity data between companies]. It lets you select an entity (in this case Asset):


Next up is the selection of the Source and the Target company, simply select one Source company and one or more Target companies: 


Click next and finish the wizard to set up the copy template. Now all that needs to be done is to Get the Staging data (hit run to fetch the records).


And finally run the job [Copy data to target] to copy the data in the actual AX table. 


The assets are now present in the selected company.




Wednesday 14 May 2014

Create and update production order through code

Here's how to create a production order and update it completely up until ending the order through code for AX2009: 


static void JLH_ProductionOrderViaCode(Args _args)
{
    NumberSeq                       numberSeq;
    ProdTable                       prodTable;
    InventDim                       inventDim;
    ProdMultiCostEstimation         prodMultiCostEstimation;
    ProdMultiScheduling             prodMultiScheduling;
    ProdMultiRelease                prodMultiRelease;
    ProdMultiStartUp                prodMultiStartUp;
    ProdMultiReportFinished         prodMultiReportFinished;
    ProdMultiHistoricalCost         prodMultiHistoricalCost;
    Args                            args   = new Args();
    ;

    ttsbegin;
    NumberSeq                   = NumberSeq::newGetNumFromCode(ProdParameters::numRefProdId().numberSequence);
    ProdTable.ProdId            = NumberSeq.num();
    ProdTable.ItemId            = '5001'; // Item

    prodTable.initFromInventTable(InventTable::find(prodTable.ItemId));
    prodTable.initBOMVersion();
    prodTable.initRouteVersion();
    inventDim.InventSiteId      = 'S1'; // Site
    inventDim.InventLocationId  = 'MW'; // Warehouse
    inventDim = InventDim::findOrCreate(inventDim);
    prodtable.InventDimId       = inventDim.inventDimId;
    prodtable.QtySched          = 2; // Quantity
    prodtable.insert();
    ttscommit;

    // Estimation
    prodMultiCostEstimation = ProdMultiCostEstimation::construct(args);
    runBaseMultiParm::initParm(ProdMultiCostEstimation);
    prodMultiCostEstimation.insert(prodTable, ProdMultiCostEstimation.defaultParmBuffer());
    prodMultiCostEstimation.run();

    // Scheduling
    prodMultiScheduling = ProdMultiScheduling::construct(ProdSchedMethod::OperationScheduling);
    runBaseMultiParm::initParm(prodMultiScheduling);
    prodMultiScheduling.insert(prodTable, prodMultiScheduling.defaultParmBuffer());
    prodMultiScheduling.run();

    // Release
    prodMultiRelease = ProdMultiRelease::construct(args);
    runBaseMultiParm::initParm(ProdMultiRelease);
    prodMultiRelease.insert(prodTable, prodMultiRelease.defaultParmBuffer());
    prodMultiRelease.run();

    // Start up
    prodMultiStartup = ProdMultiStartUp::construct(args);
    runBaseMultiParm::initParm(prodMultiStartup);
    prodMultiStartup.insert(prodTable, prodMultiStartup.defaultParmBuffer());
    prodMultiStartup.run();

    // Report as Finished
    prodMultiReportFinished = ProdMultiReportFinished::construct(args);
    runBaseMultiParm::initParm(ProdMultiReportFinished);
    prodMultiReportFinished.insert(prodTable, ProdMultiReportFinished.defaultParmBuffer());
    prodMultiReportFinished.run();

    // End the Order
    prodMultiHistoricalCost = ProdMultiHistoricalCost::construct(args);
    runBaseMultiParm::initParm(ProdMultiHistoricalCost);
    prodMultiHistoricalCost.insert(prodTable, ProdMultiHistoricalCost.defaultParmBuffer());
    prodMultiHistoricalCost.run();

}

Wednesday 7 May 2014

Data Import Export Framework in AX2012 R3

AX2012 R3 comes with the Data Import Export Framework which is a separate module in AX. It is also part of R2 CU7 and available as a download for 2012 Featurepack (via Information Source). 

I did install it before in the Featurepack version, but it had some minor bugs which held me from futher investigation this possibly cool tool. But after re-discovering this tool, my first impression for the out of the box options are:
Pros: It looks nice, it's comprehensive and let's you preview data which you'd want to import.
Cons: It stills seems a bit shaky, the slightest mistake can result an error which doesn't explain the reason of the failure. Also the drag and drop isn't very smooth (yet?), but it gets the job done.

So how does it work?

In this blog I'll briefly show how you can import data via CSV, but you can also upload via ODBC or via other AX instances. I'll import 4 new main accounts in this example.

First you need to create a Source Data format. In this case I've created CSV of the type File and have set the column delimiter to Semicolon. 



Next, create a new Processing Group (MA with description Main Account in my case) and click Entities, here you can select the type of import (I choose Main Account, but it could also be customers, sales orders, vendors etc.). Select source data format to be CSV and then click Generate Source file to create the input template. Select the fields you want and generate a sample file. 


In the Sample File Path, select your enriched file which you'll import. Then the other options in the top menu become available. 

Select Generate Source Mapping and based on the header line in your import file, the links are created. You can view (and modify) the link from the import file to the table via the button "Modify Source mapping" as can be seen below. When you modify a link (drag and drop), remember to hit the Save button. 


You can preview the file which is to be imported by clicking Preview Source File, it will show the upload to the staging table with the values and fields. 

Close the entities form and return to the Processing Group, hit "Get staging data" to upload the data and click Run. In my case, it results in an infolog which says that 4 records have been inserted in the staging table. Via "Execution History" you can preview the data in the staging table which can now be imported to the table(s) in AX. A last check if all data is correct and then click "Copy data to target" and click Run to import. 

This resulted in 4 new records and my main accounts are created: 


It's really that simple, so go for it and check it out!

Saturday 3 May 2014

Importing Demo Data in DAX 2012 R3

Today I installed AX2012 R3 on my laptop and of course I want the demo data to be imported as well (download via Partnersource). You need to use the Test Data Transfer tool which can be downloaded via InformationSource.

The next step is to extract the demo data and to install the test data transfer tool. After that has been done, open the command prompt in admin mode. Navigate to the folder where the test tool is installed and enter the following command: 

dp [direction] [directory] [database] [server]

direction = Import or Export (import in this case)
directory = the directory which holds the extracted demo data
database = the name of the AX2012 R3 database
server = name of the server (can be left blank when importing on the local machine)



Hit Y to confirm that this is a test environment and you want to import the data. The import will start to process the files.

When the import is complete, you'll have the new companies! The whole import took approximately one hour, great work.


Wednesday 5 March 2014

Clean up your databaselog

A cool thing in AX is to keep track of vital changes using the DatabaseLog. A problem however is that your databaselog can grow exponentially so there are two things you need when using it:
1. Choose carefully which data you want to keep track of. Mostly it's enough to track some updates on some fields and some delete actions. You should periodically review if it is still necessary to log these changes. 
2. You need to clean up your log for it can grow enormously within time. Disadvantage of the clean up functionality in AX is that it simply deletes your log. 

Therefor I created a job which exports your log (with the old and the new value) to a csv file which you can save and afterwards delete the log in AX. The job is just to show what could be possible to save your databaselog. You could improve it with date ranges, delete records after saving them, create a class which can run in batch, only a number of records to keep the file size normal and then on the next file, etcetera. 

static void JLH_DatabaseLogToCSV(Args _args)
{
    #AviFiles
    SysOperationProgress    progress = new SysOperationProgress();
    int                     i;
    AsciiIo                 asciiIo;
    Filename                filename;
    Str                     line;
    Container               con;
    SysDatabaseLog          databaseLog;
    str                     dataBaseLogOld;
    str                     dataBaseLogNew;
    str                     dataBaseLogDate;
    str                     dataBaseLogUser;
    str                     tableName;
    str                     fieldName;
    container               databaseLogContainer;
    int                     containerLength;
    int                     counter;
    int                     countRecords;
    int                     tableId;
    int                     fieldId;
    ;

    progress.setCaption("Export To CSV in progress...");
    progress.setAnimation(#AviTransfer);

    filename = "C:\\FileShare\\DAX2009\\ImportExport\\ExportToCSV.txt";
    asciiIo = new AsciiIo(filename,'W');
    asciiIo.outRecordDelimiter('\r\n');
    asciiIo.outFieldDelimiter(";");
    asciiIo.write('Table;Field;NewValue;OldValue;Date;User');

    while select dataBaseLog
    {
        progress.setText(strfmt("DataBaseLog Recid %1", databaselog.RecId));

        containerLength = conLen(dataBaseLog.Data);

        for (counter = 1; counter <= containerLength; counter++)
        {
            dataBaseLogContainer    = conPeek(databaseLog.Data, counter);
            dataBaseLogOld          = conpeek(databaselogContainer,3);
            dataBaseLogOld          = strKeep(dataBaseLogOld,' 1234567890,.abcdefghijklmnopqrstuvwxyz');
            dataBaseLogNew          = conpeek(databaselogContainer,2);
            dataBaseLogNew          = strKeep(dataBaseLogNew,' 1234567890,.abcdefghijklmnopqrstuvwxyz');
            dataBaseLogDate         = dateTime2str(databaseLog.createdDateTime);
            dataBaseLogUser         = databaseLog.createdBy;
            tableName               = tableid2name(databaseLog.table);

            tableID                 = databaseLog.table;
            fieldId                 = conpeek(databaseLogContainer,1);
            fieldName               = fieldid2name(tableID,fieldId);

            if (fieldName != 'RecVersion')
            {
                asciiIo.write(strfmt('%1;%2;%3;%4;%5;%6',
                tableName,
                fieldName,
                dataBaseLogNew,
                dataBaseLogOld,
                dataBaseLogDate,
                dataBaseLogUser));
            }
        }
    }
    asciiIo = null;
}

Tuesday 25 February 2014

SysQueryRangeUtil, dynamic dates in Reports

There are multiple reports in Dynamics AX which you run on a daily or weekly basis and where you need to change the date range every time you run the report. For instance, you want to see your turnover of the past week on every monday. That means that every monday, you need to change your date range.

There a clever query utility which let's you create dynamic ranges. 

The different options are (today's date is 25-02-2014): 

  • (dayRange(-30,0)) - Results in a date range for the last 30 days: "26-01-2014".."25-02-2014"
  • (day(-1)) - Results in yesterday's date: 24-02-2014
  • (day(0)) - Results in today's date: 25-02-2014 
  • (day(1)) Results in tomorrow's date: 26-02-2014
  • (greaterThanDate(2)) - Results in every date after today plus 2: > 27-02-2014
  • (lessThanDate(-1)) - Results in every date of today minus 1: < 24-02-2014
  • (monthRange(0,2)) - Results in first day till the last day of the month's choosen (0 = current month): "01-02-2014".."30-04-2014"
  • (yearRange(-1,-1)) - Results in first day till the last day of the chosen year: "01-01-2013".."31-12-2013"

Now, how to apply this in a report. In a date field, simply key in the rangetype you need. Below you see the range (dayRange(-30,0)). The result will be that every day you run this report, it will display the transactions of today minus 30 days up until today. This way, you can set reports like this to run in Batch and to sent them periodically to a fileshare or e-mail them without having the need to always having to change the date range. 



To see the results in a job or to play around with the different date options, you can copy and use the job below:
static void JLH_sysQueryRangeUtil(Args _args)
{
    str         dateRange;
    ;

    // In the following date examples, date of today (25-02-2014) is taken

    dateRange = sysQueryRangeUtil::dayRange(-30,0); // Results in a date range for the last 30 days: "26-01-2014".."25-02-2014"
    info(strfmt(dateRange));

    dateRange = sysQueryRangeUtil::day(-1); // Results in yesterday's date: 24-02-2014
    info(strfmt(dateRange));

    dateRange = sysQueryRangeUtil::day(1); // Results in tomorrow's date: 26-02-2014
    info(strfmt(dateRange));

    dateRange = sysQueryRangeUtil::greaterThanDate(2); // Results in every date after today plus 2: > 27-02-2014
    info(strfmt(dateRange));

    dateRange = sysQueryRangeUtil::lessThanDate(-1); // Results in every date of today minus 1: < 24-02-2014
    info(strfmt(dateRange));

    dateRange = sysQueryRangeUtil::monthRange(0,2); // Results in first day till the last day of the month's chosen (0 = current month): "01-02-2014".."30-04-2014"
    info(strfmt(dateRange));

    dateRange = sysQueryRangeUtil::yearRange(-1,-1); // Results in first day till the last day of the chosen year: "01-01-2013".."31-12-2013"
    info(strfmt(dateRange));

}

Wednesday 8 January 2014

ICP / OB declaration via Digipoort in The Netherlands

In AX, the ICP and OB declarations could be sent electronically to the tax authority using the PIN method. However, as of 01-02-2014, it is no longer possible to submit your declaration via BAPI with PIN and as of 01-06-2014 it's also no longer possible to sent it via BAPI PKI (but this was never possible in AX). It is to be sent via Digipoort using a "PKIoverheid" service certificate and besides that the declaration should an XBRL format instead of the old XML. 

The new certificate can be optained via:
• Digidentity (https://www.digidentity.eu/)
• ESG (http://www.de-electronische-signatuur.nl/)
• KPN (http://certificaat.kpn.com/)

• QuoVadis (http://www.quovadisglobal.nl/)

The new XBRL layout is mandatory (SBR - Standard Business Reporting) and the taxonomy can be found here: http://www.nltaxonomie.nl/ (version 8.0).

For Dynamics AX 2009, AX2012 and AX2012 R2, a hotfix has been created by Microsoft, you can download it here via Partnersource.

Currently, for AX2009 it is only possible to create the XBRL message, you cannot sent it yet, but Microsoft is working on it and has said to release a.s.a.p.. 


At this moment I've created the new messages in both AX2009 and AX2012, but have not yet send them. As soon as the AX2009 fix is there and all connection strings are complete, I'll try to post an update with the sending process as well.

Creating the declaration

After installing the update, nothing has changed. You create the ICP / OB declaration as usual in AX and then create the message via [General Ledger - Periodic - Sales tax payment - Netherlands - Electronic ICP declaration]. First create the declaration for a given period and then click [Send XML]. Your declaration cannot be sent yet, but the message is created and you can view it in the Outgoing BAPI folder. Open the file and you'll see the new layout according to the required XBRL format. This works the same for both AX2009 and AX2012.

Below you see the new output, looks good :-)!