Quantcast
Channel: Warehouse Management and Distribution
Viewing all 204 articles
Browse latest View live

D365 F&O – Fast Track webinars

$
0
0

If you are or have implemented Dynamics 365 F&O, please bookmark the following site:

https://infopedia.eventbuilder.com/index?landingpageid=92tzhl

Here the Microsoft Fast Track team is conducting lots of excellent webinars and very good knowledge is shared by the people creating Dynamics 365.

You can also sign up for live sessions or look at recorded sessions.

High value!


MPOS – Open full (kiosk) screen mode when having dual display

$
0
0

For a retailer, every saved “click” is appreciated, and the ability to remove any noise appreciated.

When starting MPOS in maximum mode, you will often see that you have a title bar at the top, and the app-bar at the bottom.

In windows 10 you can also use the “tablet-mode” to get the MPOS into full screen mode.

BUT! If you have a dual display setup, it the tablet mode does not work.

If you want to remove them, there is a smart keyboard short-cut:

Shift-Windows-Enter

This will put the MPOS in full screen mode, and giving a nicer appearance without the bar’s.

Then the questions is how to make this always happen, when starting the MPOS ? This was actually not a easy task, but a colleague of me (Espen) made it possible , du using a powershell script.

The following page contains a small powershell script, that opens a UWP app in full (kiosk) screen mode:

Add this to a “start up folder”, and create a new powershell script containing ;

[Path]\StartUWPAppFullScreen.ps1
-app
Shell:Appsfolder\Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt!App

 

Then create a shortcut towards this new powershell app.

How initial investigations (by Sven Erik) shows that the MPOS app ID is Microsoft.Dynamics.Retail.Pos_tfm0as0j1fqzt!App and let’s hope this ID stay’s permanent.

Then the MPOS looks nicer for the user, without noice.

 

 

 

Retail Enterprise Architecture mapping using ArchiMate and ARDOQ

$
0
0

The warning; The blog post is High Level, but the benefits can be mind-blowing.

Enterprise Architecture is about understanding and change. In today’s business, change is everywhere and the essential part to survive. But change is not easy. To have insights and understanding of your own organization is essential for change and risk assessment. Understanding how people, processes and technology are connected will give focus to achieve high value benefits. In my profession we use the Microsoft Dynamics technology stack as a main driver for implementing improvements. But we also acknowledge that Dynamics 365 is not the only system at work. Even though Dynamics 365 is a central component, there always will be many other systems, processes and technologies that is included in the enterprise architecture (EA). We need a way to describe all these connections in uniformed way, that allows us to communicate a model for enterprises dynamically.

But why should EA mapping be a central part of your business? here are 6 business motivators and benefits of having a structured approach of the EA mapping:

 Increased stability and availability. It is critical vital that all central systems have a near 100% availability. POS and back-end systems must always work, and the supporting processes must be streamlined to secure that risks related to business improvements and changes are minimized and understood. The EA mapping documents the relationships and show consequences changes.
 Guaranteed Performance. Having acceptable system response 24/7, that can deal with business pikes must be planned and built around the system. Systems must deal with a variable load, handling that the sudden event changes the transaction volume. Any disruptions quickly result in customers walking away. The EA mapping must document components central for performance compliance, and the business actors involved
 Scalable capacity. New stores or changes in the business model can quickly change the requirement for transaction and processing capacity. To be cost effective, the capacity scalability must dynamic according to the actual need. Both in terms to scaling up and down. The EA mapping documents components central for scalability, and the business actors involved.
 Strong security. Cyberattacks are increasing and it is vital important to secure information and transactions. Being GDPR compliant puts demands on systems and internal processes on how to handle own and customer information. Security, tractability and audit trail builds trust into the system and documenting compliancy. The EA mapping documents governance and role compliance, and the business actors involved.
 Right focus. There are always new business opportunities and process improvements. Keeping track on where to focus will lead to better and faster implementation of changes in a secure and stable manner. New ideas must be analyzed, and risk assessed, and also to understand the implications. The EA mapping can assist in focusing on what changes have the highest priorities and benefits.
 Cost control. Being a retailer involves large investments in technology like POS, Mobile apps, customer portals and enterprise systems. Moreover, there may be large fluctuations in system usage throughout the year. By purchasing these features in the subscription form, it is possible to equalize the operating costs and that you only pay for what is needed. Good liquidity is archived by balancing cost full investments towards the revenue stream and securing actual return on these investments

To move forward a “language” is needed to describe an enterprise architecture model where you can visualize, plan, implement and maintain all relationships that exists today, in transitions and the final vision.

Architecture Layers using ArchiMate

The overall mapping can be modelled in 5 main layers; Here I would like to focus on the symbolism used for identifying. The notation here is ArchiMate, that is open and independent enterprise architecture modeling language to support the description, analysis and visualization of architecture within and across business domains in an unambiguous way.

Motivation Elements defines the overall drivers and goals that the enterprise have. Much of the vision is located here. The Motivation elements can also be seen as a vertical layer, in close relationship to all layers.

The Strategy layer defines the overall course of action and a mapping towards resource and business capabilities.


The Business layer defines the business processes and the services the enterprise is providing, and the here the main business processes are defined. To simply the modeling it is relevant to start with the Business Objects, Business processes, Business Roles, Business actors, Business events, Business Services and Business Rules and Logics.

The Application layer contains application services and capabilities, their interactions and application processes. Here Dynamics 365 and much of the power platform is located. To simply the modeling it is relevant to start with Data objects, Application functions and Application components.


The Technology and physical layer describes the software and hardware(physical or virtual) capabilities that are required to support the deployment of business, data, and application services; this includes IT infrastructure, middleware, networks, communications, processing, standards, etc. The underlaying structure of Microsoft Azure would typically be described here. To simply the modeling it is relevant to start with Artifacts, System Software, Technology Service, Device and Communication network.

Architecture Relationships using ArchiMate

The real beauty comes, when the relationships between architecture elements are being defined. But to do this, a set of predefined relationships needs to be defined. The most common used is the following one

If putting this together in a combined setup I get the following relationship diagram of what is relevant to document.

(*Credits to Joon for this visualization)

As seen here, the business processes are a realization of the application functions, and this clarifies how a proper Enterprise Architecture modelling is documents. With this model, we can what business actors is assigned to what Business roles. This again shows the business process assignment to the role. The Business processes are there to realize business services.

Building the Architecture model using Ardoq

The architecture relationships can be challenging to describe using tools like Visio. Often, we see that great work is done, but not used to the potential. An alternative is to use cloud based mapping tools as ardoq, that covers most aspects in documenting relationships between business processes, applications, roles, risks and transitions. This is not a commercial for this tool, but I find it great. So, I decided to try to use Ardoq to model the Contoso demo data.

Here I will focus on the Application Layer, as this is the layer where the application functionality and data are located. First, I create the application components:

Then I create the Application Functions, and I also import the Business Roles that is available in the Contoso demo dataset.

Next job is to build the relationship between the application functions(D365), business processes(vertical processes) and business roles. This will allow me to visualize and to trace dependencies across all the EA mappings. Let’s take an example looking into the responsibilities of an employee named April Mayer.

I can here see that she is related to the business roles; Accounts payable clerk and manager. If I click on the “Accounts payable clerk” I jump into the view of this business role, and I can see that it is related to the business processes of accounts payable, and an association to April Mayer.

Jumping to accounts payable allows be to see the business processes involved.

I can also visualize the entire Enterprise Architecture Map will all objects and relations,

And zoom into specific on the relations; This graph shows me that April Meyer belongs to the role “Employee”, Accounts payable manager and clear. The Accounts payable clerk is associated with the business process “Accounts payable”. The clerk role is associated with the Financial management modules in Dynamics 365.

Here is another visualization, that shows the how the business objective of “Marketing” can be achieved, and what Business roles are involved, what Business processes, Application functions and what application components are also involved.

Knowing the relation and the ability to communicate is a key to happy Enterprise Architecture mapping.

Give is a try, the result can be very powerful.

Additional information

1. A high value blogger on Enterprise Architecture is http://theenterprisingarchitect.blogspot.com/.

2. Homepage of archimate: http://pubs.opengroup.org/architecture/archimate3-doc/toc.html .

3. Homepage of ARDOQ : https://ardoq.com/ Give it a try !

Near real-time replenishment in Dynamics 365 F&O

$
0
0

There is a lot of good stuff on the horizon for Dynamics 365. I highly recommend that you check out the following article of some new planning services that will come in the April 2019 release.

https://docs.microsoft.com/en-gb/business-applications-release-notes/April19/dynamics365-finance-operations/planning-service

To make this happen, I would expect the planning to go deeper into the SQL stack, and also to maximize the utilization of in-memory processing of the transactions.

For Retailers, this will be highly appreciated, where limited space in the stores means that shelf replenishment several times each day is common. Especially for perishable products with limited shelf-time. Keeping things fresh and presentable is a necessity for the customer to buy. The ability to more quickly react to customer demands ensures that the customers actually find the products in your store. And the same aspect, when there are a slower sale, the ability to adjust down the replenishment according to activity. This saves cost and increases profit. In Retail, it is the small improvements that in sum creates the big results.

For the planning service to work, it needs the transactions to take action on. In Dynamics 365 for Retail we must choose between the ability to aggregate the transactions coming from the POS/Channel databases, or more quickly posting the statements. I’m looking forward to many good discussions on this area.

The future is faster

Dynamics 365F&O – Enabling new hidden functionality (SYSFlighting)

$
0
0

With Dynamics 365 version 10, the innovation wave from Microsoft is continuing to accelerate. All customer will use the same base source code of the Dynamics 365 solution, and it will be maintained and updated every month. But for many customers, stability also have its value. New functionality every month is not always what existing customers want to implement. New functionality could mean new trainings and new testing. Me on the other hand loves new features, because it enables new possibilities and solutions.

Microsoft have a solution for this, and that not all new functionality is enabled by default. Instead, the new functionality must be manually enabled based on support request through LCS. Two specific functionalities that is already documented is new functionality in Data Management framework and Business events. In the documentation pages you can see how to enable this hidden functionality, but the essence is that you have to run a SQL commend (only available for non-production environments) :

INSERT INTO SYSFLIGHTING (FLIGHTNAME, ENABLED, FLIGHTSERVICEID) VALUES (‘XXXXX’, 1, 12719367)

PS! This is NOT something you can enable by your self in a production system.

A small tip, to search for in Docs.microsoft.com is the term “SYSFLIGHTING“. And then you will see the articles on documented hidden features.

But there are more, but undocumented features in two categories; Application and Platform. And these can be seen as two macro’s in the source code, named ApplicationPlatformFlights and ApplicationFoundationFlights. I have taken a snapshot of them here and based on the names we do get some indication of what they are used for. What they are, and how to use them I expect will be documented in the future.

PS! I look forward in exploring the “AnalyticsRealTimeReporting“, “DMFEnableAllCompanyExport“, “AnalyticsReportWebEditor“, “BusinessEventsMaster“, “ApplicationPlatformPowerAppsPersonalization“.

Happy Flighting

Dynamics 365 F&O – Selecting the correct Tier level on your sandboxes

$
0
0

When purchasing Dynamics 365 F&O, you a set of Microsoft managed (but self-service) environments that is included with the standard offer. (Production, Tier-2 Standard Acceptance Testing and a Tier-1 Develop/Build and test environment. Microsoft have described this on the environment planning docs. I will not discuss Tier-1 environments here, as these environments is optimized for development experiences. Do not perform performance testing on a tier-1 environment.Tier-2+ environments is based on the same architecture as a production environment and uses the Azure SQL Database service.

When running an implementation project, it is common to purchase additional tier 2+ environment that is used of different purposes as shown in the table below (from Microsoft Docs)

Selecting the correct level is important and is depending on what the environment is going to be used for. As a guidance, Microsoft have the following baseline recommendation:

On the projects where I have been involved, we most often have 3 or 4 Tier 2+ environment and the purpose are changed through the project.

The flow of data between these environments can be included into a Sprint Cycle. The process will start with defining the general parameters in the golden configuration environment (1). Here all system setup, number sequences, and master data will be uploaded/entered from the legacy systems. The Test/Stage/Migration environment (2) will be created based on the golden environment + transactional data packages/initial startup data. Then there will be a database refresh from Test (2) à UAT (2), where all test scripts will be run and approved. The results and configuration changes/master data are then fed into the golden environment ready for the next data movement cycle. The reason why we do this, is to ensure that the golden environment and the migration environment is not corrupted through testing. At Go Live, and when the UAT is approved (after a few iterations), then the Migration environment will be copied to the production environment. This can only happen once. Subsequent updates to the production environment must be done manually or using data packages.

(1)- Tier-2 Golden environment (before PROD have been deployed). This environment is often changed to become staging environment that contains an exact replica of the production environment. I prefer golden environments as a Tier-2, as this simplifies the transfer of data using the LCS self-service database refresh.

(2)- Tier-2 data migration. This environment is used for making transactional data ready for being imported to the production environment at Go-Live.

(3)- Tier-2/3 User acceptance. Here the system is really tested. Lot’s of regression testing and running test scripts. The focus is functionality. If there are concerns on performance, a Tier-5 environment can be purchased for a shorter period to validate that system can handle the full load of a large-scale production environment. For performance testing, it is recommended to also invest in automation of the test script. (Unless you ask the entire organization to participate in a manual test).

The performance of a system is a combination of the raw computing power of the VM’s hosing the AOS, and the sizing of the Azure SQL. With Dynamics 365 we don’t have any way’s of influencing the sizing of this. It is all managed by Microsoft, and they will size the production environment according to number of users and transactions per hour. But the Azure SQL boundaries that Microsoft is most often related to the following sizing steps.

I don’t exactly understand how Microsoft is mapping the Tier-2..5 towards these steps, but I have experienced that a Tier-2 level in some cases are a P1, P2, P4 and P6. More information on the DTU capacity can be found here, and the summary is that we can expect 48 IOPS per DTU. So, a P6 will provide 48000 IOPS. If you want to check your DTP limit, then open SQL manager towards the Azure SQL database, and execute the following script:

SELECT
*
FROM
sys.dm_db_resource_stats ORDER
BY end_time DESC;

And then the DTU limit should be shown here: This is from a Tier-2 environment belonging to the initial subscription, and this seams to have 250 DTU’s(P2)

But what puzzles me is if I go into another Tier-2 add-on environment I have 500 DTU (P4)

And in the third Tier-2 add-on environment I have 1000 DTU (P6)

So there seams not to be a consistency between the DTU’s provided and the Tier-2 add-on purchased. As far as I know, in this case the production environment is 1000 DTU’s(Or P6) in some of my customers.

The AOS’es on the Tier-2 environment seams to mostly be D12/DS12/DS12_v2 with 4 CPU and 28 Gb RAM and 8x500Gb storage, capable of giving out 12.800 IOPS.

What also puzzles me is the number of Tier-2 AOS’s that is deployed. Some environments have one AOS, and one BI server.

While other Tier-2 environments have two AOS’es and one BI server

I assume that the differences are related to how the subscription estimator have been filled out, and that this may have an impact on what is deployed as sandbox Tier-2 environments.

Dynamics 365 do have some performance indicators under the system administrator menu, that gives some numbers, but I cannot see a clear correlation between the environments and the performance. Maybe some of you smart guys can explain how to interpret these performance test results? What is good, and what is not?

If we take the “LargeBufferReads”, how does your environments perform?

D365F&O – Auto-report as finished in a Retail scenario

$
0
0

For many years I have had the opportunity to work on Dynamics 365 topics involving Kitting, Value Added Services(VAS) and Bill-of-Materials(BOM). Today I would like to write about the released product parameter “Auto-report as finished” in a retail scenario, and you can read more about report as finished at the Microsoft docs. To explain the business scenario, let’s take hot-dogs. A hot-dog is normally assembled as the customer wants, but in this scenario, we have a standardized hot-dog with 4 ingrediencies.

As a retailer, I would like to sell the finished product, but keep track of the raw materials. To do this you need to create a BOM, and when the hot-dog is sold, Dynamics 365 will automatically report a hot-dog as finished, and draw the ingrediencies from the store warehouse. It is possible to use a production order, but for retailers this is overkill. Something much easier is needed. Instead of exact BOM’s, then average BOM’s can also be used, since knowing exactly how much onion or mustard the customer will apply is not an exact science.

Dynamics 365 have a nice feature for this; Auto-report as finished.

What this parameter does, is then when the product is physically deduced (or Sold) a BOM journal will be created and posted. This will create issue-transaction (sold) from your inventory.

Here I have created a BOM for my hot-dog:

When creating a sales order and posting a packing slip you will see that a Bom journal is automatically created and posted.

The posted BOM Journal looks like this, and here we see that a hot-dog is added to the warehouse, while the ingrediencies are subtracted from the warehouse.

For retailers, this means that we can sell goods in the POS, and when the statement posting process is creating and posting sales orders, the auto-report as finished functionality will be posted. So, no need of any production order, or manually posting Report as Finished journals. So, Dynamics 365 have an alternative to retail kit’s, if a more standardized BOM’s are used. The BOM can then also be used for cost calculations on food and retail produced items. Comparing the counting and the actual transactions will also help to know how accurate the BOM are for describing the cost picture of the products. Master planning will also catch this, and you can get replenishment to work on ingrediencies.

BUT!!! There are some issues.
As a workaround and to make this work you will have to specify default warehouse per site per item in the default order settings.(I know this is an impossible task if you have 500 products and 500 stores, as this would mean you have to create 250.000 default order settings). I have a support request going with Microsoft to change this, so that this is not needed, and that the warehouse can be inherited from the parent transaction. So, if you get error like this, then you have done nothing wrong, and hopefully it will be fixed on future releases.

 

STOPP HERE, unless you like X++

Here is something for the “technical” guys; The code that automatically triggers this auto-report as finished is actually the class InventUpd_Physical.updatePhysicalIssue(). For those of us, that have worked quite some time with Dynamics, we understand that this class is very central, because all physical inventory transactions are posted through this class. The behavior of auto-posting BOM’s will therefore influence all places where a physical transaction is posted.

Microsoft have created a method on the movement classes named ” canBeAutoRepAsFinished()”, that let’s them refuse this behavior on certain transaction types.

If you don’t want to wait until Microsoft fixes the feature where the warehouse dimension is inherited from parent BOM, then you do have an option to extend the class BOMReportFinish.initInventDimFromBOMRoute(), and here set the InventLocationId from the parent. Her is at least my suggestion to fix the issue in the standard code(without extension):

Here is the code for validating that warehouse storage dimension is used on the BOM-line, and sending this back to the report as finished class.

Take care and I’ve got to get back to work. When I stop rowing, the mothership just goes in circles.

 

 

 

 

 

 

D365F&O – Community Driven Engineering

$
0
0

I have previously blogged about the importance of reporting new ideas, issues and bugs to Microsoft, and also why the community will benefit from sharing. I have often experienced that experienced engineers often have the solution available and are more than willing to give it for free to get the fixed-up code into the standard solution.

But the formalized support path does require time and energy and remember that not all Microsoft support consultants are engineers that you can discuss X++ topics with. But how can the process of contributing to the D365 community be easier?

But did you know that Microsoft have a program for Community Driven Engineering with Dynamics 365 F&O? This covers not only bugs, but also new features. Community driven engineering (CDE) is a Microsoft effort to make external engineers more efficient at providing recommended bug fixes as minor features to Microsoft, as well as to make Microsoft more efficient in accepting fixes from the community. If the fix is accepted, it will be merged into the main Dynamics 365 F&O branch. I have tried the program, and reported in a fix for auto-report as finished, and the fix was accepted, and hopefully in the near future the entire community can benefit from it.

How to start?

If you have the right skills and the willingness to share and give away your fixes (or features) you can sign up at https://aka.ms/Communitydrivenengineering. You need to be accepted into the program, and your user must be whitelisted before you can access. The CDE also have a private Yammer group, that you get access to when accepted. But I must warn you. This program is meant for the most experienced and technical people we have in our community, and that are deep into X++ and AzureDevOps. You must have approval from CxO-level in your organization that you can share code with Microsoft. (Lawyer stuff)

Here is the overall flow for the external engineer:

  1. You create a bug or a Feature in CDE Azure DevOps
  2. The bug or Feature is reviewed by the MS team and accepted or rejected
  3. You create a branch for this work and commit in this branch
  4. When done you create a Pull Request
  5. The Pull Request is reviewed by the MS team and feedback is provided
  6. After some iterations the Pull Request will be approved and complete
  7. The MS team will take over the code and include in a future release

Here are the more technical details of how it works.

The following text is copied from the onboarding documentation of the CDE.

It takes approximately one hour to get started with CDE, the majority of which is the initial build time.

  1. Obtain a development VM from LCS with build 8.1.195.20001 (app 8.1, platform update 22) or later. The latest branch I have access to is 10.0.80.19, that basically is 10.0.2 PU 26.
  2. Make sure you have opened Visual Studio at least once on the VM to sign in and pick default settings.
  3. Install Git on the machine from https://git-scm.com/downloads . The default installation options should work fine.
  4. From an administrator command line instance, clone this repo to a location on the machine.
    pushd k:\
    mkdir git
    cd git
    git clone https://dev.azure.com/msdyncde/_git/cde

  5. Define your user name and email in Git
    git config –global user.name “John Doe”
    git config –global user.email johndoe@example.com

  6. Mount the git repo into the F&O deployment
    pushd K:\git\cde
    powershell .\Mount.ps1
  7. Open Visual Studio as administrator and rebuild the following models
    ApplicationSuite
    ApplicationWorkspaces
    FiscalBooks
    GeneralLedger
    Project
    Retail
    Tax

At this point you can start development(in the SYS layer actually)

How to submit a change?

Changes submitted by the community are committed to the same REL branch matching the version on the dev VM. Once the pull request (PR) is completed, that signals that Microsoft has officially accepted the change and it will show up in a future official release, usually the next monthly release (depending on what day of the month the release closes). The change will only enter the master branch of msdyncde through a future official release. Syncing to the tip of a REL branch will pull in other community changes submitted from that version.

  1. Create a Bug or Feature depending on whether the change is related to incorrect behavior of existing code, or new behavior.
    https://dev.azure.com/msdyncde/cde/_workitems
    New work item > bug
    Fill in the title, assign it to yourself, and set the Area to your best guess as to where the behavior belongs (will help us review appropriately)
    In repro steps and system info, provide information on why this change is necessary
  2. In Git, create a topic branch to work on. Branches are usually named by username/bug number.
    git checkout -b johndoe/482
    git push –set-upstream origin johndoe/482

  3. In Visual Studio make changes to Application Suite SYS code as normal. Changes are actually being made directly in the Git folder.
  4. Push your changes to VSTS.
    git add -A
    git commit -m “Message explaining what is being changed”
    git push

  5. Send a pull request from VSTS
    https://dev.azure.com/msdyncde/_git/cde/pullrequests?_a=mine
    New pull request
    Source branch = johndoe/482, Destination branch = Rel_8.0.30.8022 (or whatever version you have)
    Fill in the title and description, link the work item > Create

Any feedback from Microsoft reviewers (or other Community reviewers) will show up in the PR. Changes can be made to the PR by editing in Visual Studio, and doing git add / commit / push again. Once Microsoft has signed off, all comments have been resolved, a work item is linked, and all other polices have been met, then you can click Complete to complete the pull request. When a PR is completed, that is official acceptance by Microsoft that the change will become part of a future official release, usually the next monthly release.

Behind the scenes

  • The powershell script starts by checking what version of source code exists on the VM by examining the K:\AosService\PackagesLocalDirectory\ApplicationSuite\Descriptor\Foundation.xml file.
  • It then checks out the REL branch associated with that version, which matches the platform and other model versions currently on the machine.
  • The development config files are updated to allow changes to SYS models, which is normally disallowed on dev VM’s.

In addition to having an accelerated approach to get fixes into main branch, participants also have some more benefits. You will have access to the latest & greatest code changes through all code branches that Microsoft makes available. You can search through the code and see if there are code changes that affects extensions or code that is local to you installations. You can also see how the Microsoft code is evolving and improvements are made available in the standard application. You will also build gradually very valuable network towards the best developers in the world, where you will discuss technical topics with the actual people creating the world’s best ERP-system.

One final joke for those considering going into this program: Git and sex are a lot alike. Both involve a lot of committing, pushing and pulling. Just don’t git push –force

 


 


D365F&O – Address performance tips

$
0
0

Sometimes the smallest thing can make a huge difference. At a customer we experienced a huge load (DTU +70% average), and the LCS shows that there was a single SQL query that was the reason for the load. The data composition here was that there was close to a half million customers in the customer table, and most of them had addresses, email and phone numbers assigned to them. Except of the customers used for retail statement processing.

In LCS environment monitoring you can see this as spikes in the overview.

 

The query you typical see looks like this:

(@P1 int,@P2 nvarchar(256),@P3 int,@P4 bigint)SELECT TOP 1 T1.COUNTRYREGIONCODE,T1.DESCRIPTION,T1.ISINSTANTMESSAGE,T1.ISMOBILEPHONE,T1.ISPRIMARY,T1.ISPRIVATE,T1.LOCATION,T1.LOCATOR,T1.LOCATOREXTENSION,T1.PRIVATEFORPARTY,T1.TYPE,T1.ELECTRONICADDRESSROLES,T1.MODIFIEDBY,T1.RECVERSION,T1.PARTITION,T1.RECID FROM LOGISTICSELECTRONICADDRESS T1 WHERE ((T1.PARTITION=5637144576) AND ((T1.TYPE=@P1) AND (T1.LOCATOR<>@P2))) AND EXISTS (SELECT TOP 1 ‘x’ FROM LOGISTICSLOCATION T2 WHERE ((T2.PARTITION=5637144576) AND (T2.RECID=T1.LOCATION)) AND EXISTS (SELECT TOP 1 ‘x’ FROM DIRPARTYLOCATION T3 WHERE ((T3.PARTITION=5637144576) AND (((T3.LOCATION=T2.PARENTLOCATION) AND (T3.ISPOSTALADDRESS=@P3)) AND (T3.PARTY=@P4)))))

By downloading the query plan, we see that there is a index seek on the table LOGISTICSELECTRONICADDRESS.

 

This results in that the indexes don’t get a good “hit” on the logisticselectronicaddess.type.

The solution was surprisingly easy. Add Phone, Email address and URL to the customers.

 

Then the DTU drastically goes down, and normal expected performance was achieved.

 

Conclusion; Remember when having many customers, to fill inn contact information.

This just must be shared

Meetings: Every minute counts, and snooze to 1 minute before meeting starts

$
0
0

As a consultant I’m used to having a lot of “back-to-back” meetings, and when the next meeting is near, I typically get an outlook reminder 15 minutes prior to the meeting.

Then using the “Snooze” button is good. If I snooze until 5 minutes before I am too early. 0 minutes before and I am too late. You know that in the drop-down, the minimum selection is 5 minutes? That is too much for me. I would like to have a new reminder when it is 1 minute before the meeting start. But did you also know that you can type into the field? You can actually write “1 minute”, and this will then remind you when it is 1 minute to the meeting start.

A smaller more advanced way is to set the default reminder to 16 minutes, prior to the meeting

And then when the reminder “pop’s” up, to can select to “Snooze” and select to be reminded in 15 minutes. That is exactly 1 minute before the meeting starts.

Now I have just “earned” 4 more minutes where I can create D365 customer value before the meetings starts

D365: Search for code with Agent Ransack

$
0
0

When supporting customer’s we often can get small fragments of information on an issue, like a form is not performing as expected, or an error message. The procedure is then often to log into LCS and find traces of the issue. Often we end up with a query that is the source of the issue. But to better understand and analyze how to fix the issue we often need to find exactly in the source code where the query is executed. By also being more exact and precise towards Microsoft support you also get quicker response.

Searching through the code in Visual Studio can be time consuming, and the built in Cross reference is not always updated, but there is an alternative I can recommend. Agent Ransack is a free file searching utility that quickly can scan most D365 source code (the *.XML files placed in K:\AosService\PackagesLocalDirectory\).

Let’s say I see in the LCS that the current query is what I need to find out from here it is executed.

From the query I can then search for the text “Join RetailEODTransactionTable”, and I get 25 results, and even where the exact table is not specified as

I can then open the file in explorer and then validate to see if I need to go into Visual Studio for further analysis.

This speed up the process of finding the source code that you are looking for. It is free and download it from https://www.mythicsoft.com/agentransack/ and install it in you development environment.

 

Take care Daxer’s.

Retail statement trickle feed (public preview)

$
0
0

Retail statements are one of the most important (and complex) processes a retailer have. It’s where the retail sales and transactions are being transformed to become physical and financial transactions so you can see the sales in finance and in inventory. Retail statement calculation and posting have been covered many times in my blog posts and Microsoft have a large set’s of article on doc’s on the matter. The amount of transactions retail statements calculates and post is to my knowledge THE most intense in the entire Dynamics 365 solution. Imagen that every sale, in every store is being processed. For larger retailers Dynamics 365 for Retail are processing millions of transactions daily. This area really put’s computational pressure in the systems and is also one of the areas where Microsoft is investing heavily.

Since the start of D365 there have been done hundreds of improvements on retail statement posting, and the next “big thing” is Retail Statement trickle feed. One of the pain’s in today’s solution is a significantly delay between the when the retail sales have been conducted, and when the inventory transactions have been financially posted. And in short, when the inventory transaction gets a financial status like “Sold”. Why is this important? Because the inventory transactions define on-hand values, as again it defines how the master planning/replenishment is calculated. We want this to be as accurate and up to date as possible. Any delays in having accurate on-hand influences planned purchase orders. Also the ability to spread out the processing of transactions through the day will reduce the amount of “spikes” in the Azure SQL load, making the nightly timeslot more open for other high intensive transaction processing tasks.

Microsoft have made the following improvements to the statement posting process:

  1. Deprecate the “inventory job” that creates temporary reservations.
  2. Create a new job that will, at a predefined schedule, create sales orders, invoice them, and create, post, and apply payments for all the transactions that are synchronized to the HQ at that point of time. In addition, it will also create any ledger journals that need to be created for discounts, gift cards, and so on.
  3. The statement document that gets created at the end of the day will only be used to calculate and post any counting variances.

To enable the new preview (10.0.5) trickle feed solution you have to enable the Retail Statement (trickle feed) – preview configuration key. Also remember to disable the other retail statements configuration keys, and that you don’t have any open statements when doing this.

When finally released (GA) I hope that the new the new feature management is used for enabling this.

When this is done, you will see a set of new menu items. Under the menu \Retail\Retail IT\POS Posting.

The sequence of these batch jobs is to be able to financially post most of the transactions, and the financial statement posting will only be used to calculate and post any counting variances. There is no need to run the “Post inventory” job any more.

If we look into the Retail Statement form, we now have the possibility to manually create transaction posting and financial reconciliation (That in the essence is the financial statement).

When creating a “Transactional posting”, we see that the form is a bit changed compared how it was before. There a no lines related to payments.

When posting the transactional statement, the following steps are performed:

When calculating and posting a financial statement, you see the more traditional statement posting screen, where you have the payment lines:

The steps in the posting is the following:

The summary of this, is that Dynamics 365 will with trickle feed support a much faster updating frequency to get proper on-hand values and scalability. Since the transaction statement will be posting running more frequently it also means that there will be less heavy retail statement posting in the evening/night. The transactions will be smaller and therefore also easier to post. But there are a few things to keep in mind. If you trickle feed too often, you will miss out on the transaction aggregation, and will have to process many more sales order invoicing per day. This can again increase the load in your system. This feature will also increase scaling of the system, as posting of transactions can be better load balancing among multiple AOS-batch services. I also have a feeling that there will be more features in this area to come, that will further enable close to real-time master planning, inventory services, and close to real-time power-BI reporting.

Great work to the Microsoft team working on the retail statement processing. Next on customers wish list is a super-duper-fast invoicing service of sales orders(retail), as this still is the most resource demanding task in the processing of retail transactions.

Here is a small joke for all of you that don’t care about retail statement posting

Retail statement trickle feed (public preview)

$
0
0

Retail statements are one of the most important (and complex) processes a retailer have. It’s where the retail sales and transactions are being transformed to become physical and financial transactions so you can see the sales in finance and in inventory. Retail statement calculation and posting have been covered many times in my blog posts and Microsoft have a large set’s of article on doc’s on the matter. The amount of transactions retail statements calculates and post is to my knowledge THE most complex and intense feature and business process in the entire Dynamics 365 solution. Imagen that every sale, in every store is being processed. For larger retailers Dynamics 365 for Retail are processing millions of transactions daily. This area really put’s computational pressure in the systems and is also one of the areas where Microsoft is investing heavily.

Since the start of D365 there have been done hundreds of improvements on retail statement posting, and the next “big thing” is Retail Statement trickle feed. One of the pain’s in today’s solution is a significantly delay between the when the retail sales have been conducted, and when the inventory transactions have been financially posted. And in short, when the inventory transaction gets a financial status like “Sold”. Why is this important? Because the inventory transactions define on-hand values, as again it defines how the master planning/replenishment is calculated. We want this to be as accurate and up to date as possible. Any delays in having accurate on-hand influences planned purchase orders. Also the ability to spread out the processing of transactions through the day will reduce the amount of “spikes” in the Azure SQL load, making the nightly timeslot more open for other high intensive transaction processing tasks.

Another critical benefit of trickle feed is the decoupling of transactional statements and financial statements. Now you can post transactional statements without even posting a financial statement, and the other way around. Together with the increase of posting frequency that produce small bundles of transactional statements, it will address the main reason for the compounding effect that prevents a series of statement from being posted due to a single invalid transaction. Right now, the only validation that impact financial statements is that all retail transactions for a given shift must be present in HQ in order for a financial statement to be posted. However the transactions don’t need to be successfully posted for a financial statement to be posted.

There is also a new aggregation strategy, where unnamed transactions are always aggregated and named(customers) transactions are never aggregated. There is no more option available to turn aggregation on or off.

Microsoft have made the following improvements to the statement posting process:

  1. Deprecate the “inventory job” that creates temporary reservations.
  2. Create a new job that will, at a predefined schedule, create sales orders, invoice them, and create, post, and apply payments for all the transactions that are synchronized to the HQ at that point of time. In addition, it will also create any ledger journals that need to be created for discounts, gift cards, and so on.
  3. The statement document that gets created at the end of the day will only be used to calculate and post any counting variances.

To enable the new preview (10.0.5) trickle feed solution you have to enable the Retail Statement (trickle feed) – preview configuration key. Also remember to disable the other retail statements configuration keys, and that you don’t have any open statements when doing this.

When finally released (GA) I hope that the new the new feature management is used for enabling this.

When this is done, you will see a set of new menu items. Under the menu \Retail\Retail IT\POS Posting.

The sequence of these batch jobs is to be able to financially post most of the transactions, and the financial statement posting will only be used to calculate and post any counting variances. There is no need to run the “Post inventory” job anymore. But in reality, there is a decoupling, and the transactional statement and financial statement can post independently if the other have not been posted. The only actual requirement is that the P-job have fetched the retail transactions from the retail channel database.

If we look into the Retail Statement form, we now have the possibility to manually create transaction posting and financial reconciliation (That in the essence is the financial statement).

When creating a “Transactional posting”, we see that the form is a bit changed compared how it was before. There a no lines related to payments.

When posting the transactional statement, the following steps are performed:

When calculating and posting a financial statement, you see the more traditional statement posting screen, where you have the payment lines:

The steps in the posting is the following:

The summary of this, is that Dynamics 365 will with trickle feed support a much faster updating frequency to get proper on-hand values and scalability. Since the transaction statement will be running more frequently it also means that there will be less retail statement posting in the evening/night. The transactions will be smaller and therefore also easier to post. But there are a few things to keep in mind. If you trickle feed too often, you will miss out on the transaction aggregation on the unnamed transactions, and will have to process more sales order invoicing per day. This can again slightly increase the load in your system.

This feature will also increase scaling of the system, as posting of transactions can be better load balancing among multiple AOS-batch services. I also have a feeling that there will be more features in this area to come, that will further enable close to real-time master planning, inventory services, and close to real-time power-BI reporting.

 

Next on customers wish list is a super-duper-fast invoicing service of sales orders(retail), as this still is the most resource demanding task in the processing of retail transactions. It is also in the roadmap the ability for the store manager to perform and generate the financial statement when a shift is closing in POS.  The financial statement in HQ in this case will post whatever the financial statement generated in POS defines, breaking the requirement of having all transactions uploaded to HQ db. And beyond this Microsoft is as always improving general performance by working close with customers and partners. We see that the data distribution and different usage of retail statements require different indexes and Microsoft invests heavily in improving how queries are executed.

 

Great work to the Microsoft team working on the retail statement processing.

 

Here is a small joke for all of you that don’t care about retail statement posting

 

 

Analyzing Cloud POS performance in Dynamics 365 for Retail

$
0
0

It is a constant requirement that systems retailers are directly interacting with should be Bigger, Better, Faster, Stronger (BBFS). In this blog post, I will dig into how the POS performance can be analyzed to better understand the transactional performance of the Dynamics 365 POS. What I’m specially interested in is how perceived performance is towards actual. What we think is good performance is relative to the observer. The average reaction time for humans is 250 ms to a visual stimulus, but newer studies shows that we can identify visual stimulus down to 13 ms. Your screen has a refresh rate of 17 ms. As time is relevant and the expected performance is close to real-time, this can sometimes lead to performance expectations that is actual irrelevant towards what is wanted to be achieved. We as humans cannot go beyond 250 ms visual response time, so this is important to keep in mind.

As you can see in the following video, 4 items is scanned and then a quick cash payment is done. The total time taken to complete this example transaction in CPOS is approx. 5 s.

But as you can see on the screen, there is a lot happening, and when the user interface is being redrawn. I wanted to go deeper to understand exactly what is happening when scanning. More specifically on what’s happening when adding the sales lines in the POS.

As the CPOS is a 100% web based application, we can use Google Chrome to take a deeper look into exactly what is happening. By pressing the F12(Or CTRL-Shift-I), you get up the developers tools.

Then start the recording (CTRL-E), add a line in POS, and stop the recording. Then you will see:

1. CPU load, Activity bars, Network calls
2. The actual animation on the POS display each millisecond
3. Exactly how long calls to the Retail Server is taking.
4. The entire REST-call stack being executed on the CPOS client.

Here you see an example where I added one line to the POS basket, and this resulted in 2 calls to the retail server.

If we look at one of the calls happening:

ScanResults() (*.dynamics.com/Commerce/ScanResults(‘07100’)?$expand=Customer&api-version=7.3) – This scans the product/item barcode and sends it to the retail server. In google development tool, we can analyze exactly what is taking place on this call. Here we see that the total time was 559.54 ms but the actual waiting time for the RSSU to respond is 263,69 ms(Waiting TTFB). The browser is waiting for the first byte of a response. TTFB stands for Time-To-First-Byte. This timing includes 1 round trip of latency and the time the server took to prepare the response.) I have measured the network latency to this Tier-2 with RCSU system to be 40 ms.

If I scan the item again, we see that the caching of DNS etc kick’s in the TTFB lowers to 132,80 ms.


As you can see you can really go deeeep, and analyze all what is happening, from client execution to server execution, without any debugging tools. Down to the milliseconds, and better understand the performance. The profile created can be exported and imported for deeper analysis. We can see that there are many factors that influence performance, from network delay’s to form refresh. Microsoft could have the pleasure of shaving milliseconds of the animations, server calls and J-scripts, but this is an ongoing investment from and R&D perspective.

My honest opinion is that the Cloud based Dynamics 365 for Retail POS is performing good. Network elements and the speed of light is a fundamental restriction. The use of animations also seams to affect how performance is perceived, but it does not affect the general performance and usability. Legacy system that is on-prem have the benefit of not having latency, but the cloud solution brings so many other positive elements. If you choose MPOS instead, these tools are not available and you can use fiddler for analysis. But a small tip is to have a CPOS client available when performance testing, as this also will affect MPOS.

Bigger, Better, Faster, Stronger !

D365F&O, Lots of new high value content on DOC’s

$
0
0

The Microsoft Dynamics team have been quite buzzy after the vacation producing a lot of valuable content to Dynamics 365. I would like to highlight some of the latest additions that is worth checking out and to share in the Dynamics 365 ecosystem. Just this year alone, 714 articles have been published, and just the last 2 months close to 300 articles are made available. With this amount of information, I do get questions if there are some hidden gems on docs. And here some of them are:

1. Learning catalog

There are now more tailored learning paths towards customers and partners, with references to free, self-paced online learning path, Tech-talks, and formal instructor-led training. Here you will find articles, videos, and all you need to start learning Dynamics 365.

2. Test recorder and Regression suite automation tool for Retail Cloud POS

Now we can start creating regression testing on Retail POS. Cool stuff, and in my mind where we actual see the true value of regression testing. Retail is Detail, and this delivers quality.

3. Master planning setup wizard

Setting up master planning involves taking many decisions and here you can read how this is done in 10.0.5.

4. One Version service updates FAQ

This page answers a lot of question on the One Version strategy, and what this means for you. At many customers I see that extensive, time-consuming and costly testing processes are being manually executed each time Microsoft is releasing a new monthly updated. Why? I do not see the need to perform full testing on all modules on a monthly basis. Yes, it is a fact that nobody releases flawless code. (Even not Microsoft), but if you follow the procedures and guidelines from Microsoft, the monthly updates should be safe to deploy. There are several release rings and programs in place ensuing that quality is in place at GA. (General Available). Please align to the release cadence updates, and focus on your essential core processes. If you find painful bugs, report them asap.

5. Environment planning

I have seen several projects where the focus is to save costs on implementation environments. This page explains a lot on Microsoft’s take on this. My simple advice is use Tier-1/One box for development on a cloud hosted CSP subscription, and the rest of the environments as Tier-2 or higher (my recommendation is to have 2 additional Tier-2 environments for larger projects). The benefit to use self-service processes is priceless. Also keep in mind that Azure costs are very cheep compared to consultancy hours trying to maintain and manually transfer databases between environments. Also take a look at the great Denis Trunin’s blogpost on development VM’s performance.

6. Business events overview

This is the future and start adopting this feature into your business processes. This is also a key enabler for working closer with the Dynamics Power platform.

7. Regulatory updates

Here you find localized information for your country, and how to comply to specific local requirements. This is being updated very often.

8. Unified product experience

Do you want to keep the products from D365F&O synced with D365Sales ? This article explains how to achieve a near real time bi-directional integration with CDS. Great stuff also explaining dual write capabilities.

9. Adyen payment processing with omnichannel experience

Payment connector is far more versatile than just for retail. Also check out the FAQ.

10. Asset management

Great stuff on the horizon. Keep track of your stuff

11. Franchising

No longer in the official 2019 Wave 2 release. So, we must keep waiting for this in the future.

 

Take care, and

DXC you later

 

 


D365 Retail – Buzz Alert !

$
0
0

THIS IS COOL !
Microsoft is launching several new product lines for retailers.

Dynamics 365 Commerce

Empower your business to create exceptional, insightful shopping experiences for every customer with Dynamics 365 Commerce—built on our proven Dynamics 365 Retail solution.

https://dynamics.microsoft.com/en-us/commerce/overview/

Microsoft Connected Store

Empower retailers with real-time observational data to improve in-store performance. From customer movement to the status of products and store devices, Dynamics 365 Connected Store will provide a better understanding of your retail space. (Check out the video)

https://dynamics.microsoft.com/en-us/ai/connected-store/

 

 

Dynamics 365 Branding and Commerce (Preview) Firsthand experience

$
0
0

PS! Remember to read the last lines in this blogpost

As I hope you have seen in your never-ending twitter/news feed is that Microsoft again adding lot’s of new apps and features to Dynamics 365. Microsoft are delivering on the communicated vision of Dynamics 365. We now have apps where we have a holistic approach to business processes. To solve business requirements users will be using a combination of apps that works natively together. We see how the entire solution is being connected, and further split up into specific areas. In the old days when we had large ERP suites, and we sold functional modules. We are now implementing connected apps that enable business processes per user. If anyone wonder what the “new” hashtag is, it is easy: “#MSDyn365“, and get used to it. We no longer a need to put things into additional silo’s to explain the legacy, and to succeed we must embrace and deliver the right combination of apps that solves the business requirements per business process.

One of the most exiting news in the current wave 2 release is the delivery of Dynamics 365 Commerce(preview). I have been privileged to validate and try of this solution the last days. My current experience is that: This Rock’s! Microsoft finally can deliver a complete suite to give a true omnichannel experience. One interesting finding is that Microsoft will rebrand their “Dynamics 365 for Retail” offering to “Dynamics 365 Commerce”. Why? Because what is now being offered extends the binderies of the traditional retail solutions. As seen in the following figure you can get a complete integrated end-to-end system. And this is not just for retailers, but for all companies that want to digitalize their processes and offer true omnichannel can benefit of this.

1 : Picture from Microsoft presentations

To try out this new solution, you can request a preview. You ask for a preview here. When/if accepted you will receive an email from Microsoft containing instruction on how to deploy this preview. This guide is also available her and it is important that the guide is followed very carefully. To complete the guide, you need to get some assistance from you Azure AD tenant administrator. Also, the preview is currently only deployable to US Azure datacenters, and this put’s some latency into the commerce experience.

One interesting thing with the commerce, is that even though this is a tier-1 environment, you get the possibility to deploy RCSU and the e-Commerce Server. The data set is basically standard Dynamics 365 for Retail, where the configuration key for retail essential enabled. So we can showcase that the Dynamics 365 Commerce also can be delivered as a standalone app or be extended with the finance and supply chain management apps.

The preview commerce solution is what you expect an e-commerce solution to be:

The back-end editor for the editor is easy to use, and it is easy configure your

To get a full understanding of the solution also head over to Microsoft DOC’s to learn more : https://docs.microsoft.com/en-us/dynamics365/commerce/

But I’ll do something better for you; You can check-out the preview solution yourself right now: https://d365commerceus2ecom-eval.commerce.dynamics.com/DXCCommerce (I expect that the site will be available for a only few days, so hurry)

If you want to buy something use Card number: 4111-1111-1111-1111, Expiration: 10/20, CVV: 737 . Also remember that this is a US-based Azure Datacenter and NOT a production grade scaled system.

Happy DAXing and DXCuLater!

Microsoft Bookings and Microsoft Graph

$
0
0

One common feedback we get when implementing Dynamics 365 is the ability to handle appointments and booking. There are many very good 3’rd party solutions, but did you know that Microsoft have an easy to use booking system that works online and integrated with Outlook. It’s called Microsoft Bookings, and is worth taking a small look at especially if you have the need of booking your customers for appointments and simple services. Microsoft Bookings provides online and mobile apps that make appointment scheduling simple and efficient for small businesses and their customers. Any small business that provides service on an appointment basis, such as auto repair shops, hair salons, and law firms, can benefit from having their bookings managed so as to free up time for the more important task to grow their business. Microsoft Bookings is available to businesses that have an Office 365 Business Premium subscription.

Here is a small live demo for you my friends: https://outlook.office365.com/owa/calendar/DXCCommerce1@dxccommerce.onmicrosoft.com/bookings/

The first page an online customer arrives at is the following screen, that can be published on Facebook or any social media sites. Here I choose to order my haircut from my favorite hairdresser. (Full manual is available here)

 

When booking I will get a confirmation email, and the booking coordinator will also get an email. The booking is also available on my phone:

 

On the back-office side, Microsoft have created a simplified view of managing and setting up your bookings:

Here you manage the calendar, customers and staff.

Here is the calendar for a specific day showing all appointments and bookings for today. Drag and drop of appointments between staff and dates is of course possible.

You can also manage you staff.

And the services you offer and map them towards your staff.

 

If you are a functional person, then just stop reading here, because here comes the good part: There is a complete API interface your you to integrate towards booking. (See also this link) Connecting this towards Dynamics 365 or commerce apps can be done by a developer, and makes it possible to expose booking services to POS, call-center and with tight integration to your Dynamics 365 solution.

Check out Microsoft Booking and Microsoft Bookings API in Microsoft Graph.

Here are some sample pictures on how to access the Booking system using Microsoft Graph. First, here I list all the booking sites listed in my tenant:

Pay attention to the fact that it returns and “id”, that identifies my booking on a specific store. If I now queries for bookings at the ID like this:

https://graph.microsoft.com/beta/bookingBusinesses/DXCCommerce1@dxccommerce.onmicrosoft.com/appointments (You will not get access to this link, but you are welcome to click it )

I get the following, where the service is listing up all bookings posted into Microsoft Bookings. A consent through the azure portal must be setup. And the great thing is that is actually is a two-way service. I can post bookings in.

BOOM! Take that! We now have a complete interface towards all services that Microsoft Graph can expose and can let us integrate on a completely new level.

If I wanted, I can now connect my bookings to any planning engine that would add more value to the service. Like picking me up in a golden limo-cab when I book my hairdressing hour. The possibilities are endless. Also remember that this is not restricted to bookings, but all services that azure may provide. We in the Dynamics partner community have just scratched the surface of the possibilities that Microsoft now provide.

Happy DAX’ing friends.

D365 – To exist or not, that is the question!(part 2)

$
0
0

Some years ago I created a free community solution for “Not-Exists Join“. Not exists join means that we can filter and search on data that does not have any relational records. This answers questions like;

– Show me all customers that have no sales orders the last X days

– Show me all items with no inventory transaction. Show me items with no movement last 30 days.

– Show me all items that have no price.

Countless community friends have used this for AX 2012. But since Dynamics 365 was released this solution could not be applied. To make it properly I have decided to push a request through the CDE (Community Driven Engineering), and hopefully making it available to all D365 customers as part of the standard solution. All code is ready and checked-in , and I’m just waiting for Microsoft review.

The way the CDE works, is that partners and customer that have code or bugfixes can work together with Microsoft on implementing changes. It is Microsoft that have the final decision, and they will also make it part of their IP. But for all you community friends, here is a sneak peek of what I’m working on together with Microsoft.

The advanced filter and query in Dynamics 3656 are a very powerful tool. Here you can search and filter on most fields and add join relations to the query.

But there is one area that the advanced query screen is not handling. That is “not-exist-join”. Let’s say I want a list of all the customers that don’t have sales orders. The standard D365 will not help here. The purpose of this document is to show how to implement “not-exists-join” into standard.

Functional Solution

In the joins form, a new section of relations has been added that represents the tables that can be “not-exist-join” added:

In this sample the customers will no sales orders will be in the query result/form. But the feature are generic, and all 1:n relations can also be selected as a “Not exists” relation.

When will you have this in standard? Maybe 10.0.10?? It depends on Microsoft and final approval of the code and feature. But hopefully it should not be in the far future. But “cheer and share” and maybe we as the community can accelerate this very requested feature.

D365 community ROCK’s and Happy DAX’ing!!

Batch Jobs; Take control of the executions

$
0
0

Dynamics 365 can be automated quite a lot with the use of batch jobs. With batch jobs, your Dynamics 365 solution becomes “alive”, and we can set up the system to automate many manually processes. Lets say to have the following “vanilla process”, and wants to automate as many steps as possible.



This document covers the Batch jobs needed to be setup for this process to be as automated as possible. I wanted to put a structured system on all the batch jobs that is typically used in a production system. But this also generates a lot of data, that you don’t normally need. It is therefore common to create both functional batch jobs that processes and executes functionality, and also execute cleanup jobs that removes irrelevant data.

Batch job Naming conventions

To make it simpler to understand the batch jobs a simple structure of naming the batch jobs have been created. The first character is just “A”, to make sure that the sorting of the batch jobs is in the best possible way, and that the batch jobs can be sorted according to name. The next is a 3 digit number and at the last there is a then a description that explains the batch job.

ID

Description

A001-A099

System administration batch jobs

A100-A199

Data management batch jobs

A200-A299

General ledger batch jobs

A300-A399

Procurement and sourcing batch jobs

A400-A499

Sales and marketing batch jobs

A500-A599

Retail batch jobs

A600-A699

Inventory management batch jobs

A700-A799

Warehouse management batch jobs

Reach of these ranges are then set up as batch groups, and you can better control what AOS servers is executing what type of batch jobs:


In this blog post more than 87 batch jobs have been specified, and that keeps the Dynamics 365 system updated and as automatic as possible

Job description
A001 Notification clean-up
A002 Batch job history clean-up
A003 Batch job history clean-up (custom).
A004 Daily Diagnostics rule validation
A005 Weekly Diagnostics rule validation
A006 Monthly Diagnostics rule validation
A007 Named user license count reports processing
A008 Databaselog cleanup
A009 Delete the inactivated addresses
A010 Scan for orphaned document references.
A011 Report data clean up
A012 Cryptography crawler system job that needs to regularly run at off hours.
A014 Updates system notification states.
A015 Deletes non-active and orphaned system notifications.
A016 Database compression system job that needs to regularly run at off hours.
A017 Database index rebuild system job that needs to regularly run at off hours
A018 Deletes expired email history.
A019 Process automation polling system job
A020 Scan for document files that have been scheduled for physical deletion.
A021 System job to clean up expired batch heartbeat records.
A022 System job to seed batch group associations to batch jobs.
A023 System job to clean up unrecovered user session states.
A024 Change based alerts
A025 Due date alerts
A026 Email distributor batch
A027 Email attachment distributor
A103 Entity Store Deploy measurement
A103 Refresh data entity
A200 Clean up ledger journals
A201 Import currency exchange rates
A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.
A205 Update purchase and sales budget
A206 Source document line processing
A207 Source document line processing queue cleanup
A208 Ledger journal monitor
A300 Purchase update history cleanup
A300 Purchase update history cleanup
A301 Delete request for quotation
A303 Draft consignment replenishment order journal cleanup
A303 Run Forecast planning
A304 Run Master planning
A305 Post product receipt
A403 Sales update history cleanup
A405 Order packing slip
A406 Order invoice
A407 Calculate sales totals
A500 All retail distribution jobs (9999)
A501 Upload all channel transactions (P-0001)
A502 Process Assortment
A503 Update listing status
A504 Product availability
A505 Generate related products based on customer transactions
A506 Process delivery modes
A507 Synchronize orders job
A508 Update search Product data
A509 Update search Customer data
A510 DOM batch job
A511 DOM fulfillment data deletion job
A512 Default channel database batch job
A513 Recommendation batch job
A514 Retail scheduler history data removal batch job
A515 Create customers from async mode
A516 Retail transaction consistency checker orchestrator
A517 Retail transactional statement calculate batch scheduler
A518 Retail transactional statement post batch scheduler
A519 Retail financial statement calculate batch scheduler
A520 Retail financial statement post batch scheduler
A521 Process loyalty schemes
A522 Post earned points in batches
A523 Process loyalty lines for other activities
A524 Retail time zone information job
A600 Calculation of location load
A601 Inventory journals clean-up
A602 Inventory settlements clean up
A605 On-hand entries cleanup
A606 Warehouse management on-hand entries cleanup
A607 On-hand entries aggregation by financial dimensions
A608 Cost calculation details
A609 CDS – Post integration inventory journals
A700 Work creation history purge
A701 Containerization history purge
A702 Wave batch cleanup
A703 Cycle count plan cleanup
A705 Work user session log cleanup
A706 Wave processing history log cleanup
A707 WMS Replenishment
A708 Automatic release of sales orders

I will not go in detail of all the jobs, but here I at least refer to where you can find the menu item or what class is used in the batch job tasks. Also take a look at blog post by the D365 Solution architecture team, that is a subset of the batch jobs presented in this blog post.

System administration batch jobs

These are general system batch jobs that can perform cleanups and other general executions.

ID

Name, path and recurrence

Description and recurrence

A001A001 Notification clean-up

System administration > Periodic tasks > Notification clean up

Daily

This is used to periodically delete records from tables EventInbox and EventInboxData. Recommendation would also be if you don’t use Alert functionality to disable Alert from Batch job.

A002A002 Batch job history clean-up

System administration > Periodic tasks > Batch job history clean-up

Daily

The regular version of batch job history clean-up allows you to quickly clean all history entries older than a specified timeframe (in days). Any entry that was created prior to – will be deleted from the BatchJobHistory table, as well as from linked tables with related records (BatchHistory and BatchConstraintsHistory). This form has improved performance optimization because it doesn’t have to execute any filtering.

A003A003 Batch job history clean-up (custom).
System administration > Periodic tasks > Batch job history clean-up (custom)

Manually

The custom batch job clean-up form should be used only when specific entries need to be deleted. This form allows you to clean up selected types of batch job history records, based on criteria such as status, job description, company, or user. Other criteria can be added using the Filter button.

A004A004 Daily Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Daily

Incorrect configuration and setup of a module can adversely affect the availability of features, system performance, and the smooth operation of business processes. The quality of business data (for example, the correctness, completeness, and cleanliness of the data) also affects system performance, and an organization’s decision-making capabilities, productivity, and so on. The Optimization advisor workspace is a tool that lets you identify issues in module configuration and business data. Optimization advisor suggests best practices for module configuration and identifies business data that is obsolete or incorrect.
A005A005 Weekly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Weekly

Performs a weekly validation and diagnostics.
A006 A006 Monthly Diagnostics rule validation

System administration > Periodic tasks > Diagnostics rule validation

Monthly

Performs a monthly validation and diagnostics based on the rules.
A007A007 Named user license count reports processing

Class : SysUserLicenseMiner

Daily

Batch job that counts number of users that have been using the system. The data is used in the Named user license count report. D365 creates this execution automatically, but you have to rename it to fit this structure.
A008A008 Databaselog cleanup

System administration > Inquiries > Database > Database Log

Weekly

This job cleans up the database log, and makes sure that only (let’s say) 100 day’s of history remains. In the query criteria I set created date time less than “d-100”, to ensure that I keep 100 day’s of database log. This is general housekeeping and dusting in the system, and keeping the system nice and tidy.
A009A009 Delete the inactivated addresses

Organizational administration > Periodic >Delete inactivated addresses

Weekly

Deletes addresses that have been set to inactive.
A010A010 Scan for orphaned document references.

Class : DocuRefScanOrphansTask

Daily

Batch job that is setup automatically by the system, and scans for document references where the source record is deleted.
A011A011 Report data clean up

Class: SrsReportRunRdpPreProcessController

Daily

Cleans up any data generated for SSRS reports.
A012A012 Cryptography crawler system job that needs to regularly run at off hours.

Class: SysCryptographyCrawlerTask

Every 3 days

Auto created at D365 setup …Not sure what this is, yet…..
A013A013 Data cache refresh batch

System administration > Setup >

Data cache >Data cache parameters

Every 10 minutes

The data cache framework is used to cache data sets and tiles. Enabling of the data cache framework will redirect certain queries against a cache table instead of executing them against the underlying source tables.
A014A014 Updates system notification states.

Class : SystemNotificationUpdateBatch

Every minute

Updates notifications,
A015A015 Deletes non-active and orphaned system notifications.

Class : SystemNotificationScanDeletionsBatch

Daily

Deletes non-active and orphaned system notifications
A016A016 Database compression system job that needs to regularly run at off hours.

Class: SysDatabaseCompressionTask

Daily

Compresses the database
A017A017 Database index rebuild system job that needs to regularly run at off hours

Class: SysDatabaseIndexRebuildTask

Daily

Rebuilds indexes to ensure good index performance
A018A018 Deletes expired email history

Class: SysEmailHistoryCleanupBatch

Daily

Deletes expired email history
A019A019 Process automation polling system job

Class: ProcessAutomationPollingEngine

Every minute

Using business events, the polling use case can be re-designed to be asynchronous if it is triggered by the business event. Data will be processed only when it is available. The business logic that makes the data available triggers the business event, which can then be used to start the data processing job/logic. This can save thousands of batch executions from running empty cycles and wasting system resources.
A020A020 Scan for document files that have been scheduled for physical deletion.

Class: DocuDeletedFileScanTask

Hourly

Scan for document files that have been scheduled for physical deletion
A021A021 System job to clean up expired batch heartbeat records.

Class : SysCleanupBatchHeartbeatTable

Daily

Cleans up the new internal monitoring BatchHeartbeatTable table (Only after PU32), and used for priority-based batch scheduling.
A022A022 System job to seed batch group associations to batch jobs.

Class:
SysMigrateBatchGroupsForPriorityBasedScheduling

Daily

See priority-based batch scheduling.
A023A023 System job to clean up unrecovered user session states.

Class:
SysUnrecoveredUserSessionStateCleanup

Daily

Cleans up sessions that is unrecovered.
A024A024 Change based alerts

System administration > Periodic tasks > Alerts > Change based alerts

Hourly (or faster)

Events that are triggered by change-based events. These events are also referred to as create/delete and update events.

See also Microsoft docs.

A025A025 Due date alerts

System administration > Periodic tasks > Alerts > Due date alerts

Hourly (or faster)

Events that are triggered by due dates.

See also Microsoft docs.

A026A026 Email distributor batch

System administration > Periodic tasks > Email processing > Email distributor batch

Send emails. See also Microsoft docs.
A027A027 Email attachment distributorSend emails, with attachments. For workflow.

Data management batch jobs

Data management executions can generate a lot of data, and to maintain performance and avoid data growth, it is relevant to clean up staging tables and job executions. Also document any of your recurring executions to make it easy and simple to maintain a overview of your data imports and exports that are recurring.

ID

Name, path and recurrence

Description

A100

[Cannot be executed in batch]

Data management workspace > “Staging cleanup” tile

Manually

Data management framework makes us of staging tables when running data migration. Once data migration is completed then this data can be deleted using “Staging cleanup” tile.

A101

A101 Job history cleanup

Data management workspace > Job history cleanup

Daily

The clean up job will execute for the specified amount of time. If more history remains to be cleaned up after the specified about of time has elapsed, the remaining history will be cleaned up in the next recurrence of the batch job or it can be manually scheduled again.

A102

A102 BOYD Data management export

Data management workspace >export in batch

Hourly

If you have a data management export to BYOD, then this can be executed in batch. There are other options that also can be evaluated for this purpose. See A102 BOYD Data management export

A103

A103 Refresh data entity

System administration à Setup à Entity Store

Monthly

To refresh the entity store (the built in embedded power BI). The refresh updates the aggregated measurements, and is only relevant of there are updates or changes that affect these.

General ledger batch jobs

ID

Name, path and recurrence

Description

A200

A200 Clean up ledger journals

Periodic tasks > Clean up ledger journals

Weekly

It deletes general ledger, accounts receivable, and accounts payable journals that have been posted. When you delete a posted ledger journal, all information that’s related to the original transaction is removed. You should delete this information only if you’re sure that you won’t have to reverse the ledger journal transactions.

A201

A201 Import currency exchange rates

Currencies > Import currency exchange rates

Daily

Automatically imports exchange rates from the bank.

A202

A202 Purchase budget to ledger

Inventory management > Periodic tasks > Forecast updates > Purchase budget to ledger

Monthly

Posts the purchase budget to ledger

A203

A203 Sales budget to ledger

Inventory management > Periodic tasks > Forecast updates > Sales budget to ledger

Monthly

Posts sales budget to ledger

A204

A204 Update purchase and sales budget

Inventory management > Periodic tasks > Forecast updates > Update purchase and sales budget

Monthly

Updates the purchase and sales budget.

A205

A205 Create a scheduled task that will execute the batch transfer of subledger journal entries.

General Ledger > Periodic tasks > Batch transfer for subledger journals

Daily

Batch transfer for subledger journals

A206

A206 Source document line processing

Class: SourceDocumentLineProcessingController

Every 10 minutes

Used for accounting distribution. See Microsoft docs.

A207

A208 Source document line processing queue cleanup

Class: SourceDocumentLineProcessingQueueCleanupController

Weekly

Used for cleaning up accounting distribution. See Microsoft docs.

A208

A208 Ledger journal monitor

Class: LedgerJournalTableMonitorController

Every 6 hours

Monitors if ledger journals should be blocked or opened.

Procurement and sourcing batch jobs

ID

Name, path and recurrence

Description

A300

A300 Purchase update history cleanup

Periodic tasks > Clean up > Purchase update history cleanup

Weekly

This is used to delete all updates of confirmations, picking lists, product receipts, and invoices generate update history transactions.

A301

A301 Delete request for quotation

Periodic tasks > Clean up > Delete requests for quotations

Manually

It is used to delete requests for quotation (RFQs) and RFQ replies. The corresponding RFQ journals are not deleted, but remain in the system.

A302

A302 Draft consignment replenishment order journal cleanup

Periodic tasks > Clean up > Draft consignment replenishment order journal cleanup

Weekly

It is used to cleanup draft consignment replenishment order journals.

A303

A303 Run Forecast planning

Master planning > Forecasting > Forecast planning

Weekly

Demand forecasting is used to predict independent demand from sales orders and dependent demand at any decoupling point for customer orders. See also at Microsoft docs, where using additional azure services to perform the calculation.

A304

A304 Run Master planning

Master planning > Master planning > Run > Master planning

Daily

Master planning is used to generate planned (purchase) orders, based on the coverage settings. We expect this service to be enhanced with more real-time oriented planning engine. The master planning batch job execution is located at. Also check out the Microsoft docs on this (large) subject.

A305

A305 Post product receipt

Procurement and Sourcing > Purchase orders > Receiving products > Post product receipt

Automatically post purchase receipt when all lines have been registered,

Sales and marketing batch jobs

ID

Name, path and recurrence

Description

A400

A400 Delete sales orders

Periodic tasks > Clean up > Delete sales orders

Manually

It deletes selected sales orders.

A401

A401 Delete quotations

Periodic tasks > Clean up > Delete quotations

Manually

It deletes selected quotations.

A402

A402 Delete return orders

Periodic tasks > Clean up > Delete return orders

Manually

It deletes selected return orders.

A403

A403 Sales update history cleanup

Periodic tasks > Clean up > Sales update history cleanup

Weekly

It deletes old update history transactions. All updates of confirmations, picking lists, packing slips, and invoices generate update history transactions. These transactions ca be viewed in the History on update form.

A404

A404 Order events cleanup

Periodic tasks > Clean up > Order events cleanup

Weekly

Cleanup job for order events. Next step is to remove the not needed order events check-boxes from Order event setup form.

A405

A405 Order packing slip

Sales order > Ordershipping > Post Packingslip

Hourly

Set up automatic packingslip posting of the sales order is completely picked. (If this is the process). This means that as soon as the WMS have picked the order it gets packingslip updated.

A406

A406 Order invoice

Accounts payable > Invoices > Batch invoicing > Invoice

Hourly

Set up automatic invoice posting of the sales order is completely packingslip updated. (If this is the process).

A407

A407 Calculate sales totals

Periodic tasks > Calculate sales totals

Recalculate the totals for the sales order. This is typically used in scenario’s when the sales order is part of a “Prospect to cash” scenario. See docs.

Retail batch jobs

ID

Name, path and recurrence

Description

A500

A500 All retail distribution jobs (9999)

Retail and Commerce > Retail and Commerce IT > Distribution schedule

Hourly

This batch job is sending all distribution jobs to the retail channel database. This data like products, prices, customers, stores, registers etc. The distribution job is a “delta” distribution, meaning that only new and changed records are sent. There is a lot of more to be discussed on how to optimize the 9999-distribution job, and for really large retail installations some deep thinking is required. For smaller installations it should be OK to just use the setup that is automatically generated when initializing D365 retail/Commerce.
A501

A501 upload all channel transactions (P-0001)

Retail and Commerce > Retail and Commerce IT > Distribution schedule

Hourly

The P-0001 is sending the retail transactions back from the POS to the D365 HQ, where the retail transactions can be posted and financially updated.
A502

A501 Process Assortment

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process Assortment

Hourly

This job processes the assortment based on the assortment categories set on an item, and based on the assortment set up, puts the items in the relevant stores’ assortment. When defining an assortment, you have in D365 the possibility to connect organization hierarchies to retail category hierarchies. The process assortment will perform the granulation of this, so that D365 have a detailed list of each product that is present in each store. The assortment is setup under Retail and Commerce à Catalogs and assortments à Assortments and more details is available on Microsoft docs.
A503

A503 Update listing status

Retail and Commerce > Retail and Commerce > Products and Inventory > Update listings

Daily

The listing status is related to publishing a retail catalog to an online store. The Microsoft documentation is not the best in this area, and the closes explanation I have is that it is related to the listing status on the catalog.
A504

A504 Product availability

Retail and Commerce > Retail and Commerce > Products and Inventory > Product availability

Daily

The batch job for product availability is calculate if a product is available on online store. Checkout this blogpost for further details. SiteCore eCommerce integrations can benefit from this, and in essence it populates the data needed for distribution job 1130, and that maintains the following tables into the channel database
A505

A505 Generate related products based on customer transactions

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Generate related products

Daily

This job will automatically populate related products based on sales transaction purchase history. The two relationships created are ‘customers who bought this item also bought’ and the ‘frequently bought together’ relation types. This data can then further be used in eCommerce scenario’s. Fore deep details, take a look at the class ‘RetailRelatedProductsJob’
A506

A506 Process delivery modes

Retail and Commerce > Retail and Commerce IT > Products and Inventory > Process delivery modes

Daily

This job sets up delivery modes on a new store when added to organization hierarchy ‘retail store by department’. On the modes of delivery you can assign a organizational hierarchy, and this batch job assigns the specific modes of deliveries to each store. The modes of delivery is used in omnichannel scenario’s where the customer can have their products sent home etc.
A507

A507 Synchronize orders job

Retail and Commerce > Retail and Commerce IT > Synchronize orders

Hourly

If you have setup your channels to create sales order asynchrony, this job will create the sales orders and post payments. Also take a look at the following Microsoft docs on how sales orders and payments are synchronized from an online store.
A508

A508 Update search Product data

Sales and marketing > Setup > Search> Search criteria

Daily

Create an indexed search of products, that makes it faster and easier to search for products in the call center.
A509

A509 Update search Customer data

Sales and marketing > Setup > Search> Search criteria

Daily

Create an indexed search of customers, that makes it faster and easier to search for customers in the call center.
A510

A510 DOM batch job

Workspace > Distributed Order Management > Dom processor job setup

Hourly

Run distributed order management on retail sales orders to determine what warehouse should deliver the sales order
A511

A511 DOM fulfillment data deletion job

Workspace > Distributed Order Management > DOM fulfillment data deletion job setup

Daily

Cleans up the DOM data that is no longer the valid calculation.
A512

A512 Default channel database batch job

Class : RetailCdxChannelDbDirectAccess

Every 3 minutes

This job main duty is to check all Download sessions and Upload sessions with status “Available”, then it will apply the data to respective target DB’s (AX or channel DB). See also this blog.
A513

A513 Recommendation batch job

Class FormRunConfigurationRecommendationBatch

Weekly

Se Microsoft docs.
A514

A514 Retail scheduler history data removal batch job

Retail and Commerce > Headquarters setup > Parameters > Retail scheduler parameters

Class: RetailCdxPurgeHistory

Daily

Deletes CDX history. Typical only keeping 30 days of CDS history
A515

A515 Create customers from async mode

Retail and Commerce > Retail and Commerce IT > Customer > Create customers from async mode

Hourly

If customers should be created async (parameter), then this job will create the customer.
A516

A516 Retail transaction consistency checker orchestrator

Retail and Commerce > Retail and Commerce IT > POS posting > Validate store transactions

Hourly

Performs validation on the unposted POS transactions. See Microsoft docs.
A517

A517 Retail transactional statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Creates transactional statement. Se the following blog post.
A518

A518 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post transactional statement in batch

Hourly (of faster)

Retail statement Trickle feed transactional calculate. Create and posts sales orders. Se the following blog post.
A519

A519 Retail financial statement calculate batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Calculate financial statement in batch

Daily

Retail statement Trickle feed financial statement calculate. Creates financial statement. Se the following blog post.
A520

A520 Retail transactional statement post batch scheduler

Retail and Commerce > Retail and Commerce IT > POS posting > Post financial statement in batch

Daily

Retail statement Trickle feed financial calculate. Posts shift declaration Se the following blog post.
A521

A521 Process loyalty schemes

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty schemes

Processes loyalty schemes. See Microsoft docs.
A522

A522 Post earned points in batches

Retail and Commerce > Retail and Commerce IT > Loyalty > Post earned points in batches

Loyalty points should be posted in batch. See Microsoft docs.
A523

A523 Process loyalty lines for other activities

Retail and Commerce > Retail and Commerce IT > Loyalty > Process loyalty lines for other activities

Other Loyalty points in batch. See Microsoft docs.
A524

A524 Retail time zone information job

Monthly

Generates timezone information up until 2054. Ensures that timezone used in the store does not causes inconsistent dates.

Inventory management batch jobs

ID

Name, path and recurrence

Description

A600

A600 Calculation of location load

Inventory management > Periodic tasks > Clean up > Calculation of location load

Daily

WMSLocationLoad table is used in tracking weight and volume of items and pallets. Summation of load adjustments job can be run to reduce the number of records in the WMSLocationLoad table and improve performance.

A601

A601 Inventory journals clean-up

Inventory management > Periodic tasks > Clean up > Inventory journals cleanup

Weekly

It is used to delete posted inventory journals.

A602

A602 Inventory settlements clean up

Inventory management > Periodic tasks > Clean up > Inventory settlements cleanup

Manually/Yearly

 

It is used to group closed inventory transactions or delete canceled inventory settlements. Cleaning up closed or deleted inventory settlements can help free system resources.

Do not group or delete inventory settlements too close to the current date or fiscal year, because part of the transaction information for the settlements is lost.

Closed inventory transactions cannot be changed after they have been grouped, because the transaction information for the settlements is lost.

Canceled inventory settlements cannot be reconciled with finance transactions if canceled inventory settlements are deleted.

A603

A603 Inventory dimensions cleanup

Inventory management > Periodic tasks > Clean up > Inventory dimensions cleanup

Manually/Yearly

This is used to maintain the InventDim table. To maintain the table, delete unused inventory dimension combination records that are not referenced by any transaction or master data. The records are deleted regardless of whether the transaction is open or closed.

Inventory dimension combination record that is still referenced cannot be deleted because when an InventDim record is deleted, related transactions cannot be reopened.

A604

A604 Dimension inconsistency cleanup

Inventory management > Periodic tasks > Clean up > Dimension inconsistency cleanup

Manually/Yearly

This is used to resolve dimension inconsistencies on inventory transactions that have been financially updated and closed. Inconsistencies might be introduced when the multisite functionality was activated during or before the upgrade process. Use this batch job only to clean up the transactions that were closed before the multisite functionality was activated. Do not use this batch job periodically.

A605

A605 On-hand entries cleanup

Inventory management > Periodic tasks > Clean up > On-hand entries cleanup

Monthly

This is used to delete closed and unused entries for on-hand inventory that is assigned to one or more tracking dimensions. Closed transactions contain the value of zero for all quantities and cost values, and are marked as closed. Deleting these transactions can improve the performance of queries for on-hand inventory. Transactions will not be deleted for on-hand inventory that is not assigned to tracking dimensions.

A606

A606 Warehouse management on-hand entries cleanup

Inventory management > Periodic tasks > Clean up > Warehouse management on-hand entries cleanup

Weekly

Deletes records in the InventSum and WHSInventReserve tables. These tables are used to store on-hand information for items enabled for warehouse management processing (WHS items). Cleaning up these records can lead to significant improvements of the on-hand calculations.

A607

A607 On-hand entries aggregation by financial dimensions

Inventory management > Periodic tasks > Clean up > On-hand entries aggregation by financial dimensions

Weekly

Tool to aggregate InventSum rows with zero quantities.

This is basically extending the previously mentioned cleanup tool by also cleaning up records which have field Closed set to True!

The reason why this is needed is basically because in certain scenarios, you might have no more quantities in InventSum for a certain combination of inventory dimensions, but there is still a value. In some cases, these values will disappear, but current design does allow values to remain from time to time.

If you for example use Batch numbers, each batch number (and the combined site, warehouse, etc.) creates a new record in InventSum. When the batch number is sold, you will see quantity fields are set to 0. In most cases, the Financial/Physical value field is also set to 0, but in Standard cost revaluation or other scenarios, the value field may show some amount still. This is valid, and is the way Dynamics 365 for Finance and Operations handles the costs on Financial inventory level, e.g. site level.

Inventory value is determined in Dynamics 365 for Finance and Operations by records in InventSum, and in some cases Inventory transactions (InventTrans) when reporting inventory values in the past. In the above scenario, this means that when you run inventory value reports, Dynamics 365 for Finance and Operations looks (initially) at InventSum and aggregates all records to Site level, and reports the value for the item per site. The data from the individual records on Batch number level are never used. The tool therefore goes through all InventSum records, finds the ones where there is no more quantity (No open quantities field is True). There is no reason to keep these records, so Dynamics 365 for Finance and Operations finds the record in InventSum for the same item which has the same Site, copies the values from the Batch number level to the Site level, and deletes the record. When you now run inventory value reports, Dynamics 365 for Finance and Operations still finds the same correct values. This reduced number of InventSum records significantly in some cases, and can have a positive impact on performance of any function which queries this table. 

A608

A608 Cost calculation details

Inventory management > Periodic tasks > Clean up > Cost calculation details

Monthly

Used to clean up cost calculation details.

A609

A609 CDS – Post integration inventory journals

Inventory management > Periodic tasks > CDS integration > Post integration inventory journals

Fetches journals from the CDS (Common Data Service) and posts them. This applies only of the CDS is in use.

Warehouse management batch jobs

ID

Name, path and recurrence

Description

A700

A700 Work creation history purge

Warehouse management > Periodic tasks > Clean up > Work creation history purge

Weekly

This is used to delete work creation history records from WHSWorkCreateHistory table based on number of days to keep the history provided on dialog.

A701

A701 Containerization history purge

Warehouse management > Periodic tasks > Clean up > Containerization history purge

Weekly

This is used to delete containerization history from WHSContainerizationHistory table based on number of days to keep the history provided on dialog.

 

A702

A702 Wave batch cleanup

Warehouse management > Periodic tasks > Clean up > Wave batch cleanup

Weekly

This is used to clean up batch job history records related to Wave processing batch group.

A703

A703 Cycle count plan cleanup

Warehouse management > Periodic tasks > Clean up > Cycle count plan cleanup

Weekly

This is used to clean up batch job history records related to Cycle count plan configurations.

A704

A704 Mobile device activity log cleanup

Warehouse management > Periodic tasks > Clean up > Mobile device activity log cleanup

Weekly

This is used to delete mobile device activity log records from WHSMobileDeviceActivityLog table based on number of days to keep the history provided on dialog.

A705

A705 Work user session log cleanup

Warehouse management > Periodic tasks > Clean up > Work user session log cleanup

Weekly

This is used to delete work user session records from WHSWorkUserSessionLog table based on number of hours to keep provided on dialog.

A706

A706 Wave processing history log cleanup

Warehouse management > Periodic tasks > Clean up > Wave processing history log cleanup

Weekly

This is used to clean up history records related to Wave processing batch group.

A707

A707 WMS Replenishment

Warehouse management > Replenishment > Replenishments

Calculate location replenishments on the warehouse locations.

A708

A708 Automatic release of sales orders

Warehouse management > Automatic release of sales orders

Releases sales orders to the warehouse so that the picking can start.

Monitoring Distribution jobs

The Retail IT workspace is specifically created to monitor all distribution jobs, sending data to RCSU and POS. If there are failed sessions, they will be seen here. Also the current download (To RCSU) and Upload (From RCSU) is shown here.


Monitoring Batch jobs

The best place to monitor all current batch jobs is through the system administration workspace. Here all failed, running, waiting and withheld batch jobs are shown. This workspace also has additional system administration features.



Viewing all 204 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>