September 03, 2024

Installing Planning Optimization Add ins in D365FO

Hi Folks, 

In this post lets understand how we can enable Planning optimization in D365FO sandbox. 

What is Master planning

Master planning fundamentally enables companies to forecast and align the future demand for raw materials and capacity with their objectives. It evaluates:

Current Availability: What raw materials and capacities are presently on hand?
Required Resources: What raw materials and capacities are needed to finalize production? 

This includes items that must be manufactured, purchased, transferred, or reserved as safety stock.
Using this data, master planning calculates the necessary requirements and generates planned orders.

For more details check Master planning home page

Prerequisite: Ensure the following prerequisites are met before installing the Planning Optimization Add-in:
  • Only works on an LCS-enabled high-availability environment (tier 2 or higher). OneBox environments are not supported.
  • Ensure your system is set up for Power Platform integration.
  • Your Microsoft Entra account must have a Supply Chain Management license. No extra license is needed for Planning Optimization, but a Supply Chain Management license is required.
  • Sign in to your Power Platform environment with an account that has administrator privileges and Read-Write access.
  • All the steps should be followed by one admin user.
Installing the Add-in

1. Check the right environment and verify current status of this, you can do this as showing in below image. 



2. Login into LCS with admin access and navigate to respective Sandbox environment. Under Power platform Integration, click on Setup, this is step to Install Power Platform Environment (not actual Add in yet)




Next screen,  Here you will see two template available, I would recommend to use with dual write so you still have option to use more features of it while doing this setup once. Here is quick difference in both

'Dynamics 365 standard' means this Finance and Operations environment will be connected to a Power Platform environment with Data verse. While Platform solutions for Dual-write and Virtual Entities will also be installed but not enabled. Data will not be written via Dual-write to the Power Platform environment by default, but can be configured to do so as part of a later step.'



You will be asked for confirmation and verification as this step can be reversed, so make sure you are on the right environment with right requirement. 

This step may take an hours to finish. Once this step is finish you will see Environment URI available there which means the Power plateform is configured for this environment. 

3. [This is an option step, as we are not using Dual write for this Add in]The next step is to Enable Dual write connection to start synch with Dataverse. (And  yes you are not along in this NAME changing game, we all are same boat dealing with new names every year , Almost.) 

4. Now when we have connected with Power platform environment, you are ready to install Add-ins.


Click on 'Install a new add-in' , you will a list of all available solutions/Add-in to install on this environment. Scroll to bottom in list and you will find 'Planning optimization', select that


And on next screen click on Install button. It will start the installation. Again it may take just few mins to finish. Once done you can see this under 'power Platform Intergation' tab in LCS environment page. 


5. Now lets validate things, Login to FO environment and Navigate to 'Master Planning > Setup > Planning Optimization Parameters'. You will find connection status = Connected. 




Common Issues and tips:
1. Missing user permissions: 
If you get an error message regarding missing user permissions while installing the Planning Optimization Add-in, follow these steps: (Validation failed. You must be an environment administrator in Microsoft Dataverse to perform this action. If the issue persists please contact Microsoft support)

Go to the Power Platform admin center.
Open the environment where you want to install the add-in.
Go to Settings > Users and select your user account from the list to see its details.
From your user details page, select the Client Access License (CAL) information link.
On the Client Access License (CAL) information page, make sure that Access Mode is set to Read-Write.

2. Enable configuration key
Put your system into maintenance mode from LCS portal.
Go to System administration > Setup > License configuration.
On the Configuration keys tab, select the check box for Planning Optimization.
Turn off maintenance mode.

3. Connection Status is not showing Connected
If the Connection stats is Connected, then you're ready to enable Planning Optimization. Use the Use Planning Optimization option to choose which planning engine is used for master planning. Below are other option you may see and what's the meaning of each, 

Read more:

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

August 21, 2024

How to connect cloud hosted environment with Data verse / Power Platform

Hello Everyone, 

By this time (now) I hope you have started using Power Platform admin center (PPAC) in your D365FO projects. A lot of new features can be enabled via Data verse on your tier-2 onwards environments. by what about our beloved dev box(es). 

Well, there are architectural difference between could hosted environments which are tier-1 , one box environment and Sandbox/Production environments which are tier-2 on wards and multi box environments. And because of this difference  Power Platform Integration cannot be configured after creating the developer environment, unlike sandbox/Production where you can enable it from LCS even after deployment. 

BUT question remain same, how to access Power platform /Data verse on my Dev box, the simple and the only answer (at the moment) is , you need to redeploy the environment and during process you need to enable Power Platform environment. 

Here are detailed steps, 

1. Navigate to LCS portal and select your project. 
2. Go to Could hosted environment. 
3. Add a new environment, and give a meaningful name. 

Environment deployment



4. Click on Advance > Navigate to Power Platform Integration and select values as shown in below image, 

Environment advance setting








-









5. Once all detailed filled, click on Next and system will start the deployment (it may stay in Queued for some time but that's fine). 

6. On successful deployment, you will be able to have option for 'Setup Dual write application' on environment page in LCS.

LCS page












7. When you click on 'Setup Dual-write Application', system will take you though wizard where you can review and procced. 



























this will initiate 'Configuring Dual-write application' and may about 30 min of time. 


8. Once this step is finished, oyu will get an option to 'Enable Dual-write Connection'. So you may ask what was the previous step, will that step was to configure Dual write and this step is to enable Dual write. 


This step will link both environments. 



Note: This setup needs to be performed by the Lifecycle Services Environment Administrator.



Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

August 02, 2024

QuickFix: Issue with CAR report generation (The source for referenced module '####' is missing from the model store. Please specify the -packagesRoot parameter to instead use binary metadata for referenced modules.)

Hi Everyone, 

We have multiple ISVs and as per MS documentation I added additional parameter in cmd but While generating CAR report I got error for ISV references. 

I tried to remove reference to one ISV than next time it started giving error for another ISV. Here is the cmd I tried, 

xppbp.exe -metadata='K:\AosService\PackagesLocalDirectory' -all -model='MyModel' -xmlLog=C:\CARreport\BPCheckLogcd.xml -module='MyModel' -car='c:\CARreport\CAReport.xlsx'-packagesroot=K:\AosService\PackagesLocalDirectory


Issue and Solution: 
The issue was I used the cmd car parameter path in quotations. 

xppbp.exe -metadata='K:\AosService\PackagesLocalDirectory' -all -model='MyModel' -xmlLog=C:\CARreport\BPCheckLogcd.xml -module='MyModel' -car='c:\CARreport\CAReport.xlsx'-packagesroot=K:\AosService\PackagesLocalDirectory

The correct cmd should be as below

xppbp.exe -metadata='K:\AosService\PackagesLocalDirectory' -all -model='MyModel' -xmlLog=C:\CARreport\BPCheckLogcd.xml -module='MyModel' -car=c:\CARreport\CAReport.xlsx -packagesroot=K:\AosService\PackagesLocalDirectory


Don't miss the space after .xlsx


-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

July 19, 2024

[Solved] The container will need to be recreated with the new table metadata in order to be unpacked correctly.

I got below error in development instance. 

Error message: 
A container for table (SalesTable) was packed with more fields than are currently defined on the table and cannot be safely unpacked. The container will need to be recreated with the new table metadata in order to be unpacked correctly.



Possible solutions:

Option 1: Reset uses data.
Navigate to user option and reset the usages data, I would prefer if its Dev instance you do reset for all users or just for current user. 

Option 2: Try t build your custom models and DB synch. 

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

July 16, 2024

How to Install 'Globalization Solution for Microsoft Dynamics 365 Finance' for ER reporting

Hi Folks, 

I recently been working on ER reporting was referring Microsoft documentations for Electronic reporting (ER) overview , the detaisl inthsi document have some conditions (as of today July 11, 2024) which is not clearly mentioend in this document that the repository mentieodn are only applicate till version 10.0.38, from this version onwards thigs changed (Yes changed again!!!)

In this post I will explain how to setup ER reporting for your environment. 

Thigs to note before start:

1. Admin access to Admin center of power platform
2. Admin access to D365FO
3. LCS and Operations resources are not supported any more, Regulatory Configuration Service (RCS) will be deprecated as well
4. New Solution will be deployed on Dataverse
5. This works only with Tier 2 instances 

Worth the know: 

1. If you check your ER reporting work space (with version 10.0.30 and above) you will only get two options in 'Configuration repository' which are
i. Dataverse
ii.  Operation resource

and if you try to setup data verse and open it, you may get below error,  Error when opening Data version in ER repository, 

ER reporting workspce
















Request to Dataverse failed. Check that solution is installed and application user has access to Dataverse tables. Error code: d02c0bcd Timestamp: 2024-07-04 13:20:10 Correlation Id: <####> Exception thrown: System.ServiceModel.FaultException`1[Microsoft.Xrm.Sdk.OrganizationServiceFault]: RetrievePrivilegeForUser: The user with id <####> has not been assigned any roles. They need a role with the prvReadmsdyn_ElectronicReportingConfigurationsIndexFile privilege. (Fault Detail is equal to Exception details: ErrorCode: 0x80042F09 Message: RetrievePrivilegeForUser: The user with id <###> has not been assigned any roles. They need a role with the prvReadmsdyn_ElectronicReportingConfigurationsIndexFile privilege. TimeStamp: 2024-07-04T13:20:10.5918579Z -- Exception details: ErrorCode: 0x80042F09 Message: RetrievePrivilegeForUser: The user with id <###> has not been assigned any roles. They need a role with the prvReadmsdyn_ElectronicReportingConfigurationsIndexFile privilege. TimeStamp: 2024-07-04T13:20:10.5918579Z -- ).

2. If you have already setup some other repositores in past version and than updated your environment, you still be able to see and access these repository. But you wont be able to create new repository on same type. 

Now coming to solution: 

1. Navigate to Microsoft app source and search for  Globalization Solution for Microsoft Dynamics 365 Finance or click here


Microsoft app store





2. Click on 'Get it now', it will nevugatre you to https://admin.powerplatform.microsoft.com/ portal where you need to select further details. In select environment option you can only selelct a sanbox or productoin ( no tier 1 option by default)

(Note: To install an app, you must have a successfully provisioned environment in the region where the application is available and the environment must have a database connection)


Dataverse install solution




























3. After the installation you will be able to see this under solution tab, check if there are update on this solution , 

Dataverse update solution

I would recommend to udpate this to latest version, 

Dataverse solution






































Dataverse update solution


















Once these steps finished successfully , go back to your environment and select data verse and click on Open. 
D365FO ER reporting workspace















You should be able to see all reporting configuration and choose what you need and import to use. 

D365FO ER reporting workspace
































Further references


Happy ER reporting, and may your data flow smoothly!


July 10, 2024

Thank you Everyone!!!

(I will try be just human👼to write this post, no AI 🤐 . Because this is very special to me and close to my heart👐)

Hello Everyone, 

I writing this 'Thank you' note for each of you for your love and support thought the year. 

I started my Microsoft MVP journey back in year 2013, with one mind set, 'Help others, help yourself'. Today I remarked my 11th consecutive year for this Award.  I’ve had the privilege of connecting with brilliant minds, sharing knowledge, and contributing to the growth of technical communities. Together, we’ve shaped the future of Microsoft products and services.

THANK YOU!!! My family, Friends, Mentors and communities for your love and support.

Looking forward to 'Help people, Help myself' for another a year (And many more). Let’s continue inspiring, learning, and making a difference in the our communities.

If you want to know more about MVP Program : https://mvp.microsoft.com/

My Blog post: https://www.theaxapta.com/

YouTube channel: https://www.youtube.com/@TheAxapta

Twitter: https://x.com/d47447

LinkedIn DUG group: https://www.linkedin.com/groups/13988044/

June 17, 2024

Error 1067: The Process terminated unexpectedly.

Windows could not start the Microsoft Dynamics 365 Unified Operations: Batch Management Service service on Local Computer: Error 1067: The Process terminated unexpectedly. 


First, check if there’s an issue with your custom model build and synchronization process.
Ensure that your custom objects are correctly defined and synchronized.

Start by examining your custom model and its associated entities.
Look for any inconsistencies or errors in your code.
Pay attention to the specific line where the error occurs.

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

June 07, 2024

[Solved] Dynamics 365FO DB restore error Line 1 The permission 'KILL DATABASE CONNECTION' is not supported in this version of SQL Server.

Important Note: Microsoft frequently updates SqlPackage. Therefore, it’s crucial to download the latest version of SqlPackage each time you intend to use any of its commands. Download the latest SQLPackage.

Error Details: The error message for BacPac DB restore is as follows:

Dynamics 365FO DB restore error Line 1 The permission ‘KILL DATABASE CONNECTION’ is not supported in this version of SQL Server. Alternatively, use the server level ‘ALTER ANY CONNECTION’ permission.





Dynamics 365FO DB restore error Line 1 The permission 'KILL DATABASE CONNECTION' is not supported in this version of SQL Server. Alternatively, use the server level 'ALTER ANY CONNECTION' permission.

Recommended solution: 

Step 1: Navigate to the folder where your BACPAC file is saved and change the file extension from .bacpac to .zip.

Step 2: Open the zip file and copy the model.xml file to a different location. Open the copied file in a text editor such as Notepad, VS Code, or Visual Studio. (Avoid editing the file directly in the zip folder or the original file).



Model file may not load in notepad, I would prefer to open this into VS code, Visual Studio itslef or Notepad ++, 


Step 3: In the copied model.xml file, find and delete the entire Element tag that contains “Grant.KillDatabaseConnection”. Save the modified file as ModelCopy1.xml.




Step 4: Copy the modified file (ModelCopy1.xml) and paste it into the SqlPackage folder. (Download the latest version of SQLPackage)


Step 5: Change the file extension of the zip file back to .bacpac (reverse of Step 1).

Step 6: Go to the downloaded SQLPackage folder and execute the following command:

SqlPackage.exe /a:import /sf:J:\MSSQL_BACKUP\PreProdDB.bacpac /tsn:localhost /tdn:AxDB_PreProd2005 /p:CommandTimeout=1200 /TargetUser:"axdbadmin" /TargetPassword:"<DbPassword>" /TargetTrustServerCertificate:True /mfp:"ModelCopy.xml
DB import should be successful this time. 

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

May 24, 2024

Covert timestamp (1716336000000) to ISO format (2020-08-20T23:00:00Z)

To convert a timestamp in milliseconds to a date format like 2020-08-20T23:00:00Z in X++, you can use the DateTimeUtil class to convert the timestamp to a DateTime value, and then format it as needed. Here’s an example of how you might do it:

static void ConvertTimestampToDate(Args _args)
{
    // Your timestamp in milliseconds
    int64 timestamp = 1716336000000;

    // Convert the timestamp to DateTime
    DateTime dateTime = DateTimeUtil::addMilliseconds(DateTimeUtil::utcNow(), timestamp - DateTimeUtil::getSystemDateTime());

    // Convert DateTime to desired format
    str formattedDate = DateTimeUtil::toStr(dateTime, DateTimeUtil::TimeZone::UTC);

    // Print the formatted date
    print formattedDate;
    pause;
}

This code snippet assumes that the timestamp 1716336000000 is the number of milliseconds since the Unix epoch (January 1, 1970). The DateTimeUtil::addMilliseconds method is used to add the timestamp to the Unix epoch to get the correct DateTime. Then, DateTimeUtil::toStr is used to convert the DateTime to a string in the ISO 8601 format, which matches the format you provided (2020-08-20T23:00:00Z). Please adjust the logic if your timestamp is based on a different epoch or requires different handling.

-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

May 20, 2024

QuickFix: Error 1064 An Exception occurred in the service when handling the control request


Windows could not start the Microsft Dynamics 365 Unified Operations: Batch Management Service service on Local Computer: Error 1064




Solution: Check event logs and you may get exact error, in my case I was trying to update some ISV and somehow, there was package refence was missing in other custom model which was causing this error.

Also try this , this also helps to fix this issue, 



-Harry Follow us on Facebook to keep in rhythm with us. https:fb.com/theaxapta

May 15, 2024

Lets understand D365FO service updates

Today, we’re going to discuss some important updates from Microsoft that you should be aware of. Let’s dive in!

Service Updates

Microsoft’s service updates are continuous and touchless, providing new features and functionality. These updates eliminate the need for expensive upgrades every few years and maintain backward compatibility, so there’s no need to merge your code.

Here are some major updates on how you upgrade your system after 10.0.37 and the beginning of 10.0.39:

  • Customers can choose to pause one update at a time.
  • The number of service updates released annually is being reduced from seven to four. Customers can take up to four service updates per year and are required to take a minimum of two per year.
  • In case of multiple sandboxes, customers have to define which is the designated UAT sandbox to be used for production upgrade. A sandbox auto-update occurs seven days before the production update.
  • Microsoft is releasing four service updates annually, in February, April, July, and October.
  • In LCS, there are two auto-update dates. If customers did not upgrade their system, Microsoft will auto upgrade them based on settings in LCS. Beginning version 10.0.39, the service update auto-update window is divided into two windows that are separated by approximately a four-week gap.

The First Release Program

The First Release program is open to all customers. Customers who join it are the first, select group of customers to take the service update all the way to production. Microsoft manages the deployment of this service update to a UAT sandbox environment and then auto-deploys the update to production seven days later. Customers who participate in this program gain the benefit of having dedicated Microsoft engineers closely monitor the environments for any issues after updates are applied.

Regulatory Updates

A regulatory update is a new feature, or a change to an existing feature, that’s required by law, usually for a specific country or region. A regulatory update is always required by a specific law enforcement date (LED) and should be enabled by that date or earlier.

Expected Downtime During an Auto-update

The expected downtime for a successful update is approximately 15 minutes. However, Microsoft asks for three hours of downtime in case issues occur while the update is being applied.

PQUs

PQUs are cumulative builds of hotfixes that are delivered with near-zero downtime. PQUs follow a push model, where updates are applied to a Microsoft Dynamics Lifecycle Services environment in the background and have minimal impact on customers. Every PQU is deployed region by region, by following a “safe deployment process” that tracks issues that are found within each region during deployment. The safe deployment process helps identify and fix issues before the PQU is deployed to more regions. PQUs are 100-percent automated and contain important bug fixes that are ready after the service update is generally available.

That’s all for today! Stay tuned for more updates and don’t forget to follow us on Facebook to keep in rhythm with us.

Best, Harry