Export AOT objects (XPO) with or without IDs? - dynamics-ax-2009

I have this situation where we have a production Dynamics 2009 AX (RTM) environment whose AOT we used to do a code update project to RU8.
So now, I have a production environment (RTM) and one that is the same but patched to RU8.
The thing is that since, Page Definitions have changed in the RTM environment and I want to export those to the RU8 environment.
I can export an XPO of the whole tree (Web->Web Files->Page definitions) and import it in the RU8 env. but do I export with or without IDs ? Does this change anything if objects exist in both environments ?
Thanks!

Almost in any situation you should prefere to EXPORT WITH IDs. There is nothing you can lost with this.
You should prefere to IMPORT WIDTH IDs when you want to maintain two copies of the same environment (DEV, TEST, ...).
You has to IMPORT WIDHOUT IDs when you import objects in an environment that has not been in that environment before, nor in a copy of this environment. For example, when you import objects from another company or created by another developer. This usually has to be done in development environment and then export-import WITH IDs to the next apps (TEST, PRODUCTION, ...)
When you import objects WITHOUT IDs, the system will assign new IDs to the objects for this application, so you won't be abble to export-import WITH IDs to another copies of the app. This will lead to problems soon or late so it should be avoided.
There are exceptions to this rule but to be sure of picking the right option in any situation you have to understand well this best practices.

Related

Importing Unmanaged Solution from Dev does not remove Field from Form added in QA

In all environments, there is a field MXM_RemoveMe we want gone from the Account "Information" Form (one of the Account's OOTB Managed Forms, but of course we've customized it on our own a lot by now)
In Dev, I remove MXM_RemoveMe from the Account "Information" Form.
I put that Form in an Unmanaged solution in Dev, export and import into QA. Publish all.
Problem: but the "MXM_RemoveMe" field is STILL on the form in QA.
What could cause this? Do we have to manually remove fields from Forms in all environments? I don't think that's the case normally.
I've verified this behavior in a specific test after the fact. If I add a field to the Account form in QA... then export/import from Dev that Form (Unmanaged) WITHOUT the field... it still stays in QA! I encourage everyone else reading this to do this simple test themselves and see the same behavior I see.
How should this be handled/understood?
I think this is because the form itself is managed. The system simply adds fields to the form on import and does not simply overwrite unmanaged changes any more. In older versions of Dynamics CRM this did not work this way though.
When you prefer to continue working with unmanaged solutions (I feel there are valid reasons to do so), a best practice would be to always copy managed forms first and modify, export and import the copy instead of it.
The copy would be in its entirety an unmanaged form. Up to now I have never seen issues with those forms when imported in target environments in an unmanaged state.
I might be wrong here but this might be changed in the modern make.powerapps.com compared to how it worked in the classic import experience. You might have the option to overwrite your customizations there(not recommended).
https://learn.microsoft.com/en-us/powerapps/maker/data-platform/update-solutions#overwrite-customizations-option
Probably the safest way is to manually go about and delete the components.
Could have to do with solution layering and using the accounts OOTB managed form. Usually, id say it is better to use a custom form

Organizing Dynamics CRM customizations and update test environment

We are actually reorganizing our CRM customizations. Till now we had one main solution which was containing all the customizations and now we would like to split it by technical matters.
So now on our development instance, we have 4 unmanaged solutions that we would like to publish on tests instance which has the old managed solution.
We plan to do the following:
-> Export the 4 solutions to managed
-> Import them to test instance
-> Uninstall the old solution from test instance
I have a doubt concerning that procedure. Will it break something?
At some point we'll have the same customizations from different solutions. What do you think ?
I tested your steps on a trial environment with a couple Solutions and although when I began I believed (as Arun answered) that uninstalling a managed solution would delete all objects regardless of usage by other solutions, when I actually tested it, it's not deleting them. Data was also kept.
So the steps:
-> Export the 4 solutions to managed
-> Import them to test instance
-> Uninstall the old solution from test instance
Might work with no issues.
I would recommend that you make sure to checklist all elements so nothing is left behind.
If you have an available instance I would also say that you first restore a backup and test that everything goes as planned, but from my test it worked out.
I'm also curious about this exercise. If this is another sandbox to play-around by only disrupting the QA team (without any concern to Prod instance) - I will go with the listed steps to see if it goes through. We can wipe out this Test Org with the Restore from Prod anytime later if it didn't go through all the way.
Or else spin a new sandbox copy of exact Test replica for dry run.
At some point we'll have the same customizations from different solutions.
True, but uninstalling the existing Managed solution will remove the components though they are part of another Managed solution, if I'm not wrong.
This is a common approach. We also have split our customizations into several solutions. (e.g. one for Plugins, Security Roles, Web Ressources...)
You can split your customization work into as many solutions as you like, but don't overdo it.

Dynamics Importing Option Set Values that were Deleted

I added two values in an option set field on the production system (on premise Dynamics solution). I realized the right way would be to first introduce the two values into the option set on Dev, then export the solution as managed, and import it into production.
Now when I try to import the managed solution, I'm getting an error:
An error has occurred. {1}{0}
I believe it's because I had created those first values before in Dynamics, since Dynamics only does soft deletes.
I'm wondering should I go to the StringMapBase table and force delete those option set values, in order for the import to work.
You added those options as unmanaged customizations, so you should be able to delete them in production using the regular customization tooling.
Playing with managed solution is dangerous. That's why I always work unmanaged. I would suggest you to recreate the component manually with the same name or to copy your production database in dev.

Dynamics CRM 2011: Managed Solutions or deploying changes from DEV to PRD

For Dynamics CRM 2011, Microsoft suggests moving entity customizations from DEV to PRD by packaging the changes as managed (or unmanaged) solutions. Unmanaged is bad because you cannot remove the entities when you need to (deleting the solution only deletes the container, entities contained in the solution remain). In most lab examples during training, you’d customize the system, then export the customized entity as a managed solution, then import it into production. This solution-based approach is clean, makes it easier to control what’s in PRD, bundle related entities together, track dependencies, etc, so I get that.
There are times, however, when you need to dump the org on the DEV server and restore from PRD (to address a data-specific issue or for other reasons). We do that by disabling, then deleting the DEV org, then asking the DBA team to restore the CRM database from production, then we import the org back to the DEV server. But if we implement this “managed solutions”-based change migration process, won’t we lose the ability to change our entities after we dump DEV and recreate it from PRD, where these solutions are sitting in read-only mode? If we enable customizations in these managed solutions, will we be able to add new entities to the solutions or remove entities from inside the solutions without deleting the entire solution? Because I thought managed solutions are treated as a single unit of code, so it’s either delete all or delete none. Interested in learning how others have resolved this issue.
One way we have handled this is using a seperate clean dev machine which we use to manage the configurations as the "configuration master". That machine is not used for any other dev or test work. The dev machines for plugsin, etc. can be rebuilt from prod, but this machine continues to be the master for all solutions. Not an ideal solution, but it does avoid the "feature gap" of being able to convert managed solutions to unmanaged (maybe through some password facility)
I would advise against using solutions in these type of dev-to-testing-to-prod situation.
If you are unsure about this try to remove an entity in your dev environment and publish the change to your production environment.
Solutions are inclusive meaning that CRM doesnt remove fields and entities that where deleted in your solution.
The only way to remove an entity is to uninstall your solution therefore deleting the production data in all entities covered by your solution!
While in theory solutions seem perfect they are only usefull for third party vendors.
The goal of beeing able to rollback by uninstalling your solution is a pipe dream. Consider a data model update that involves data conversion. No magic function will reverse that.
It is a far simpler and reliable to restore your backup.

Strategy for managing Oracle packages without breaking code

I'm curious to find out how people manage their packages in their applications.
For example, in our development instance, an application developer may want a change to a stored procedure. However, changing the stored procedure will break the existing Java code until the DAO layer is updated to accommodate for the changes.
My typical practice has been to put the new procedure implementation into a "DEV" package. The developer can then change his reference to this package, do his testing and then when we're ready, we can replace the procedure in the "production" package, delete it from DEV and the developer changes his reference back to the production package.
However, I'm finding it doesn't work as swimmingly as I'd like. First, if there's a bunch of Java code which depends on the DEV package, then I'm in the same situation as if were editing the production package directly - if I break the package, I'll break a bunch of code.
Second, people get busy and we don't get around to moving the package into production as soon we should. Then we have two versions of the stored procedure floating around and it gets difficult to remember what has been moved into production and what hasn't.
The goal is to keep the developers working. Yes, it's a development server, but we don't want to be breaking code unexpectedly.
Can anyone suggest methodologies that have worked for them to address this issue?
If each developer has their own schema in the database and there are public synonyms for all objects in the shared schema and all the Java code uses non-qualified object names, then a local copy of the package in a particular developer's schema will have precedence over the shared version. So developer A can take the current version of the package, install it in his or her local schema, make whatever changes are desired to the package, and make whatever Java changes are necessary all within their own development environment (I'm assuming that developers have their own local app server). When both sets of changes are sufficiently stable that they can be checked in to the shared development environment, both the PL/SQL package and the Java changes can be built out to the shared development environment (the shared development app server and the real schema in the development database). The developer can then drop their local copy of the package.
That approach works reasonably well so long as the developers are checking the PL/SQL out of source control to start their changes rather than assuming that whatever local copy they have in their schema is current-- if developers keep old, local versions of code around in their local schema, they may end up with difficult to debug issues where their PL/SQL and Java versions are out of sync. You can resolve that problem by automating processes that, for example, drop packages from developer schemas if they haven't been modified in a reasonable period of time and if those packages aren't checked out by the developer in source control or by building scripts that let a developer automate the refresh of their schema as part of the build process.
The Java/DAO layer should only be affected if the procedure specification changes (ie number, name etc of parameters). Mitigation strategies for this are
Add new parameters with DEFAULT values for parameters so that they don't need to be passed.
Don't change the order of parameters if they cat called positionally [eg pkg.proc_a (1,2,3)], or rename them if called by name [eg pkg.proc_b (p_1 => 1, p_2 => 2)]
Use packages for procedures and functions so you can overload them
create or replace pkg is
proc (p1 in varchar2);
proc (p1 in varchar2, p2 in number);
end;
With overloading you can have multiple procedures with the same name in a package just with different numbers and/or datatypes of the parameters
11gR2 has introduced Editioning to solve this problem. It allows multiple versions of packages and the application code choose which 'edition' (version) of the code it wants to see - the default 'base' edition or a development version.
However I suspect upgrading the database version isn't a practical solution.

Resources