I have a from with a plugin that updates a record in the DB. I see the updated data in the record in the DB, but when i load the form the updated data does not show.
there are no business rules, the is no JS OnLoad event.
The record is inactive in the DB when the data is updated, but i don' think that should matter
Any ideas as to what I am overlooking?
You're correct that the changes should still save to an inactive record.
Under Advanced Settings > Administration > System Settings > Customization you can set "Enable logging to plug-in trace log" to "All".
Then in the plugin you can use the ITracingService to log messages, which are then visible in Advanced Settings > Plugin-In Trace Log.
You could log the fields' values before and after you set them to confirm that they're getting set.
Or, for a "quick and dirty" option store the fields' values before you set them, then after you set them, throw an InvalidPluginExecution exception containing the "before and after" values. The exception message will pop up right in the UI.
We'd be better able to analyze the issue if you post your code.
On a related note, when writing plugins I often separate the logic out into a Visual Studio Shared Project. I reference that project from a console app and the plugin assembly. The console app enables me to test and debug locally with full VS debugging capabilities before publishing the plugin. Of course there are certain things from the context that can be tricky to mock in the Console app, so your mileage may vary depending on the application.
There are also testing frameworks like FakeXrmEasy, but I have yet to try any of those.
Related
I have recently started to use Google Analytics 4 for event tracking, using Google Tag Manager to send events to Analytics. I have set up the custom variables in Tag Manager as well as in GA4.
Everything seems to work very well in Tag Manager's debug mode, and also if I look up the events in real-time view I get the requested event along with all the parameters I have set in Tag Manager. However, once I look up the events for the last couple of days in the Engagement -> Event view, a couple of my parameters are missing, and also I'm not getting the same hit count for each parameter, even though each event has all parameters set:
As you can see, 86 events have been recorded, but the event count for the parameters widely varies.
Here's a screenshot of my Tag manager settings:
I have tried to set up a new event with the same parameters, but (logically) I got the same result. I am under the strong impression I'm missing something obvious here. Does anyone have experience with this, or has anyone come across this same issue?
To be honest, you can just try to click on mark as conversion in the Events tab, and this should save them. I think for some reason GA4 doesn't save parameters otherwise...
i cant get this til work with fields and parameters for an event so i try to create a user custom dimension instead and also to send it as a content group and as user property
also i can see the user id under app instance for the standard report user explorer
but i can't filter on or select this one myself
i cant create the custom parameter reports like gerrit did
but yeah create them as custom dimensions /metric first should be the way
ga4 has a bug when you edit a standard report, you add 3 card and deselect one it still thinks you are maxed out and you have to start over...
the test internal users filters dont work, though you can see data in debug and under events, but since i know they work i know try to just make the filters active
I have a customized Account "Main" Form in CRM.
I added a new tab, a new section, and new fields inside it.
I published all customizations before-hand, then export, then import into my test environment. I publish all in the test environment.
The Unmanaged Solution contains this form and its new fields; When I review the Form XML in the exported zip's customizations.xml file, it indeed has the new tab label, new section label, I can see the fields in the XML.
PROBLEM: But after importing and publishing into the test environment... the changes don't show up!
What I see in its place is the way that section looked BEFORE (not sure how long ago last time it was changed). It's simply not getting my latest changes.
How can an Unmanaged solution import have these issues? What else is there to check/consider? I don't think Solution Layers would cause this since it's Unmanaged, nothing stood out there anyway.
Please follow this check list, also note* with Dynamics it needs more env/info including your environment to trouble shoot, since it could be something in tandem with your env.
1. Usual Suspects in an UnManaged Solution
With yours being an UNMANAGED solution, I strongly suspect its a security permissions issue not setup/enabled in the Target environment, check your logs, you may see something like this
Try these Solution Importing steps
Import your main solution (note* without field security profiles).
Now, Publish all your customizations (note* you can enable/import the field security after this step)
Lastly, Import the second solution which would contain your field security profiles
2. Troubleshooting Checklist
To resolve/hunt down the issue, try these checklist steps
Check IIS: if you reset/restart IIS, (after define and publish your field)
Clear you client side cache: run it by
pressing Ctrl+F5
Clear you server side cache:
https://learn.microsoft.com/en-us/powerapps/maker/portals/admin/clear-server-side-cache
Are using the entity form:
Do you have the correct: entity permissions Notes & Sub Grid
https://learn.microsoft.com/en-us/powerapps/maker/portals/configure-notes
Did you add the metadata
Lastly are you using activities
I am testing/learning webhook. How to send, receive it. So I thought I would use GAS.
I have this simple script and I wonder why Logger does not work. In Projects/Executions I can see that the script doPost was executed but no logs. Email was sent and the script returned the value.
using old, Legacy Editor (no idea how to get the new one)
in Menu-Resources-Cloud Platform Project is said "This script has an Apps Script-managed Cloud Platform project."
when I open the project in editor I get this message "This project is running on our new Apps Script runtime powered by Chrome V8."
exception logging set to "exceptionLogging": "STACKDRIVER" is set to default
I tried console.log(e); but it did not work for me.
function doPost(e) {
Logger.log("I was called")
if(typeof e !== 'undefined'){
Logger.log(e.parameter);
Logger.log("I was called 2")
MailApp.sendEmail({
to: "radek#gmail.com",
subject: "Call Sucessful",
htmlBody: "I am your <br>" +
JSON.stringify(e)+ "<br><br>" +
JSON.stringify(e.parameters)
});
}
return ContentService.createTextOutput(JSON.stringify(e))
}
Question1: Can I make Logger work?
Question2: I would like to see accepted data in Debugger, is that possible?
Question3: Is there any way the GAS pushes the data it received to my web browser. Of course the browser is NOT sending the data.
Question4: No related to the topic but ... Would you know what I need to do in order to be able to use new Editor?
If you want your own custom log information to go to Stackdriver, then you need to create a Google Cloud Platform project, and associate that GCP project with your Apps Script project.
First create a new GCP Project:
Go to your GCP dashboard:
https://console.cloud.google.com/home/dashboard?authuser=0
In the blue bar at the top, there is a drop down menu for project names. Click that, and a dialog should appear with a button to create a new project.
Create a new Google Cloud Platform project.
Copy the Project Number.
Go back to the Apps Script editor.
From the legacy Apps Script editor, click the "Resources" menu.
Click Cloud Platform project.
Paste in the Project Number and click the button.
Now, any console.log() statements you have will send the logs to Stackdriver.
And Stackdriver can be viewed in your browser.
Note: Some people set up their own logging system to log information from server side Apps Script code to a Google Sheet. There are some open source repos that are available.
The new code editor does have a "built-in" way to log server side information to a log pane code editor window. But, of course this assumes that you are running code from the code editor. This new feature avoids the need for changing browser tabs to see your logging output. I don't know of any way to log server side info to your browser console. You could save log info into an object, and then send it back to the client after the server code completed, and then log everything in the console.
The way that it might be possible to get logging information depends on how the code was originally triggered.
From code editor
From a user using your app
From a Http Request to your Web App
Logging in Apps Script works differently depending upon:
run time version being used - V8 or DEPRECATED_ES5 - Set in appsscript.json file or through the "Run" menu in the Legacy editor, Set in "Settings" in new IDE - New Apps Script projects default to V8 so chances are your project is using V8.
Is your Apps Script project associated with a Google Cloud Platform
(GCP) default or standard project
Is exception logging set to "exceptionLogging": "STACKDRIVER" - Set in appsscript.json file - Default is always to include it - Probably already correct unless you deleted it.
Using either Logger.log or console.log
Using console.log in the server ".gs" file or client side ".html"
file. console.log can be used in both server and client side code
but the log print out goes to different places. You can't see logs
in the browser dev tools from a console.log statement in your
server code. If you use console.log in server side .gs files, and the Apps Script project is not associated with a standard GCP project, then the log only gets sent to your "Executions." I believe that the only way that you can get your logs sent to Stackdriver is by using a standard GCP project. The problem with that, is that you only have so many GCP projects that you can use without requesting an increase in your quota.
Plus there may be issues (bugs) depending upon how you have logging set up and other factors.
For example:
https://issuetracker.google.com/issues/134374008
As lots of people specified, you can use console.log for this.
However, I also work with webhooks from time to time. And I find it much more comfortable to debug directly into google spreadsheets, using code like this
function doPost(e) {
log('# doPost', JSON.stringify(e));
try {
// Some webhook-processing logic here
if(e.parameter.action == 'test-error') {
item.something = nothing;
}
if(e.parameter.action == 'test-log') {
log('# custom log', 'Some data');
}
} catch(error) {
log('# error', JSON.stringify([error, error.stack]));
}
}
function log(event, message){
SpreadsheetApp.getActive().getSheetByName('Log').appendRow([new Date(), event, message])
}
Example spreadsheet:
https://docs.google.com/spreadsheets/d/144i9pxDIB_C8ZRZ-6vd9DiPIw8Rt85hTtVoqk7dtHQg/edit?usp=sharing
You can trigger logging by something like this
curl -X POST https://script.google.com/macros/s/AKfycby3EoaQ8DOt8H_Mc4vjH6JZkhsaAwYHk_sa9HE5Be3qVo0Ym0b2/exec?action=test-error
or
curl -X POST https://script.google.com/macros/s/AKfycby3EoaQ8DOt8H_Mc4vjH6JZkhsaAwYHk_sa9HE5Be3qVo0Ym0b2/exec?action=test-log
You can use same log for your custom logging of some intermediate variable during webhook resolution.
The reason I prefer this over standard stackdriver logging is that google sheets are more explicit and easier to manage.
You can use console.log() to see things within "My Executions".
I have a simple plugin for a custom entity that is set to trigger on Update of my custom entity. It is registered in the Post Operation stage. I have noticed some strange behaviour when I make changes to the Owner field of the record in addition to other standard fields (e.g. text boxes, dates etc).
The plugin fires the first time and the only attributes that come across in the image are all the regular fields. The owner field does not come across.
The plugin then fires again, but the Depth property of the context is still only one (i.e. the plugin is not getting triggered by changes made in the plugin code). In this run of the plugin, the attribute that come across is only the Owner field.
My theory is that because the owner fields are 'special', the CRM is doing two different requests - one to change the regular fields, and then another request for changing the owner via an AssignRequest. However, I cannot find any 'official' documentation for this behaviour.
Can someone explain why this is happening?
I am running Dynamcs CRM 2013 UR2
The Update event fires during the Assign event. So if an assignment takes place your plug-in will execute. The same is true for SetState - if you activate/deactivate a record an Update event takes place. These items are not documented in the SDK.
A good practice is to use Attribute Filtering on your Update plugin so it only fires for the fields it is concerned about - this will, assuming it is isn't looking at the owner related fields, avoid it firing twice. If you have logic specific to record ownership you would put it in a plugin that is registered on the Assign event.
I was not able to find official documentation about this, but I think Assign message is what you are looking for (if the entity is user-owned. See http://msdn.microsoft.com/en-us/library/gg328576.aspx. I would strongly recommend that you specify Filtering Attributes if you are registering a plugin on Update message. You could also debug your plugin and inspect MessageName property of plugin context and see what message gets triggered. I hope this helps.
.
Hey guys,
I have a strange behaviour and am wondering why this happens:
My managed bean holds three values (selected plant, selected year and selected month). When opening the required page a #PostConstruct method is called and initializes the plant data according to the selected/pre defined plant, year and month. When changing one of these three options data should be updated and displayed via AJAX request. In order to switch to edit mode I can click my button and change one value from true to fals which is indicating whether the page should be displayed in view or edit mode.
Now here is my problem:
My local Weblogic Server (IntegratedWeblogicServer - standard configuration) works as expected. I open the page, see my current data, switch to edit mode, edit & save it. That's all. Works like a charm.
The productive Weblogic Server (configured by a colleague of mine) does some kind of cache I think. I open the page, see my current data, change year value to last year and see the updated values. When clicking on the "edit" button the old values are displayed instead of the updated values. This just happens as long as I do not switch the plant. My current workaround looks like this: open the page, switch the plant and then switch the year. After switching the plant everything works like expected. I can't figure out why the productive machine behaves different than the local machine. Each of the update methods setPlant(), setYear() and setMonth() call refreshValues() and are equal requading JSF definitions. So I don't know if it's a caching problem or maybe a Weblogic configuration problem.
Let me know if you need more information or certain code snippets. I excluded them as it is a lot of code.
Kind regards,
Stefi
Enable http headers debug from browser and monitor the difference in each environment. Also monitor the access.log on each domains server.