I'm using miniprofiler in my .NET MVC3 and MVC4 projects and SQL storage provider. I would like to harvest those logs for display in other systems...but need to store extra information with each log entry, such as the customer name, user id, report id, etc.
Is there a way to add custom fields and request-specific values to my current miniprofiler so the logs include those values in the SQL tables?
Yes, you can archive this by implementing a custom class that derives from DatabaseStorageBase
The DatabaseStorageBase is located here: DatabaseStorageBase.cs # MiniProfiler GitHub
Nowadays, your are using the SqlServerStorace, correct ? The source for it is here: SqlServerStorage.cs # MiniProfiler GitHub
Just copy it, and change as your needs or you could even extend SqlServerStorage just overriding the Save (it's public), and implement custom SaveTimings (private) and SaveClientTimings (private) methods
Related
I have embedded Teiid 12.3 in a Spring Boot application. I want to get into the metadata of my VDB in order to generate a diagram using graphviz-java. I assume that if I have a org.teiid.metadata.Table object, I can call getIncomingObjects() to get references to tables that table depends on. I just can't figure out how to navigate from the EmbeddedServer to the Table objects.
I looked into using the administration API available via EmbeddedServer.getAdmin(). From there, I can call getVDBs(), and from there I can navigate down to getModels(), but below that level there is only the model source via getSourceMetadataText(). I also tried subclassing EmbeddedServer to make getVDBRepository() public. I can call getVDBRepository()*.getModels(), but it returns the same Model objects only get me access to the source definition of the models, not the runtime metadata model.
I tried getVDBRepository().getSystemStore() and VDBRepository.getODBCStore(), but those MetadataStores are not for the VDB I have deployed.
I haven't found any examples by Google, Teeid JIRA, Teiid forum, or StackOverflow to help me.
Take look at [1] the getSchema method on Admin API, this method returns the string form of the metadata, however you can grab Schema object for object form. If you do not want that way, Teiid also exposes system catalog using many SYS tables, you can issue SQL queries to grab the metadata of schemas and schema items in a VDB. One for internal access, another is from external access.
BTW one of users created a dependency diagram tool that may be useful if you are trying to do something similar. See [2]. Let me know if you interested in pushing that further.
[1] https://github.com/teiid/teiid/blob/master/runtime/src/main/java/org/teiid/runtime/EmbeddedAdminImpl.java#L544-L557
[2] https://github.com/teiid/metadata-catalog-ui
How do I run/call/kickoff an external program (custom code) whenever certain attributes or objects are added or modified in OpenDJ’s database?
Here is my real world need. (Feel free to change my thought direction entirely).
Whenever a new email address gets created or changed in the OpenDJ database I want to initiate some java code that does some email verification/validation (send the “click here” link with a token to prove the user owns the email they just signed up with).
I know, I could use OpenIDM/AM to accomplish this but to take this a step further I need to validate other information and other credentials (custom) which users supply that are not supported by OpenIDM/AM suites.
Initiating/calling custom code upon ADD or MODIFY of specific objects and attributes is what I want and would like to know how to accomplish this. Preferably without having to scrape logs.
Please Help.
Chad
OpenDJ has a plugin interface where you can plug Java calls on Add or Modify. A sample of this kind of plugin is the attribute uniqueness which verifies that some attributes have a unique value in the directory.
The plugin interface javadoc can be found here : http://docs.forgerock.org/en/opendj/2.6.0/javadoc/org/opends/server/api/plugin/DirectoryServerPlugin.html
I'm using mini profiler in my asp.net Web API project and want to track the performance of some code that runs in a custom DelegatingHandler.
The calls MiniProfiler.Current.Step() inside the DelegatingHandler don't show up in the results. Other calls in the same project show up ok.
Further investigation revealed that MiniProfiler.Current is retrieved from HttpContext.Current in the WebRequestProfilerProvider. And HttpContext.Current is null when called from DelegatingHandler.
Is there a better way to retrieve the MiniProfiler.Current so that it works inside the handler?
MiniProfiler Timings are stored in HttpContext.Current by default (as you discovered). Thus if you are calling MiniProfiler from a place where HttpContxt.Current is null, the results cannot be saved. The solution is to save (and retrieve) the results from somewhere else.
MiniProfiler offers the option of the option of changing the location where all results should be stored and retrieved from (using MiniProfiler.Settings.Storage). The new v3 MiniProfiler (beta nuget here) offers the option of configuring different IStorage for each request, and for using a MultiStorageProvider to designate multiple locations into which results can be stored and retrieved. You can see an example of this in the Sample.Mvc project on github.
In your case, the best approach might be to set a MultiStorageProvider for your global MiniProfiler.Settings.Storage that will first save/retrieve from HttpRuntimeCacheStorage and then afterwards will use some other IStorage that is accessible from the DelegatingHandler. Then in the DelegatingHandler, set the MiniProfiler.Current.Storage to only use the second storage option that you set in the MultiStorageProvider (since it is pointless to try to save the the HttpCache). In this was, profiles from the DelegatingHandler will be saved into your second storage option, and will be retrieved for view with your other results (since MultiStorageProvider will Load results from the first place it can get them - if it doesn't find the result in HttpCache, it will go to the second option.
Note - having multiple storage options is useful in this case, but it can have a negative impact on the performance of retrieving profiles.
I wanted to get your opinions on the easiest way to track changes that users make when they do CRUD events. I am working on a system where the users are less interested on permissions, but really want to have a sophisticated log of what changes a user made. I am using ASP.NET MVC 3, EF, and NLog.
Any advice is greatly appreciated :)
Steve
I use a convention based approach. Each entity has an associated audit entity which includes all properties from the base entity plus information on the change, including whether it was successful or not. I override the SaveChanges method on the DB context. For each entity being changed it creates an audit entity from it holding the new values. It attempts to save the changes, then uses a separate, auditing context to save each of the audited entities with the results of the save operation. I use an injected utility in the data context to get access to the current user (via HttpContext.Current for web, via the Environment.User for non-web) when constructing the audit entities.
I blogged about an earlier version of this for LINQ to SQL at http://farm-fresh-code.blogspot.com/2009/05/auditing-inserts-and-updates-using-linq.html. You should be able to get the basic idea from that.
I would like to have a workflow create a task, then email the assigned user that they have a new task and include a link to the newly created task in the body of the email. I have client side code that will correctly create the edit URL, using the entities GUID and stores it in a custom attribute. However, when the task is created from within a workflow, the client script isn't run.
So, I think a plug-in should work, but I can't figure out how to determine the URL of the CRM installation. I'm authoring this in a test environment and definitely don't want to have to change things when I move to production. I'm sure I could use a config file, but seems like the plug-in should be able to figure this out at runtime.
Anyone have any ideas how to access the URL of the crm service from within a plug-in? Any other ideas?
There is no simple way to do this. However, there is one.
The MSCRM_Config is the deployment database that handle physical deployment properties, like the URL from which users are accessing the CRM deployment. The url that you might want is the one stored in "ADWebApplicationRootDomain", in the MSCRM_CONFIG.dbo.DeploymentProperties table. You may need some permission to access this database.
Note that this doesn't work in a deployment that is an Internet Facing Deployment.
Another way could be to query the discovery service to retrieve the same information (in the case that you are on the Online edition of MSCRM4).
What do you mean by "change things"?
If you create a custom workflow assembly, you can give it a server url input. Once you register it with CRM, you can simply type in the server url when you configure the workflow. You'll have to update the url for any workflows that use the custom workflow assembly once you move to production, but you'll only have to do that once.
My apologies if this is what you meant you wanted to avoid.
Edit: Sounds like you may be able to use the CustomConfiguration attribute when you register the plugin. Here's some more info.
http://blogs.msdn.com/crm/archive/2008/10/24/storing-configuration-data-for-microsoft-dynamics-crm-plug-ins.aspx
String Url = ((string)(Registry.LocalMachine.OpenSubKey(
"Software\\Microsoft\\MSCRM").GetValue("ServerUrl"))
).Replace("MSCRMServices", "");