According to the documentation Profile live Azure Cloud Services with Application Insights. The application insights key must be provided in the WadCfg.
<WadCfg>
<DiagnosticMonitorConfiguration>...</DiagnosticMonitorConfiguration>
<SinksConfig>
<Sink name="MyApplicationInsightsProfiler">
<!-- Replace with your own Application Insights instrumentation key. -->
<ApplicationInsightsProfiler>00000000-0000-0000-0000-000000000000</ApplicationInsightsProfiler>
</Sink>
</SinksConfig>
</WadCfg>
However, when using ServiceConfiguration.*.cscfg files per environment, the application insights key is stored in the cscfg files, but the <ApplicationInsightsProfile> doesn't appear to honor the location.
How do you link the application insights key to the profiler sink in the WadCfg file. Or is there some other way to configure Application Insights Profiler per environment?
Ended up creating a wadcfg transfrom in the build pipeline since there don't appear to be any built in mechanisms to handle this.
Related
Is there a way to download debug snapshots for Operations in Azure App Insight. (It can be done for the exception) Need to use these to debug Asp. Net core project in VS.
It seems that the azure portal admin needs to create a new role and assign a user to see and download snapshots from App Insight.
I currently have a desktop PBIX file that I manually publish to Power BI Web.
I have to keep different version of the same PBIX file just to keep track of different sources based on environment such as Dev/QA/UAT/Prod etc
I have more than one data source for each environment i.e. in same PBIX file I have data coming from say Application Insights and REST API.
I scanned through power bi community to see how to do this but can't find relevant information. All pointers are for refreshing either the local PBIX or using Schedule Refresh option in Power BI Web.
Someone even wrote code to hit Publish code via OLE automation but that's not acceptable solution.
https://community.powerbi.com
I would like to automate this process such that
A. I can provide the data source connection string/ credentials externally based on the environment I want to publish it to.
B. Publish the report to Power BI web using a service account instead of my own.
Our current build and deployment tool set does allow use of PowerShell/ Azure CLI etc. Hence it would be helpful if the solution uses those.
Fetching data from sql Azure won't need refresh but it's expensive.
In one of the organizations I worked for they used views on sql Azure to accomplish this task
I'm using Jenkins to produce cspkg files using msbuild. It stores build results in azure blob storage. Then I use management portal to deploy them.
The biggest drawbacks I see are:
1. Deployments can be accidentally deleted easily.
2. There is no straightforward* way to check which version the cloud service has.
Is there a better way to manage deployments?
Its definitely not the best experience is it?
The approach I tend to use is as follows:
Build the deployment package and add the version number to the package filename (taken from AssemblyInfo.cs) e.g. MyCloudService-1.2.0.0.cspkg - this should be trivial using msbuild.
Push the package to Cloud Storage.
Perform the deployment of the package from Storage, with the Deployment Label '[CLOUD SERVICE NAME]-[VERSION] # [DATE & TIME]' e.g. 'MyCloudService-1.2.0.0 # 10-09-2015 16:30'
Check the deployment package into a 'Packages' directory in source control.
If you need to identify the version of the package deployed to the cloud service, you can see the Deployment Label on the Azure Management Portal:
'Old' Portal (manage.windowsazure.com):
'New' Portal (portal.azure.com):
I made a sample application using windows azure dedicated caching (preview).
The sample application runs perfectly in emulated environment but does not get properly deployed in production environment: The project and the cache instances always shows that "waiting for role to startup". I am not able to understand whether it is a configuration issue or something else.
Which deployment mode are you using? Colocated or Dedicated?
If it is colocated can you tell me what value of Cache Size (%) have you set? And which vm type are you deploying?
Also, please check whether you have provided storage account connection string in "Caching Tab" under "storage account credentials to use for maintaining the cache cluster's runtime state" section.
In the Cache Worker Role (the role where you have caching service enabled) if you go to "Caching Tab" you will see "storage account credentials to use for maintaining the cache cluster's runtime state" section.
For Emulated environment this value is "UseDevelopmentStorage=true", for deploying to cloud you need to replace this setting with a valid cloud storage account. Please follow detailed instructions for proper configuration at http://msdn.microsoft.com/en-us/library/windowsazure/jj131263.aspx
Further if you are still facing the issues, enabling remote desktop and checking "Windows->Application" and/or "Microsoft->Windows-Application Server-System Services/Admin" channel would also help.
How can I simultaneously access my .db4o database from the Visual Studio's Object Manager Enterprise (OME) db4o plugin and from my application?
I'm starting out with db4o, integrating it with an ASP.NET MVC application. I have a two-layer repository access pattern set up using StructureMap for IoC and I keep getting DatabaseFileLockedException erros in VS when debugging while using OME.
When you want to access a db4o database file from multiple applications at the same time, you need the client-server-mode. So, either your application or a special "db-server only" application has to open the file as server, then both your application and the OME should be able to connect to this server.
The documentation has an example for this.
I never used OME, so I have no idea how to configure it there.