Streaming Logs for Azure Function for local debugging VS 2019 - visual-studio

So where can one find the local logs for when debugging a Azure Function locally with Visual Studio 2019?
Relevent documentation:
https://learn.microsoft.com/en-us/azure/azure-functions/streaming-logs
The Azure Function documentation only explains this for VS Code, but for this project I need VS 2019, and they are not where I would expect to find them (in the Output window under the Debug dropdown).
They can be seen on the deployed function fine in the Streaming logs, so they are getting created, but for development speed I need them locally too.
I have been looking and googling for a day or two now, so I am reluctantly resorting to Stack Overflow. They must be there somewhere, right?

The logs should also appear in the console window when you start your project.

Ok, the problem was I was using information level messages and they don't come to the console by default. There were some messages on the console, but I had no idea they were actually log messages, thanks to Justin for putting me on to this.
More googling revealed you need to specify an additional argument on startup to see those messages - in Visual Studio on the Properties dialog for the function app Project - a screenshot of that dialog:

Related

Is it possible to watch application insights locally

There is a web.api application with Application Insights plugged in. The AI works like a charm when it is published to Azure. Unfortunately sometimes it is necessary to launch the app in iis express for test purposes. Normally I do it from cmd like this: "c:\program files\iis express\iisexpress" /port:1337 /path:c:\tracker_pub.
Is it possible to watch AI statistics in such a case? In particular I would like to see exceptions that happen sometimes.
Please read this. You can use LinqPad to get all internal telemetry live. Also if you have VS 2015 Update 1 there is an Application Insights hub where you can find AI telemetry (same as in the VS output). You can read about it here. And also this.
Yes, as long as your application is receiving requests and your local machine has internet connection, so it can send events to AI data collection endpoint it should be recording activity when running in iis express. The recommended approach is to send this data to a different instrumentation key (after creating a new AI resource in AI portal), so that your local test traffic is not mixed up with your production data, this is also a great way to test new custom events you are about to add. If you are not seeing any data when running in iis express, the best way to debug would be to start your application in Visual Studio with F5, you will see every event that is about to be sent in your debug output window.

Using TOAD's SQL Tracker with Visual Studio and IIS

My current project uses Visual Studio 2010 and TOAD. It is an MVC project with Oracle 11g. I can get the SQL I am producing with logging and or debugging, however, since I already have a TOAD license I would love to be able to use the Tools for Oracle product, SQL Tracker to get the SQL in runtime without having to use logging or break points.
I assume I need to use the devenv.exe (VS executable) process but I cannot get it to add to the SQL Tracker tool. I get the error:
cannot create process; error=[740] (the requested operation requires elevation.)
Any google search says it has to do with running as admin. I have tried to include the Command-line argument of "RUNAS Administrator" (as well as a variety of other options) with no luck.
Update:
I am now able to Start monitoring the devenv.exe process. In order to do this I needed to run the applications as administrator before I even started SQL Tracker. However, no output is being captured.
I think I probably need to monitor the IIS process instead (w3wp.exe). When I click to start monitoring this process I now get the error:
Failed to create remote thread; error=8 (Not enough storage is available to process this command.)
I believe monitoring IIS is the correct approach, however, this error is now holding me up. Again google is not helping and I am running everything as Administrator.
Root cause could be many reasons but one thing that I suspect is caching on IIS. You can start from looking there.
Here are few things that you can try out.
http://forums.iis.net/t/1150494.aspx?w3wp+exe+memory+usage+is+out+of+control
Running Studio as admin will not necessarily run your project as admin too, imagine the security hole. I am fairly confident you can make your project run as administrator by modifying the manifest. There is an article on here at Forcing an application to Admin from config file but no one ever confirmed whether the answer was right or not, but that does not mean its not. I have noticed on a number of occasions a C# answer gets preference over a VB.NET one on stackoverflow.

"No endpoint listening" when starting an Azure web role under Compute Emulator

I have two cloud solutions (.ccproj files). Each has a single distinct web role. One project runs under Compute Emulator without any problems but when I try to run another one (the first one not running) Visual Studio will package it and then display
Windows Azure Tools: There was no endpoint listening at net.pipe://localhost/dfagent/2/host that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details.
Windows Azure Tools: The Windows Azure compute emulator is not running or responding. Stopping the debugging session.
I'm using SDK version 1.4
I Googled for a while but couldn't find anything that could help me. Force starting the Compute Emulator (csrun /devfabric:start) doesn't seem to help.
How do I resolve this problem?
Although an old question, I got this issue recently and the reason for it was that - while the service or website in azure would have been removed or stopped and you try to publish to it. If this happens, check the publish profile to see that you are pointing to the correct service/site including the storage acc etc and correct them. Hope it helps someone.

Azure: just HOW do I debug this?

I'm really loosing it here. Not being able to attach a debugger to a process is kind of a big deal for me. As such, I'm having a very hard time doing something to pinpoint the source of problems with an Azure-hosted application.
What's worse is that the app works fine in the Development Fabric, even when using online Storage Tables, but can go quite haywire when uploaded and running online.
I know IntelliTrace is one way to do it, but unfortunately, I've got a x86 machine, and the application uses RIA Services. As such, publishing it from my machine results in an error caused by RIA services. I can't build the application by specifying x64 the very same bug strikes again. (So far the only way that I know of to deploy a RIA Services Azure application is to set it to Any CPU and build / publish it from an x64 machine).
So IntelliTrace is not available. Online Azure doesn't have something to resemble the nice console log window of the Development Fabric, and as such, I'm at a loss. Thus far I've been just trying to get things to work and not crash by commenting out sections of code, but given the time it takes to upload and start an instance, this is hardly optimal.
Any suggestions would be appreciated at this point.
The Azure SDK has a logging / diagnostics mechanism built in:
http://msdn.microsoft.com/en-us/library/gg433120.aspx.
One route would be to deploy a version with some Azure specific instrumentation built in.
You could try to RDP into an instance of the role and see if there's anything in any of the logs (event or files) that helps you identify where the failure is.
Baring that, I think Amasuriel has it right in that you REALLY need to architect instrumentation into your solutions. Its something that's on my "must" list when building a Windows Azure application.
If you have access to another workstation with an x64 version of Visual Studio, you can configure Azure diagnostics to collect and copy the crash dumps to Blob Storage:
// Must be called after diagnostic monitor starts.
CrashDumps.EnableCollection(false);
You can then download them (using a tool like Azure Storage Explorer) and debug them locally.
If you absolutely need to see what's going on on the console Rob Blackwell has embedded a neat little trick in his Azure Run Me solution.
It pushes the console output of azure instance(s) out over the service bus. You can therefore consume that data locally and in effect monitor the console of the instances running on Azure right on your desktop.
AzureRunMe is available here and it's open source so you can take a look at how they've fed the console output to the SB.
https://github.com/RobBlackwell/AzureRunMe

Visual Studio 2010 Debug Server Not Recognizing My Changes

Using Visual Studio 2010 on Window 7 64bit. I'm trying to test a website project (not a web application project) using the built in dev server (cassini). The problem I'm having is that when I make a change, I now have to actually stop debugging, kill cassini, and restart before I can actually see my changes in the browser. I used to be able to edit and refresh. One of my fellow developers here is able to do this just fine with an identical setup (same project/vs version/os - and settings near as I can tell). I'm beginning to suspect some sort of permissions issue. I've been all over google trying to find an answer to no avail. Any ideas?
As it turns out, this was my fault... I had experienced the dreaded "network BIOS command limit has been reached" issue. I found a post that recommended doing a regedit hack "HKLM\Software\Microsoft\ASP.NET\FCNMode = 1", well this basically turns off File Change Notifications. Changing this value to 2, and applying the changes recommended in knowledge base 810886 fixed both problems.

Resources