Read application log written on Windows Azure - windows

I have 10 applications they have same logic to write the log on a text file located on the application root folder.
I have an application which reads the log files of all the applicaiton and shows details in a web page.
Can the same be achieved on Windows Azure? I don't want to use the 'DiagnosticMonitor' API's. As I cannot change logging logic of application.
Thanks,
Aman

Even if technically this is possible, this is not advisable as the Fabric Controller can re-create any role at a whim (well - with good reasons, but unpredictable none-the-less) and so whenever this happens you will lose any files stored locally on a role.
So - primarily you should be looking for a different place to store those logs, and there are many options, but all require that you change the logging logic of the application.

You could do this, but aside from the issue Yossi pointed out (the log would be ephemeral; it could get deleted at any time), you'd have a different log file on each role instance (VM). That means when you hit your web page to view the log, you'd see whatever happened to be on the log on that particular VM, instead of what you presumably want (a roll-up of the log files across all VMs).
Windows Azure Diagnostics could help, since you can configure it to copy log files off to blob storage (so no need to change the logging). But honestly I find Diagnostics a bit cumbersome for this. It will end up creating a lot of different blobs, and you'll have to change the log viewer to read all those blobs and combine them.
I personally would suggest writing a separate piece of code that monitors the log file and, for each new line, stores the line as an entity (row) in table storage. This bit of code could be launched as a startup task and just run continuously as a separate process (leaving everything else unchanged). Then modify the log viewer to read the last n entities from table storage and display them.
(I'm assuming you can modify the log viewer even if you can't modify the apps that log to the file.)

What about writing logs to something like azure storage table? Just need to define unique ParitionKey/RowKey, then you can easily retrieve the log for the web page.

Related

Azure Blob Storage lifecycle management - send report or log after run

I am considering using Azure Blob Storage's build-in lifecycle management feature for deleting blobs of a certain age.
However, due to a business requirement, it must be possible to generate a report or log statement after each daily execution of the defined ruleset. The report or log must state the number of blob blocks that were affected, e.g. deleted during the run.
I have read through the documentation and Googled to see if others have had similar inquiries, but so far without any luck.
So my question: Does any of you know if and how I can get a build-in Lifecycle management system to do one of the following after each daily run:
Add a log statement to the storage account containing the Blob storage.
Generate and send a report to an endpoint I define.
If the above can't be done I will have to code the daily deletion job and report generation myself, which surely I can do, but I would like to use the built-in feature if possible.
I summarize the solution as below.
If you want to know which blobs are deleted every day, we can configure Diagnostics settings in the storqge account. After doing that, we will get the logs for read, write, and delete requests for the blob. For more detail, please refer to here and here
Regarding how to enable it, we can use PowerShell command Set-AzStorageServiceLoggingProperty.

Clarion based application in WinServer 2016, frozen. Suggestions needed on tools we could use to gather more information

I have a clarion application running in Win Server 2016 talking to a sybase DB, over the past few weeks we find the application gets frozen for different users at a given time. However the user can leave the session as such and start a new one and that works good. The users are known to use multiple instances of the same application in one remote server or on multiple servers. Having said that, I wanted to get more information on the freezeup and looked through the application event logs in the system where I see explorer.exe crashes but these correlate to the time of occurrence of the issue at certain times but not always, checked the DB transaction logs from Sybase and I do not find any crashes, errors or stuck connections. Having said that, since I have exhausted all possible options I am reaching out to you guys to know if there are any other places that i can look for to gather more information.
I would love to know of any application / tools that we could use to gather logs of a frozen clarion application on windows. Also good to know if anyone has faced such a situation and where and how have you guys looked into the issue.
Thanks in advance for your help in this.
Clarion's runtime library and database drivers expect a persistent connection. Disconnects that are normal with remote ODBC can cause a problem (including app hangs) unless you test for them at the ABC file mgr level and reconnect, or use similar steps to test and recover.
If you're looking for specifics about what's going on between the driver and the SQL backend, I suggest using Clarion's database driver trace facilities. From the help topic: "Logging Driver I/O for debugging":
To view the trace details in debugview, name the target trace file "DEBUG:"
Logging opens the named logfile for exclusive access. If the file exists, the new log data is appended to the file.
On Demand Logging
For on-demand logging you can use property syntax within your program to conditionally turn various levels of logging on and off. The logging is effective for the target table and any view for which the target table is the primary table.
file{PROP:Profile}=Pathname !Turns Clarion I/O logging on
file{PROP:Profile}="DEBUG:" !Turns Clarion I/O logging on and
!sends output via OutputDebugString()
!(viewable via debugview, etc)
file{PROP:Profile}='' !Turns Clarion I/O logging off
PathName = file{PROP:Profile} !Queries the name of the log file
file{PROP:Log}=string !Writes the string to the log file
file{PROP:Log}="DEBUG:" !Writes the string to the log file
file{PROP:Details}=1 !Turns Record Buffer logging on
fFile{PROP:Details}=0 !Turns Record Buffer logging off
where Pathname is the full pathname or the filename of the log file to create. If you do not specify a path, the driver writes the log file to the current directory.
You can also accomplish on demand logging with a SEND() command and the LOGFILE driver string. See LOGFILE for more information.
Example I use frequently, which was based on the help above:
SYSTEM{PROP:DriverTracing} = '1'
CRMNotes{PROP:TraceFile} = 'DEBUG:'
CRMNotes{PROP:Details}=1
CRMNotes{PROP:Profile}= 'DEBUG:'
CRMNotes{PROP:LogSQL} = 1

windows registry storage best practice

Background
I've recently been shunted into the world of windows programming and I'm still trying to find my way around the best practices and ways of doing things. So I was just hoping for some pointers on use of the registry
Not particularly relevant but the background is that I am creating an installer in Golang, a couple of points to get out the way on that:
I am aware MSI's would usually be best practice for an installer (I have my reasons for going custom exe)
I know there are more obvious language choices than golang, just go with it
Current registry use
As part of the install process, I store several pieces of data in the registry:
run once commands:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\RunOnce
I create a few entries here: to restart the process after a system reboot and to delete some temp files on reboot after uninstall
an uninstall entry:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\Windows\CurrentVersion\Uninstall\Vendor
Product
Content here is the same as an MSI would create, I was careful not to create any additional custom fields here (all static data until uninstall)
an application entry:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Vendor\Product
I store some additional data about the installation here, some of which is needed for uninstall such as state info from before installation (again all static content)
a temporary entry:
Computer\HKEY_CURRENT_USER\SOFTWARE\Vendor\Product
I store some temporary data here which can include some sensitive user entered data (usernames/passwords). I run some symmetric encryption to obscure the data though my understanding is this is area of the registry is encrypted so only the user could access anyway (would like confirmation on that)
This data is used to resume after restart and then deleted
Questions
I'm looking for confirmation / corrections on my current use of the registry?
I now have need to pass some data between an application and a running service, this data would be updated every 1-2 minutes and would be a few bytes of JSON. Does the registry seem like a reasonable place to store variable data like this? If so is there a particular place that better for variable data - I was going to add it to:
Computer\HKEY_LOCAL_MACHINE\SOFTWARE\Vendor\Product
HCKU isn't encrypted to my knowledge. It's stored in a file called NTUser.dat and could be loaded as a hive under HKEY_USERS and visible to other processes with sufficient rights to do so.
You would need to open up the rights to HKLM\SOFTWARE\Vendor\Product if you expect a user priv process to be able to write to it. If you want to pass data to a service you might want to use some sort of IPC pipe to do so. Not sure what's available in Golang for this.

WindowsAzure: Is it possible to set directory permissions within the web.config?

A PHP scriptof mine wants to write into a log folder, the resulting error is:
Unable to open the log file "E:\approot\framework\log/dev.log" for writing.
When I set the writing permissions for the WebRole User RD001... manually it works fine.
Now I want to set the folder permissions automatically. Is there an easy way to get it done?
Please note that I'm very new to IIS and the stuff around, I would appreciate precise answers, thx.
Short/Technical Response:
You could probably set permissions on a particular folder using full-trust and a startup taks. However, you'd need to account for a stateless OS and changing drive letters (possible, not likely) in this script, which would make it difficult. Also, local storage is not persisted, so you'd have no way to ensure this data stayed in the case of a reboot.
Recommendation: Don't write local, read below ...
EDIT: Got to thinking about this, and while I still recommend against this, there is a 3rd option: You can allocate local storage in the service config, then access it from PHP using a dll reference, then you will have access to that folder. Please remember local storage is not persisted, so it's gone during a reboot.
Service Config for local:
http://blogs.mscommunity.net/blogs/dadamec/archive/2008/12/11/azure-reading-and-writing-with-localstorage.aspx
Accessing config from php:
http://phpazure.codeplex.com/discussions/64334?ProjectName=phpazure
Long / Detailed Response:
In Azure, you really are encouraged to approach things as a platform and not as "software on a server". What I mean there is that ideas such as "write something to a local log file" are somewhat incompatible with the cloud "idea". Depending on your usage, you could (and should) convert this script to output this data to some cloud-based or external storage, vs just placing it on the disk.
I would suggest modifying this script to leverage the PHP Azure SDK and write these log entries out to table or blob storage in Azure. If this sounds good, please provide the PHP and I can give an exact example.
The main reason for that (besides pushing the cloud idea) is that in Azure, you cannot assume the host machine ("role instance") will maintain an OS state, so while you can set some things such as folder permissions, you can't rely on them sticking that way. You have no real way to guarantee those permissions won't be reset when the fabric has to update your role and react to some lower level problem. For example, a hard-drive cage on the rack where your current instance lives could fail. If the failure were bad enough, the Fabric controller would need to rebuild your instance. When that happens, your code is moved to an entirely different server, so the need would arise to re-set those permissions. Also, depending on the changes, the E:\ could all of a sudden need to be the F:\ or X:\ drive and you wouldn't know.
Its much better to pretend (at some level) that your application is running "in Azure" and not "on a server in azure", so you make no assumptions about the hosting environment. So anything you need outside of your code (data, logs, audits, etc) should be stored somewhere you can control (Azure Storage, external call-out, etc)

Good way to demo a classic ASP web site

What is the best way to save data in session variables in a classic web site?
I am maintaining a classic web site and want to be able to allow my users to demo all functionality of the site, this means allowing them to delete records.
The closet example I have seen so far are the demos of Telerik controls where they are saving the dataset in sessions on first load and allowing the user to manipulate the data.
How can I achieve the same in ASP with an MS Access backend?
If you want to persist the state over multiple pages (e.g. to demo you complete application) then it's a bit tricky.
I would suggest copying the MDB file for each session and using the copied version. This would ensure that every session uses its own data.
create a version of your access db which will be used as a fresh template for each user
on session copy the template and name it after the users session ID
use the individual MDB
Note: Then only drawback I can see here is that you need to remove the unused MDB files as it can get a lot after sometime. You could do it with a scheduled task or even on session start before you create a new one.
I am not sure what you can use to check if it's used or not but check the files creation date or maybe the LDF file can help you as well (if it does not exist = unused).
You can store a connection or inclusive an object in a session variable as far you remember what kind of variable are you storing at the retrieving time. I had never stored a dataset in a session variable but I had stored a lot of arrays in session variables so you can use the ADO Getrows method to locate a complete dataset into a session variable.
How big is the Access database? If your database is small enough (relative to the server capacity, expected number of users, and so forth) then I like the idea of using a fresh copy of the database for each user that runs the demo.
With this approach, you simplify your possible code paths. Otherwise this "are we in demo mode or not?" logic will permeate a heck of a lot of your code.
I'd do it like this...
When the user begins the demo, make a copy of the Access DB for that user to use. If your db is foo.mdb, copy it to /tempdb/foo_1234567890.mdb where 1234567890 is the user's session ID.
Alter the user's connection string to point to the fresh database copy. From this point on, your app can operate like "normal" with no further modifications.
Have a scheduled task that deletes all files in /tempdb with last-modified times more than __ hours in the past. If you don't have the ability to schedule tasks on the server (perhaps you're in a shared hosting environment, etc) then you could do this at the same time you do step #1.

Resources