Getting Extra Information From WinAPI File Change Notifications - winapi

The MSDN has a pretty good example of getting notified when a file or directory is changed.
However, I can't find any way to get extra information such as the user/machine name associated with the change notification.
For example, I've setup a share X:\Foo from my machine. I would like to log the user/machine names that make changes to my shared directory.
Is this possible to accomplish?

You can't using FFCN. You can set the security descriptor on the files in the shared directory to cause file accesses to be logged to the event log.
This KB article has some information about how to enable auditing on files.

Related

How can I use fsecurity apply on a NetApp filer to reset NTFS permissions? (ONTAP 7-MODE)

I have a NetApp filer, with a CIFS export. The permissions have been locked down on it, to a point where it's no longer accessible.
I need to reset the permissions on this - I've figured out I can probably do this by changing the qtree to Unix security mode and back again (provided I'm prepared to unexport the share temporarily).
However, I think I should be able to use the fsecurity command to do this. There's just one problem - the manpage example refers to 'applying ACLs from a config file':
https://library.netapp.com/ecmdocs/ECMP1196890/html/man1/na_fsecurity_apply.1.html
But what it doesn't do, is give me an example of what a 'security definition file' actually looks like.
Is anyone able to give me an example? Resetting a directory structure to Everyone/Full Control is sufficient for my needs, as re-applying permissions isn't a problem.
Create a conf file containing the following:
cb56f6f4
1,0,"/vol/vol_name/qtree_name/subdir",0,"D:P(A;CIOI;0x1f01ff;;;Everyone)"
Save it on your filer somewhere (example in manpage is /etc/security.conf).
Run:
fsecurity show /vol/vol_name/qtree_name/subdir
fsecurity apply /etc/security.conf
fsecurity show /vol/vol_name/qtree_name/subdir
This will set Everyone / Full Control: inheritable. Which is a massive security hole, so you should now IMMEDIATELY go and fix the permissions on that directory structure to something a little more sensible.
You can get create more detailed ACLs using the 'secedit' utility, available from NetApp's support site. But this one did what I needed it to.

Looking for windows event viewer system logs message templates , where can I get them?

I need to get hold of at least microsoft windows system events message templates ,
is there a place I can find those?
a template for example :
Windows cannot access the file gpt.ini for GPO CN={31B2F340-016D-11D2-945F-00C04FB984F9},CN=Policies,CN=System,DC=,DC=com. The file must be present at the location <\\sysvol\\Policies{31B2F340-016D-11D2-945F-00C04FB984F9}\gpt.ini>. (.). Group Policy processing aborted.
where the parameters are surrounded by tags.
Thanks you for your help.
Each event log in windows has it own registry entry, for example:
System event log has it entry at this path: HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\EventLog\Security
Under System key there are keys for each event source, that writes events into System event log.
Most of event sources also contain EventMessageFile value, which points you to a .dll or .exe that contain Message Tables inside it. That's actually yyou should look for. It would be useful to read about Event Logging in Windows.
Also there are some tools already, that allows you to see messages for a particular event source, unfortunately, I don't remember exact names of that utilities.

ACCESS_DENIED_ERROR when using NetFileClose API

I have a windows network in which many files are shared across many users with full control. I have a folder shared for everyone in my system, so whenever I try to access it using the machine name (run->\Servername) from another system, I can see the shared folder and open/write files in it.
But my requirement is to close any open files(in my system) in network. So I used NetFileEnum to list all opened file ids so that I can close those files using NetFileClose API.
But the problem is NetFileEnum returns invalid junk ids like 111092900, -1100100090 etc so that I can't close it from another machine. So I listed the network opened files using net file command and by noting the id, say it be 43 I hard coded the id in my function call NetFileClose("Servername", 43); But when I executed, I got ACCESS_DENIED_ERROR. But if the same code is run on the server, it is successfully closing the files. I had given full permission in share for all users.
But why ACCESS_DENIED_ERROR and why NetFileEnum returning invalid ids? Is there anything to be done for this API to work? How can I use these APIs properly to close network opened files?

Read application log written on Windows Azure

I have 10 applications they have same logic to write the log on a text file located on the application root folder.
I have an application which reads the log files of all the applicaiton and shows details in a web page.
Can the same be achieved on Windows Azure? I don't want to use the 'DiagnosticMonitor' API's. As I cannot change logging logic of application.
Thanks,
Aman
Even if technically this is possible, this is not advisable as the Fabric Controller can re-create any role at a whim (well - with good reasons, but unpredictable none-the-less) and so whenever this happens you will lose any files stored locally on a role.
So - primarily you should be looking for a different place to store those logs, and there are many options, but all require that you change the logging logic of the application.
You could do this, but aside from the issue Yossi pointed out (the log would be ephemeral; it could get deleted at any time), you'd have a different log file on each role instance (VM). That means when you hit your web page to view the log, you'd see whatever happened to be on the log on that particular VM, instead of what you presumably want (a roll-up of the log files across all VMs).
Windows Azure Diagnostics could help, since you can configure it to copy log files off to blob storage (so no need to change the logging). But honestly I find Diagnostics a bit cumbersome for this. It will end up creating a lot of different blobs, and you'll have to change the log viewer to read all those blobs and combine them.
I personally would suggest writing a separate piece of code that monitors the log file and, for each new line, stores the line as an entity (row) in table storage. This bit of code could be launched as a startup task and just run continuously as a separate process (leaving everything else unchanged). Then modify the log viewer to read the last n entities from table storage and display them.
(I'm assuming you can modify the log viewer even if you can't modify the apps that log to the file.)
What about writing logs to something like azure storage table? Just need to define unique ParitionKey/RowKey, then you can easily retrieve the log for the web page.

What's the best way to (programatically) determine a file's network origin?

For an application I'm writing, i want to programatically find out what computer on the network a file came from. How can I best accomplish this?
Do I need to monitor network transactions or is this data stored somewhere in Windows?
When a file is copied to the local system Windows does not keep any record of where it was copied. So unless the application that created it saved such information in the file then it will be lost.
With file auditing file and directory operations can be tracked, but I don't think that will include the source path with file copies (just who created it and when).
Yes, it seems like you would either need to detect the file transfer based on interception of network traffic, or if you have the ability to alter the file in some way, use public key cryptography to sign files using a machine-specific key before they are transferred.
Create a service on either the destination computer, or on the file hosting computers which will add records to an Alternate Data Stream attached to each file, much the way that Windows handles ZoneInfo for files downloaded from the internet.
You can have a background process on machine A which "tags" each file as having been tagged by machine A on such-and-such a date and time. Then when machine B downloads the file, assuming we are using NTFS filesystems, it can see the tag from A. Or, if you can't have a process at the server, you can use NTFS streams on the "client" side via packet sniffing methods as others have described. The bonus here is that future file-copies will retain the data as long as it is between NTFS systems.
Alternative: create a requirement that all file transfers must be done through a Web portal (as opposed to network drag-and-drop). Built in logging. Or other type of file retrieval proxy. Do you have control over procedures such as this?

Resources