I am working on a cloud file sync application and an issue that comes up for users is the local time on the users computer being wrong. I use timestamps for tracking changes on each file to check which file is newer, but when someone's computer time is set to say 2005 or even an hour earlier or ahead of the right time and they make changes without being connected to the internet, when they then sync at a later date the time stamp for each file is wrong and overrides the very data they are trying to save.
Getting a timestamp from a time server like nist.gov can only happen when a user is connected to the internet, but if changes are made when the user is offline there is no way to get the right time. I was thinking about putting a counter to track changes to files in addition to a timestamp, but that doesnt work because multiple changes mess that up.
Related
I have designed and built a database in MS Access 2013. There is a back end file on a shared drive and multiple users use a front end located on their individual PCs.
Occasionally, seemingly randomly, the back end will lock out in read only mode, meaning that none of the front ends can update the back end, and when I open the Back End, I get the yellow bar saying:
"READ-ONLY This database has been opened read-only. You can only change data in linked tables. To make design changes, save a copy of the database".
It won't let you save a copy of the database by the way. Other databases in the same location open correctly.
I have some spreadsheets that access the database read only, occasionally these change to Share Deny Write, and when they do you get the 1kb lock file for the database. This allows me to see who has it locked out. In my case, this file doesn't appear until I open the database read only and then it shows as locked out to me.
Eventually, it tends to just fix itself, but it can take hours and I have no idea what happened to fix it!
I've searched Google and tried everything that's been suggested, and of course restarted my PC a couple of time, none of which fixed it. However, last time this happen, after restating my PC it worked again, could have been a coincidence.
Any advise about how I can see who's locked it, or how I can get it to unlock would be greatly appreciated so we don't have hours of downtime!
I have a script, which monitors some folder's subfolders, check their created date, and if DateDiff from created time and Now is more than 730(2 years) it deletes this folders. The problem is in that, if set current time on PC for 5/16/2015 - the script will delete folders, and it's not cool. I thought about getting time from some internet service, but there is no guarantee that PC will be connected to internet. So I tried to get BIOS time (I believe no users can change it to wrong), and compare folder's created time with this time. But, unfortunately I didn't find the way how to do that. Maybe you have any idea how to implement this task?
Having an incorrect date and time on a production-level machine is not an option nowadays. It will break a lot of things like HTTPS sites (they will fail to validate because of expired certificates), etc.
The system clock (BIOS time) is changed by the operating system whenever the time itself is changed in the OS, so you only have one clock in the computer. In other words the OS does not have a separate clock to track time with.
Nowadays most computers use the Network Time Protocol to keep their clocks in sync, so you should specify the correctly set time as a pre-requisite for your software. (Or you can just specify that the computer must be connected to an external atomic clock, but that's going to be expensive.)
I have a file which can be edited from several different places over an intranet, but needs to be kept up to date on one specific machine.
The way things work is like this.
We have a local intranet where changes to the file are made and viewed on the intranet copy of the live website. If everything looks good, the file may then be uploaded to the remote server, overwriting the previous version.
It isn't ideal, I know, but thats the way it has to be.
What I would like to do is keep track of any changes to that remote version of the file so I can then reflect that change on my local machine.
The idea I've had so far is to use the Task Scheduler on the remote server to send an email to me whenever a change takes place. Changes aren't foreseen to happen often, but when they do, I need to know about it.
My problem is, I'm not sure what events to look for in Task Manager. As it stands, the file could be changed by someone FTPing into the server and changing it or by someone remoting in and uploading that way. As I said, not ideal, but its what I have to work with.
To keep things specific, I'm looking to use Task Scheduler here, working off a trigger. From there, I'm a little lost.
As it turns out, I found a better/more useful/"good enough" solution, by creating a FileSystemWatcher and starting it when the application starts.
Thats where the "good enough" comes in, since it won't catch any changes if the application is stopped for some reason. However, since I'm the only one likely to stop the application, things will be a bit more serious than a broken FileSystemWatcher if it comes to that.
Specifically, I created a class called "Utilities" and created the FileSystemWatcher in there.
Then in the Global.asax.cs Application_Start() method, I initialised the FileSystemWatcher and set it going.
If a change takes place, the event handler of the watcher is set to fire off an email to me, with the new file attached.
Simples.
We're building a Windows-based application that traverses a directory structure recursively, looking for files that meet certain criteria and then doing some processing on them. In order to decide whether or not to process a particular file, we have to open that file and read some of its contents.
This approach seems great in principle, but some customers testing an early version of the application have reported that it's changing the last-accessed time of large numbers of their files (not surprisingly, as it is in fact accessing the files). This is a problem for these customers because they have archive policies based on the last-accessed times of files (e.g. they archive files that have not been accessed in the past 12 months). Because our application is scheduled to run more frequently than the archive "window", we're effectively preventing any of these files from ever being archived.
We tried adding some code to save each file's last-accessed time before reading it, then write it back afterwards (hideous, I know) but that caused problems for another customer who was doing incremental backups based on a file system transaction log. Our explicit setting of the last-accessed time on files was causing those files to be included in every incremental backup, even though they hadn't actually changed.
So here's the question: is there any way whatsoever in a Windows environment that we can read a file without the last-accessed time being updated?
Thanks in advance!
EDIT: Despite the "ntfs" tag, we actually can't rely on the filesystem being NTFS. Many of our customers run our application over a network, so it could be just about anything on the other end.
The documentation indicates you can do this, though I've never tried it myself.
To preserve the existing last access time for a file even after accessing a file, call SetFileTime immediately after opening the file handle with this parameter's FILETIME structure members initialized to 0xFFFFFFFF.
From Vista onwards NTFS does not update the last access time by default. To enable this see http://technet.microsoft.com/en-us/library/cc959914.aspx
Starting NTFS transaction and rolling back is very bad, and the performance will be terrible.
You can also do
FSUTIL behavior set disablelastaccess 0
I don't know what your client minimum requirements are, but have you tried NTFS Transactions? On the desktop the first OS to support it was Vista and on the server it was Windows Server 2008. But, it may be worth a look at.
Start an NTFS transaction, read your file, rollback the transaction. Simple! :-). I actually don't know if it will rollback the Last Access Date though. You will have to test it for yourself.
Here is a link to a MSDN Magazine article on NTFS transactions which includes other links. http://msdn.microsoft.com/en-us/magazine/cc163388.aspx
Hope it helps.
I saw this question in a forum about how an application can be developed that can keep track of the installation date and show trial period expired after 30 days of usage. The only constraint is not to use the external storage of any kind.
Question: How to achieve this?
Thanks
Bala
--Edit
I think its easy to figure out the place to insert a question work. Anyway, I will write the question clearly. "external storage" means don't use any kind of storage like file, registry, network or anything. You only have your program.
Use the file-modified date of the file containing the program as the installation date.
I like Doug Currie's idea of the file-modification date. But if the application is downloaded from the web, every night at midnight it gets relinked with new initialized data containing the new expiration date. Then any binary downloaded that day expires on the date given.
If you like, sign the date with a private key so it can't be hacked. Include a public key in the app and decrypt the date. If not correctly signed, hasta la vista, baby.
I don't know if this is possible, as most work I've done has been with embedded systems in which I don't even need to touch the operating system. But would the following be possible?
When compiling your program, leave some extra space at the end (say, 8 bytes), all set to 0. When your application is run, it fetches those bytes and if they're all 0, replaces them with the current time (That's the part I'm not sure about. Does the OS let you do that? If not, there might be some work-arounds using multiple processes.), otherwise, if the time difference is greater than 30 days, it notifies the user that the trial period has ended.
Of course, that method would be vulnerable to resetting the system clock.
If you can't use any external storage at all (not even config files or anything like that), you would need to code it into the app itself so the app's main method (or some method) checks if the current date is less than some expiration date. Part of your installer could actually compile that code on the fly and then it would be set to the installation date. This could be easily defeated by reinstalling the app, but then again, it's not realistic to have no external storage either.
I think the only way to do this generally would be to have your application spawn something off in a separate process that would continue to run and keep track of the date/time even if the main application were closed. When it was restarted, it would then connect to the running process to see if the trial period had expired.
Of course, this would only work if the computer was never restarted and the user never hunted down your spawned process and killed it, which is pretty unlikely. If your application does not do anthing IO-related (file system, registry, something on the network etc.), then a simple restart will wipe away anything that you've done.
So, to summarize: it's not really possible.