File system storage/access to IE cookies - debugging

is there any way to list/decode cookies stored by Internet Explorer, w/o running the browser?
I am now tinkering with SWT Browser component which provides get/set methods for cookies, but is there any way to see what is really stored by IE itself?
UPD: I am using IE7. This location:
C:\Documents and Settings\UserName\Local Settings\Temporary Internet Files
does not contain cookies if I look at the place with Far Manager.
Apparently Explorer mixes them in from some other location or database. Any ideas?
UPD2: Okay, okay. This is my current problem. Some webapp does not work if I restart IE7 and load the same page again (lots of scripting, embedded flash and video streaming). Killing the cookies seems to help, if we do that from within the IE itself. If we programmatically kill the cookies (listed by name) it does not help and page gets stuck on restart/reload sequence.
If I manually wipe the cache folder it indeed helps and subsequent page loads work fine. So the question is - do I really wipe the cookies while wiping the Temporary Internet Files folder or not? I just need to know the cause - is it resource cache or some cookie which we don't list in the cleanup sequence.

Doh... quite a simple question, why noone was willing to answer?
Looks like explorer shows files from c:\Documents and Settings\USER\Cookies mixed in into Temporary Internet Files. This was XP, not sure how other versions behave.

source
Temporary Internet Files is a pretend folder. The view you see in Explorer is a combination of the actual files on disk, which live in securly unpredictably named subfolders, and the Wininet Url Cache Containers that store the metadata about the files (like what URL they came from, expiry date, etc). If you use filesystem APIs, you'll get what's actually there. If you use shell APIs, however, you can enumerate them in the way you want. Try starting with SHParseDisplayName() and go from there.

Related

Can't Clear or Stop File Caching while Building Site

Working on a site, using Chrome Incognito to review my changes, but files are still being cached and I can't review what I've done. Incognito isn't supposed to cache files. I just had a session with my hosting server to be sure nothing is being cached there, and it's not. To be certain, I deleted the file I'm working on from the server, then logged into a remote computer where I tried accessing the file and got a 404 error, so I know it's not cached on the server. I then completely shut down Chrome in this computer, restarted, opened an Incognito Window, went to the URL, and the file is still showing in a state before I made the most recent changes I'm trying to verify. I've repeatedly cleared the Chrome cache, but that doesn't effect Incognito. How do I clear whatever is caching these files in my system? Or maybe a better question would be, how are developers avoiding cache issues when building?

Removal of JS:Miner-AI[PUP]

When I try to access every site without using HTTPS. I get a popup from avast telling me they have blocked it.
It says 'JS:Miner-AI[PUP]'. When I try to access google.com (without https) it happens as well.
I have tried the following:
Scan with Avast, adaware and Malwarebytes (used to remove previous versions of js:miner)
Does anyone have any suggestions on what I can do to remove it?
Not sure what is going on but I am sure google.com is not malicious so my best guess is that your computer is infected with malware which tries to inject Javascript/HTML code into every page you visit probably by hooking wininet functions.
Most Malwares inject code into chrome, firefox and internet explorer. one way to check is to install another web browser and check if everything is good in that browser or not. Another way is to restart the computer and run again in safe mode (disable all the start up programs except for AV) and again run your favorite browser to see if the problem solved. if the browser's exe file (chrome.exe or firefox.exe) or any of the their DLLs are infected, then you have to reinstall the browser.
hope it helps :-)
Turns out that the router was infected. Once the router was replaced the problem was solved.

Performing a move (mv) on a folder which has a process running from it

I have a web server (PlayFramework/Netty based) running from a particular app folder. I did a quick test and everything worked fine. The server was able to handle all web requests as normal. But I'm hesitant to call to it safe.
Could someone explain why it might or might not be safe to do this?
It is not safe to do this. Processes that are already running will already have files open and will be able to access them from disk, but if any processes are HUP'd or restarted they won't be able to find the files in that folder.

File sharing over the internet - WebDAV / SMB / FTP

We are developing a web based application which provides a repository of users case files. Would like the user to be able to access these from their web browser with full read write capability.
For an earlier generation of our system, which was hosted on a local Linux server with Windows clients we were able to share out a folder and access it with \\server\share_name\file.doc type links. If these type of links were included in web pages (in internet explorer) and clicked on the file opened in MS Word and was savable directly into the shared folder. These type of links however only worked in IE - not FF or Chrome
Moving now to an internet based solution in our next generation of the system, we require similar functionality.
We are toying with the idea of having a WebDAV (or FTP/SFTP) share and mapping a local drive on each client machine to it to provide similar functionality. This though will probably not work well with FF or Chrome with \\server\share_name... type links. We have done brief testing and file:// links do not provide write capability once the file is opened.
As a last resort we will be able to use manual file upload dialogs, but this is not ideal and would entail additional end user training.
Has anyone any similar experience in this field and any possible solutions / best practice.
When you map remote resource as a local drive, for a browser this becomes a local drive. And browsers have only limited access to the local file system. Now when you provide a link to the browser, the browser's default behavior is to download the resource behind the link, and then let the local application process it. The browser just doesn't know how to open the remote resource locally in a different manner.
The solution would be to let the browser download something (some kind of link file) and have some local helper module (external application or browser plugin) open this link file and open the location, specified in this link file, locally. As this would be a client-side helper module, it will be able to interact with the client system and will know how to open the provided link. Given that the virtual drive letter can be different on each system (if you mount the disk to the drive letter), the helper module would need to resolve the link to point to the correct local drive. If you create a hidden virtual drive (our virtual storage products let you do this), then a link would look like "\SomeFancyNameUniqueToYourApp\Path\To\File.ext" and no resolving would be necessary. And most applications handle this type of paths fine.
I don't know for sure, but it's possible that browsers will open Windows .lnk files without a need in helper module, and with hidden virtual drive you could generate an LNK file on the server and have the browser open it locally. But this is just a guess. My bet is that you will need a helper module anyway.
ftp://username:password#hostname/ type links should work, and MS apps are getting better at handling them. still not 100% though
Try SMEStorage.com. They enable you to map local WebDav and FTP servers and access files using a Cloud Drive on Linux, Mac or Windows, and also from mobile devices (iOS, Android, BlackBerry and Windows Phone 7). You can get unique file links for each file and also secure file sharing in which the links expire.

Does Windows 7 restrict folder access as Vista does?

I noticed that in my application, most compatibility problems were caused by 'access denied' for some folders, such as:
Application Data [C:\ProgramData]
Desktop [C:\Users\Public\Desktop]
Documents [C:\Users\Public\Documents]
Favorites [C:\Users\Public\Favorites]
Start Menu [C:\ProgramData\Microsoft\Windows\Start Menu]
Templates [C:\ProgramData\Microsoft\Windows\Templates]
Does Windows 7 have the same problem as Vista?
With help from the members of Stack Overflow, I know that on Vista I can use CSIDL_APPDATA to enable the file access without UAC problems or 'access denied' errors.
Is this also valid for Windows 7?
It's not a "problem", it's a feature. It's called User Account Control (UAC), and it's one of the ways that system security was tightened under Windows Vista. Windows 7 indeed retains a similar security model.
There's absolutely no reason that your application should need to mess with system folders in the first place. As you've already learned, Windows provides a number of locations for applications to store data, both temporarily and permanently. Microsoft has been recommending for a long time that you take advantage of these folders: they were the preferred location for storing data even under previous versions of Windows. The fact that you ignored this advice, yet your application continued to work, was actually the bug. The fact that later versions of Windows finally closed that security vulnerability, thus breaking your application, should be neither unexpected nor unappreciated.
You can find more information about where to store your data on this page. Also see this blog article, which attempts to summarize the array of technical documentation into a handy table. And as always, Raymond Chen provides a simple, yet instructive, overview of the differences between the locations:
The most important difference between My Documents and Application Data is that My Documents is where users store their files, whereas Application Data is where programs store their files.
In other words, if you put something in CSIDL_MYDOCUMENTS (My Documents), you should expect the user to be renaming it, moving it, deleting it, emailing it to their friends, all the sorts of things users do with their files. Therefore, files that go there should be things that users will recognize as "their stuff". Documents they've created, music they've downloaded, that sort of thing.
On the other hand, if you put something in CSIDL_APPDATA (Application Data), the user is less likely to be messing with it. This is where you put your program's supporting data that isn't really something you want the user messing with, but which should still be associated with the user. High score tables, program settings, customizations, spell check exceptions...
There is another directory called CSIDL_LOCAL_APPDATA (Local Settings\Application Data) which acts like CSIDL_APPDATA, except that it does not get copied if the user profile roams. (The "Local Settings" branch is not copied as part of the roaming user profile.) Think of it as a per-user-per-machine storage location. Caches and similar non-essential data should be kept here, especially if they are large. Other examples of non-roaming per-user data are your %TEMP% and Temporary Internet Files directories.

Resources