Setting up a public (or private) symbol server over http - visual-studio

Every piece of documentation I've found (references 1 through 5) talks about setting up a symbol server by using a shared UNC path, and then putting the correct settings available to the local debugger instance (whether _NT_SYMBOL_PATH or the Visual Studio IDE Debugging settings).
Microsoft provides a symbol server (reference 6) available via http for their public symbol stores.
I want to create, for my own code, a symbol server accessible over http transport, instead of over UNC file sharing. The Mozilla folks appear to have done so (reference 7), but it is no longer functional.
Are there better references available for performing this task than I have found so far?
References
https://msdn.microsoft.com/en-us/library/b8ttk8zy(v=vs.80).aspx
http://msdn.microsoft.com/en-us/library/ms680693(v=vs.85).aspx
http://stackhash.com/blog/post/Setting-up-a-Symbol-Server.aspx
http://entland.homelinux.com/blog/2006/07/06/…
http://msdn.microsoft.com/en-us/windows/hardware/gg462988
http://support.microsoft.com/kb/311503
http://developer.mozilla.org/en/Using_the_Mozilla_symbol_server

I believe the answer is a very simple, "Just share the directory via some sort of http path." According to Chad Austin's entry on "Creating Your Very Own Symbol Server", this will just work.
In other words, the directory which symstore.exe uses to store the symbols, when served up as http://symbols.example.com/public_symbols/ , will be usable as the symbol server target for the Windows Debugging Tools.

Be careful when having multiple users use Symstore.exe directly against the same symbol store. Microsoft's white papers on this subject make it sound like you simply create a share and have everyone update through the SYMSTORE.EXE program delivered as part of Debugging Tools for Windows. The white papers advised you to have this done by each build.
And it works great with single users or when funneling all updates through a single person who is updating the symbol server for a team.
Unfortunately, the "fine print" at the bottom of some of the white papers says that only one user running symstore.exe can update the shared symbol server at the same time without breaking the content.
(Example: At http://msdn.microsoft.com/en-us/library/ms681417(VS.85).aspx, Microsoft says: "Note SymStore does not support simultaneous transactions from multiple users. It is recommended that one user be designated "administrator" of the symbol store and be responsible for all add and del transactions.")
So there is no inherent mechanism to serialize updates to the symbol store. It appears that multiple, simultaneous attempts to update the symbol store can break the symbol store and/or its index.
We cannot have builds for our entire multi-thousand man, international corporation in all time zones dependent upon coordination thru one man in one location.
Based on those white papers, I raised this issues with Microsoft in March of 2009; who confirmed this was a possible issue. After that discussion, we chose to implement a symbol update service which serializes the updates via direct Windows Debugging Tools SDK DbgEng.DLL SymbolSrvStoreFile() API calls so there is never a possibility of two simultaneous updates against the same area of symbols at the same time. Users have a build action that queues their symbols through the service instead of directly updating the symbol store. The service then serializes the updates to make sure true concurrent update attempts never happen.
The limited documentation available about using SymSrvStoreFile was not very clear at the time. I did get it working. Hopefully it has been improved since then. if not, the most crucial issue was the that the input path must be specified in a format similar to _NT_SYMBOL_PATH. So instead of, for example, using "C:\Data\MyProject\bin" as the input path, you would instead specify "srv*C:\Data\MyProject\bin".
Our service now also logs the updates through a database. The database both serves as a backup to the symbol store (in case it ever gets corrupted and must be rebuilt) and also creates a reporting point so that managers and support people know who is actually saving their symbols and who is not. We generate a weekly "symbol check-in" report which is auto-EMailed to stakeholders.

A symbol server served via HTTP has the same structure as a symbol server served via a UNC file path, so the simplest thing to do would be to use symstore.exe to store the files in a folder somewhere and then use a simple HTTP server which exposes that folder via HTTP (even running python -m SimpleHTTPServer in the symbols dir would work).
A small gotcha is that if a symbol file does not exist, the HTTP server must return a 404 error code (tested under Visual Studio 2013 at least). I ran into an issue where an HTTP server returning 403 for missing files caused Visual Studio to stop making requests after the first failed request.
symstore.exe creates a number of auxilliary files and folders (the 000Admin/ folder, refs.ptr and files.ptr files). None of these are needed for the symbol server to work.
If you want to create a symbol store without using symstore.exe, you can upload the files with this structure:
BinaryName.pdb/$BUILD_ID/BinaryName.pdb
BinaryName.exe/$LINK_ID/BinaryName.exe
Where BUILD_ID is a GUID embedded in the PDB file and executable and LINK_ID is a combination of build timestamp and file size in the executable. These can be obtained by reading the output of the dump_syms.exe tool from the breakpad library. See http://www.chromium.org/developers/decoding-crash-dumps

Our (Mozilla's) symbol server works fine, AFAICT. We're not doing anything particularly complicated, we just put the PDB files into the right directory structure (we have a script for that, but you could use symstore.exe) and serve it up via Apache. I think the only special thing we have are some Rewrite rules to allow accessing the files in a non-case-sensitive manner, because Microsoft's tools are really inconsistent about filename/GUID case.

There is also Electron's variant of this, which sits in front of S3.
It has the additional helpers of converting 403's to 404's (to not upset the debugger), and converting all paths to lowercase, so that incoming requests are case-insensitive.
https://github.com/electron/symbol-server

Related

Can I prevent DAO from trying to open nonexistent "system.mdb"?

Summary: DAO is trying to automatically open a nonexistent file - can this be prevented?
I maintain a VB6 program which uses the DAO36 library for accessing MDB / Jet databases. (Yes, pretty old stuff).
We noticed that during application startup it has been looking for file system.mdb which does not exist. This is not anything our own code was initiating.
Using Process Monitor I was able to determine that the program looks for this file around the time that the DAO library is loaded or initialized. After a bunch of DAO/Jet DLLs are loaded, one of them looks in the registry here:
HKLM\SOFTWARE\WOW6432Node\Microsoft\Jet\4.0\Engines\SystemDB
the value of which is indeed system.mdb. I can see the same in RegEdit:
Having located the desired filename from the registry it tries to open that file in the working folder of the program. i.e., I can see it try to find a file called:
C:\Program Files (x86)\<appname>\system.mdb
which of course does not actually exist.
I looked on Microsoft's website and also through older MSFT KB articles (1, 2), as well as here on SO. It seems that system.mdb is (or used to be) tied into how DHCP and WINS networking was configured in Windows NT. (e.g., refs 1, 2, 3 ... and many other KB articles indicate this).
This is the kind of 'red flag' that sometimes it is good to be proactive about before it leads to some (often rare and unreproducible) problem for an end user. So if there is any way to properly disable or configure this behavior I'd like to find out about it.

How to detect OneDrive online-only files

Starting from Windows 10 Fall Creators Update (version 16299.15) and OneDrive build 17.3.7064.1005 the On-Demand Files are available for users (https://support.office.com/en-us/article/learn-about-onedrive-files-on-demand-0e6860d3-d9f3-4971-b321-7092438fb38e)
Any OneDrive file now can have one of the following type: online-only, locally available, and always available.
Using WinAPI how can I know that the file (e.g. "C:\Users\Username\OneDrive\Getting started with OneDrive.pdf") is online-only file?
After years, I'm still using FILE_ATTRIBUTE_RECALL_ON_DATA_ACCESS attribute described here to determine if a file or a directory is completely present locally or not.
Microsoft docs says the following for FILE_ATTRIBUTE_RECALL_ON_DATA_ACCESS:
When this attribute is set, it means that the file or directory is not fully present locally. For a file that means that not all of its data is on local storage (e.g. it may be sparse with some data still in remote storage). For a directory it means that some of the directory contents are being virtualized from another location. Reading the file / enumerating the directory will be more expensive than normal, e.g. it will cause at least some of the file/directory content to be fetched from a remote store. Only kernel-mode callers can set this bit.
There are some advantages of FILE_ATTRIBUTE_RECALL_ON_DATA_ACCESS:
It can be used for both files and directories.
It can be set in kernel mode only, so there is no chance for anyone to set the attribute arbitrary.
And as it described in this answer, there are still some interesting undocumented attributes which can provide additional information about cloud files.
Note: I didn't accept Jonathan Potter's answer because I mentioned FILE_ATTRIBUTE_RECALL_ON_DATA_ACCESS attribute in comments and started using it a year earlier than he updated his answer.
To check for "online only" all you need is to call GetFileAttributes() and see if the FILE_ATTRIBUTE_OFFLINE attribute is set.
In fact this isn't new for OneDrive, that attribute has existed for a long time.
There are other OneDrive attributes available via the shell (although the property you need is PKEY_StorageProviderState rather than PKEY_FilePlaceholderStatus) but "online only" is easy to check for.
Edit: Another filesystem attribute, FILE_ATTRIBUTE_PINNED is new for Windows 10, and is used by OneDrive to indicate a file that's "always available".
Edit: As of 2019 it appears that OneDrive now uses FILE_ATTRIBUTE_RECALL_ON_DATA_ACCESS rather than FILE_ATTRIBUTE_OFFLINE, as suggested below.
Edit: PKEY_StorageProviderState was broken in Windows 10 1903, and still not fixed in 1909. It returns 4 ("uploading") for all files in any apps other than Explorer.
Take a look at the PKEY_FilePlaceholderStatus property for the file (at the shell level, not the file-system level). This blog post has a example program you can test. This question also hints to some undocumented properties you might want to take a look at.
Microsoft has a UWP example on MSDN.

How to set up Visual Studio to analyze crash dumps

I have a program which is instrumented to generate mini-dumps on exceptions. I have archived copies of the .exe, .pdb and the source files. The only way that I have found to get Visual Studio to find the .pdb file and analyze a dump when I receive one from a client, is to place the archived files in the exactly the same location that the original build took place on the disk.
I have tried adding the path to the .pdb file to Visual Studio's debug symbol directories, but the path is always ignored. The path in the .exe file seems to be used instead.
This is terribly inconvenient, since it means moving the code which is currently under development to some temporary location, while the archived code takes its place for crash dump analysis.
Is there some simple way (i.e. without setting up symbol and source servers) to direct Visual Studio to access the debugging context in some location other than the original build location?
What you need is a symbol server or at least a directory that has the same structure. If you have TFS, you just might need to configure it correctly.
If not, you have the following options:
a) add symbols manually for each delivered version using symstore
b. add symbols automatically for each build using symstore in a post build step
c) either of a) or b) and publish the result onto a webserver that acts as a HTTP symbol server.
You can do a) or b) if you're working alone. You should really consider c) if you're working in a team.
The things are not so simple and Stack Overflow is not thought for writing fully-fledged tutorials. Therefore I give you the following hints:
You need to understand that a symbol path can have several tiers. You are currently using a 0-tier symbol store, which is a flat directory. This is the worst option. Good news: if you have the symbols, you can still get other tier types set up.
Once you understood point 1. about the tiers and want to go for option c) without TFS, build an HTTP server.
IMHO you should find all necessary information in How to get a symbol server set up. If you don't want it on the network, you can also put it on local disk.

How to obtain more information from SRCSRV?

I'm working on a custom symbols/source server.
I've been able to produce pdb files which reference our sources. The most of our sources can be retrieved by Visual Studio. But sometimes, SRCSRV fails to retrieve them.
If I inspect the Visual Studio output window, I can get the following message
SRCSRV: Source server cannot retrieve the source code for file 'e:\SoftwareFactory\Projects\Product.Net Trunk\WorkingDirectory\Services\ErpWebServices\ErpServiceLegacyHost\Threading\ErpTransactionsSynchronizationContext.cs' in module 'C:\Program Files (x86)\Product\ProductCommon\ePgiStarterCS\server\Product.Erp.Services.LegacyHost.dll'. Données non valides.
The web server hosting the sources hasn't received any request for such file. So this must be an issue in the record concerning this precise file.
Is there any way to get more information by SRCSRV?
Apparently the answer is no: there is no way to obtain more information from srcsrv.dll.
The Microsoft forums moderators have told me that the message "Données non valides", "Invalid data" is possibly related to the length of the path of the file to be downloaded.
This path is combined to the temporary symbol path you've specified in the Visual Studio/WinDBG settings.
e.g. If you've specified
%APPDATA%\Symbols
as local symbol storage, and you're downloading a source file hosted by an HTTP server at the address
http://nightlybuilds.int/sources/get.svc/path/file.cs
the path
%APPDATA%\Symbols\sources\get.svc\path\file.cs
shall not be longer than 255 characters.
Other factors that can affect the behavior of SRCSRV:
The presence of characters invalid in a classic dos PATH (i.e. other than [0-9 A-Za-z\.])
The client debugger settings. (e.g. In the Native mode, the symbols for Managed code won't be downloaded. The Modules window will give you some hint about the symbols currently loaded.)
This is where SymChk can help. Using the /v switch, you get detailed output on how the symbols are resolved and what the symbol server responds with. Use this in conjunction with a tool like Fiddler which captures HTTP traffic, and you can analyze where your server is not responding with the expected protocol.

Web folder options: are all webdav servers created equal?

I have implemented a webdav using PHP on Apache. However, I'm having some issues when testing it with XP web folders.
I notice that when I right-click on any folder, the 'new' option only contains an option for 'folder' i.e., I can only create new folders and not files. Also, when I right-click on a file, I can only see an option for 'open', which presumably will open the file using its associated program. The 'Open With' option is not available. Furthermore, even opening the file usually brings up my browser trying to open it, and not the associated program. Finally, even when I get a program like MS Word to open a file, I am unable to save it in-place.
I believe web folders on windows supports all these features, just like windows explorer. Interestingly, when I access the test webdav server at www.ajaxfilebrowser.com with web folders, I get all these features, which leads me to suspect the issue involves my implementation of webdav. However, if all required webdav methods have been implemented, what differentiates one webdav server from another? Are there some properties that web folders uses to determine what options to enable?
Things to look for:
OPTIONS response "Allow" and "Dav" headers
support for LOCK (may be required for writing)
media types
In doubt, capture HTTP traces and compare.
Most likely your LOCK management. Required by most clients or they will operate in "read only" mode.

Resources