ClickOnce Error "different computed hash than specified in manifest" when transferring published files - visual-studio

I am in an interesting situation where I maintain the code for a program that is used and distributed primarily by our sister company. We are ready to distribute the program to all of the 3rd party users and since it is technically our sister companies program, we want to host it on their website. (in the interest of anonimity, I'll use 'program' everywhere instead of the actual application name, and 'www.SisterCompany.com' instead of their actual URL.)
So I get everything ready to go, setup the Publish setting to check for updates at program start, the minimum required version, and I set the Insallation Folder URL and Update Location to "http://www.SisterCompany.com/apps/program/", with the actual Publishing Folder Location as "C:\LocalProjects\Program\Publish\". Everything else is pretty standard.
After publish, I confirm that everything installs and works correctly when running directly from the publish location on my C: drive. So I put everything on our FTP server, and the guy at our sister company pulls it down and places everything in the '/apps/program/' directory on their webserver.
This is where it goes bad. When I try to install it from their site, I get the - File, Program.exe.config, has a different computed hash than specified in manifest. Error. I tested it a bit, and I even get that error trying to install from any network location on our network other than my local C: drive.
After doing the initial publish in visual studio, I have changed no files (which is the answer/reason I've found by doing some searching about this error).
What could be causing this? Is it because I set the Installation Folder URL to a location that it isn't initially published too?
Let me know if any additional info is needed.
Thanks.

After bashing my head against this all weekend, I have finally found the answer. After unsigning the project and removing the hash on the offending file (an xml file), I got the program to install, but it was giving me 'Windows Side by Side' Errors. I drilled down into the App Cache were the file was, and instead of a config .xml file, it was one of the HTML files from the website the clickonce installer was hosted on. Turns out that the web server didn't seem to like serving up an .XML (or .mdb it turns out) file.
This MSDN article ended up giving me the final solution:
I had to make sure that the 'Use ".deploy" file extension' was selected so that the web server wouldn't mangle files with extensions it didn't like.
I couldn't figure out why that one file's hash would be different. Turns out it wasn't even the same file at all.

It is possible that one of the FTP transfers is happening in text mode, rather than binary?

For me the problem was that .config transformations were done after generating manifest.

To anyone else who's still having trouble, five years later:
The first problem was configuring the MIME type, which on nginx (/etc/nginx/mime.types) should look like this:
application/x-ms-manifest application
See Click Once Server and Client Configuration.
The weirder problem to me was that I was using git to handle the push to the server, i.e.
git remote add live ssh://user#mybox/path/to/publish
git commit -am "committing...";git push live master
Works great for most things, but it was probably being registered as a "change," which prevented the app from installing locally. Once I started using scp instead:
scp -r * user#mybox/path/to/dir/
It worked without a hitch.
It is unfortunate that there is not a lot of helpful information out there about this.

Related

VScode: how to setup for local edit and ftp-deplyment

I used to use Dreamweaver. I've a huge Classic ASP website. I edit the files on my local system, and when done, I can upload the file(s) via ftp to the remote webserver. Now, I try to switch to VSCode. I've installed ftp-simple, ftp-sync and deploy. But can't find the set-up to get a Dreamweaver like behaviour. Eg, I have to locate for each file I want to upload/deploy, the exact location in the remote file tree.
I really feel like deploy deserves more attention. I spent the past 4 days or so to find an extension that does just that. Auto-upload to an ftp-folder from a local folder. I wanted to make git work for my website, but couldn't get that to work on the server with ftp-simple or ftp-sync because those extensions only download the opened files or open in a different temporary folder each time. I set up deploy now and got exactly what I wanted thanks to your tiny comment, thank you!
(I'm sorry if this post is too old to comment on, but I browsed Stack overflow for days to find this, so I thought it might help others in the future to point this out.)
it sounds like your just missing your mapping configuration. Most text editor FTP packages include a configuration file where you specify the server, your credentials, and the root folder of your ftp server. Have you specified this?

Logging into TFS on a Mac

I got Team Explorer Everywhere so we can use TFS on the Mac Mini we got to test Iphone apps. Since we're using XCode for phonegap, we need to use the commandline program and it is giving me a lot of grief.
What I've done so far (Listing out for anyone who stumbles on this so they can use it):
-Downloaded the trial (free)
-Set the path using PATH=$PATH\:/FOLDERLOCATION
-Accepted EULA and got trial product key... for command line program (tf eula/tf productkey -trial)
-Set up workspace:
tf workspace -new WORKSPACENAME -server:http://SERVERNAME:PORT/FILEPATH -comment:"WORKSPACENAME" && prompted for username -> domain -> password
-Trying to setup the folder path (Fixed):
tf workfold -map SERVERFOLDERPATH LOCALFOLDERPATH -collection:http://SERVERNAME:PORT/FILEPATH -workspace:WORKSPACENAME && prompted for username -> domain -> password
-Make sure I can check out/check in (On hold):...
The error I'm getting right now is "An argument error occurred: First free argument must be a server path." This is what I've been following ever since I got the path set, but I think the versions are different because mine doesn't seem to be set up the same. Any help at all would be appreciated, and I'll keep up with the post as I figure parts out because there doesn't seem to be much online that I can find on TFS on macs.
Update: As normal, I'm an idiot. Have to put the options at the end of the command and have to have the serverfolder path as the first thing after -map. Now I just need to figure out how to use the damn thing. I'll post any other questions I have and try to get all the correct commands up for the selfish reason of having them somewhere in case I forget them later.
Update 2: The mapping hasn't worked out as well as I'd hoped, it seems a combination of my unfamiliarity with Unix/Mac file systems and some settings being missing is keeping me from using 'tf get' to load all of the test data I was trying to get. I'm planning on trying again after I get the location of where my boss wants the data saved and after I can look into something that would save the workspace so it won't say that it can't find the map path every time...
It looks like you're setting up your workspace and some working folder mappings just fine, after the edit. If you're having problems doing a tf get after this, then there are some common problems that might be occurring. TFS workspaces can be a little bit opaque and having a better understanding of them can sometimes help you understand where the problem is:
Team Foundation Server requires a workspace to be configured before you can get files out of source control, edit them or check them back in. A workspace basically simply contains working folder mappings that map your local path(s) to server path(s).
Workspaces are stored on the server and are uniquely identified by your computer's hostname, your username and the workspace's name. A cache of this information for the local host is saved on the client. This implies:
If you remove a workspace on the server, your workstation will be unable to connect.
If you remove the cache, your local computer will not be able to identify the workspace based on working folder mappings until the cache is rebuilt (which happens every time you connect to the server.)
If you change your username or local workstation's name, you cannot access those workspaces.
(Note that very early versions of the Teamprise command line client had certain issues on Mac OS that made identifying the local workstation name difficult. This is fixed, however, in Team Explorer Everywhere.)
Because you can have multiple workspaces for a single server on a single workstation, you can't always simply provide server paths to tf commands, since server paths are ambiguous. ($/ exists in every workspace, for example.) So the command line client resolves paths based on the current working directory and/or the arguments provided. Meaning that you can run tf get foo.txt if you're in a working folder, or you can run tf get /tmp/foo.txt if /tmp is mapped.
One more point - the configuration data for Team Explorer Everywhere is shared between the TFS plug-in for Eclipse and the command line client. So if you're more comfortable using a GUI to set up your workspace(s), you can do that and then use the CLC as you see fit. You don't need to be a Java programmer to use Eclipse - simply download Eclipse and install the TFS plug-in for Eclipse into it, and select Window > Open Perspective > Team Foundation Server Exploring. After that, you'll have the full GUI Team Explorer experience and this perspective will be restored when you open Eclipse, so you won't even need to worry about the Java IDE bits if you don't want to.

I've been asked to deploy, but I cant make the magic happen

I've added a couple of lines to a file, let's say it builds to be foo.dll. It's part of more then one dll file, but it's the core dll. What I did was that I added a couple of lines so it should add some log data to the database. It should not affect any other files what so ever.
So i tried to deploy it. We don't have the magical one click deploy, we are just copying the right files to the right place.
So now, since i have a change in foo.dll, i thought for myself that i just could copy foo.dll and the server would be happy.
I was wrong. Browsing the website i now get "Generic Errors", dont know what that is. I've also tried to copy all the new dll files (4 i total) but that did not solve the problem either.
The error it gives are
Http Error 404.0 not found
Module: ISS Web Core
Notification: MapRequestHandler
Handler: Static File
Error Code: 0x80070002
replacing the new foo.dll with the old one solves the problem. and i've tried to restart the webserver. :-(
I asume you have "published" and not just "compiled" your Web?
You also need to take care off the "Solution Configurations": Debug and Release.
In a normal publish process you would change the configuration to release and publish your project into another folder.
After you have done that you just need to collect the desired files and upload them.
Keep in mind that you need the newest version of you web project. Maybe there are some changes online that your local project hasn'T. This would cause such problems.
We don't have the magical one click deploy
Why not? It's not magic, and it's pretty easy to set up. Get any continuous integration software (I would recommend BuildMaster since I am a developer for it and it's free now) and you'll never have this problem again.

Visual Source Safe - Removing files from web projects

I'll try to make this as straight forward as possible.
Currently our team has a VSS database where our projects are stored.
Developers grab the code and place on their localhost machine and develop locally.
Designated developer grabs latest version and pushes to development server.
The problem is, when a file is removed from the project (by deleting it in VS2008) then the next time another developer (not the one who deleted it) checks in, it prompts them to check in those deleted files because they still have a copy on their local machine.
Is there a way around this? To have VSS instruct the client machine to remove these files and not prompt them to check back in? What is the preferred approach for this?
Edit Note(s):
I agree SVN is better than VSS
I agree Web Application project is better than Web Site project
Problem: This same thing happens with files which are removed from class libraries.
You number one way around this is to stop using web site projects. Web Site Projects cause visual studio to automatically add anything it finds in the project path to the project.
Instead, move to Web Application Projects which don't have this behavior problem.
Web Site projects are good for single person developments.
UPDATE:
VB shops from the days gone past had similiar issues in that whatever they had installed affected the build process. You might take a page from their playbook and have a "clean" build machine. Prior to doing a deployment you would delete all of the project folders, then do a get latest. This way you would be sure that the only thing deployed is what you have in source control.
Incidentally, this is also how the TFS Build server works. It deletes the workspace, then creates a new one and downloads the necessary project files.
Further, you might consider using something like Cruise Control to handle builds.
Maybe the dev should take care to only check in or add things that they have been working on. Its kind of sloppy if they are adding things that they were not even using.
Your best solution would be to switch to a better version control system, like SVN.
At my job we recently acquired a project from an outsourcing company who did use VSS as their version control. We were able to import all of the change history into SVN from VSS, and get up and running pretty quickly with SVN at that point.
And with SVN, you can set up ignores for files and folders, so the files in your web projects dont get put into SVN and the ignore attributes are checked out onto each developer's machine
I believe we used VSSMigrate to do the migration to SVN http://www.poweradmin.com/sourcecode/vssmigrate.aspx
VSS is an awful versioning system and you should switch to SVN but that's got nothing to do with the crux of the problem. The project file contains references to what files are actually part of the project. If the visual studio project isn't checked in along with the changes to it, theres no way for any other developer to be fully updated hence queries to delete files when they grab the latest from VSS. From there you've got multiple choices...
Make the vbproj part of the repository. Any project level changes will be part of the commit and other developers can be notified. Problem here is it's also going to be on the dev server. Ideally you could use near the same process to deploy to dev as you would to deploy as release. This leads into the other way...
SVN gives you hooks for almost all major events, where hooks are literally just a properly named batch file / exe. For your purposes, you could use a post-commit hook to push the appropriate files, say via ftp, to the server on every commit. File problems solved, and more importantly closer towards the concept of continuous integration.
Something you may want to consider doing:
Get Latest (Recursive)
Check In ...
Its a manual process, but it may give you the desired result, plus if VS talks about deleted files, you know they should be deleted from the local machine in step 1.

How can I publish a subversion repository to a local IIS?

At work, we have a windows server 2003 with IIS and Subversion installed. We use it to publish and test locally
our ASP.NET websites. Every programmer has Tortoise installed on his PC and can update/commit content to the server. Hosting the repositories is working fine.
But the files kept in those repositories needs then to be copied to our local IIS (virtual directories).
What is an easy way to publish those subversion repositories to our local IIS?
Edit:
Thanks to puetzk I added a simple bat file that gets executed every time a commit occurs (check the subversion documentation about hooks). My bat file only contains:
echo off
setlocal
:: Localize the working copy where IIS points)
pushd E:\wwwroot\yourapp\trunk
:: Update your working copy
svn update
endlocal
exit
Just keep the web server's file area as a working copy, and perform an svn up in it whenever you want to "publish". Configure it to hide the contents of the .svn folders if they seem untidy to you (I don't specifically know how to do this, but I assume it can be done). They will already have the filesystem hidden bit, which may take care of this.
If you want it really automatic (updates as soon as someone commits), use a post-commit hook script on the SVN server to kick off the first process.
Others in the comments have suggested using export instead of checkout. That can work too, and avoids the .svn clutter, but has two drawbacks. One, it has to redownload the entire contents every time, not just the modified files (since it didn't keep the .svn dir to remember what it has). If you have a lot of files, this will be much slower. Two, update replaces the file atomically (writes the new version in .svn/tmp, then moves it into place). Export writes the file gradually into it's destination as it downloads. That means export could deliver an incomplete file to someone who browsed it at just the wrong time.
SVN doesn't support IIS; you can however run the standalone svnserve server as a windows service.
There's the SVN FAQ entry about it, and this blog post on Vertigo Software blog may be helpful too.
UPDATE:
After your clarification, I see that what you are looking for is a way to automatically update the code on the server after it's checked in. Look into CruiseControl.NET, after looking at the subversion integration tutorial it looks like it should do what you want.
UPDATE 2: This tutorial describes integrating Subversion, CruiseControl.NET and Nant.
maybe SVNIsapi can solve the problem (http://www.svnisapi.com). Cause it only utilizes an IIS installation, therefore you don't need an APACHE server or an SVNSERVER service. Secondly it should be possible to stack the ASP.NET ISAPI plugin onto the processing of SVNISAPI, so that a ASP.NET (.aspx) page will interpreted after read from the repository.
Cheers
Paolo
Use can use the free Visual-SVN Server to quickly install Subversion with Apache front end. It also have a nice MMC snap-in for managing the server and repositories.
You will than be able to access subversion with HTTP or HTTPS, but the port number must be different from the one your local IIS uses (default port for Visual-SVN server is 8080).
If you really need to access the repositories using your local IIS port 80, you can try SVN-IIS which acts as a bridge between your IIS and Apache. I haven't tried this one myself though.

Resources