Publishing my local site to an Azure profile with web deploy method. If I delete a file and publish, the file remains on the server. I've seen similar versions of this question with answers suggesting either to check the 'delete all files when publishing' (not an option in VS 2017) or to delete bin and obj folders and re-publish (tried several times).
Any other suggestions on getting files to sync when publishing?
You could right-click your project -> publish, then follow the screenshots below to check the option Remove additional files at destination.
Related
We are experiencing bad slowdowns in Visual Studio 2019 that appears partly due to a large folder of content (~12,000 files) that we have in our wwwroot folder. This content rarely changes, but it ends up getting searched when we do "Find in files...", etc. which we don't need, and ends up slowing down normal operations such as adding/renaming classes.
Is there any way to keep the content in source control, exclude it from Visual Studio, yet still have it get deployed when we publish?
I haven't been able to figure out if this is possibly editing the csproj file using the settings such as DefaultItemExcludes, or various options on the ItemGroup element.
Here is what we have done to (hopefully) resolve this issue for us:
Added the following line to our WebApp.csproj file within the PropertyGroup:
<ProjectGroup>
<DefaultItemExcludes>$(DefaultItemExcludes);wwwroot\hugecontentfolder\**</DefaultItemExcludes>
</ProjectGroup>
The huge content folder stays where it is in source control, this simplifies things as we need those files there to be served locally for development
We updated our Azure DevOps pipelines with new tasks to copy the contents of that folder from source control into the build artifact staging directory
Updated the Dotnet Publish Azure Devops task to no longer zip it's output
Another new Azure DevOps task to Archive the build artifact staging directory (which now has the non-zipped dotnet publish output, as well as the huge content folder output in the correct location) into a zip file for publishing.
We have been using the ftp publishing method to upload our site to an external server. But have recently discovered that some of the JavaScript files that have been changed are not being updated when published. We have checked on the server to verify if the files have been uploaded and only found the old versions, so it's not a cache issue.
Additional Information:
We are using Visual Studio Team Services
We are using ASP.NET MVC.
We have the 'Exclude files from the App_Data folder' File Publish
Option checked.
We are not sure what is causing this to happen but suspect that the issue might be caused by source control. That the files that have been worked on and checked in on one machine are not seen as having been changed by the Visual Studio on the machine doing the publication.
We've found this question Content files not updating with Visual Studio 2010 FTP Publish that seems to relate to the issue we're having but would prefer not to have to use the workarounds provided.
Is there a other way to fix or avoid this issue?
I've been experiencing this off-and-on with file system publishing. Out of pure desperation I created a new publish profile and...it worked.
It appears to be related an issue with the .user file that's created for the publish profile. While not a solution, a workaround at least can be to remove the .user file if a problem is encountered.
Related question: Visual Studio 2012 Web Publish doesn't copy files
I am using Team Foundation Server 2013 and have the nightly build configured to deploy a web application. The web application is making use of the web API help pages which depend on the built in XML documentation files.
I currently have these XML files being output to the App_Data folder. These are not being copied to the server during deployment. I tried checking in the documentation files but when the build process tried to regenerate them it caused an access error as the files are read-only.
I currently have a placeholder text file inside the App_Data folder included in the project to ensure that the folder gets created but I have to manually copy across the documentation files in order for the help pages to work as intended.
What is the correct/best way of forcing these files to be copied?
Thanks
Ensure the project build order has the docs being built first. Right click on the solution and choose Project Build Order
Then add some MSBuild logic in your pubxml or wpp.targets file to add the generated help files to the FilesForPackagingFromProject ItemGroup.
http://sedodream.com/2012/10/09/VSWebPublishHowToIncludeFilesOutsideOfTheProjectToBePublished.aspx
Open the solution, right click on the .xml file, click Properties, for 'Copy to Output Directory' select 'Copy Always'.
I am using a TeamCity build that has been working for the past several years on IIS 7.5 installations. I have recently upgraded to IIS 8 and I am finding that the App_Data directory is not being deployed when I execute my MsBuild script.
Our TeamCity build deletes the entire IIS site directory contents during each publish to ensure that we start with a clean slate and don't have an lingering files. I need to continue to do this but now all of the sudden I can longer automatically push out the App_Data folder during the publish step. I have even tried adding a dummy Placeholder.txt file to the folder (and set the BuildAction to content in Visual Studio) but the App_Data folder still does not appear on my web server. Any help would be greatly appreciated.
So... it turns out that all I had to do was add some content to the Placeholder.txt files. Seems like the deployment tools were skipping the files since they were empty.
I am working on an extension for visual studio to update a project.
The situation is a follows:
We create a new project from a template.
We put the solution in tfs
We change the project which was used to create the template. The project on the tfs server still needs te be updated.
We publish the project to a folder. I now want to update the files in the TFS repo with the current files in the publish folder
I thought of the following approach:
In the application we have all the project stored that are on the TFS server. When I want to update a specific project I select the project from the database. Next I load this project from the TFS server and create a new checkout for this project. So far so good. The problem arise when I copy the files from the published folder to the new checkout. Files that are already registered in the workspace get marked as changed but the files that are new are not added.
The only solution I can think of is to add all the files through the TFS SDK. But this seems pretty heavy to me.
So I got a couple of questions:
Is this the right approach to update
the project?
Is there any other way to add the files to the workspace instead of adding all files through the TFS SDK?
Thanks!
Don't do a blind copy of the files. Instead, have your program iterate through them one by one.
First, update your workspace with the latest from TFS.
Then, for each file in the source directory:
If the file exists in the target directory, "pend edit" the file, then copy it.
If the file does not exist, copy the file, then "pend add" the file.
When you've finished, check in all pending changes in the workspace.