Every time I want to add new code to my site I have to modify the file outside of users view to debug it before updating the real file users see.
I usually create a copy of the file I want to change and test all changes on it but sometimes this files only appear included on another and I have to create two copies and sometimes it becomes even more complicated.
How is this normally done? Are there any tools to simplify the process, for example and enviroment to test my site on my PC so I don't have to upload files to the server each time I update something. Any info about beta testing new features will be thanked.
Most people have a 2nd server (potentially a virtual machine) configured exactly the same as their live (production) website. Where this 2nd server is located is completely up to you, but it should match your live site by using the same versions of software and same file structure.
I also like the idea of a staging server suggested by Sean. Again, your post doesn't say too much about your production web server and all of the features that you're using (are you running scripts on the server? PHP? some version of SQL?). But for a simple setup, you can run a copy of the Apache web server on your own PC, or something a little more lightweight like the XAMPP server.
Related
We've recently launched a new website in Azure (i.e. Azure Websites) and as is typical with new launches we've had to deploy a few tweaks to fix minor issues shortly after launch.
We want to use Slots in the long run but this is not possible at the moment. Hence we are deploying to the live site. It's a fairly busy site with a good amount of traffic and obviously want to keep downtime to am minimum.
We are using Visual Studio to publish file changes to Azure but have noticed that even if we publish a relatively insignificant single file the whole site goes down and struggles to come back up. I was assuming that publishing a single file would literally just replace that file on the file system but it's behaving more like it recycles the application pool (or Azure equivalent) for the site. The type of files I've been publishing have been Razor views, hence would not typically cause a recycle.
Does anyone know what actually happens under the hood of VS Publish and if there is a way to avoid this happening?
Thanks.
I just tried this using a basically clean new MVC app (https://github.com/KuduApps/Dev14_Net46_Mvc5), and I did not see this behavior. The Index.html view has a hit count based on a static, which would tell us if the app or the page got restarted (or if that specific page got recompiled).
Then the test is to publish it, make a change to some other view (about.cshtml), and publish again. WHen doing this and hitting Index.cshtml, the count keeps going up, and there is minimal slowdown.
If you see it getting restarted after a view change, I suggest using Kudu Console to look at the files in site\wwwroot before/after the publish, and check what has a newer timestamp (e.g. check web.config, bin folder, ...).
We are new to Windows azure and are developing a Sitefinity web application. In the beginning of the project , we have deployed complete code using Sitefinity Thunder to different environments which actually publish complete code. But now as we are in the middle of development , we are just required to upload any new files created which can be quite less in numbers (1 or 2 or maybe few). Now if we deploy with thunder , it publishes all files and then deploys complete code which takes good amount of time. Is there a way we can deploy only changed or new code files via sitefinity thunder or is there any other way with which we can only upload only the changed files?
Please help.
I use Beyond Compare 3 from scooter software to move files to our different Sitefinity environments. I haven't used Sitefinity Thunder to deploy my sites before. Also, you might want to post your question on the Sitefinity Devs group on Google+. Below is the link.
https://plus.google.com/communities/101682685148530961591
This is not easy to do and Azure is not designed for this although many people have requested this feature. The one way to achieve it is to enable Remote Desktop for the cloud service and then by logging onto the server, you can then make some kind of connection to where your files are stored and copy them into the cloud service. However, it is always possible that the instance will be rebooted and even re-provisioned from fresh so I don't know if there are any guarantees that this is a safe way to do it.
I am in an interesting situation where I maintain the code for a program that is used and distributed primarily by our sister company. We are ready to distribute the program to all of the 3rd party users and since it is technically our sister companies program, we want to host it on their website. (in the interest of anonimity, I'll use 'program' everywhere instead of the actual application name, and 'www.SisterCompany.com' instead of their actual URL.)
So I get everything ready to go, setup the Publish setting to check for updates at program start, the minimum required version, and I set the Insallation Folder URL and Update Location to "http://www.SisterCompany.com/apps/program/", with the actual Publishing Folder Location as "C:\LocalProjects\Program\Publish\". Everything else is pretty standard.
After publish, I confirm that everything installs and works correctly when running directly from the publish location on my C: drive. So I put everything on our FTP server, and the guy at our sister company pulls it down and places everything in the '/apps/program/' directory on their webserver.
This is where it goes bad. When I try to install it from their site, I get the - File, Program.exe.config, has a different computed hash than specified in manifest. Error. I tested it a bit, and I even get that error trying to install from any network location on our network other than my local C: drive.
After doing the initial publish in visual studio, I have changed no files (which is the answer/reason I've found by doing some searching about this error).
What could be causing this? Is it because I set the Installation Folder URL to a location that it isn't initially published too?
Let me know if any additional info is needed.
Thanks.
After bashing my head against this all weekend, I have finally found the answer. After unsigning the project and removing the hash on the offending file (an xml file), I got the program to install, but it was giving me 'Windows Side by Side' Errors. I drilled down into the App Cache were the file was, and instead of a config .xml file, it was one of the HTML files from the website the clickonce installer was hosted on. Turns out that the web server didn't seem to like serving up an .XML (or .mdb it turns out) file.
This MSDN article ended up giving me the final solution:
I had to make sure that the 'Use ".deploy" file extension' was selected so that the web server wouldn't mangle files with extensions it didn't like.
I couldn't figure out why that one file's hash would be different. Turns out it wasn't even the same file at all.
It is possible that one of the FTP transfers is happening in text mode, rather than binary?
For me the problem was that .config transformations were done after generating manifest.
To anyone else who's still having trouble, five years later:
The first problem was configuring the MIME type, which on nginx (/etc/nginx/mime.types) should look like this:
application/x-ms-manifest application
See Click Once Server and Client Configuration.
The weirder problem to me was that I was using git to handle the push to the server, i.e.
git remote add live ssh://user#mybox/path/to/publish
git commit -am "committing...";git push live master
Works great for most things, but it was probably being registered as a "change," which prevented the app from installing locally. Once I started using scp instead:
scp -r * user#mybox/path/to/dir/
It worked without a hitch.
It is unfortunate that there is not a lot of helpful information out there about this.
I'm using Publish/Web Deploy to deploy an asp.net aplication from Visual studio 2010. It works perfect, but there is a problem. If the new release is not working as expected, the old version is already replaced by the new one and there is no easy way to roll back to the working version. How is this best solved? I wish it was possible to keep the old version on the server so I could just switch back if needed.
With WebDeploy there is no built in rollback feature, so once you've deployed that's it.
There's a number of hand rolled strategies you could put in place, for example:
Limited Access e.g. Shared Hosting:
Where you don't have full access to the machine -
Backup the live site beforehand by downloading it.
Keep copies of what you deployed so you can push the previous version should something break
Full Access:
Maintain two sets of folders for the application and map your site to one or other of these folders. When you come to deploy switch the IIS site's physical path to the other folder then deploy. If the site fails then just knock the site back to the original folder. Each successful deploy would alternate between these two folders.
For stuff like user uploaded content you'd need to map virtual directories to a place on the file system that's always the same place because you don't want to be copying these around each time.
You're not the only one who has encountered these issues. Have a look at this article by Rob Conery and his observations about the state of affairs regarding ASP.NET deployment.
ASP.NET Deployment Needs To Be Fixed
Getting Constructive On ASP.NET Deployment
Using some form of Source Control would be another alternative. We use subversion, so if the publish goes bad, we can just update back to the last-good revision, and publish that. Even if you're the only developer, using source control can be very useful.
I want to create a component that will allow me to install other components, modules, and plugins that i personally use all the time. I will need to be able to change these modules, components, and plugins at anytime but updating the components and etc.. that i use and be able to add more plugins and etc as well. I would like this Component because it takes too much time to install them all individually and on multiple sites as a web designer. I also would need to have some instruction on how to add subtract plugins, modules, components, and etc. I am ok with not a total integration i would like to be able to just host the install file on my server with a link to my server where the file is located.
If anyone can help with this please do.
this is not a direct answer more of a personal workaround ( I do this on local host).
I create a site for example Joomlabase, when it asks for DB name call it Joomlabase then add my extensions
then when I need a copy
1) copy and paste the folder named Joomlabase in Windows Explorer to a new name.
2) go into Phpmydmin copy the joomlabase DB to the same name as the new site name.
3) DO a search and replace of Joomlabase to new site name in config.php file (there should be 5 changes) and your done.
For me it saves a lot of time because in admin alone I use at least 12 different extensions
There is a Joomla admin component called "Akeeba". It creates a snapshot of your files and database which you can easily deploy to another server. I use it often when pushing a new site to production from a QA server.
http://www.akeebabackup.com/download/akeeba-backup-core-for-joomla/index.html
Your question is way too broad, and the simple answer is that it would take much much much more work to maintain this 'super component' than you are currently spending simply installing the extensions separately when you need them.
The other answers here don't answer your question, but they provide some decent solutions to your actual problem.