I use WebMatrix for deployment of files and databases during development. I want to continue to use it after development for updating files on the host. I also want to synchronize my local database to the host database, but never the other way around. I am terrified of accidentally overwriting the database on the host.
How can I do what I want while safeguarding against accidental updates of the database on the host? Essentially, I am looking for a way to tell WebMatrix that the host database is read only and not to be updated.
You can link the development project to the live database. That way you can still develop the files offline while using the actual data to do it. If you link to the database WebMatrix won't attempt to update it as it will already be working with it. Yes, if you make a change for development purposes it will change it on the live site. However, if you are only developing the webpages this should not be a concern. To link to your database just go to the database workspace and click on the New Connection icon. Just remember any changes to the database in WebMatrix after that point will be immediately sent to your database.
Related
I am trying to move from SSMS to Azure Data Studio, primarily to take advantage of the easy git integration. In Azure Data Studio, all my queries (the .sql files) all show up; everything works as expected. It was easy to add github integration. But unlike SSMS, each query is disconnected at start. I've saved the collection of files as a workspace. The Server I want is shown as an active connection in the server list. Is there a way to default the queries to be connected at startup?
Unfortunately, that is not possible for the moment. There is already a similar issue in the GitHub repo to have a default connection for files in a folder/workspace. You can find it here.
I would like to create a new installation of TFS 2013 on a new server.
I made my research and learnt that the migration process as it is described on this link below carries some risks:
TFS Migration Manual:
https://msdn.microsoft.com/en-us/library/ms404869.aspx
Risks:
http://blogs.msmvps.com/p3net/2014/04/12/tfs-upgrade-nightmares/
I have a plan to avoid using the TFS Migration manual above, instead; I would instead check all of my projects out (about 20) and then re-create them on the new TFS and check them in again.
However, we have work-items, users, workspaces and other agile information which I have created for my projects, and which I still require to be on the new installation.
I was wondering whether the following works (again without risks and hassle, as time is scarce):
Back up the TFS Databases from the old installation, and restore them into the new installation or simply import the data from old to new using SQL Server's Data Import Tool.
I am particularly referring to these databases, which TFS has:
Tfs_Configuration; Tfs_DefaultCollection; Tfs_Warehouse.
I found these databases on the SQL Server instance which TFS uses.
Also, this approach works easier without having to obstruct the team, as the Data Base Resotation can occur after hours..
Now, will this plan work?
No, your plan will not work and will leave your TFS in an unsupported state.
You need to follow a combination of the Upgrade and "changing environment" workflow.
1) Restore all TFS databases (tfs_*) to tye new environment
2) Install TFS 2015
3) Configure and select Upgrade Wizard - when running make sure you have all the new server names
4) (optional) ChangeServerID - if this is a practice run you should then immediately:
4.1) I unconfigure the application tier with "tfsconfig exe setup /uninstall:all"
4.2) run the ChangeServerID command
4.3) reconfigure tfs and run the "app tier only" wizard
Simples....
Note: You need to change the server ID if this is a test/practice instance as each server gets a unique ID. When clients first connect to the new server they will "upgrade/migrate" the users data across. You don't want that happening for a trial...so change the ID...
WARNING: If you manipulate the data in the TFS server in any way that is not done by the TFS Product Team tools you will turn your instance to crap. Do not ever edit, or cause to edit, the data in the operational store.
Here's the scenario: I have multiple developers on an asp.net mvc 4 project. Each developer has a local database. The source control system is TFS at http://tfs.visualstudio.com. We're using Azure websites to host the site. We have also configured Azure websites for continuous deployment.
The source control system could be git, mercurial, TFS, etc. Honestly, I don't think it matters.
My question is how to accomplish these three things:
Each developer has his/her own connection string(s) locally (without them being in source control)
Azure has its own connection string(s) (without it being in source control)
Source Control doesn't show any connection information
The ability for each developer to F5 and run/debug/test the app locally.
We accomplished #1 by adding our individual connection strings to our machine.config so that there's no conflict between developer workstation setups.
I originally removed the connectionstrings section from web.config. In the Azure website (using the management portal, under Configure), I configured the connection strings, and after watching a Scott Hanselman video was under the impression that these would be dynamagically merged into my web.config upon deployment, but that doesn't appear to happen. Whenever I go to any page that hits the db, I get an error saying can't find the connection string (or some other db error related to the connection)
If I put the Azure connection string directly in web.config, Things work on Azure, but then the connection details are in source control visible to everybody.
After reading some more posts from Scott and David Ebbo it seems that I could put a blank connection string in web.config (with the correct name) and then Azure will overwrite the values correctly. I would then have to have the developers put their connection strings in their web.debug.config and then install the Slow Cheetah plugin so that they could F5 and test locally. They would also have to not check in the web.debug.config into source control. (Not that easy with TFS) This seems like a seriously burdensome kludge, that's bound to fail somewhere along the line.
I have to believe that this isn't that rare of a problem. How do other teams accomplish this?
After looking around, it appears that what I was asking isn't actually supported without a bunch of command line hacks to the pre/post build process. What we ended up doing is forcing developers to all create their own local databases, use trusted authentication, and establish a SQL alias that was used by all developers in the web.config. That way, it works locally for everybody, it doesn't expose any user names/passwords within source control, and Azure can still overwrite it when automatically pulled from source control.
Slow Cheetah is actually a nice solution. It's an extension to web.config transformations. Those transformations let you keep one web.config file and then for each deployment scenario you specify which changes you want to it. For example, your Release configuration will probably remove the debug attribute.
This can also be used to change connection strings. The transformations are applied during the deployment of your project to Azure.
What I've done in the past to make this also work with local development machines is use a web.config with an externalized connections.config file. Each developer created a connection.machinename.config file that was copied to connection.config on build in the post-build step. Those files don't have to be checked in and they can never cause conflicts because each machine name is unique.
The release/staging/.. configurations used a web.config transformation to replace the connection string element with a specific connection string for that deployment (and that way remove the dependency on the external config file).
Slow Cheetah offers some nice helpers for checking the result of these transformations at design time.
Every time I want to add new code to my site I have to modify the file outside of users view to debug it before updating the real file users see.
I usually create a copy of the file I want to change and test all changes on it but sometimes this files only appear included on another and I have to create two copies and sometimes it becomes even more complicated.
How is this normally done? Are there any tools to simplify the process, for example and enviroment to test my site on my PC so I don't have to upload files to the server each time I update something. Any info about beta testing new features will be thanked.
Most people have a 2nd server (potentially a virtual machine) configured exactly the same as their live (production) website. Where this 2nd server is located is completely up to you, but it should match your live site by using the same versions of software and same file structure.
I also like the idea of a staging server suggested by Sean. Again, your post doesn't say too much about your production web server and all of the features that you're using (are you running scripts on the server? PHP? some version of SQL?). But for a simple setup, you can run a copy of the Apache web server on your own PC, or something a little more lightweight like the XAMPP server.
I have been asked to create an MVC web application in VS 2010, and was instructed to use a SQL express database for my data. I am using EF Code-First for creating and managing my data. The database was created in VS2010, and is attached via "AttachDBFilename" in the web.config.
I have used SQL CE before with MVC with no problems, however the attached SQL Express DB is causing weird issues.
For one thing, when I try to deploy the app, it fails and tells me that it cannot copy the database.mdf because it is in use by another process. I have NOT opened the database in VS2010 nor SSMS. Of course the program code accesses it - is there some reason that connection would remain open? I am using boilerplate code from the scaffolding.
I should mention that I use a ProjectInitializer.cs to create the sample data. It runs at every launch for the moment, since I am testing quite a bit.
The other problem I have is that if I delete the database, it fails to recreate it. It says that my windows account does not have access to the (now non-existent) database that it is trying to create. I literally have to create a new database with new name, as anything that was created previously (with that DB name) fails.
I assume there is some sort of residual info being left somewhere that is out of synch, but I don't know what it is. I've closed all connections to the file in VS 2010, deleted the files, both any found via VS2010 and any physical files I see in the app_data directory.
Any help or suggestions would be appreciated.
Shut down the web server (Cassini, IIS, IIS Express) and try again. The file can remain locked if the web process is still referencing the file. In addition the loaded EF context will retain the db name. Ensure the visual studio browser isn't running in the tray still either.