In Azure Data Studio, queries are all disconnected - azure-data-studio

I am trying to move from SSMS to Azure Data Studio, primarily to take advantage of the easy git integration. In Azure Data Studio, all my queries (the .sql files) all show up; everything works as expected. It was easy to add github integration. But unlike SSMS, each query is disconnected at start. I've saved the collection of files as a workspace. The Server I want is shown as an active connection in the server list. Is there a way to default the queries to be connected at startup?

Unfortunately, that is not possible for the moment. There is already a similar issue in the GitHub repo to have a default connection for files in a folder/workspace. You can find it here.

Related

TFS Migration Risks

I would like to create a new installation of TFS 2013 on a new server.
I made my research and learnt that the migration process as it is described on this link below carries some risks:
TFS Migration Manual:
https://msdn.microsoft.com/en-us/library/ms404869.aspx
Risks:
http://blogs.msmvps.com/p3net/2014/04/12/tfs-upgrade-nightmares/
I have a plan to avoid using the TFS Migration manual above, instead; I would instead check all of my projects out (about 20) and then re-create them on the new TFS and check them in again.
However, we have work-items, users, workspaces and other agile information which I have created for my projects, and which I still require to be on the new installation.
I was wondering whether the following works (again without risks and hassle, as time is scarce):
Back up the TFS Databases from the old installation, and restore them into the new installation or simply import the data from old to new using SQL Server's Data Import Tool.
I am particularly referring to these databases, which TFS has:
Tfs_Configuration; Tfs_DefaultCollection; Tfs_Warehouse.
I found these databases on the SQL Server instance which TFS uses.
Also, this approach works easier without having to obstruct the team, as the Data Base Resotation can occur after hours..
Now, will this plan work?
No, your plan will not work and will leave your TFS in an unsupported state.
You need to follow a combination of the Upgrade and "changing environment" workflow.
1) Restore all TFS databases (tfs_*) to tye new environment
2) Install TFS 2015
3) Configure and select Upgrade Wizard - when running make sure you have all the new server names
4) (optional) ChangeServerID - if this is a practice run you should then immediately:
4.1) I unconfigure the application tier with "tfsconfig exe setup /uninstall:all"
4.2) run the ChangeServerID command
4.3) reconfigure tfs and run the "app tier only" wizard
Simples....
Note: You need to change the server ID if this is a test/practice instance as each server gets a unique ID. When clients first connect to the new server they will "upgrade/migrate" the users data across. You don't want that happening for a trial...so change the ID...
WARNING: If you manipulate the data in the TFS server in any way that is not done by the TFS Product Team tools you will turn your instance to crap. Do not ever edit, or cause to edit, the data in the operational store.

Managing connection strings in source controlled application that is continuously deployed to Azure websites

Here's the scenario: I have multiple developers on an asp.net mvc 4 project. Each developer has a local database. The source control system is TFS at http://tfs.visualstudio.com. We're using Azure websites to host the site. We have also configured Azure websites for continuous deployment.
The source control system could be git, mercurial, TFS, etc. Honestly, I don't think it matters.
My question is how to accomplish these three things:
Each developer has his/her own connection string(s) locally (without them being in source control)
Azure has its own connection string(s) (without it being in source control)
Source Control doesn't show any connection information
The ability for each developer to F5 and run/debug/test the app locally.
We accomplished #1 by adding our individual connection strings to our machine.config so that there's no conflict between developer workstation setups.
I originally removed the connectionstrings section from web.config. In the Azure website (using the management portal, under Configure), I configured the connection strings, and after watching a Scott Hanselman video was under the impression that these would be dynamagically merged into my web.config upon deployment, but that doesn't appear to happen. Whenever I go to any page that hits the db, I get an error saying can't find the connection string (or some other db error related to the connection)
If I put the Azure connection string directly in web.config, Things work on Azure, but then the connection details are in source control visible to everybody.
After reading some more posts from Scott and David Ebbo it seems that I could put a blank connection string in web.config (with the correct name) and then Azure will overwrite the values correctly. I would then have to have the developers put their connection strings in their web.debug.config and then install the Slow Cheetah plugin so that they could F5 and test locally. They would also have to not check in the web.debug.config into source control. (Not that easy with TFS) This seems like a seriously burdensome kludge, that's bound to fail somewhere along the line.
I have to believe that this isn't that rare of a problem. How do other teams accomplish this?
After looking around, it appears that what I was asking isn't actually supported without a bunch of command line hacks to the pre/post build process. What we ended up doing is forcing developers to all create their own local databases, use trusted authentication, and establish a SQL alias that was used by all developers in the web.config. That way, it works locally for everybody, it doesn't expose any user names/passwords within source control, and Azure can still overwrite it when automatically pulled from source control.
Slow Cheetah is actually a nice solution. It's an extension to web.config transformations. Those transformations let you keep one web.config file and then for each deployment scenario you specify which changes you want to it. For example, your Release configuration will probably remove the debug attribute.
This can also be used to change connection strings. The transformations are applied during the deployment of your project to Azure.
What I've done in the past to make this also work with local development machines is use a web.config with an externalized connections.config file. Each developer created a connection.machinename.config file that was copied to connection.config on build in the post-build step. Those files don't have to be checked in and they can never cause conflicts because each machine name is unique.
The release/staging/.. configurations used a web.config transformation to replace the connection string element with a specific connection string for that deployment (and that way remove the dependency on the external config file).
Slow Cheetah offers some nice helpers for checking the result of these transformations at design time.

Prevent accidental update of host database WebMatrix

I use WebMatrix for deployment of files and databases during development. I want to continue to use it after development for updating files on the host. I also want to synchronize my local database to the host database, but never the other way around. I am terrified of accidentally overwriting the database on the host.
How can I do what I want while safeguarding against accidental updates of the database on the host? Essentially, I am looking for a way to tell WebMatrix that the host database is read only and not to be updated.
You can link the development project to the live database. That way you can still develop the files offline while using the actual data to do it. If you link to the database WebMatrix won't attempt to update it as it will already be working with it. Yes, if you make a change for development purposes it will change it on the live site. However, if you are only developing the webpages this should not be a concern. To link to your database just go to the database workspace and click on the New Connection icon. Just remember any changes to the database in WebMatrix after that point will be immediately sent to your database.

How do you force the deletion of a TFS 2010 workspace on a client when the TFS Server no longer exists?

I currently have a TFS 2010 Server running on SERVER-1. On my client (MY-CLIENT) I have VS2010 running and have a workspace associating SERVER-1 with \MY-CLIENT\Development. All is good.
I was playing around with setting up a different instance of TFS on SERVER-2. On my client, I deleted the original SERVER-1 workspace and created a new workspace associating SERVER-2 with \MY-CLIENT\Development. All is good.
Having finished my experiments with TFS on SERVER-2, I re-imaged the machine (deleting the TFS Server on SERVER-2).
I then went back to my client machine, reconnected to TFS on SERVER-1 and attempted to remap source control to my Development folder. However, am now receiving the error "The path \MY-CLIENT\Development is already mapped in workspace MY-CLIENT;SERVER-2\Steve." Now I have a problem.
So, I gather from this that I should have first deleted the SERVER-2 workspace BEFORE re-imaging the machine. Unfortunately, I did not do that.
Poking around in some forums, I realize that I can use a command line tool to perhaps delete it:
tf workspace /delete MY-CLIENT;SERVER-2\Steve
However, when I run this, I get a message indicating that "Team Foundation services are not available from server http://SERVER-2:8080/tfs/development."
So the question, then, is how do I force deletion of the SERVER-2 workspace on my client so that I can re-create my old SERVER-1 workspace?
The working folder mappings for all the local workspaces is stored in the version control cache file. This allows you to bootstrap TFS clients, allowing them to locate the server information for a given local folder. In addition, it will provide the information for this test you're seeing, that prevents a local folder from being mapped to two different servers.
In order to clean this up (without trying to connect to the server), you can use the tf workspaces command (note the pluralization - the workspaces command operates on the list of workspaces, the workspace command operates on a workspace and generally requires connectivity to the server that workspace is located on.
To delete all workspaces for your deleted project collection, you can do:
tf workspaces /remove:* /collection:http://server-2:8080/tfs/DefaultCollection
(Obviously replacing the project collection URI with the URI for your deleted server.)
I had exactly the same issue: After moving TFS server to another machine, I couldn't map to a local folder in VS2012 on the old machine because it was still associated with an old Workspace that TFS denied all existence of. After many hours (and days) searching Google and trying different things, none of which worked (including all the "tf" commands, deleting the local cache etc), this is how I eventually solved it:
Edit the actual TFS collection database on the TFS server using SQL Management Studio Express (e.g. "Tfs_DefaultCollection")
Look for the "dbo.tbl_Workspace" table and edit it
You should see your "ghost" workspace(s) in here
Delete the rows
All is right in the world
The workspaceowner parameter on the delete command is optional. Can you issue the delete without that parameter, or will that damage another MY-CLIENT workspace?

Attached SQL Express DB is causing problems

I have been asked to create an MVC web application in VS 2010, and was instructed to use a SQL express database for my data. I am using EF Code-First for creating and managing my data. The database was created in VS2010, and is attached via "AttachDBFilename" in the web.config.
I have used SQL CE before with MVC with no problems, however the attached SQL Express DB is causing weird issues.
For one thing, when I try to deploy the app, it fails and tells me that it cannot copy the database.mdf because it is in use by another process. I have NOT opened the database in VS2010 nor SSMS. Of course the program code accesses it - is there some reason that connection would remain open? I am using boilerplate code from the scaffolding.
I should mention that I use a ProjectInitializer.cs to create the sample data. It runs at every launch for the moment, since I am testing quite a bit.
The other problem I have is that if I delete the database, it fails to recreate it. It says that my windows account does not have access to the (now non-existent) database that it is trying to create. I literally have to create a new database with new name, as anything that was created previously (with that DB name) fails.
I assume there is some sort of residual info being left somewhere that is out of synch, but I don't know what it is. I've closed all connections to the file in VS 2010, deleted the files, both any found via VS2010 and any physical files I see in the app_data directory.
Any help or suggestions would be appreciated.
Shut down the web server (Cassini, IIS, IIS Express) and try again. The file can remain locked if the web process is still referencing the file. In addition the loaded EF context will retain the db name. Ensure the visual studio browser isn't running in the tray still either.

Resources