Is it possible to recover an HSQLDB from the data file alone - teamcity

A delete . was executed on the folder containing a HSQLDB. The only file which was locked by the system (and thus not deleted) was the database.data file. Is it possible to recover the database from this file alone?

If the delete was done within the BuildServer directory itself and not specifically within the BuildServer/system directory you are out of luck since all the builds and their build step configurations are stored within BuildServer/config/projects.
The Database only stores build logs, changes, users and etc. but not the actual config. They are all XML based configs on the file system.
If the delete was done within BuildServer/system you may be able to start up a clean TC Instance to rebuild the BuildServer/system directory and then shut it down. Once its down switch out the buildserver.data files and bring it up again. (Trying to do this now but its taking forever to start up. If I find out more I'll edit).

Related

Safe to clean out C:\ProgramData\firebird folder when FB offline?

Is it safe to clean out the contents of the C:\ProgramData\firebird
folder, i.e. wipe it, when the Firebird service (superserver, v3.0) is not
running?
I understand that it contains lock tables etc. so should not be touched
while FB is running. But it's not clear to me if it can be wiped safely
when FB is not running, or if it contains data that can be vital when FB
starts up again.
My situation is that I'm migrating a VM with an FB installation.
Migration has been done like this, due to practical reasons (uptime vs.
file transfer & VM conversion time):
Snapshot of source VM, i.e. nightly backup is copied to new location.
Source stays up and running. Copy process takes about 1 day. (We have the databases locked with nbackup when nightly snapshot is taken).
Snapshot is unpacked at target location, converted from VMWare to
HyperV and brought online for additional reconfig and system testing.
A few days pass.
Both source and target Firebird services are stopped, so no database
activity is going on anywhere.
Sync files from source to target, including database files. This file
transfer is much smaller then in step 1 so it can be done during offline
time.
In step 5 I find diffs in the C:\ProgramData\firebird folder, and I'm
wondering what would be the best approach:
A) Wipe the folder at target.
B) Sync so target has the same content as source.
C) Leave target as is.
Please note that when FB service is started again at target, the
database files will be identical with those at the source at the time of
FB shutdown, and probably won't "match" the contents of
C:\ProgramData\firebird at target. I would assume that this fact rules
out option C).
The files in C:\ProgramData\firebird are only used during runtime of the Firebird server and contain transient data. It should be safe to delete these files when Firebird is not running.
In other words, when migrating from one server to another, you do not need to migrate the contents of C:\ProgramData\Firebird.

Get Octopus Deployment Folder at beginning of release process

I'm using an Octopus custom template with a powershell script to delete old deployments by age or by count. However, it relies on running after an actual deployment step, so that it can access the output variables of that step. For example,
DeploymentFolder:
#{Octopus.Action[Deployment Step Name].Output.Package.InstallationDirectoryPath}
This works great when, for example, you haven't run out of disk space. However, it doesn't work so well when disk space is low (since you didn't have this step running before) and Octopus can't deploy a new version due to the low disk space condition, thus you can't run the delete old files step. (I know one can change the Octopus parameter for what's considered too little disk space, but I'd rather not do that.) This additionally has a problem where if you want to enter a hardcoded path (say you're experiencing a temporary problem with your script and just need to delete a bunch of old deployments, but don't currently have the permissions needed to delete them manually from the server), it's too easy to accidentally leave off the final version-number-bearing folder and have the underlying script get confused.
Is there some way that I can get the deployment folder for the current environment, and combine it with my NuGet package name (and whatever other parts are in the deployment folder name), so I can work out the expected deployment folder in advance?
The actual deployment folder being used is E:\Octopus\Applications\LifeCycle\NugetPackageName.
I think I can get the NugetPackageName with $OctopusParameters['Octopus.Action[Deployment Step Name].Package.NuGetPackageId'] (though I am not sure, as Octopus.Action.Package.NuGetPackageId is listed as an "Action-Level Variable", and thus may not be available in advance of the step running. I'd be willing to hard-code the NuGet package name if I had to.
About the LifeCycle part, I don't know if that is actually a LifeCycle name. It may be a coincidence that it matches. I'm not sure. For the life of me I can't figure out where this comes from.
I also can't figure out where E:\Octopus\Applications\ comes from, so that I can get this value automatically instead of hard-coding it.
This variable will be available at the start of the deployment:
$OctopusParameters['Octopus.Action[Deployment Step Name].Package.NuGetPackageId']
You can get E:\Octopus\Applications\ from an environment variable called TentacleApplications:
$myRootDir = $env:TentacleApplications

Staging database version changes in development with RoundhousE

EDITED: from original as frankly it was a poor question first time around....
We have a batch script called DEV.DBDeployment.DropCustomCreate.bat, as the name suggests this drops and creates our db from a fresh, a useful tool in Dev but we don't always want to drop the database, sometimes just get the latest changes.
It's worth noting currently every CI checkin triggers a build in TeamCity which pumps the current Major.Minor.BuildNumber.Revision (e.g. 1.0.123.1568) number in to all AssemblyInfo.cs files within all Visual Studio projects. This obviously allows us to stamp the resultant dll's with the build number, pretty standard stuff for sure. We also overwrite a BuildInfo.txt file in a similar way, most importantly this BuildInfo.txt file is included within every deployment package and sits within the RoundhousE\deployment folder and is referenced by /vf=%version.file% when we run rh.exe as mentioned above from the .bat file. So we're sorted for deploying to existing databases in Test and Prod.
However in dev the AssemblyVersion is always 0.0.0.0 in AssemblyInfo.cs, as is the version number in BuildInfo.txt, therefore how do devs stage their changes locally against their database. For example, with this setup when we run rh.exe all changes will be stamped with the version number 0.0.0.0. Is the expectation that in dev you will always drop and create? If that's the case I'm assuming we need TeamCity to checkin the BuildInfo.txt file so RoundhousE can reference it from source control when executed in dev?
Is there something I'm missing here?
I think we discussed this over here - https://github.com/chucknorris/roundhouse/issues/113
As you saying about the .bat file; that is a tool for roundhouse. You have to run that batch file again and again when you want to run your scripts. If you want to run scripts when you build the roundhouse database project then you have to configure that with certain steps. If you wish I can tell you if you replied.

How to tackle machine-dependant configuration with SVN and VS2010?

To start with some background, I am a member of a small team developing an ASP.NET application. In addition to us, there are 2 other teams working on it, all from different countries. Source code is hosted on a shared SVN server but there is no central testing environment. Each developer runs the app on their own machine and data services are set up per team.
Unfortunately our SVN workflow has some gaps in it: annoyances arise when there is time for an SVN update.
It is mainly because each developer and team have slightly different environments in terms of disk directory structure and configuration (both IIS and app itself). Hence conflicts in configuration files and elsewhere that in essence are not conflicts at all - for runtime configuration (XML) and in *.suo.
How should we handle this if our objective is to keep checkout, app setup and update as painless as possible?
One option would obviously be master copies. Another one establishing uniformity in developer environments and keeping it. But what about a third alternative?
One thing to do is to not put the .suo files into SVN, there's no reason to do that.
For IIS configuration there should be no argument - uniform environment across the build team.
For app.config files and the like, I tend to keep them in a separate "cfg" directory in the root of the project and use pre-build events to copy in the relevant ones I need depending on the project and environment I'm working on.
You could have a separate build task to copy in user-specific config into your output directory. Add a new directory in your root project called "user.config or something, and leave it empty. Then configure your project build to check this for entries and copy them to the output directory. This is easy to do, and then each dev can have their own config without affecting the master copies. Just make sure you have an ignore pattern on that folder so you don't commit user-specific configuration. If you have svnadmin access to your source code repo, you could set a hook to prevent it from ever happening.
Also set ignore patterns on your root directory (recursively) for .suo, .user, _Resharper or any other extensions you think are pertinent. There are some So questions already on exactly this topic:
Best general SVN Ignore Pattern?
Ignore *.suo and *.user files in svn. It is easy. After that create two types of config files in subversion. Development and Server, if in use add Test also. See below example.
ConnectionStringDevelopment.config
ConnectionStringServer.config
AppSettingsDevelopment.config
AppSettingsServer.config
Server files would contain server information. Development files is not contained in svn and ignored there. Every new developer will start by copying server files and making changes according to his environment.
Look following example site
http://code.google.com/p/karkas/source/browse/trunk/Karkas.Ornek/WebSite/web.config
following lines are interest.
<appSettings configSource="appSettingsDevelopment.config"/>
<connectionStrings configSource="ConnectionStringsDevelopment.config" />
ConfigSource can be used almost everywhere in web.config therefore you will be able to change every config to every developer. Only make use of following naming convention. ignore *Development.config in subversion. This way no developer config will be added to subversion.
Its not a perfect solution (and should only be used if there are not many of those special files), but what I do is to add fake files for each case, and switch the real file locally to it.
In detail: I have a file foo that creates the problem. I also create foo_1 and foo_2 and then locally switch foo to foo_1 (I use tortoisesvn, so I cant really give you the command line to do that). Then I am working on foo on my machine, but actually commit to foo_1. Other parties could then switch to foo_2...
(I admit this is basically a variant of the master-file approach you suggested yourself; but if there are not many actual changes to those files this at least reduces the numer of conflicts you have to think about)

Moving a .sdf file into isolated storage

How do I move a .sdf file into my isolated storage and after I have moved it is there a way to delete it as it is of no use. I have added my .sdf file as a content in my project.
Your question is not very clear, but let me see if I get this. You created a database, added it to your file as content to your project so that you can have all the data present when the user installs your app. Then you are copying the data from the read-only .sdf file into a database that you are creating on first run, so that you can read/write to it. Correct?
If so, I do not believe there is a way to delete the read-only file that you included with the install.
If your database is large enough that you are concerned about the space it will take by having two copies of it on the phone, I would suggest placing your data on a server, creating a web service, and access that web service on first run. Place a notice on the screen that lets your user know that it is downloading information that will only be downloaded once, and that subsequent launches will not take as long. Be sure you include code to prevent a problem should the download be interrupted by a phone call, text message, back key press, start button, or other event. Make it be able to continue the download if it was interrupted in a prior run.
To answer your question, .SDF is a format of Microsoft SQL Server Compact (SQL CE) databases. The link you have pasted talks about SQLite databases.
This the way to download the entire Isolated Storage onto your device.
Open cmd and go to the following directory
C:\Program Files\Microsoft SDKs\Windows Phone\v7.1\Tools\IsolatedStorageExplorerTool
then use the isetool.exe to download the Isolated Storage along with the .sdf file onto your machine.
isetool.exe ts xd [Product_id_here_see_WMAppManifest.xml] "D:\Sandbox"
You should get message like download successful into D:\Sandbox.
You can also upload the sdf by changing the argument ts with rs

Resources