I am using a SQL 2008 database project (in visual studio) to manage the schema and initial test data for my project. The atabase project uses a post deployment which includes a number of other scripts using SQLCMD's ":r " syntax.
I would like to be able to conditionally include certain files based on a SQLCMD variable. This will allow me to run the project several times with our nightly build to setup various version of the database with different configurations of the data (for a multi-tenant system).
I have tried the following:
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
print 'inserting specific configuration'
:r .\Configuration1\Data.sql
END
ELSE
BEGIN
print 'inserting generic data'
:r .\GenericConfiguration\Data.sql
END
But I get a compilation error:
SQL01260: A fatal parser error occurred: Script.PostDeployment.sql
Has anyone seen this error or managed to configure their postdeployment script to be flexible in this way? Or am I going about this in the wrong way completely?
Thanks,
Rob
P.S. I've also tried changing this around so that the path to the file is a variable, similar to this post. But this gives me an error saying that the path is incorrect.
UPDATE
I've now discovered that the if/else syntax above doesn't work for me because some of my linked scripts require a GO statement. Essentially the :r just imports the scripts inline, so this becomes invalid sytax.
If you need a GO statement in the linked scripts (as I do) then there isn't any easy way around this, I ended up creating several post deployment scripts and then changing my project to overwrite the main post depeployment script at build time depending on the build configuration. This is now doing what I need, but it seems like there should be an easier way!
For anyone needing the same thing - I found this post useful
So in my project I have the following post deployment files:
Script.PostDeployment.sql (empty file which will be replaced)
Default.Script.PostDeployment.sql (links to scripts needed for standard data config)
Configuration1.Script.PostDeployment.sql (links to scripts needed for a specific data config)
I then added the following to the end of the project file (right click to unload and then right click edit):
<Target Name="BeforeBuild">
<Message Text="Copy files task running for configuration: $(Configuration)" Importance="high" />
<Copy Condition=" '$(Configuration)' == 'Release' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Debug' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Configuration1' " SourceFiles="Scripts\Post-Deployment\Configuration1.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
</Target>
Finally, you will need to setup matching build configurations in the solution.
Also, for anyone trying other work arounds, I also tried the following without any luck:
Creating a post build event to copy the files instead of having to hack the project file XML. i couldn't get this to work because I couldn't form the correct path to the post deployment script file. This connect issue describes the problem
Using variables for the script path to pass to the :r command. But I came across several errors with this approach.
I managed to work around the problem using the noexec method.
So, instead of this:
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
print 'inserting specific configuration'
:r .\Configuration1\Data.sql
END
I reversed the conditional and set NOEXEC ON to skip over the imported statement(s) thusly:
IF ('$(ConfigSetting)' <> 'Configuration1')
SET NOEXEC ON
:r .\Configuration1\Data.sql
SET NOEXEC OFF
Make sure you turn it back off if you want to execute any subsequent statements.
Here's how I am handling conditional deployment within the post deployment process to deploy test data for the Debug but not Release configuration.
First, in solution explorer, open the project properties folder, and right-click to add a new SqlCmd.variables file.
Name the file Debug.sqlcmdvars.
Within the file, add your custom variables, and then add a final variable called $(BuildConfiguration), and set the value to Debug.
Repeat the process to create a Release.sqlcmdvars, setting the $(BuildConfiguration) to Release.
Now, configure your configurations:
Open up the project properties page to the Deploy tab.
On the top dropdown, set the configuration to be Debug.
On the bottom dropdown, (Sql command variables), set the file to Properties\Debug.sqlcmdvars.
Repeat for Release as:
On the top dropdown, set the configuration to be Release.
On the bottom dropdown, (Sql command variables), set the file to Properties\Release.sqlcmdvars.
Now, within your Script.PostDeployment.sql file, you can specify conditional logic such as:
IF 'Debug' = '$(BuildConfiguration)'
BEGIN
PRINT '***** Creating Test Data for Debug configuration *****';
:r .\TestData\TestData.sql
END
In solution explorer, right click on the top level solution and open Configuration Manager. You can specify which configuration is active for your build.
You can also specify the configuration on the MSBUILD.EXE command line.
There you go- now your developer builds have test data, but not your release build!
As Rob worked out, GO statements aren't allowed in the linked SQL scripts as this would nest it within the BEGIN/END statements.
However, I have a different solution to his - if possible, remove any GO statements from the referenced scripts, and put a single one after the END statement:
IF '$(DeployTestData)' = 'True'
BEGIN
:r .\TestData\Data.sql
END
GO -- moved from Data.sql
Note that I've also created a new variable in my sqlcmdvars file called $(DeployTestData) which allows me to turn on/off test script deployment.
I found a hack from an MSDN blog which worked fairly well. The trick is to write the commands to a temp script file and then execute that script instead. Basically the equivalent of dynamic SQL for SQLCMD.
-- Helper newline variable
:setvar CRLF "CHAR(13) + CHAR(10)"
GO
-- Redirect output to the TempScript.sql file
:OUT $(TEMP)\TempScript.sql
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
PRINT 'print ''inserting specific configuration'';' + $(CRLF)
PRINT ':r .\Configuration1\Data.sql' + $(CRLF)
END
ELSE
BEGIN
PRINT 'print ''inserting generic data'';' + $(CRLF)
PRINT ':r .\GenericConfiguration\Data.sql' + $(CRLF)
END
GO
-- Change output to stdout
:OUT stdout
-- Now execute the generated script
:r $(TEMP)\TempScript.sql
GO
The TempScript.sql file will then contain either:
print 'inserting specific configuration';
:r .\Configuration1\Data.sql
or
print 'inserting generic data';
:r .\GenericConfiguration\Data.sql
depending on the value of $(ConfigSetting) and there will be no problems with GO statements etc. when it is executed.
I was inspired by Rob Bird's solution. However, I am simply using the Build Events to replace the post deployment scripts based on the selected build configuration.
I have one empty "dummy" post deployment script.
I set up a pre-build event to replace this "dummy" file based on the selected build configuration (see attached picture).
I set up a post-build event to place the "dummy" file back after the build has finished (see attached picture). The reason is that I do not want to generate changes in the change control after the build.
Related
After a few mistake builds with the Release configuration pushing stuff to other environments I'd like to have a warning or prompt of some sort to stop me from doing such madness if I don't really want to.
Is there a way to make this happen? :)
The simplest way that I can see of doing this is to leverage the Build Events dialog in the Project Settings.
First add a file called usermessage.vbs to the solution. It should contain the following:
a = MsgBox("Continue with Debug Build",1,"Build Configuration Warning")
if a=1 then WScript.Quit(0) Else WScript.Quit(1) End If
This will present an OK/Cancel dialog which returns an error unless you click OK.
Add this code to the Pre-build event command line:
if $(ConfigurationName) == Debug WSCRIPT.EXE "$(SolutionDir)usermessage.vbs"
This will run the script if you build in debug configuration.
The script will error and the build will halt unless you click OK in the dialog.
im using CodeSmart 3.4.6.297 to get code metrics from my vb6 projects.
now i want to include codesmart in my build-process using TeamCity.
i know that codesmart can be user in the command line. this works for me. i have my command line.xml file:
<sourcemonitor_commands>
<write_log>true</write_log>
<command>
<project_file>Path to xyz.smproj</project_file>
<project_language>VB</project_language>
<modified_complexity>true</modified_complexity>
<file_extensions>*.cs,*.Designer.cs,*.frm</file_extensions>
<source_directory>Path to Project</source_directory>
<include_subdirectories>true</include_subdirectories>
<checkpoint_name>Baseline</checkpoint_name>
</command>
</sourcemonitor_commands>
which generates a projectXYZ.smproj file.
My question now is...
how do i export data (code metrics like lines of code) from this smproj file using a console command?
i can get code metrics when i open my projectXYZ.smproj file in the codesmart IDE and exporting data using the menue "file...export Checkpoint details" but i need this in a console command
Any ideas?
Greetings
SLimke
Got it myself. had to add an export area to the command block
<export>
<export_file>PathToProject\Checkpoint1.xml</export_file>
<export_type>2 (checkpoint details as XML)</export_type>
<export_option>1 (do not use any of the options set in the Options dialog)</export_option>
</export>
I am using WiX 3.5 and making an installer. I have used heat.exe to bundle all the files.
It produced a WiX file. I referred in main wxs files as componentgroup ref. When I build my installer, it throws the following exception.
light.exe : error LGHT0103 : The system cannot find the file
'..........\target\tmp-release\jboss-eap-5.0\jboss-as\server\all\deploy\httpha-invoker.sar\invoker.war\WEB-INF\classes\org\jboss\invocation\http\servlet\ReadOnlyAccessFilter.class'
with type ''.
It is able load many files from this location, except the above file, even though the file is present.
Looks like you've hit the linker bug. As far as I can see, it was already reported to the WiX team, and was scheduled for v4.0. The comment to the issue states the path is more than 255 characters, so a possible workaround for you is to re-work the files/folders layout to avoid the paths of that length.
Hope this helps.
The answer of Ravz1234 works ! I used it with a environment variable e.g. env.SourcePath.
1) Set an environment variable to show on your Source Dir e.g. C:\SourceDir
2) On heat.exe add the argument -var env.SourcePath along with the other arguments
I used the variable for the directory, sys.SOURCEFILEDIR, and it worked well.
I just installed mogenerator+xmo'd on my development machine and would like to start playing with it. The only instructions I could really find online were from a previous SO post, and those don't work with XCode 4 (or at least ⌘I doesn't pull up metadata any more and I don't know how).
So to get things up and running, is all that needs to happen to add xmod in the .xcdatamodeld's comments (wherever they are) and the classes will be generated/updated on save from then on?
While trying to find this answer myself, I found MOGenerator and Xcode 4 integration guide on esenciadev.com. This solution is not a push-button integration, but it works. The link has detailed instructions, but generally you:
Copy the shell scripts into your project
Add build rules to your target to run the two shell scripts
When you build your project, the script runs MOGenerator on all .xcdatamodel files in your project directory. After the build, if the script generates new class files, you must manually add them to your project. Subsequent builds will remember existing MO-Generated files.
Caveats:
The example's build rule assumes you put the scripts into a /scripts/ file folder within your project directory. When I ignored this detail (creating a project folder but not a file folder) I got a build error. Make sure the build rule points to the script's file location.
The script uses the --base-class argument. Unless your model classes are subclasses of a custom class (not NSManagedObject), you must delete this argument from the script. E.g.,
mogenerator --model "${INPUT_FILE_PATH}/$curVer" --output-dir "${INPUT_FILE_DIR}/" --base-class $baseClass
Now that Xcode 4 is released Take a look at the Issues page for mogenerator
After I make changes to my model file, I just run mogenerator manually from the terminal. Using Xcode 4 and ARC, this does the trick:
cd <directory of model file>
mogenerator --model <your model>.xcdatamodeld/<current version>.xcdatamodel --template-var arc=YES
Maybe I'll use build scripts at some point, but the terminal approach is too simple to screw up.
I've found a Script in the "Build Phases" to be more reliable than the "Build Rules".
Under "Build Phases" for your Target, choose the button at the bottom to "Add Run Script". Drag the run script to the top so that it executes before compiling sources.
Remember that the actual data model files (.xcdatamodel) are contained within a package (.xcdatamodeld), and that you only need to compile the latest data model for your project.
Add the following to the script (replacing text in angle-brackets as appropriate)
MODELS_DIR="${PROJECT_DIR}/<path to your models without trailing slash>"
DATA_MODEL_PACKAGE="$MODELS_DIR/<your model name>.xcdatamodeld"
CURRENT_VERSION=`/usr/libexec/PlistBuddy "$DATA_MODEL_PACKAGE/.xccurrentversion" -c 'print _XCCurrentVersionName'`
# Mogenerator Location
if [ -x /usr/local/bin/mogenerator ]; then
echo "mogenerator exists in /usr/local/bin path";
MOGENERATOR_DIR="/usr/local/bin";
elif [ -x /usr/bin/mogenerator ]; then
echo "mogenerator exists in /usr/bin path";
MOGENERATOR_DIR="/usr/bin";
else
echo "mogenerator not found"; exit 1;
fi
$MOGENERATOR_DIR/mogenerator --model "$DATA_MODEL_PACKAGE/$CURRENT_VERSION" --output-dir "$MODELS_DIR/"
Add options to mogenerator as appropriate. --base-class <your base class> and --template-var arc=true are common.
Random tip. If you get Illegal Instruction: 4 when you run mogenerator. Install it from the command line:
$ brew update && brew upgrade mogenerator
In the build log of my project, i can see these properties:
<integrationProperties>
<CCNetProject>Gdet_T</CCNetProject>
...
<LastModificationDate>4/6/2010 1:29:04 PM</LastModificationDate>
<LastChangeNumber>10841</LastChangeNumber>
</integrationProperties>
I want to pass the property CCNetProject and LastChangeNumber to a batch file. it works well with CCNetProject, as it can be used in the batch as an environment variable %CCNetProject%.
But it doesn't work with other properties (those are not starting with the CCnet prefix) as LastChangeNumber or LastModificationDate.
I tried to pass it as argument, but it fails !
<exec>
<executable>$(WorkingFolderBase)\MyBatch.bat</executable>
<baseDirectory>$(WorkingFolderBase)\</baseDirectory>
<buildArgs>$(LastModificationDate)</buildArgs>
</exec>
I tried to pass it as environment variable, but it fails:
<exec>
<executable>$(WorkingFolderBase)\MyBatch.bat</executable>
<baseDirectory>$(WorkingFolderBase)\</baseDirectory>
<environment>
<variable>
<name>svn_label</name>
<value>"${LastModificationDate}"</value>
</variable>
</environment>
</exec>
The results is always the same when I display the parameter or variable : empty string or the variable name $(svn_label)
I'm sure it is simple, but ... I can't find ! Any idea ?
CCNET passes the following parameters to external programs:
CCNetArtifactDirectory
CCNetBuildCondition
CCNetBuildDate
CCNetBuildTime
CCNetFailureUsers
CCNetIntegrationStatus
CCNetLabel
CCNetLastIntegrationStatus
CCNetListenerFile
CCNetModifyingUsers
CCNetNumericLabel
CCNetProject
CCNetProjectUrl
CCNetRequestSource
CCNetUser
CCNetWorkingDirectory
As you can see LastIntegrationStatus e.g. is available through CCNetLastIntegrationStatus but LastModificationDate e.g. has no equivalent.
You can pass additional arguments via <buildArgs> or <environment> but inside CCNET configuration you have no access on the integration properties mentioned above. Most people starting with CCNET (including myself) try something like <buildArgs>$(CCNetProject)</buildArgs> and fail.
Have a look on my answer to a similar question.
Sorry I can't provide a better solution.
Update (regarding Thinker's suggestion):
Using $[$CCNetLabel] inside CCNET configuration does not seem to work.
Frankly spoken, I would have been rather surprised, if it had. The configuration is something static whereas CCNetLabel is something dynamic, that potentially changes with every integration build. Assuming you have access to these dynamic properties inside the configuration, the configuration might change with every build. Since changing the configuration means restarting the CCNET server automatically, you would cause a server restart with every build. Not actually a desirable behavior, is it?
ok, found the solution.
Need to use a specific label called SvnRevisionLabeller to retrieve the svn revision.
it is then available via the CCNetLabel environement variable.
http://code.google.com/p/svnrevisionlabeller/
<labeller type="svnRevisionLabeller">
<url>http://mysvnrootproject/trunk</url>
</labeller>