Is it possible to use conditional compilation symbols in VS build events? - visual-studio-2010

Say for instance I have a Visual Studio project with a configuration called "MyConfig" and I have the compilation symbol MY_CONFIG_SYMBOL defined.
Is there a macro or command to see if MY_CONFIG_SYMBOL is defined in the pre/post build events? Something like #if MY_CONFIG_SYMBOL, but for the build event?

I finally found an answer. The following works perfectly:
if "$(DefineConstants.Contains('DEBUG'))" == "True" <command>
This works for any constants defined in the build, but note that the constant is case-sensitive ('DEBUG' != 'Debug').

If you mean conditional builds based on the build types (Debug or Release) then yes. Check out these threads:
Conditional Post-build event in Visual Studio 2008
How to run Visual Studio post-build events for debug build only

Well, this is not a solution, just trying to advance state by sharing some experiments.
(I've yet to find a way to test conditional compilation symbols.)
This as a way to consolidate switching debug on and off:
<## include file="debug.incl" #>`
some text1
<# if ( xdebug ) { #>
foo = bas;
<# } #>
more text
Where debug.incl contains:
<#
bool xdebug = true;
#>
The conditional (if) in the first file is able to see the value of xdebug, so output is altered based on the setting of xdebug in debug.incl.
Sadly, however, the output files are not rebuilt on changes to debug.incl, despite the obvious include of it. And even a clean & rebuild doesn't seem to trigger generation, so some separate build construct is need for that...
(I did try debug.tt instead of debug.incl to no avail, switch to .incl so that debug.cs wasn't created by debug.tt.)
This didn't work very well as it doesn't see conditional compilation symbols, though does actually switch on the template debug attribute!
<#
#if DEBUG
bool xdebug = true;
#else
bool xdebug = false;
#endif
#>
some text1
<# if ( xdebug ) { #>
foo = bas;
<# } #>
more text
with <## template debug="true" #> vs. <# template debug=false #> you get the conditional output or not, respectively.

Related

'Gulpfile.js - failed to load See output' but output does not show any error information

I am trying to use gulp to copy some JS/CSS from node_modules to wwwroot in an ASP.Net core app.
I have what I thought was a fairly simple gulpfile.js
var gulp = require('gulp');
gulp.task('copy-files', function () {
var assets = {
js: [
'./node_modules/bootstrap/dist/js/bootstrap.js'
],
css: [
'./node_modules/bootstrap/dist/css/bootstrap.css'
]
};
_(assets).forEach(function (assets, type) {
gulp.src(assets).pipe(gulp.dest('./wwwroot/' + type));
});
});
However, when I look at the VS Task Runner, it just shows an error:
But the output window is empty:
How can I get more information about the error?
This answer here worked for me.
Moved up the $(PATH) location above everything. As I did not have (DevEnvDir)|Extensions\Microsoft\Web Tools\External location as mentioned in the answer.
For VS 2015
Tools > Options > Projects and Solutions > External Web Tools
For VS 2017
Tools > Options > Projects and Solutions > Web Package Management > External Web Tools
The problem is not related to path, but actually there must be some problem with gulp file itself either syntax error or some package is missing which unfortunately visual studio does not show that specific error but generic error what you see in task runner "failed to load". And the right way to see the errors is
Open the command prompt (preferably in admin mode, this is what i did).
Goto the same location where gulp file is located.
Run the task through following command example --> gulp default
If there is any error like package is missing, it will show you, fix those issues.
After all errors are fixed, then you will see that gulp task runs successfully.
Go back to visual studio, task runner, and click on refresh (left top button), and it will show you all tasks.
screenshot?
I'm not sure why but opening a cmd prompt at the directory containing gulpfile.js and running npm install has fixed it.
Perhaps someone wiser than I can explain why.
In Output window, make sure you select Task Runner Explorer for Show output from option. This was my problem why I didn't see the error logs from gulpfile. A rookie mistake.
I'm using Visual Studio Community 2019 Version 16.5.4.
I had the same problem and found the answer in the next link:
https://developercommunity.visualstudio.com/content/problem/961170/gulpfile-fails-to-load-after-upgrading-to-vs2019-1.html
Gulp uses node.js but it is important the version to be compatible. I've tried few versions and at the end version 0.12.7 works for me. But had to place absolute path to the place where that node version is installed in VS
Tools > Options > Projects and Solutions > External Web Tools
and move the path to the top. Placing the Path in environment variables and moving $(PATH) to the top didn't help in my case.

Compile error but js get generated

This is a new computer setup. All code compile and run on my old comupter.
I have 2 project in my solution than use typescript.
The first compile without problem.
The second one show error on compile but generate js on save.
I have installed vs 2013 update 5 then installed typescript 1.8.5.
I alsow have vs2015 installed.
After vs2015 was installed i have repair the typescript sdk.
They must have 2 different compiler setup and one in my project is not set correctly.
I have dig into the csproj and compare the 2 project but did not find missing/different parameter for typescript ....
It's like it dont take the new version. I get syntax errors in code i know it compile.
if someone had this problem, please help me.
p.s excuse me for my bad english, im french ...
UPDATE: Exemple of code dont compile
public doSomething(errorCallBack?: (failCallback1?: JQueryPromiseCallback<any> | JQueryPromiseCallback<any>[], ...failCallbacksN: Array<JQueryPromiseCallback<any> | JQueryPromiseCallback<any>[]>) => void)
{}
Error :
Error 218 Build: ',' expected.
A reduced sample code that fails to compile:
class Foo {
public doSomething(errorCallBack?: (failCallback1?: any, ...failCallbacksN: Array<any>[]>) => void)
{ }
}
You have major syntax errors in ...failCallbacksN: Array<any>[]>. You need something like ...failCallbacksN: Array<any> e.g.:
class Foo {
public doSomething(errorCallBack?: (failCallback1?: any, ...failCallbacksN: Array<any>) => void)
{ }
}
but js get generated
This is by design. Valid JavaScript will always generate valid TypeScript (even in the presence of errors). The types are considered invisible to the emitter in TypeScript so even in the presence of type errors TypeScript will try and do graceful recovery and generate JavaScript.
More
See Why TypeScript : https://basarat.gitbooks.io/typescript/content/docs/why-typescript.html
I have found than visual studio IDE was using 1.8.9 and the compiler was 1.0. This is why i had such compilation errors.
when i used "tsc -v" command in vs2013 command prompt, it show 1.0.
when i use "where tsc" it show only directory for 1.0 and not the 1.8.9.
so, i have replaced the 1.0 content with 1.8.9 . I dont use 1.0.
Now i have a 1.0 directory wish have 1.8.9 in it.
I think this is not the way it's supposed to be, but i need it to work.
It have a .target file i can modify, but i dont wish to do that.

WebEssentials tslint custom rules

I have a tslint.json file in my solution directory and I'm trying to create a custom rule following the guidelines on https://www.npmjs.com/package/tslint
I have created a "nonImportsRule.ts", have copied the code from the link and have added "no-imports": true to my tslint.json file however the rule is not being picked up.
The guide says that a rulesDirectory needs to be specified, but I have no idea where this should be configured?
Also - is it possible to setup Web Essentials to break the build if tslint rules are violated?
I had a same kind of a problem. I wanted to use the TSLint extensions, tslint-microsoft-contrib and codelyzer, together with Web Analyzer. This did not work. The first step to figure out why was to make an adaptation in server.js which can be found in C:\Users\[User]\AppData\Local\Temp\WebAnalyzer1.7.75. I changed the TSLint function into:
tslint: function (configFile, files) {
// Try catch tslint errors
try {
var tslint = require("tslint");
var options = {
formatter: "json",
configuration: JSON.parse(fs.readFileSync(configFile, "utf8").trim())
};
var results = [];
for (var i = 0; i < files.length; i++) {
var file = files[i];
var ll = new tslint(file, fs.readFileSync(file, "utf8"), options);
results = results.concat(JSON.parse(ll.lint().output));
}
} catch(error) {
// Return tslint error to visual studio so we can get some ideas for counter measures.
var result = JSON.parse('[{"endPosition": {"character": 0,"line": 0,"position": 0},"failure": "INSTALL ERROR","name": "/","ruleName": "INSTALL ERROR","startPosition": {"character": 0,"line": 0,"position": 0}}]');
result[0].failure = error.message;
return result;
}
return results;
},
The alternation resulted in error feedback in the visual studio error list when I run the Web Analyzer. Do not forget to force a new instance of node.exe with the task manager after you have applied the alternation. The feedback leaded, for my particular situation, to the following installation of npm packages in the following directories:
Packages:
"codelyzer": "0.0.12"
"tslint": "^3.7.3"
"tslint-microsoft-contrib": "^2.0.2"
"typescript": "^1.8.9"
Directories:
C:\Users\[User]\AppData\Local\Temp\WebAnalyzer1.7.75
C:\Program Files (x86)\Microsoft Visual Studio 14.0\Common7\IDE
After this, Web Analyzer was able to use the same tslint rules as my grunt task. Hopefully a newer version of Web Analyzer will solve my problems more elegantly.
Okay, i'm not using Web Essentials extension but Web Analyzer : https://visualstudiogallery.msdn.microsoft.com/6edc26d4-47d8-4987-82ee-7c820d79be1d
So i won't be able to answer on this question 100%, but i want to summarize here my experience with custom tslint rules. First of all, what is not completely clear from documentation is that the whole thing depends on node.js.
So first of all you need to install node js. This will give you npm command to your command line.
After install with npm tslint and typescript. https://github.com/palantir/tslint here are examples. These will create files in : "c:\Users[Username]\AppData\Roaming\npm\node_modules"
Go into "c:\Users[Username]\AppData\Roaming\npm\node_modules\tslint\lib\rules\". Create here noImportRule.ts. Copy the following content:
import * as ts from "typescript";
import * as Lint from "../lint";
export class Rule extends Lint.Rules.AbstractRule {
public static FAILURE_STRING = "import statement forbidden EDE";
public apply(sourceFile: ts.SourceFile): Lint.RuleFailure[] {
return this.applyWithWalker(new NoImportsWalker(sourceFile, this.getOptions()));
}
}
// The walker takes care of all the work.
class NoImportsWalker extends Lint.RuleWalker {
public visitImportDeclaration(node: ts.ImportDeclaration) {
// create a failure at the current position
this.addFailure(this.createFailure(node.getStart(), node.getWidth(), Rule.FAILURE_STRING));
// call the base version of this visitor to actually parse this node
super.visitImportDeclaration(node);
}
}
Note that in the example import lint is not given with relative path that won't work with this approach.
4. Fire the command : "tsc -m commonjs --noImplicitAny .\noImportsRule.ts". This will compile your custom rule's ts. You will get bunch of compilation errors, such as: ../enableDisableRules.d.ts(1,21): error TS2307: Cannot find module 'typescript'. That's a good question why are these thrown, but forget about them, js file will be generated anyway.
5. Put "no-imports": true to your tslint.json(for now this should be custom one). With this command from command line:
tslint -c 'sample.tslint.json' test.ts
you will get:
test.ts[1, 1]: import statement forbidden. So you made the custom rule working!!! :)
That's all for working from command line. In addition I made custom rules working with WebAnalyzer, at least temporary.
I needed to copy my custom rule's files here:
c:\Users[Username]\AppData\Local\Temp\WebAnalyzer1.6.65\node_modules\tslint\lib\rules\ and of course configure WebAnalyzer tslint.json to include custom rules.
I have no idea how Web Essentials extension makes this whole thing working with tslint, but i guess some way similar :). Somewhere there should be a folder (node_modules\tslint\lib\rules) with rules what tslint uses. There you need to copy your custom ones.
Of course the most elegant solution would be to modify Web Essentials extension itself and make the tslint's custom rules directory configurable from visual studio. (so my solution is just a workaround)
Here is my custom rule example in the visual studio warning's list:

T4 include Directive broken after installing VS2013 Update 3

After updating to Visual Studio 2013 Update 3, the following code in a T4 template breaks.
<## include file="../File1.tt" #>
It worked as expected in all previous versions of VS.
With this Update, all the variants:
<## include file="../File1.tt" #>
<## include file="..\File1.tt" #>
<## include file="..\\File1.tt" #>
Fails with the following error:
There was an error loading the include file '..\\File1.tt'. The transformation will not be run. The following Exception was thrown:
System.ArgumentException: The path is not of a legal form.
at System.IO.Path.NormalizePath(String path, Boolean fullCheck, Int32 maxPathLength, Boolean expandShortPaths)
at System.IO.Path.GetFullPathInternal(String path)
at System.IO.Path.GetFullPath(String path)
at Microsoft.VisualStudio.TextTemplating.Engine.VisitedFiles.Visit(String fileLocation)
at Microsoft.VisualStudio.TextTemplating.Engine.ProcessIncludeDirective(Directive directive, ITextTemplatingEngineHost host, VisitedFiles includedFiles) in line: 234 in file: C:\...\mytemplate.tt
Any known workaround or fix for this issue?
Root cause found.
When implementing the interface ITextTemplatingEngineHost one of the method to implement is:
public bool LoadIncludeText(
string requestFileName, out string content, out string location)
In all previous versions before VS2013 Update 3 returning a valid filename in the location parameter as output was not required. In fact, we send string.Empty.
Accordingly to the documentation the value can be empty if the template is not file-system based. See doc reference:
location Type: String
A String that contains the location of the
acquired text. If the host searches the registry for the location of
include files or if the host searches multiple locations by default,
the host can return the final path of the include file in this
parameter. The host can set the location to Empty if the file could
not be found or if the host is not file-system based.
However with Update 3, something has changed and T4 engine checks and expects the location parameter to be a non-empty file path. String.Empty value result in the original commented exception.
As a temporal workaround: passing a valid file name to the location parameter on method LoadIncludeText when implementing ITextTemplateEngineHost avoids the exception.
Thanks to #rubenjmarrufo in the hunt for the bug.
Update: Confirmed bug by Microsoft. It will be fixed on Update 4. Commented workaround is valid.

Conditional compilation in database project [duplicate]

I am using a SQL 2008 database project (in visual studio) to manage the schema and initial test data for my project. The atabase project uses a post deployment which includes a number of other scripts using SQLCMD's ":r " syntax.
I would like to be able to conditionally include certain files based on a SQLCMD variable. This will allow me to run the project several times with our nightly build to setup various version of the database with different configurations of the data (for a multi-tenant system).
I have tried the following:
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
print 'inserting specific configuration'
:r .\Configuration1\Data.sql
END
ELSE
BEGIN
print 'inserting generic data'
:r .\GenericConfiguration\Data.sql
END
But I get a compilation error:
SQL01260: A fatal parser error occurred: Script.PostDeployment.sql
Has anyone seen this error or managed to configure their postdeployment script to be flexible in this way? Or am I going about this in the wrong way completely?
Thanks,
Rob
P.S. I've also tried changing this around so that the path to the file is a variable, similar to this post. But this gives me an error saying that the path is incorrect.
UPDATE
I've now discovered that the if/else syntax above doesn't work for me because some of my linked scripts require a GO statement. Essentially the :r just imports the scripts inline, so this becomes invalid sytax.
If you need a GO statement in the linked scripts (as I do) then there isn't any easy way around this, I ended up creating several post deployment scripts and then changing my project to overwrite the main post depeployment script at build time depending on the build configuration. This is now doing what I need, but it seems like there should be an easier way!
For anyone needing the same thing - I found this post useful
So in my project I have the following post deployment files:
Script.PostDeployment.sql (empty file which will be replaced)
Default.Script.PostDeployment.sql (links to scripts needed for standard data config)
Configuration1.Script.PostDeployment.sql (links to scripts needed for a specific data config)
I then added the following to the end of the project file (right click to unload and then right click edit):
<Target Name="BeforeBuild">
<Message Text="Copy files task running for configuration: $(Configuration)" Importance="high" />
<Copy Condition=" '$(Configuration)' == 'Release' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Debug' " SourceFiles="Scripts\Post-Deployment\Default.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
<Copy Condition=" '$(Configuration)' == 'Configuration1' " SourceFiles="Scripts\Post-Deployment\Configuration1.Script.PostDeployment.sql" DestinationFiles="Scripts\Post-Deployment\Script.PostDeployment.sql" OverwriteReadOnlyFiles="true" />
</Target>
Finally, you will need to setup matching build configurations in the solution.
Also, for anyone trying other work arounds, I also tried the following without any luck:
Creating a post build event to copy the files instead of having to hack the project file XML. i couldn't get this to work because I couldn't form the correct path to the post deployment script file. This connect issue describes the problem
Using variables for the script path to pass to the :r command. But I came across several errors with this approach.
I managed to work around the problem using the noexec method.
So, instead of this:
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
print 'inserting specific configuration'
:r .\Configuration1\Data.sql
END
I reversed the conditional and set NOEXEC ON to skip over the imported statement(s) thusly:
IF ('$(ConfigSetting)' <> 'Configuration1')
SET NOEXEC ON
:r .\Configuration1\Data.sql
SET NOEXEC OFF
Make sure you turn it back off if you want to execute any subsequent statements.
Here's how I am handling conditional deployment within the post deployment process to deploy test data for the Debug but not Release configuration.
First, in solution explorer, open the project properties folder, and right-click to add a new SqlCmd.variables file.
Name the file Debug.sqlcmdvars.
Within the file, add your custom variables, and then add a final variable called $(BuildConfiguration), and set the value to Debug.
Repeat the process to create a Release.sqlcmdvars, setting the $(BuildConfiguration) to Release.
Now, configure your configurations:
Open up the project properties page to the Deploy tab.
On the top dropdown, set the configuration to be Debug.
On the bottom dropdown, (Sql command variables), set the file to Properties\Debug.sqlcmdvars.
Repeat for Release as:
On the top dropdown, set the configuration to be Release.
On the bottom dropdown, (Sql command variables), set the file to Properties\Release.sqlcmdvars.
Now, within your Script.PostDeployment.sql file, you can specify conditional logic such as:
IF 'Debug' = '$(BuildConfiguration)'
BEGIN
PRINT '***** Creating Test Data for Debug configuration *****';
:r .\TestData\TestData.sql
END
In solution explorer, right click on the top level solution and open Configuration Manager. You can specify which configuration is active for your build.
You can also specify the configuration on the MSBUILD.EXE command line.
There you go- now your developer builds have test data, but not your release build!
As Rob worked out, GO statements aren't allowed in the linked SQL scripts as this would nest it within the BEGIN/END statements.
However, I have a different solution to his - if possible, remove any GO statements from the referenced scripts, and put a single one after the END statement:
IF '$(DeployTestData)' = 'True'
BEGIN
:r .\TestData\Data.sql
END
GO -- moved from Data.sql
Note that I've also created a new variable in my sqlcmdvars file called $(DeployTestData) which allows me to turn on/off test script deployment.
I found a hack from an MSDN blog which worked fairly well. The trick is to write the commands to a temp script file and then execute that script instead. Basically the equivalent of dynamic SQL for SQLCMD.
-- Helper newline variable
:setvar CRLF "CHAR(13) + CHAR(10)"
GO
-- Redirect output to the TempScript.sql file
:OUT $(TEMP)\TempScript.sql
IF ('$(ConfigSetting)' = 'Configuration1')
BEGIN
PRINT 'print ''inserting specific configuration'';' + $(CRLF)
PRINT ':r .\Configuration1\Data.sql' + $(CRLF)
END
ELSE
BEGIN
PRINT 'print ''inserting generic data'';' + $(CRLF)
PRINT ':r .\GenericConfiguration\Data.sql' + $(CRLF)
END
GO
-- Change output to stdout
:OUT stdout
-- Now execute the generated script
:r $(TEMP)\TempScript.sql
GO
The TempScript.sql file will then contain either:
print 'inserting specific configuration';
:r .\Configuration1\Data.sql
or
print 'inserting generic data';
:r .\GenericConfiguration\Data.sql
depending on the value of $(ConfigSetting) and there will be no problems with GO statements etc. when it is executed.
I was inspired by Rob Bird's solution. However, I am simply using the Build Events to replace the post deployment scripts based on the selected build configuration.
I have one empty "dummy" post deployment script.
I set up a pre-build event to replace this "dummy" file based on the selected build configuration (see attached picture).
I set up a post-build event to place the "dummy" file back after the build has finished (see attached picture). The reason is that I do not want to generate changes in the change control after the build.

Resources