Ant XSLT 2.0 with saxon9 BUILD FAILED - debugging

I have gotten this Ant script to work for XSLT 1.0 using the standard transform engine. However, I want to use XSLT 2.0 as well and I am running into a brick wall.
I have included the Saxon Jar and defined the factory class. When I run the script, I get a long pause (Maybe 10 seconds, way too much for my XSLT which is about 10 lines long) then I get a BUILD FAILED: Fatal error during transformation
Any help would be much appreciated:
<project name="TranformXml" default="TransformFile">
<target name="TransformFile">
<!-- Transform one file into an HTML file -->
<xslt in="input.xml"
out="student.html"
style="transform.xsl"
processor="trax" force="true" classpath="./lib/saxon/saxon9he.jar">
<factory name="net.sf.saxon.TransformerFactoryImpl"/>
</xslt>
</target>
</project>
I have tried a number of variations of this, including putting the directly in the <xslt> element, plus toggling the processor and force options. Same problem every time.
(Note, I have tested my XSLT and XML in Oxygen and it works well there)
Thanks,
Casey

Sorry guys. I should have been more verbose. I did find out why it was happening though. It was because I was not using resolve-uri() correctly.

Related

Debugging Acceleo M2T Transformation within MTL file

I'm currently setting up an Acceleo project to generate Java source code from an Eclipse UML2 model. I do have experiences with the Xtend code generator, but not with the Acceleo one. However, I wanted to try Acceleo as an alternative to Xtend.
Unfortunately, I do have problems with debugging the M2T transformation from within the MTL template file. All breakpoints I set in the MTL are just ignored and the debugger runs the transformation without stopping.
I've searched Stackoverflow and other sources for help, but none of the responses helped me to get the debugger to work.
I'm running the transformation as an Acceleo Application and am using the Java Application runner since the Acceleo Plug-in Application is no longer supported with Eclipse Oxygen.
Did anyone of you have similar troubles and/or found any solution?
Thanks for any hint.
Best regards
Timo Rohrberg
I have the same requirement, but apparently there is not way to debug it at runtime, as far as I know.
https://www.eclipse.org/forums/index.php?t=msg&th=1080008&goto=1740153&#msg_1740153
Create a wrapper service for java printing/logging.
This is not the best solution, but, I was able to make do.
You could also use the acceleo interpreter (using the acceleo eclipse perspective) to try and evaluate queries before changing your code.
https://wiki.eclipse.org/Acceleo/Interpreter
I use this with papyrus and it works nicely.
I found a solution (or workaround) to debug .mtl files.
In the manifest editor window of your eclipse plugin
-on the runtime tab add your bin folder (the folder containing the compiled .class and .emtl files) to the classpath.
After this your breakpoints in the .mtl files will work, if you run your Acceleo launcher in "Acceleo Plug-in Application" mode. (Configuration section in the Acceleo launcher window)
(Tested with Eclipse 19-03, Acceleo 3.7)

NUnit Not Utilizing Paramaters and Settings in .runsettings File

I'm trying to switch over an existing Selenium solution to use NUnit due to it's support of parallelization of tests within the same class. The problem I'm having is that it doesn't seem to be using the selected .runsettings file, despite the fact that it's supported by NUnit. By this, I mean that TestContext.Parameters.Count is always 0, and it doesn't store results in the path specified in the <ResultDirectory> node of my .runsettings file.
I've looked over the documentation and the reference to AdapterSettings (which has logic for parsing the .runsettings file) and I can't f figure out why my TestContext.Paramaters are never populated in my test when I use a .runsettings file.
I created a bare-bones test solution just as a sanity check and POC and it's still not populating.
I have
-NUnit 3.7.1
-NUnit3TestAdapter 3.7.0
-VisualStudio Professional 2017
My .runsettings file is:
<?xml version="1.0" encoding="utf-8"?>
<RunSettings>
<!-- Parameters used by tests at runtime -->
<TestRunParameters>
<Parameter name ="testParameter1" value="value1" />
<Parameter name ="testParameter2" value="value2" />
</TestRunParameters>
</RunSettings>
All I'm trying to do in the test:
using System;
using NUnit.Framework;
namespace ClassLibrary1
{
[TestFixture]
public class UnitTestClass1
{
[Test]
[Parallelizable(ParallelScope.All)]
public void Class1TestMethod1()
{
// Below always returns 0
Console.WriteLine(TestContext.Parameters.Count);
// Below returns null
Console.WriteLine(TestContext.Parameters["testParameter1"]);
}
}
}
I could really use some other suggestions on how to troubleshoot. We use .runsettings files extensively for our test suite with the built-in Visual Studio Unit Testing tools (and with vstest.console.exe), so I'm pretty confident I'm using it correctly.
Update:
I replaced all references to Visual Studios test attributes in my actual test solution to use NUnits attributes. Now I'm getting the parameters! However, that leaves me still baffled as to why it doesn't work in my other bare-bones solution (with files shown above).
I really only care about my real solution obviously, but I still want to understand why it's not working in my dummy solution.
Also - I still am not sure where to find the TestResultDirectory path in NUnit.

Problems getting Maven docbkx plugin to work

I have been experimenting with generating DocBook output using Maven running within Eclipse. I found the Maven plugin called docbkx, and it almost gets me there.
In the DocBook world, the way that you change some of the default behavior of the XSL transformations is by defining your own XSL file as a "customization layer." This file imports the standard XSL file and then any templates, etc. that you want to change are put in this file. Thereby, they are parsed after the standard templates, and your altered version is what is used.
To make this happen using the docbkx plugin, you have to tell it that you are using a custom XSL file in the configuration, with lines like
<foCustomization>src/docbkx/xsl/custom_print.xsl</foCustomization>
Then, in the custom XSL file, instead of needing to specify the location of the standard XSL file in the import statement at the top, you put in a symbolic path that the plug-in resolves:
<xsl:import href="urn:docbkx:stylesheet"/>
This all works very well. But I have been banging my head against the wall trying to understand how the plugin can be told to look for other things you might want to import. Two examples:
my customization layers import not just the regular stylesheet, but
also a custom titlepages XSL file I generated by the usual process.
I have a template to write a chunk of additional code into the HTML
output's head element. Specifically, it's the code to hook up to
Google Analytics. The code is in an external file.
In both cases the files being imported are sitting in the same directory as the customization layer, but best as I can tell the plugin can't find them. I don't know how to get the plugin to include these as it does its work.
Any ideas?
Thanks!
Alan
If you have something like:
src
|
+--docbkx
|
+--xsl
|
+--custom_print.xsl
+--custom_titlepages.xsl
and a pom like:
...
<plugin>
<groupId>com.agilejava.docbkx</groupId>
<artifactId>docbkx-maven-plugin</artifactId>
...
<foCustomization>src/docbkx/xsl/custom_print.xsl</foCustomization>
...
include in your custom_print.xsl the custom_titlepages.xsl:
<?xml version="1.0" encoding="UTF-8"?>
<xsl:stylesheet
xmlns:xsl="http://www.w3.org/1999/XSL/Transform"
xmlns:fo="http://www.w3.org/1999/XSL/Format"
version="1.0">
<xsl:import href="urn:docbkx:stylesheet"/>
<xsl:include href="custom_titlepages.xsl"/>
You might find enlightening these documents:
Advanced Customization
Customizing both HTML and FO

Use attribute to omit code from coverage analysis in Visual Studio

I have some classes that, for one reason or another, cannot be or need not be unit tested. I'd like to exclude these classes from my coverage metrics so that I can get a better feel for the coverage on the classes I actually care about. Right now I have to exclude the results after the fact. What I would like to do is use an attribute to mark those classes as excluded so that they aren't included to begin with. Is there any way to decorate a class with an attribute that will automatically exclude it from coverage analysis? Either VS coverage analysis or nCover will work.
FWIW, these are classes that I can assure myself by inspection that the code is correct. Mostly they are wrapper classes around existing framework classes that I've introduced in order to be able to mock the framework class out. Since the wrapper's get mocked out they don't get tested. That's ok because all they do is wrap the framework class' methods that I care about.
Starting with VS2010 we have ExcludeFromCodeCoverageAttribute. Commenters have noted this to work in NCover as well as DotCover + NUnit. Example usage:
[ExcludeFromCodeCoverage]
public class myUntestableClass
{ }
Also see this link. They suggest using VSInstr as command line tool, it have /EXCLUDE options (it's not as handy).
I've found some information on a couple of Diagnostics attributes DebuggerNonUserCodeAttribute and DebuggerHiddenAttribute that indicates that using these attributes will cause the coverage analyzer in VS to leave these out of its results. I've tried it with the DebuggerNonUserCodeAttribute and it seems to work. I can probably live with this for most of the classes that I'm thinking of, though I don't like the side effect of not being able to step into these classes. That shouldn't be a problem for the wrapper classes, but it may end up being so with classes that are inherently hard to test and I need debugger access to.
I'm still looking for alternatives, though.
With NCover you can create an attribute and then tell NCover to ignore that attribute.
In our projects, we have defined this attribute (no namespace, so it is easy to use):
public class CoverageExcludeAttribute : Attribute { }
We use NAnt, so we have a target that looks like this:
<target name="unittests" description="run nunit tests" >
<ncover
....
excludeAttributes="CoverageExcludeAttribute"
/>
</target>
Question 9 in the NCover FAQ describes this method. We based our solution on this.
Alternatively, you can use the exclude feature of the NCoverExplorer to exclude namespaces and assemblies from the final report. This merely removes the data from the report, but the end result is the same.
We use both techniques.
This work for me! 👍
use in .csproj: sonar keys
<ItemGroup>
<SonarQubeSetting Include="sonar.issue.ignore.allfile">
<Value>ExcludeFromCodeCoverage</Value>
</SonarQubeSetting>
</ItemGroup>
or
<ItemGroup>
<SonarQubeSetting Include="sonar.coverage.exclusions">
<Value>**/FileClassToExclude.cs</Value>
</SonarQubeSetting>
</ItemGroup>
And then use in your class file .cs by ExcludeFromCodeCoverage from microsoft
[ExcludeFromCodeCoverage]
public class ExcludeMeFromSonarCoverage
{
public ExcludeMeFromSonarCoverage()
{
}
}

Using rake with a non-ruby project

A workmate floated the idea of using rake as a build system for a non-ruby project. Is it possible to extend rake to compliment other languages where the autoconf toolset would usually be used?
There are examples of this, like buildr, the drop in-replacement for maven (for java) that is built on top of rake. There's also raven for java.
Tools like waf and SCons are Python-based build systems that are developed specifically for broad language support.
You can find how to use Rake as an easy replacement for Makefile in the manual...
I use it almost exlusevely for build that I write myself... If you use Java better choice would be Ant and Maven - they have a lot of code behind them... But, as for me, you have to be a little brainf*ed to program in XML, so I often use Rake for many task, and invoke it from Ant/Maven, like that:
<target name="custom_task">
<exec executable="/usr/bin/env">
<arg value="rake"/>
<arg value="some-task"/>
<arg value="param" />
</exec>
</target>
It may not be super efficient, especially if you have to run anything on the JVM it can't use Ant's, so it is not the best idea... I haven't tried JRuby, maybe it would be worth trying...
But for other task - filehandling, doing something with text files, etc. it works really nice for me :-)
I use it to deploy (Capistrano) on several non-Rails projects. One Java (servlet) and several static HTML sites.
Very handy.
I use it to compile Flex applications. I've written wrappers around the Flex SDK command line tools -- it's easy to do for any tool chain that can be called from the command line.

Resources