im using fragments like this:
#RequestMapping(value="/fragment/nodeListWithStatus", method= RequestMethod.GET)
public String nodeListWithStatus(Model model) {
// status der nodes
model.addAttribute("nodeList", nodeService.getNodeListWithOnlineStatus());
return "/fragments :: nodeList";
}
The templates are in /src/main/resources/templates. This works fine when starting the application from IntelliJ.
As soon as i create an .jar and start it, above code no longer works. Error:
[2014-10-21 20:37:09.191] log4j - 7941 ERROR [http-nio-666-exec-2] --- TemplateEngine: [THYMELEAF][http-nio-666-exec-2] Exception processing template "/fragments": Error resolving template "/fragments", template might not exist or might not be accessible by any of the configured Template Resolvers
When i open the .jar with winrar, i see /templates/fragments.html - so it seems to be there.
My pom.xml has this part for building the jar (Maven clean, install) :
<build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<configuration>
<mainClass>de.filth.Application</mainClass>
<layout>JAR</layout>
</configuration>
<executions>
<execution>
<goals>
<goal>repackage</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>
Can anyone tell me what im doing wrong here?
Thanks!
You don't need the leading / on the view name, i.e. you should return fragments :: nodeList rather than /fragments :: nodeList. Having made this change Thymeleaf should be able to find the template when run from your IDE or from a jar file.
If you're interested, here's what's happening under the hood:
The view name is used to search for a resource on the classpath. fragments :: nodeList means that the resource name is /templates/fragments.html and /fragments :: nodeList means that the resource name is /templates//fragments.html (note the double slash). When you're running in your IDE the resource is available straight off the filesystem and the double slash doesn't cause a problem. When you're running from a jar file the resource is nested within that jar and the double slash prevents it from being found. I don't fully understand why there's this difference in behaviour and it is rather unfortunate. I've opened an issue so that we (the Spring Boot team) can see if there's anything we can do to make the behaviour consistent.
It's an old topic, but I stumbled upon it while having problem with similar symptoms and different root cause. Wanted to share solution which helped me in case it could help somebody else...
Apparently name of the messages.properties file is case sensitive, but not everywhere. I had mine called "Messages.properties" (with capital M) and it worked just fine from inside IDE (IntelliJ), but once I tried to run app from jar, all messages were replaced with ??parameter.name??. Replacing M with lowercase m resolved the problem.
Related
I have written a plugin and want to treat the goal vrs (check versions) differently,
depending on execution/command line.
In my code I added
#Parameter(name = "versionsWarnOnly", defaultValue = "true")
private boolean versionsWarnOnly;
public boolean getVersionsWarnOnly() {
System.out.println("invoked get");
return this.versionsWarnOnly;
}
public void setVersionsWarnOnly(boolean versionsWarnOnly) {
System.out.println("invoked set");
this.versionsWarnOnly = versionsWarnOnly;
}
I would expect, that without specifying versionsWarnOnly in the configuration in the pom, i just get the defaultValue specified.
The problem is, that does not happen, it is always false in that case.
If i configure in the configuration
of the plugin in the pom
<versionsWarnOnly>true</versionsWarnOnly>
Then this is done (well some success) if I build the phase mvn validate.
It is even true if I invoke the goal from the command line by goal mvn latex:vrs.
But if i specify that in an execution that like,
<execution>
<id>validate_converters</id>
<phase>validate</phase>
<configuration>
<versionsWarnOnly>true</versionsWarnOnly>
</configuration>
<goals>
<goal>vrs</goal>
</goals>
</execution>
again it has no effect.
I have no idea what i do wrong
or what kind of information you need to help me.
The problem is resolved and the question could be deleted altogether.
The problem was that I put almost all of the config into a class Settings.
On the other hand, I added the new parameter outside and so it had no effect.
Thats all.
Now all works fine.
My program in xquery has a few variables that are based on the environment where the function is being run. For example, dev points to "devserver", test to "testserver", prod to "server", etc.
How do I set this up in my application.xml file and how do I reference these in my .xqy functions?
"workaround" solution 1
use "switch" to determine host:
switch (xdmp:host-name(xdmp:host()))
case <dev environment as string> return "devserver"
case <test environment as string> return "testserver"
.
.
.
default return fn:error(xs:QName("ERROR"), "Unknown host: " || xdmp:host-name(xdmp:host()))
"workaround" solution 2
create an xml file in your project for each host, update your application.xml to place the xml file in a directory depending on the environment name, then refer to the document now built on installation.
application.xml:
<db dbName="!mydata-database">
<dir name="/config/" permissionsMode="set">
<uriPrivilege roles="*_uberuser"/>
<permissions roles="*_uberuser" access="riu"/>
<load env="DEV" root="/config/DEV/" merge="true" include="*.xml"/>
<load env="TEST" root="/config/TEST/" merge="true" include="*.xml"/>
documents located in project directory /config//environment.xml
<environment>
<services>
<damApi>https://stage.mydam.org</damApi>
<dimeApi>https://stage.mydime.org/api/Services</dimeApi>
</services>
</environment>
usage when I need to get the value
fn:doc("/config/environment.xml")/environment/services/damApi/fn:string()
neither solution seems the best to me.
If you use ml-gradle to deploy your project, it can do substitutions in your code. That means you can set up an XQuery library with code like this:
declare variable $ENV = "%%environmentName%%";
You can then import that library wherever you need.
Don't know if MarkLogic supports it, but XQuery 3.1 has functions available-environment-variables() and environment-variable().
You can consider using the tiny “properties” library at https://github.com/marklogic-community/commons/tree/master/properties
We wrote it long, long ago for MarkMail.org with the belief we didn’t want to put the configuration into a database document because configuration should be separate from data. Data get backed up, restored somewhere else, and the new location may not be the same environment as the old.
So instead we did a little hack and put config into the static namespace context (which each group and app server has). The configured prefix is the property name. The configured value is the property value (including type information). Here’s a screen shot from a MarkMail deployment showing it’s a production server, to email on errors, what static file version to serve, and what domain to output as its base.
This approach lets you configure properties administratively (via the Red GUI or REST) and they’re kept separate from the data. They’re statically available to the execution context without extra cost. You can configure them at the Group level or App Server level or both. The library is a convenience wrapper to pull the typed values.
Maybe by now there’s a better way like the XQuery 3.1 functions, but this one has been working well for 10+ years.
Not yet using gradle in our project, I managed to work out how to use maven profiles to find/replace the values I needed based on the environment it was deployed into. I just have to add to the proper profile the plugin to include, the files to update, and what to replace.
pom.xml:
<plugin>
<groupId>com.google.code.maven-replacer-plugin</groupId>
<artifactId>replacer</artifactId>
<version>1.5.2</version>
<executions>
<execution>
<phase>prepare-package</phase>
<goals>
<goal>replace</goal>
</goals>
</execution>
</executions>
<configuration>
<includes>
<include>**/modules/runTasks.xqy</include>
<include>**/imports/resetKey.xqy</include>
</includes>
<replacements>
<replacement>
<token>https://stage.mydime.org/api/Services</token>
<value>https://www.mydime.org/api/Services</value>
</replacement>
</replacements>
</configuration>
</plugin>
I am new to ASCIIDOC and just wanted to know WHERE the following problem comes from.
Setup:
Intellij with the neweset ASCIIDOC-Plugin
neweset asciidoctor-maven-plugin with preserveDirectories = true
I organized my asciidocs like this:
footer.adoc
header.adoc
index.adoc
subfolder
index.adoc
generated-docs looks like this:
footer.html
header.html
index.html
subfolder
index.html
Now, if I want the subfolder/index.html to include header & footer too, I thought I need to write include::../header.adoc[] into the adoc-file which is no problem for the Intellij-Plugin. But in the generated html you will find following error:
<p>Unresolved directive in index.adoc - include::../header.adoc[]</p>
So when I write the following into the adoc-file: include::header.adoc[] the generated html is happy but the Intellij ASCIIDOC plugin shows an error:
Unresolved directive in <stdin> - include::header.adoc[]
I am just wondering if this is a bug for the Intellij Plugin-Team or for the Maven-Plugin-Team. Or maybe someone has a workaround this problem?
And a little bonus question: Is it possible to configure the maven plugin to not generate header-/footer.htmls since they are already included into the actual htmls?
I have no experience with the maven plugin, but I do have lots of experience with AsciiDoc, the IntelliJ Plugin and the Gradle plugin.
The IntelliJ Plugin behaviour is correct. When you convert /subfolder/index.adoc, the includes are resolved relative to this file, so the include include::../header.adoc is correct.
You describe that you don't specify which file to render for the maven plugin (header.adoc is converted). This might be the problem with the maven plugin:
You just specify the source path and all documents are rendered relative to this source path and hence the /subfolder/index.adoc has the wrong source path.
With the Gradle plugin, you cann specify all documents to be converted. This would also avoid getting header.adoc converted. From the maven plugin docs I see that you can specify only a single file.
With this in mind, I would suggest to change your file structure in such a way that you have all the files to be converted in one folder. You can then specify this folder and the other files should not be converted. This also shoul dresolve your problem with the relative path name:
/src/docs/
|
+-common/
| |
| +-header.adoc
| +-footer.adoc
+-chapters/
+-main/
|
+-index1.adoc
+-index2.adoc
I know this is late, the answer is found in the documentation in the restdoc manual from Spring.io in the "Using the Snippets" section of https://spring.io/guides/gs/testing-restdocs/ There is some mention of this in the sample gradle project
The maven plugin configuration should be something like this:
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<version>1.5.8</version>
<executions>
<execution>
<id>output-html</id>
<phase>prepare-package</phase>
<goals>
<goal>process-asciidoc</goal>
</goals>
<configuration>
<sourceHighlighter>coderay</sourceHighlighter>
<backend>html</backend>
<attributes>
<toc/>
<linkcss>false</linkcss>
<snippets>
${project.build.directory}/generated-snippets
</snippets>
</attributes>
</configuration>
</execution>
</executions>
</plugin>
I try to generate documentation via Spring Rest Docs using Asciidoctor. User manual says: for highlighting source code in documentation I shall to use :source-highlighter: highlightjs attribute in header of .adoc file.
Here's an example of my index.adoc:
:source-highlighter: highlightjs
= Source code listing
Code listings look cool with Asciidoctor and highlight.js with {highlightjs-theme} theme.
[source,groovy]
----
// File: User.groovy
class User {
String username
}
----
[source,sql]
----
CREATE TABLE USER (
ID INT NOT NULL,
USERNAME VARCHAR(40) NOT NULL
);
----
after that I build and run application, and here's an generated doc without highlighting of source code:
My maven plugin configuration:
<plugin>
<groupId>org.asciidoctor</groupId>
<artifactId>asciidoctor-maven-plugin</artifactId>
<version>1.5.3</version>
<executions>
<execution>
<id>generate-docs</id>
<phase>prepare-package</phase>
<goals>
<goal>process-asciidoc</goal>
</goals>
<configuration>
<backend>html</backend>
<doctype>book</doctype>
</configuration>
</execution>
</executions>
<dependencies>
<dependency>
<groupId>org.springframework.restdocs</groupId>
<artifactId>spring-restdocs-asciidoctor</artifactId>
<version>2.0.2.RELEASE</version>
</dependency>
</dependencies>
</plugin>
What am I doing wrong?
P.S. Also I have tried to install highlight.js locally with renaming highlight/highlight.pack.js to highlight/highlight.min.js and highlight/styles/github.css to highlight/styles/github.min.css and so on, as user manual says but there is no result too
Unfortunately, as you probably figured out, Groovy is not included in the standard highlight.js language pack. It only includes the ones in the "common" section. SQL would work though. As you can see in this picture, the SQL part works with my setup out of the box, but not Groovy.
To fix the Groovy code, you can either use Java as the language (fine for a lot of Groovy code examples) or download the custom HighlightJS pack with Groovy checked. I'm guessing that's where you got to.
If you are using the custom HighlightJS pack, I ran into a similar issue at first. When I went into developer tools in the browser, it showed that the highlight.js files were not found. Another hint that was the problem is that all the Spring REST Docs examples lost their highlighting too. Although the Asciidoctor manual says to put it into the same folder and it should copy over automatically, with Gradle, I still had to tell it to include the highlight files using the resources config option. I'm not a Maven user, but maybe the Maven plugin has a similar setup?
After I fixed the config, it worked for both Groovy and SQL
so I hope that will work for you too.
I'm trying to add custom values in the pom.properties file that Maven generates in the META-INF/maven/${groupId}/${artifactId} location
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-jar-plugin</artifactId>
<version>2.4</version>
<configuration>
<archive>
<manifestEntries>
<build>${BUILD_TAG}</build>
</manifestEntries>
<addMavenDescriptor>true</addMavenDescriptor>
<pomPropertiesFile>${project.build.directory}\interface.properties</pomPropertiesFile>
</archive>
</configuration>
</plugin>
The content of the interface.properties files is
# Build Properties
buildId=746
Using the documentation I have pointed the pomPropertiesFile element to an external properties, but the generated pom.properties file still has the default content after running mvn install
What's the correct usage of the pomPropertiesFile element?
EDIT
I believe that the problem lies in org.apache.maven.archiver.PomPropertiesUtil. If you look at the method sameContents in the source it returns true if the properties in the external file are the same as the defaults and false if different. If the result of sameContents is false, then the contents of the external file are ignored.
Sure enough, this has already been logged as a bug
I think you need to place a file under src/main/resources/META-INF/${groupId}/${artifactId}/interface.properties and let maven do the filtering job (configure the filtering). The file will automatically be copied to target/META-INF/maven/${groupId}/${artifactId}/ location.
See https://issues.apache.org/jira/browse/MNG-4998
Maven 3 will resolve property placeholders eagerly when reading pom.xml for all properties values that are available at this time. Modifying these properties at a later time will not affect the values that are already resolved in pom.xml.
However, if property value is not available (there's no default), then placeholder will not be replaced by the value and it can still be processed later as a placeholder. For example, if a plugin will generate some property during the build, or if placeholder is read and processed by a plugin during some build step.