I have a basic sbt project. I want to package two jars with the same source files, but compilation with different options.
So one project, 2 compilations but with different options (scalacOptions) and 2 jars as output. I don't want to execute sbt twice, changing the options.
Does anybody have an idea?
With something like this in build.sbt, you can run sbt compile2:package and produce both a jar from the compile config and compile2 config:
val Compile2 = config("compile2") extend Compile
inConfig(Compile2)(Defaults.compileSettings ++ Seq(
// these options can be set as "in Compile2" outside of inConfig as well
scalacOptions := SECOND-OPTIONS-LIST,
// otherwise it will be "src/compile2", you want it to be "src/main"
sourceDirectory <<= sourceDirectory in Compile,
sbt.Keys.`package` <<= sbt.Keys.`package` dependsOn (sbt.Keys.`package` in Compile)
))
scalacOptions in Compile := BASIC-OPTIONS-LIST
I guess this is relatively simple in terms of lines of code, but not quite so straightforward if one isn't intimately familiar with sbt.
Related
Does anyone know if this is possible? I have ran into the following solutions:
There is a shell script that can act as 'sbt' and can be invoked in gradle using an 'exec' task but it is limited to a linux OS. I would ideally like an OS independent solution.
There is a gradle plugin for scalajs but it is relatively old (and seems no longer maintained), supporting up to version 0.6, whereas scalajs is already on version 1.3+.
ScalaJs has a 'scalajs-compiler' jar, and I am wondering if this can be used to compile a scalajs project rather than relying on SBT, if there is any documentation covering this, a reference will be greatly appreciated. Thank you all for your help.
Scala.js CLI
The Scala.js CLI (GitHub / Download) should work with *NIX systems and windows. However, there is another problem: it doesn't automatically use new versions of Scala.js. So currently, it will only give you Scala.js 1.0.0 functionality. We have not yet figured out how to solve this problem.
Compiling Scala.js yourself
The Scala.js compiler is simply a Scala compiler plugin. You simply need to invoke the Scala compiler with additional arguments:
scalac \
-classpath $CLASSPATH:scalajs-library_2.13-1.4.0.jar \
-Xplugin:scalajs-compiler_2.13.4-1.4.0.jar \
$FILES
This will produce .class and .sjsir files for the provided .scala files.
The versions of scalajs-library / scalajs-compiler must match the version of Scala you are compiling. Further, note that the compiler version must match exactly, the library needs to match in the minor version.
scalajs-library on maven
scalajs-compiler on maven
example in the Scala.js CLI
Linking Scala.js yourself
Unlike Scala for the JVM, Scala.js requires a linking step. The Scala.js linker comes as a standalone library. (artifacts on maven, Interface API).
Both the Scala.js CLI and the sbt plugin use this library to link Scala.js code. However, the CLI is outdated and the sbt plugin complicated. So instead of linking to an example, I'll just provide one here:
import java.nio.file.Path
import org.scalajs.logging._
import org.scalajs.linker.StandardImpl
import org.scalajs.linker.interface._
import scala.concurrent.ExecutionContext.Implicits.global
def link(classpath: Seq[Path], outputDir: Path): Unit = {
val logger = new ScalaConsoleLogger(Level.Warn)
val linkerConfig = StandardConfig() // look at the API of this, lots of options.
val linker = StandardImpl.linker(linkerConfig)
// Same as scalaJSModuleInitializers in sbt, add if needed.
val moduleInitializers = Seq()
val cache = StandardImpl.irFileCache().newCache
val result = PathIRContainer
.fromClasspath(classpath)
.map(_._1)
.flatMap(cache.cached _)
.flatMap(linker.link(_, moduleInitializers, PathOutputDirectory(outputDir), logger))
Await.result(result, Duration.Inf)
}
This will link all the Scala.js code in classpath and put the resulting file(s) into outputDirectory.
How would I add a single source set to multiple subprojects?
First of all ... yes I know how ridiculous this is. This is just something I have to do.
The setup
The project uses the Groovy DSL.
There are 3 subprojects (A,B,C), each with there own unique main source set.
There are 5 additional sourcesets (1, 2, 3, 4, 5) external to these projects.
Nonce of the external sourcesets can be compiled alone.
All of the source sets depend on an interface that is defined 3 different times in each subproject.
The subproject main source cannot depend on any of the external sources
1 and 2 need to be compiled with A, B, and C.
3 needs to be compiled with A and also B.
4 needs to be compiled with B and also C
5 needs to be compiled with C only.
4 and 5 need depend on a class defined in 2.
5 must be a standalone sourceset so that it can be included as a sourceset inside of any future subprojects that might be added.
None of the external sources are allowed to include sources from any other sourceset
None of the external sources are allowed to be compiled alone.
None of the external sources are allowed to be included as a jar or project dependency; they MUST be included as a source set and they MUST be compiled seperately for each subproject that includes them.
A
sourceSets {
main {
java {
srcDirs = ["src",
"$rootDir/source_sets/1/src",
"$rootDir/source_sets/2/src",
"$rootDir/source_sets/3/src"
]
}
}
}
B
sourceSets {
main {
java {
srcDirs = ["src",
"$rootDir/source_sets/1/src",
"$rootDir/source_sets/2/src",
"$rootDir/source_sets/3/src",
"$rootDir/source_sets/4/src"
]
}
}
}
C
sourceSets {
main {
java {
srcDirs = ["src",
"$rootDir/interfaces/source_sets/1/src",
"$rootDir/interfaces/source_sets/2/src",
"$rootDir/interfaces/source_sets/4/src",
"$rootDir/interfaces/source_sets/5/src"
]
}
}
}
settings.gradle
include(":interfaces/A")
project(":interfaces/A").name = "A"
include(":interfaces/A")
project(":interfaces/A").name = "A"
include(":interfaces/A")
project(":interfaces/A").name = "A"
The problem is that 4 and 5 are not able to find the class in 2, and my IDE (IntelliJ) cannot resolve the correct classpath.
Really what I need is for the external sourcesets to act as if there were 3 separate copies of them without there actually being 3 separate copies, and I need to do it without the use of symbolic/soft links.
The solution needs to only use gradle, but it can use JetBrains "idea" plugin for gradle so long as it doesn't involve committing any files under the ".idea" folder, but it can include inline xml or files in a resource folder outside of the .idea folder.
So yeah ... this is overly complicated and just .. ugh! But that's just how it is.
Ugh indeed.
I don't have an answer, but this is too long to put into a comment. So here goes.
I assume this is a problem with IntelliJ only, and not when compiling with Gradle, right? If that is the case, you should try and model your project in IntelliJ as you want it, and once you have found a way to do it, then figure out how to use customize the Idea plugin to do it for you.
However, I am pretty sure you can't have multiple modules in IntelliJ share the same "content root". So I only see the options left that you don't want - which is either to copy (synchronize) the sources with a new folder only used for IntelliJ (which won't allow for modifications), create symlinks (which aren't always portable) or to restructure your external sources so they can be compiled independently (which may not be easily possible)
:-(
I have been given a project A that needs access to class files from another project B. More precisely, A only needs classes compiled from the B/ejb/C/src portion of the B/ tree:
B/ejb/C/src/com/company/admin/Foo.java
B/ejb/C/src/com/company/admin/FooHome.java
B/ejb/C/src/com/company/admin/FooBean.java
B/ejb/NOTNEEDED/src/com/company/data/...
The person who had this A project before used JBuilder and included in the source definition pointers to the parallel project's B/ejb/C/src. The A project builds a jar which includes classes compiled from this other tree. I'm trying to figure out how to do this using Gradle. I want to make a B/build.gradle in the B project that will create a B-C-version.jar of .class files compiled from these sources:
B/ejb/C/src/com/company/admin/Foo.java
B/ejb/C/src/com/company/admin/FooHome.java
B/ejb/C/src/com/company/admin/FooBean.java
that I would then publish to Maven and access from the A project.
i.e., the B-C-version.jar would ideally only have these classes:
com/company/admin/Foo.class
com/company/admin/FooHome.class
but if B-C-version.jar had these classes:
com/company/admin/*.class
that would also be OK. How can I make such a thing using a build.gradle in the B project?
You can simply declare a custom Jar task like
task cJar(type: Jar) {
baseName = project.name + '-C'
from sourceSets.main.output
include 'com/company/admin/Foo.class', 'com/company/admin/FooHome.class'
}
or you can make a dedicated sourceset for your api that you then use from your other B code and from your A code, then you don't need to work with includes and update the include if you need to add files, but you just place them in the source folder of the source set and you are done, something like
sourceSets { c }
task cJar(type: Jar) {
baseName = project.name + '-C'
from sourceSets.c.output
}
Then you could also declare dependencies separately and get the correct ones drawn in transitively and so on. But it might be overkill in your situation.
I want to implement a gradle build script which compiles some java classes and copy it to to a tomcat directory. I dont want to use Gradle Java plugin since it does many things which are not relevant. I want to define my own compile & deploy tasks which does it. I have implemented it as below -
task compile (type: JavaCompile) {
source = fileTree('$srcdir')
destinationDir = file('$builddir')
classpath = files('lib')
sourceCompatibility = '1.8'
}
task deploy (type: Copy) {
dependsOn compile
from fileTree('build') {
include fileTree('classes')
}
from fileTree('lib') {
include '*'
}
into '${tomcathome}//${projectname}'
}
I have not touched deploy task yet. When i am running compile tasks it is running successful but not generating any class files. I am expecting it to be generated under /build directory.
Please suggest.
Thanks
To summarise the comments in the answer, you need to use GString like #lu.koerfer stated
this way it will always be interpreted as the literal location (a subfolder called $srcdir in this case)
This is needed when using variables inside a string, if don't need to use it in a string then don't (then you don't need a dollar sign).
Not sure how your variables are defined but for build and source directories you should ideally use Gradle provided variables
buildDir to point the build directory
sourceSets.main.java.getSrcDirs() to get source directories (depending on the project structure)
or sourceSets.main.java.srcDirs but note this is going to return the collection of your source directories, depending how you specified your sourceSets, or if you haven't at all then by default is going to return a maven convention structure src/main/java
For some global variables please read about Ext variables
Summary
I am having difficulty running and debugging a CLI application built with SBT Native Packager - browsing similar questions yielded little insight as they are either referring directly to JDK packager (via JavaFX) or are simply incomplete/unanswered. Running the application throws an error message that is hard to reason about and/or track down the root cause of (no logs).
Stack
JDK 8u60
SBT 0.13.9
sbt-native-packager 1.0.6
Inno Setup 5.5.8
The intention is to build Windows installation package - which seems correctly built (I am able to install it to another machine).
Error details
After installing the packaged app, one of two happens:
running the app without any command parameters does nothing (no errors printed to console, dead silent)
running the app with a parameter --help (which hits a code path in main method that prints out the help file) yields "Error invoking method" followed with "Failed to launch JVM" errors. I could not find any error logs or further precise hints. The application's bootstrap is practically a single class (containing main()) contained in its own SBT module, which in turn produces its own JAR.
Investigation
In absence of any idea where to start I started to investigate this with following results:
there are two bootstrap JARs produced, let's call them bootstrap.jar and bootstrap-launcher.jar
both contain MANIFEST.MF with the Main-Class element correctly populated
the former doesn't contain Class-Path element but contains elements like Implementation-Title and so on. It contains the compiled bootstrap class at the expected package. The Main-Class in its manifest file points to that package.
the latter is directly opposite the former: its manifest file contains Class-Path and Main-Class elements, nothing else. Additionally, it doesn't contain any compiled code whatsoever
Experiments
I performed the following experiments with bootstrap JARs:
deleting bootstrap.jar or bootstrap-launcher.jar
renaming and overwriting bootstrap-launcher.jar -> bootstrap.jar
renaming and overwriting bootstrap.jar -> bootstrap-launcher.jar
manually adding Class-Path entries from bootstrap.jar manifest file to bootstrap-launcher.jar's manifest file (a.k.a. desperation mode)
end result of these experiments is always the same: 'Class com...Bootstrap not found' exception thrown in a GUI window; no further explanations or stacktraces
At this point in time I have no other ideas so I would appreciate any insight.
Additionally, I see library dependencies correctly materialized in lib directory, so at least I surmise this is working correctly.
build.sbt
Lastly, for completion, I am attaching my current build.sbt file. The reason it's structured as is, is because I used to use sbt-assembly to produce a fat JAR - it worked quite nicely but new packaging was requested. Tracing down the issue, I removed all traces of sbt-assembly from build, except from plugins.sbt (simply as convenience of easier fallback, should the need arise).
lazy val root: Project = (project in file("."))
.aggregate(common, commonTest, core, bootstrapCli)
lazy val common: Project = (project in file("common"))
.settings(
libraryDependencies := ...
lazy val commonTest: Project = (project in file("commonTest"))
.dependsOn(common % "compile -> test")
.settings(
libraryDependencies := ...
)
lazy val core: Project = (project in file ("core"))
.dependsOn(common, commonTest % "test -> test")
.settings(
libraryDependencies := ...,
javacOptions in (Compile, compile) ++= Seq("-parameters"),
javacOptions in doc ++= Seq.empty,
)
lazy val bootstrapCli: Project = (project in file("bootstrapCli"))
.dependsOn(core % "compile -> compile;test -> test")
.enablePlugins(JDKPackagerPlugin)
.settings(
jdkPackagerType := "exe",
mainClass in Compile := Some("com._3esi.load.bootstrap.cli.Bootstrap")
)
Again, I'd greatly appreciate any insight.