gsoap: force WS-Discovery 1.0 instead of 1.1 - gsoap

I am using gsoap and its plugin wsddapi to implement WS-Discovery.
I need to implement WS-Discovery v1.0, but the plugin output only v1.1 messages. In the source-code of the plugin, they say it's valid both for v1.1 and v1.0, but I am not able to understand how I can force gsoap to use v1.0 messages.
Do you have any hint?

In order to generate soapClient.cpp code that is referenced by wsddapi plugins for WS-Discovery 1.0, you can use a command like :
soapcpp2 -xa /usr/share/gsoap/WS/wsdd10.h -I /usr/share/gsoap/import
In the other hand code for WS-Discovery 1.1 could be generated with :
soapcpp2 -xa /usr/share/gsoap/WS/wsdd.h -I /usr/share/gsoap/import
Perhaps your paths are different depending on packaging, operating system ...

Related

Does go modules support upgrading transitive dependencies?

Say I have a go module with each dependency pointing to different versions of the same transitive dependency.
example.com/foo v1.1 --> example3.com/baz v0.1
example2.com/bar v2.1 --> example3.com/baz v0.2
Say I find a bug in v0.2, which is the version resolved by the go modules minimum version algorithm, and would like to upgrade all dependencies that point to a specified version of a transitive dependency. Let's call it example3.com/v0.4.
Is there a command that I can run that upgrades example.com/foo and example2.com/bar so that the transitive dependencies are satisfied by example3.com/v0.4 if they exist?
Ideally I would call go get <some flag> example3.com/bar v0.4 and the result would look something like:
example.com/foo v1.x --> example3.com/baz v0.4
example2.com/bar v2.x --> example3.com/baz v0.4
Go's dependencies only specific minimum versions — they don't pin exact or maximum versions, and do assume that dependencies generally remain compatible as they evolve. So go get example3.com/bar#v0.4 will upgrade example3.com/bar to v0.4, and will downgrade anything that depends on a version higher than v0.4, but it will assume that anything written against v0.1 or v0.2 is more-or-less compatible — and you could perhaps verify that using go test all.
So there isn't a built-in command that directly does what you're trying to do.
That said, you could use go mod graph and grep to figure out which external modules depend on any example3.com/bar. Then you could use sed to chop those lines down to just the module path, and upgrade those modules using go get:
MODS=$(go mod graph | grep '#.* example3.com/bar#.*' | sed 's/#.*//')
go get -d $MODS
You could do that even more precisely using go list -json all, which would give you structured information about the packages imported by the main module. The Deps, ImportPath, and Module fields are probably sufficient to identify which packages need to be updated. (There may be an elegant way to filter and transform that using jq, but I don't have the bandwidth to figure that out today.)

Build / Compile latest SalaJS (1.3+) using gradle on a windows machine?

Does anyone know if this is possible? I have ran into the following solutions:
There is a shell script that can act as 'sbt' and can be invoked in gradle using an 'exec' task but it is limited to a linux OS. I would ideally like an OS independent solution.
There is a gradle plugin for scalajs but it is relatively old (and seems no longer maintained), supporting up to version 0.6, whereas scalajs is already on version 1.3+.
ScalaJs has a 'scalajs-compiler' jar, and I am wondering if this can be used to compile a scalajs project rather than relying on SBT, if there is any documentation covering this, a reference will be greatly appreciated. Thank you all for your help.
Scala.js CLI
The Scala.js CLI (GitHub / Download) should work with *NIX systems and windows. However, there is another problem: it doesn't automatically use new versions of Scala.js. So currently, it will only give you Scala.js 1.0.0 functionality. We have not yet figured out how to solve this problem.
Compiling Scala.js yourself
The Scala.js compiler is simply a Scala compiler plugin. You simply need to invoke the Scala compiler with additional arguments:
scalac \
-classpath $CLASSPATH:scalajs-library_2.13-1.4.0.jar \
-Xplugin:scalajs-compiler_2.13.4-1.4.0.jar \
$FILES
This will produce .class and .sjsir files for the provided .scala files.
The versions of scalajs-library / scalajs-compiler must match the version of Scala you are compiling. Further, note that the compiler version must match exactly, the library needs to match in the minor version.
scalajs-library on maven
scalajs-compiler on maven
example in the Scala.js CLI
Linking Scala.js yourself
Unlike Scala for the JVM, Scala.js requires a linking step. The Scala.js linker comes as a standalone library. (artifacts on maven, Interface API).
Both the Scala.js CLI and the sbt plugin use this library to link Scala.js code. However, the CLI is outdated and the sbt plugin complicated. So instead of linking to an example, I'll just provide one here:
import java.nio.file.Path
import org.scalajs.logging._
import org.scalajs.linker.StandardImpl
import org.scalajs.linker.interface._
import scala.concurrent.ExecutionContext.Implicits.global
def link(classpath: Seq[Path], outputDir: Path): Unit = {
val logger = new ScalaConsoleLogger(Level.Warn)
val linkerConfig = StandardConfig() // look at the API of this, lots of options.
val linker = StandardImpl.linker(linkerConfig)
// Same as scalaJSModuleInitializers in sbt, add if needed.
val moduleInitializers = Seq()
val cache = StandardImpl.irFileCache().newCache
val result = PathIRContainer
.fromClasspath(classpath)
.map(_._1)
.flatMap(cache.cached _)
.flatMap(linker.link(_, moduleInitializers, PathOutputDirectory(outputDir), logger))
Await.result(result, Duration.Inf)
}
This will link all the Scala.js code in classpath and put the resulting file(s) into outputDirectory.

Use Groovy app and test code in combination with jlink solution for bundling JavaFX

This follows on from this excellent solution to the question of how to get Gradle to bundle up JavaFX with your distributions.
NB specs: Linux Mint 18.3, Java 11, JavaFX 13.
That stuff, involving jlink and a module-info.java, is beyond my pay grade (although I'm trying to read up on these things).
I want to move to using Groovy in my app and test code (i.e. Spock) rather than Java. The trouble is, the minute I include the "normal" dependency in my build.gradle i.e.
implementation 'org.codehaus.groovy:groovy-all:2.5.9'
and try to build, I get multiple errors:
mike#M17A ~/IdeaProjects/TestProj $ ./gradlew build
> Configure project :
Found module name 'javafx.jlink.example.main'
> Task :compileTestJava FAILED
error: the unnamed module reads package org.codehaus.groovy.tools.shell.util from both org.codehaus.groovy.groovysh and org.codehaus.groovy
[...]
error: the unnamed module reads package groovy.xml from both org.codehaus.groovy and org.codehaus.groovy.xml
[...]
error: module org.codehaus.groovy.ant reads package groovy.lang from both org.codehaus.groovy and org.codehaus.groovy.test
error: module org.codehaus.groovy.ant reads package groovy.util from both org.codehaus.groovy.xml and org.codehaus.groovy.ant
100 errors
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':compileTestJava'.
Yes, 100 errors... probably more! By commenting out various things I think I've come to the conclusion that some Groovy dependency is being injected by the jlink stuff. Fine, I can live with that (although it'd be nice to know what version of Groovy it is).
The trouble is, even if I omit the Groovy dependency line, the same errors occur when I try to introduce the Spock dependency:
testImplementation 'org.spockframework:spock-core:1.2-groovy-2.5'
Has anyone got any idea what's going on here and what to do about it?
I searched for an answer. I didn't find a good solution.
According to this, it seems that Groovy is currently not really compatible with Java modules. It is due to the fact that some packages are contained by multiple jars of the library (not compatible with modules). You will have to wait for Groovy 4 for a compatible version.
I discovered that the JavaFX plugin use this plugin internally. This plugin seems to consider that all dependencies are modules (it is not the default Gradle behaviour).
To make your application works, it seems that you have to:
force Gradle to put Groovy in the classpath instead of the modulepath (it will not be considerered as a module, but seems impossible if you use the javafx plugin)
use the "patch-module" system: it allows Gradle to make a fusion of the library jars into a single module, to prevent the problem of packages that are in different jars
I searched the Groovy jars with IDEA (Project structure/Libraries), and I tried to use the syntax offered by the plugin to use "patch-module":
patchModules.config = [
"org.codehaus.groovy=groovy-ant-3.0.1.jar",
"org.codehaus.groovy=groovy-cli-picocli-3.0.1.jar",
"org.codehaus.groovy=groovy-console-3.0.1.jar",
"org.codehaus.groovy=groovy-datetime-3.0.1.jar",
"org.codehaus.groovy=groovy-docgenerator-3.0.1.jar",
"org.codehaus.groovy=groovy-groovydoc-3.0.1.jar",
"org.codehaus.groovy=groovy-groovysh-3.0.1.jar",
"org.codehaus.groovy=groovy-jmx-3.0.1.jar",
"org.codehaus.groovy=groovy-json-3.0.1.jar",
"org.codehaus.groovy=groovy-jsr-3.0.1.jar",
"org.codehaus.groovy=groovy-macro-3.0.1.jar",
"org.codehaus.groovy=groovy-nio-3.0.1.jar",
"org.codehaus.groovy=groovy-servlet-3.0.1.jar",
"org.codehaus.groovy=groovy-sql-3.0.1.jar",
"org.codehaus.groovy=groovy-swing-3.0.1.jar",
"org.codehaus.groovy=groovy-templates-3.0.1.jar",
"org.codehaus.groovy=groovy-test-junit-3.0.1.jar",
"org.codehaus.groovy=groovy-test-3.0.1.jar",
"org.codehaus.groovy=groovy-testng-3.0.1.jar",
"org.codehaus.groovy=groovy-xml-3.0.1.jar"
]
It only works with a single line "org.codehaus.groovy=X.jar", but a bug prevents it to work with all of the library jars (Look at this issue on Github).
So you have multiple choices:
Use Java instead of Groovy
Wait for a new Groovy release, or new releases of plugins (modules-plugin, and a version of javafx-plugin that use this one internally)
Use old javafx configuration: dependencies are not module by default, and you have to specify manually in build.gradle that JavaFX dependencies should be considered as a module (check my "obsolete" answer to this question)

How to get rid of V677: custom declaration of a standard type warning

We are using a PVS Studio (free variant for opensource projects) in conjunction with travis-ci and it for some reason spawns errors for files located in /usr/local/clang-3.5.0/lib/clang/3.5.0/include:
/usr/local/clang-3.5.0/lib/clang/3.5.0/include/stddef.h:58:1: warning: V677 Custom declaration of a standard 'size_t' type. The declaration from system header files should be used instead.
/usr/local/clang-3.5.0/lib/clang/3.5.0/include/stddef.h:86:1: warning: V677 Custom declaration of a standard 'wchar_t' type. The declaration from system header files should be used instead.
/usr/local/clang-3.5.0/lib/clang/3.5.0/include/stdarg.h:30:1: warning: V677 Custom declaration of a standard 'va_list' type. The declaration from system header files should be used instead.
/usr/local/clang-3.5.0/lib/clang/3.5.0/include/stddef.h:47:1: warning: V677 Custom declaration of a standard 'ptrdiff_t' type. The declaration from system header files should be used instead.
This location looks like an example of “system” headers for non-standard compiler and is far away from the project root (which AFAIR is somewhere in /home: standard travis location). Script run uses latest version from https://www.viva64.com/en/pvs-studio-download-linux/, latest run at “Mon Jul 3 20:13:42 UTC 2017” (unfortunately, used version is not saved).
If the compiler is located in some unusual place, it is recommended to add a new path to the analyzer exceptions, so that you can see in the report only the warnings for the code of your own projects.
pvs-studio-analyzer analyze ... -e /path/to/exclude-path ...
or
pvs-studio ... --exclude-path /path/to/exclude-path ...
It appears that PVS does not detect the error if clang is updated to clang-4.0.

jni call involving open cobol dlls

I am trying to invoke an existing COBOL application using JNI. COBOL application structure is as follows.
c-wrapper(main)-->COBOLProgram -> several dyn(.so) and static called modules
The existing COBOL application involves several statically called subprograms(COBOL) and many dynamic(COBOL) ones.
Jni invocation of the application is ok, but it could not locate and invoke COBOL dynamic sub modules.
Modified Application structure (for jni) is as follows:
java class --> libjni.so --> appl.so
I verified COB_LIBRARY_PATH and LD_LIBRARY_PATH environment variables before the CALL, those seems to be fine.
Following error message got in the case dynamic modules.
libcob: ....<module>.so: undefined symbol: cob_save_call_params
I use 64 bit , 1.1.0 on linux. gcc is used to create binary using the c output of cobc command
This issue can be resolved by properly specifying -lcob linkage option(when using gcc). The gcc command used to create the binary already contained the option, but it was mistakenly placed in between target and source file, which was not in effect. Execution of dll without JNI invocation, somehow does not require -lcob option, but from the JNI invocation requires -lcob linkage option.

Resources