SonarQube 5.1 PAM - no jpam in java.library.path - sonarqube

I'm unable to use PAM plugin on SonarQube 5.1 on Debian 8 (64bit).
I did setup according to https://github.com/SonarCommunity/sonar-pam and still getting following error during login:
Java::JavaLang::UnsatisfiedLinkError (no jpam in java.library.path):
java.lang.ClassLoader.loadLibrary(ClassLoader.java:1886)
java.lang.Runtime.loadLibrary0(Runtime.java:849)
java.lang.System.loadLibrary(System.java:1088)
net.sf.jpam.Pam.<clinit>(Pam.java:51)
org.sonar.plugins.pam.PamConfiguration.newInstance(PamConfiguration.java:61)
org.sonar.plugins.pam.PamConfiguration.getPAM(PamConfiguration.java:49)
org.sonar.plugins.pam.PamAuthenticator.authenticate(PamAuthenticator.java:45)
org.sonar.api.security.SecurityRealm$1.doAuthenticate(SecurityRealm.java:60)
Here's setup (sonar is located at /var/lib/sonarqube-5.1):
/var/lib/sonarqube-5.1/lib/JPam-1.1.jar
native libs (64bit and 32bit) have been put to /var/lib/sonarqube-5.1/bin/linux-x86-64/lib/libjpam.so and /var/lib/sonarqube-5.1/bin/linux-x86-32/lib/libjpam.so (just for sure in case sonar was run as 32bit)
All directories leading to native libraries and libraries themselves have +rx access
Any idea what can be causing problem?

I'd print the java.library.path variable. The only thing I can think of is that the jpam lib is in the wrong place or there is an issue with permissions. (Did you check the SonarQube user can actually read that file?)
UPDATE
Check java.library.path in Settings->System Info page
Move jpam lib to one of those paths

Related

Unable to run SparkR in Rstudio

I cant use sparkR in Rstudio because im getting some error: Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, :
JVM is not ready after 10 seconds
I have tried to search for the solution but cant find one. Here is how I have tried to setup sparkR:
Sys.setenv(SPARK_HOME="C/Users/alibaba555/Downloads/spark") # The path to your spark installation
.libPaths(c(file.path(Sys.getenv("SPARK_HOME"), "R", "lib"), .libPaths()))
library("SparkR", lib.loc="C/Users/alibaba555/Downloads/spark/R") # The path to the lib folder in the spark location
library(SparkR)
sparkR.session(master="local[*]",sparkConfig=list(spark.driver.memory="2g")*
Now execution starst with a message:
Launching java with spark-submit command
C/Users/alibaba555/Downloads/spark/bin/spark-submit2.cmd
sparkr-shell
C:\Users\ALIBAB~1\AppData\Local\Temp\Rtmp00FFkx\backend_port1b90491e4622
And finally after a few minutes it returns an error message:
Error in sparkR.sparkContext(master, appName, sparkHome,
sparkConfigMap, : JVM is not ready after 10 seconds
Thanks!
It looks like the path to your spark library is wrong. It should be something like: library("SparkR", lib.loc="C/Users/alibaba555/Downloads/spark/R/lib")
I'm not sure if that will fix your problem, but it could help. Also, what versions of Spark/SparkR and Scala are you using? Did you build from source?
What seemed to be causing my issues boiled down to the working directory of our users being a networked mapped drive.
Changing the working directory fixed the issue.
If by chance you are also using databricks-connect make sure that the .databricks-connect file is copied into the %HOME% of each user who will be running Rstudio or set up databricks-connect for each of them.

Building RXTX with --disable-locks

I need to build RXTX (http://rxtx.qbang.org/wiki/index.php/Main_Page) for a 64 bit platform with --disable-locks. ( the target platform is a Ubuntu Snappy platform so there is a problem with the permissions and lock files).
The .configure --disable-locks and make seemed to run ok.
I got a new .jar file and librxtxSerial.so as a result.
However when I installed them, I get the following error :
java.lang.UnsatisfiedLinkError: gnu.io.RXTXCommDriver.nativeGetVersion()Ljava/lang/String; thrown while loading gnu.io.RXTXCommDriver
java.lang.NoClassDefFoundError: Could not initialize class gnu.io.RXTXCommDriver thrown while loading gnu.io.RXTXCommDriver
Is there something else I need to do?
Are there other object files I need to copy over.
Thanks in advance.
I solved the problem myself.
The problem was that the configure script was not expecting a java version higher than 1.5 as shown here in t he extract :
case $JAVA_VERSION in
1.2*|1.3*|1.4*|1.5*)
#fix_parameters $JPATH/jre/lib/javax.comm.properties
CLASSPATH=".:\$(TOP):\$(TOP)/src:"find $JPATH/ -name RXTXcomm.jar |head -n1
RXTX_PATH="\$(JPATH)/jre/lib/\$(OS_ARCH)"
JHOME=$JPATH/"jre/lib/ext"
So the paths were not being set up correctly for me.
I changed it to
case $JAVA_VERSION in
1.2*|1.3*|1.4*|1.5*|1.7*)
Then it worked ok.

Failed to load native-lzo library when setting up Lzo on Cloudera Hadoop

I just followed the steps in the Cloudera Document, and had GPL Extras Parcel installed on the cluster as well as configured HDFS service via Cloudera Manager. But an error ocurred when trying to read .lzo files on HDFS:
$hadoop fs -text /tmp/Lzo/log.txt.lzo
INFO lzo.GPLNativeCodecLoader: Loaded native gpl library
WARN lzo.LzoCompressor: java.lang.NoSuchFiledError: lzoCompressLevelFunc
ERROR lzo.LzoCodec: Failed to load/initialize native-lzo library
-text: Fatal internal error
java.lang.RuntimeException: native-lzo library not available
I've read a dozen of posts, and known that it's caused by a failure on JNI loading lzo library, but none of them could properly solve my problem. Following are the efforts I've made:
1.All datanodes had installed lzop.
2.JAVA_LIBRARY_PATH in mapred-site.xml was set to /opt/cloudera/parcels/CDH/lib/hadoop/lib/native, which contains liblzo2.* files.
3.HADOOP_CLASSPATH was set to /usr/local/lib which contains hadoop-lzo.jar files.
What else can I do? Any suggestions would be appreciated!
Problem solved! It's caused by java.lang.NoSuchFiledError. Hadoop-lzo-0.4.15 do not have the field lzoCompressLevelFunc, and when I switched to hadoop-lzo0.4.20 the WARN and the ERROR were gone.
I got this issue as well today and spent quite a long time to figure out what the actual root cause is.
So I want to summarize the issue here: In short, the jar and the native library needs to be compatible, and the best way to ensure this is to generate (build) them from the same version of source code.
What I suffer yesterday is that I'm using hadoop-gpl-compression.jar, but the native library I'm using is built from hadoop-lzo. So there is a compatibility issue here. What I did after is to build the native library from hadoop-gpl-compression project instead of hadoop-lzo, then it works.
If you are using the jar built from hadoop-lzo, then you should also use the native library built from that library, and it is better to use the same version of source to build the native library.
If you are using the jar built from hadoop-gpl-compression project, then you should also use the native library built from it, and it is better to use the same version of source to build the native library.
Build the binaries and export HADOOP_CLASSPATH and JAVA_LIBRARY_PATH env variables:
https://gist.github.com/mmiliaus/5644460

Running Typesafe Activator 1.2.10 on OSX Maverick error

I have just downloaded Typesafe Activator 1.2.10-minimal on Mac OSX Maverick. When I try to run it using any command, I get the following error:
java.lang.UnsatisfiedLinkError: /Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/jre/lib/lwawt/liblwawt.dylib: dlopen(/Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/jre/lib/lwawt/liblwawt.dylib, 1): Library not loaded: #rpath/libosxapp.dylib
Referenced from: /Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/jre/lib/lwawt/liblwawt.dylib
Reason: image not found
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1851)
at java.lang.Runtime.load0(Runtime.java:795)
at java.lang.System.load(System.java:1062)
at java.lang.ClassLoader$NativeLibrary.load(Native Method)
at java.lang.ClassLoader.loadLibrary1(ClassLoader.java:1965)
at java.lang.ClassLoader.loadLibrary0(ClassLoader.java:1890)
at java.lang.ClassLoader.loadLibrary(ClassLoader.java:1872)
at java.lang.Runtime.loadLibrary0(Runtime.java:849)
at java.lang.System.loadLibrary(System.java:1088)
at sun.security.action.LoadLibraryAction.run(LoadLibraryAction.java:67)
at sun.security.action.LoadLibraryAction.run(LoadLibraryAction.java:47)
at java.security.AccessController.doPrivileged(Native Method)
at java.awt.Toolkit.loadLibraries(Toolkit.java:1653)
at java.awt.Toolkit.<clinit>(Toolkit.java:1682)
at java.awt.Desktop.isDesktopSupported(Desktop.java:169)
at activator.ActivatorLauncher.openDocs(ActivatorLauncher.scala:55)
at activator.ActivatorLauncher.displayHelp(ActivatorLauncher.scala:72)
at activator.ActivatorLauncher.run(ActivatorLauncher.scala:32)
at xsbt.boot.Launch$$anonfun$run$1.apply(Launch.scala:109)
at xsbt.boot.Launch$.withContextLoader(Launch.scala:129)
at xsbt.boot.Launch$.run(Launch.scala:109)
at xsbt.boot.Launch$$anonfun$apply$1.apply(Launch.scala:36)
at xsbt.boot.Launch$.launch(Launch.scala:117)
at xsbt.boot.Launch$.apply(Launch.scala:19)
at xsbt.boot.Boot$.runImpl(Boot.scala:44)
at xsbt.boot.Boot$.main(Boot.scala:20)
at xsbt.boot.Boot.main(Boot.scala)
Error during sbt execution: java.lang.UnsatisfiedLinkError: /Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/jre/lib/lwawt/liblwawt.dylib: dlopen(/Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/jre/lib/lwawt/liblwawt.dylib, 1): Library not loaded: #rpath/libosxapp.dylib
Referenced from: /Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/jre/lib/lwawt/liblwawt.dylib
Reason: image not found
What could be wrong?
Thanks,
Suriyanto
Copying libosxapp.dylib from /Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/jre/lib to /Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/jre/lib/lwawt solved the issue for me. Seems like #rpath looks in only lwawt folder, not in lib folder. Issue had happened to me after installing jdk7 after jdk8.
This does not look like a Scala- or Activator-specific problem, but rather like a problem with your JDK. It is looking for libosxapp.dylib on the run-path search paths, but doesn't find it.
That most likely means that either your JDK installation is broken, or there is some problem with your environment settings (I'm just guessing here, but I think something like DYLD_LIBRARY_PATH might have an impact).
Does /Library/Java/JavaVirtualMachines/jdk1.7.0_67.jdk/Contents/Home/jre/lib/libosxapp.dylib exist? If not, that's the problem. You might want to reinstall your JDK then.
Does everything else run fine on that JDK?
Have you tried it with another JDK, e.g. Java 8 (not that Activator would require Java 8, but that would be an easy way to try it on another JDK).

Eclipse plug-ins disappeared after update

Have been updated Eclipse PDT using Window->Check for Updates feature.
After restart all trird-party plug-ins seems like switched off.
Starting with -clean command line key doesn't helps.
Eclipse Installation Detals contains information about all my plug-ins correctly.
Error log:
eclipse.buildId=M20090917-0800
java.version=1.6.0_05
java.vendor=Sun Microsystems Inc.
BootLoader constants: OS=win32, ARCH=x86, WS=win32, NL=ru_RU
Framework arguments: -product org.eclipse.epp.package.php.product
Command-line arguments: -os win32 -ws win32 -arch x86 -product org.eclipse.epp.package.php.product
!ENTRY org.eclipse.team.core 4 0 2009-11-24 12:52:00.804
!MESSAGE Could not instantiate provider org.eclipse.team.svn.core.svnnature for project Search.
!STACK 1
org.eclipse.team.core.TeamException: Could not instantiate provider org.eclipse.team.svn.core.svnnature for project Search.
at org.eclipse.team.core.RepositoryProvider.mapNewProvider(RepositoryProvider.java:165)
at org.eclipse.team.core.RepositoryProvider.mapExistingProvider(RepositoryProvider.java:235)
at org.eclipse.team.core.RepositoryProvider.getProvider(RepositoryProvider.java:507)
at org.eclipse.team.internal.ccvs.ui.CVSLightweightDecorator.isMappedToCVS(CVSLightweightDecorator.java:192)
at org.eclipse.team.internal.ccvs.ui.CVSLightweightDecorator.decorate(CVSLightweightDecorator.java:147)
at org.eclipse.ui.internal.decorators.LightweightDecoratorDefinition.decorate(LightweightDecoratorDefinition.java:263)
at org.eclipse.ui.internal.decorators.LightweightDecoratorManager$LightweightRunnable.run(LightweightDecoratorManager.java:81)
at org.eclipse.core.runtime.SafeRunner.run(SafeRunner.java:42)
at org.eclipse.ui.internal.decorators.LightweightDecoratorManager.decorate(LightweightDecoratorManager.java:365)
at org.eclipse.ui.internal.decorators.LightweightDecoratorManager.getDecorations(LightweightDecoratorManager.java:347)
at org.eclipse.ui.internal.decorators.DecorationScheduler$1.ensureResultCached(DecorationScheduler.java:371)
at org.eclipse.ui.internal.decorators.DecorationScheduler$1.run(DecorationScheduler.java:331)
at org.eclipse.core.internal.jobs.Worker.run(Worker.java:55)
!SUBENTRY 1 org.eclipse.team.core 4 0 2009-11-24 12:52:00.804
!MESSAGE Could not instantiate provider org.eclipse.team.svn.core.svnnature for project Search.
Mar Cel was right:
Here i have write it in my german language wiki.
You need to chown the eclipse-programm-folder to full access permissions for the workspace-owner.
Start eclipse. Stop it, Rechange the owner back to root, and restart eclipse again.. After that, all works well for me (Eclipse 3.7.2)
Wiki
http://wiki.xstable.de/doku.php/entwicklungsumgebung:eclipse:faq?&#keine_plugins_mehr_in_eclipse_nach_update
Solution is use Equinox p2 Installer!
There is no other offline ways to install/reinstall plugins or features.
This seems to be an issue with writing permission of the Eclipse executing user. My guess is that the user can write metadata to workspace, therefore Eclipse shows you that the plugins were installed successfully but are obviously not available in the GUI, as none of the features were really installed in Eclipse itself.
Just alter the Eclipse program folders to give full permissions to the user actually executing Eclipse. Eclipse will then recognize that the metadata is wrong, repair them, and let you install the plugins once again. After that, all features will be available.

Resources