java.security.AccessControlException in Spring Application - spring

I have a file described in src/resources/database-context.xml
<context:property-placeholder
location="file:#{systemProperties['CONF_DIR']}/environment.properties"
ignore-unresolvable="true"/>
For some reason its not being read, and I get the following error...
14:15:32.454 [main] WARN org.springframework.context.support.ClassPathXmlApplicationContext - Exception encountered during context initialization - cancelling refresh attempt
org.springframework.beans.factory.BeanInitializationException: Could not load properties; nested exception is java.io.FileNotFoundException: \environment.properties (The system cannot find the file specified)
at org.springframework.beans.factory.config.PropertyResourceConfigurer.postProcessBeanFactory(PropertyResourceConfigurer.java:89)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:265)
at org.springframework.context.support.PostProcessorRegistrationDelegate.invokeBeanFactoryPostProcessors(PostProcessorRegistrationDelegate.java:162)
at org.springframework.context.support.AbstractApplicationContext.invokeBeanFactoryPostProcessors(AbstractApplicationContext.java:606)
at org.springframework.context.support.AbstractApplicationContext.refresh(AbstractApplicationContext.java:462)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:139)
at org.springframework.context.support.ClassPathXmlApplicationContext.<init>(ClassPathXmlApplicationContext.java:83)
at com.mizuho.ats.main.LoanAdjust.main(LoanAdjust.java:80)
Caused by: java.io.FileNotFoundException: \environment.properties (The system cannot find the file specified)
But the real reason might an access control issue...
14:15:32.282 [main] DEBUG org.springframework.core.SpringProperties - Could not retrieve system property 'spring.beaninfo.ignore': java.security.AccessControlException: access denied ("java.util.PropertyPermission" "spring.beaninfo.ignore" "read")
14:15:32.298 [main] DEBUG org.springframework.beans.factory.support.DefaultListableBeanFactory - Invoking afterPropertiesSet() on bean with name 'org.mybatis.spring.mapper.MapperScannerConfigurer#0'
14:15:32.298 [main] DEBUG org.springframework.beans.factory.support.DefaultListableBeanFactory - Finished creating instance of bean 'org.mybatis.spring.mapper.MapperScannerConfigurer#0'
14:15:32.298 [main] DEBUG org.springframework.core.env.StandardEnvironment - Adding [systemProperties] PropertySource with lowest search precedence
14:15:32.298 [main] DEBUG org.springframework.core.SpringProperties - Could not retrieve system property 'spring.getenv.ignore': java.security.AccessControlException: access denied ("java.util.PropertyPermission" "spring.getenv.ignore" "read")
And its failing to read the entries...
14:43:45.274 [main] INFO org.springframework.core.env.StandardEnvironment - Caught AccessControlException when accessing system property [CONF_DIR];
its value will be returned [null]. Reason: access denied ("java.util.PropertyPermission" "CONF_DIR" "read")
Of course, the file exists in the path. And there is a security policy file with the following specification...(may be its not even loading.)
grant {
// Allow everything
permission java.security.AllPermission;
};
The original application was developed in an environment having only JDK 1.7, and I am working on a JDK 1.8 environment, though I have explicity specified 1.7 JDK in the compile settings and 1.7 JRE in the Run Configurations JRE settings.
What might be causing this? Any resolution?

The answer is in creating an entry in your lib/security policy file of your JRE.
Usually for Spring applications, the src/resources policy file will suffice, but for some reason it was not, especially when the logs/other files were going to be read from locations outside the project folders. Its a run time exception.

Related

No pools configured yet in tomcat for apex

Tomcat is not up for Oracle Apex because it's pointing to a wrong path which does not exist. Is there a way to fix this? below is the error message.
INFO [main] . No pools configured yet
SEVERE [main] oracle.dbtools.common.logging.LegacyLoggingAdaptor.severe Error writing to:
D:\oracle\product\old_ords\defaults.xml, D:\oracle\product\old_ords\defaults.xml (The system cannot find the path specified)

Driver claims to not accept jdbcUrl, ${JDBC_DATABASE_URL} when setting externalized config

I'm using the instructions in this Heroku article to attempt to externalize database strings for a hosted Postgres DB. When I use the literal string, I can connect with no issues. However, when I attempt to use the externalized variable ${JDBC_DATABASE_URL}, I'm getting the error message:
2022-06-17 17:30:17.063 WARN 7712 --- [ main] ConfigServletWebServerApplicationContext : Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'liquibase' defined in class path resource [org/springframework/boot/autoconfigure/liquibase/LiquibaseAutoConfiguration$LiquibaseConfiguration.class]: Invocation of init method failed; nested exception is java.lang.RuntimeException: Driver org.postgresql.Driver claims to not accept jdbcUrl, ${JDBC_DATABASE_URL}
It looks like it's interpreting the externalized variable itself as a literal string, but I have no idea why. I haven't included any other config files because it is currently working with the database string, but I can attach them if need be. Any ideas?

NiFi fails to launch due to java.lang.IllegalArgumentException

I have been trying to launch NiFi, but everytime I do so I get the following error:
2019-03-06 18:53:46,935 ERROR [main] org.apache.nifi.NiFi Failure to
launch NiFi due to java.lang.IllegalArgumentException:
java.security.NoSuchAlgorithmException: md5 MessageDigest not
available java.lang.IllegalArgumentException:
java.security.NoSuchAlgorithmException: md5 MessageDigest not
available
at org.apache.nifi.nar.NarUnpacker.calculateMd5sum(NarUnpacker.java:419)
at org.apache.nifi.nar.NarUnpacker.unpackNar(NarUnpacker.java:228)
at org.apache.nifi.nar.NarUnpacker.unpackNars(NarUnpacker.java:123)
at org.apache.nifi.NiFi.(NiFi.java:128)
at org.apache.nifi.NiFi.(NiFi.java:71)
at org.apache.nifi.NiFi.main(NiFi.java:296) Caused by: java.security.NoSuchAlgorithmException: md5 MessageDigest not
available
at sun.security.jca.GetInstance.getInstance(GetInstance.java:159)
at java.security.Security.getImpl(Security.java:695)
at java.security.MessageDigest.getInstance(MessageDigest.java:167)
at org.apache.nifi.nar.NarUnpacker.calculateMd5sum(NarUnpacker.java:407)
... 5 common frames omitted 2019-03-06 18:53:46,939 INFO [Thread-1] org.apache.nifi.NiFi Initiating shutdown of Jetty web
server... 2019-03-06 18:53:46,940 INFO [Thread-1] org.apache.nifi.NiFi
Jetty web server shutdown completed (nicely or otherwise).
I understand this is coming from "calculateMd5sum " function that calculates md5 sum of a specified file. However, I have made no changes to any of Nars neither have I added any custom nars. The same instance did launch before.
I have also tried to start afresh by extracting the setup again, however I face the same error. I fail to understand why the issue is coming up all of a sudden. Please help!
I got it.
My java home pointed to "C:\Program Files\Java\jdk1.8.0_65"
changed the path to "C:\Program Files (x86)\Java\jre1.8.0_121"
It works fine now.
Thanks #BryanBende

Unable to start Hive CLI Hadoop(MapR)

I am trying to access hive CLI. However, it is failing to start with the following AccessControl issue.
Strangly enough, I am able to query hive data from Hue without the AccessControl issue. However, hive CLI is not working.
I am on a MapR cluster.
Any help is much appreciated.
[<user_name>#<edge_node> ~]$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/mapr/hive/hive-2.1/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/mapr/lib/slf4j-log4j12-1.7.12.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Logging initialized using configuration in file:/opt/mapr/hive/hive-2.1/conf/hive-log4j2.properties Async: true
2017-09-23 23:52:08,988 WARN [main] DataNucleus.General: Plugin (Bundle) "org.datanucleus.api.jdo" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-api-jdo-4.2.4.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/mapr/hive/hive-2.1/lib/datanucleus-api-jdo-4.2.1.jar."
2017-09-23 23:52:08,993 WARN [main] DataNucleus.General: Plugin (Bundle) "org.datanucleus" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-core-4.1.6.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/mapr/hive/hive-2.1/lib/datanucleus-core-4.1.6.jar."
2017-09-23 23:52:09,004 WARN [main] DataNucleus.General: Plugin (Bundle) "org.datanucleus.store.rdbms" is already registered. Ensure you dont have multiple JAR versions of the same plugin in the classpath. The URL "file:/opt/mapr/spark/spark-2.1.0/jars/datanucleus-rdbms-4.1.19.jar" is already registered, and you are trying to register an identical plugin located at URL "file:/opt/mapr/hive/hive-2.1/lib/datanucleus-rdbms-4.1.7.jar."
2017-09-23 23:52:09,038 INFO [main] DataNucleus.Persistence: Property datanucleus.cache.level2 unknown - will be ignored
2017-09-23 23:52:09,039 INFO [main] DataNucleus.Persistence: Property hive.metastore.integral.jdo.pushdown unknown - will be ignored
2017-09-23 23:52:14,2251 ERROR JniCommon fs/client/fileclient/cc/jni_MapRClient.cc:2172 Thread: 20235 mkdirs failed for /user/<user_name>, error 13
Exception in thread "main" java.lang.RuntimeException: org.apache.hadoop.security.AccessControlException: User <user_name>(user id 50005586) has been denied access to create <user_name>
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:617)
at org.apache.hadoop.hive.ql.session.SessionState.beginStart(SessionState.java:531)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:714)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:646)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:641)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
Caused by: org.apache.hadoop.security.AccessControlException: User <user_name>(user id 50005586) has been denied access to create <user_name>
at com.mapr.fs.MapRFileSystem.makeDir(MapRFileSystem.java:1256)
at com.mapr.fs.MapRFileSystem.mkdirs(MapRFileSystem.java:1276)
at org.apache.hadoop.fs.FileSystem.mkdirs(FileSystem.java:1913)
at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getDefaultDestDir(DagUtils.java:823)
at org.apache.hadoop.hive.ql.exec.tez.DagUtils.getHiveJarDirectory(DagUtils.java:917)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.createJarLocalResource(TezSessionState.java:616)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.openInternal(TezSessionState.java:256)
at org.apache.hadoop.hive.ql.exec.tez.TezSessionState.beginOpen(TezSessionState.java:220)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:614)
... 10 more
The error is saying you're defined access to create a directory in the file system. This is likely /user/<user name>, which will need to be added by the HDFS / MapR FS super user.
I am able to query hive data from Hue without the AccessControl
Hue communicates via Thrift and HiveServer2.
Hive CLI bypasses HiveServer2 and is deprecated.
You should use Beeline instead.
beeline -n $(whoami) -u jdbc:hive2://hiveserver:10000/default
And if you're in a kerberized cluster, then you'll need some extra options there.

FileNotFound exception when launching an empty WebSphere server

There was no project deployed to the Websphere server, and I have yet to enable security feature on the WebSphere Server 7.0.
Could it be from previous attempts to create a server? I've not been successful. The directory C:\myworkspace\.metadata\.plugins\org.eclipse.wst.server.core\tmp1\ does not exist because I deleted tmp1 folder in an attempt to clean my server environment. Repeat deleted and reinstalling the server does not help.
Here's the stacktrace:
[11/24/14 11:57:00:930 EST] 0000000e ApplicationMg E WSVR0100W: An error occurred initializing, MyApplication
org.eclipse.jst.j2ee.commonarchivecore.internal.exception.NestedJarException: IWAE0008E An error occurred reading MyApplication.war from C:\myWorkspace
Stack trace of nested exception:
java.io.FileNotFoundException: C:\myworkspace\.metadata\.plugins\org.eclipse.wst.server.core\tmp1\MyApplication (The system cannot find the path specified.)
at java.io.FileInputStream.open(Native Method)
at java.io.FileInputStream.<init>(FileInputStream.java:112)
at java.io.FileInputStream.<init>(FileInputStream.java:72)
at

Resources