How could I fix this error when installing "processing-java"? - processing

I got an error when trying to install "processing-java" from the menu item under tools in the Processing 4.0b8 API, but couldn't figure out the cause. I am running a macOS Monterey.
java.lang.RuntimeException: Exception while attempting osascript -e do shell script "/bin/rm -f /usr/bin/processing-java && /bin/mkdir -p /usr/local/bin && /bin/mv /var/folders/k1/_tckl8qn3p1g4kvllk1sbft80000gn/T/processing15365159986239700480commander /usr/local/bin/processing-java" with administrator privileges
at processing.core.PApplet.exec(PApplet.java:3327)
at processing.app.tools.InstallCommander.run(InstallCommander.java:131)
at processing.app.Base.lambda$createToolItem$6(Base.java:889)
at java.desktop/javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:1972)
at java.desktop/javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2313)
at java.desktop/javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:405)
at java.desktop/javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:262)
at java.desktop/javax.swing.AbstractButton.doClick(AbstractButton.java:374)
at java.desktop/com.apple.laf.ScreenMenuItem.actionPerformed(ScreenMenuItem.java:129)
at java.desktop/java.awt.MenuItem.processActionEvent(MenuItem.java:692)
at java.desktop/java.awt.MenuItem.processEvent(MenuItem.java:651)
at java.desktop/java.awt.MenuComponent.dispatchEventImpl(MenuComponent.java:379)
at java.desktop/java.awt.MenuComponent.dispatchEvent(MenuComponent.java:367)
at java.desktop/java.awt.EventQueue.dispatchEventImpl(EventQueue.java:776)
at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:722)
at java.desktop/java.awt.EventQueue$4.run(EventQueue.java:716)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:86)
at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:97)
at java.desktop/java.awt.EventQueue$5.run(EventQueue.java:746)
at java.desktop/java.awt.EventQueue$5.run(EventQueue.java:744)
at java.base/java.security.AccessController.doPrivileged(AccessController.java:399)
at java.base/java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:86)
at java.desktop/java.awt.EventQueue.dispatchEvent(EventQueue.java:743)
at java.desktop/java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:203)
at java.desktop/java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:124)
at java.desktop/java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:113)
at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:109)
at java.desktop/java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.desktop/java.awt.EventDispatchThread.run(EventDispatchThread.java:90)
Caused by: java.io.IOException: Cannot run program "osascript": error=2, No such file or directory
at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1143)
at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1073)
at java.base/java.lang.Runtime.exec(Runtime.java:594)
at java.base/java.lang.Runtime.exec(Runtime.java:453)
at processing.core.PApplet.exec(PApplet.java:3325)
... 29 more
Caused by: java.io.IOException: error=2, No such file or directory
at java.base/java.lang.ProcessImpl.forkAndExec(Native Method)
at java.base/java.lang.ProcessImpl.<init>(ProcessImpl.java:314)
at java.base/java.lang.ProcessImpl.start(ProcessImpl.java:244)
at java.base/java.lang.ProcessBuilder.start(ProcessBuilder.java:1110)
... 33 more

Related

sqoop import data to hive throw ERROR org.apache.sqoop.hive.HiveConfig?

I have installed HUE 3.10 on ambari HDP 2.5.0
config the hue.ini fully
My problem is var sqoop sync data from mysql to hive, it throw an exception:
[main] ERROR org.apache.sqoop.hive.HiveConfig – Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
[main] ERROR org.apache.sqoop.hive.HiveConfig – Could not load org.apache.hadoop.hive.conf.HiveConf. Make sure HIVE_CONF_DIR is set correctly.
[main] ERROR org.apache.sqoop.tool.ImportTool – Encountered IOException running import job: java.io.IOException: java.lang.ClassNotFoundException: org.apache.hadoop.hive.conf.HiveConf
at org.apache.sqoop.hive.HiveConfig.getHiveConf(HiveConfig.java:50)
at org.apache.sqoop.hive.HiveImport.getHiveArgs(HiveImport.java:397)
at org.apache.sqoop.hive.HiveImport.executeExternalHiveScript(HiveImport.java:384)
at org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:342)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:246)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:524)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:615)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
at org.apache.oozie.action.hadoop.SqoopMain.runSqoopJob(SqoopMain.java:202)
at org.apache.oozie.action.hadoop.SqoopMain.run(SqoopMain.java:182)
at org.apache.oozie.action.hadoop.LauncherMain.run(LauncherMain.java:51)
at org.apache.oozie.action.hadoop.SqoopMain.main(SqoopMain.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:242)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:54)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:453)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1724)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:162)
But, if execute the same sqoop script in command line, it works!
Added environment variable HADOOP_CLASSPATH=$HADOOP_CLASSPATH:/usr/hdp/current/hive-client/lib to /etc/profile. It still not work. I tried some times to slove this issue by myself, but faild.
The script is /usr/hdp/2.5.0.0-1245/hive/bin/hive. It seem like ${HADOOP_CLASSPATH} point to /usr/hdp/2.5.0.0-1245/atlas/hook/hive/* ?
#!/bin/bash
if [ -d "/usr/hdp/2.5.0.0-1245/atlas/hook/hive" ]; then
if [ -z "${HADOOP_CLASSPATH}" ]; then
export HADOOP_CLASSPATH=/usr/hdp/2.5.0.0-1245/atlas/hook/hive/*
else
export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:/usr/hdp/2.5.0.0-1245/atlas/hook/hive/*
fi
fi
BIGTOP_DEFAULTS_DIR=${BIGTOP_DEFAULTS_DIR-/etc/default}
[ -n "${BIGTOP_DEFAULTS_DIR}" -a -r ${BIGTOP_DEFAULTS_DIR}/hbase ] && . ${BIGTOP_DEFAULTS_DIR}/hbase
export HIVE_HOME=${HIVE_HOME:-/usr/hdp/2.5.0.0-1245/hive}
export HADOOP_HOME=${HADOOP_HOME:-/usr/hdp/2.5.0.0-1245/hadoop}
export ATLAS_HOME=${ATLAS_HOME:-/usr/hdp/2.5.0.0-1245/atlas}
HCATALOG_JAR_PATH=/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/hcatalog/hive-hcatalog-core-1.2.1000.2.5.0.0-1245.jar:/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/hcatalog/hive-hcatalog-server-extensions-1.2.1000.2.5.0.0-1245.jar:/usr/hdp/2.5.0.0-1245/hive-hcatalog/share/webhcat/java-client/hive-webhcat-java-client-1.2.1000.2.5.0.0-1245.jar
if [ -z "${HADOOP_CLASSPATH}" ]; then
export HADOOP_CLASSPATH=${HCATALOG_JAR_PATH}
else
export HADOOP_CLASSPATH=${HADOOP_CLASSPATH}:${HCATALOG_JAR_PATH}
fi
exec "${HIVE_HOME}/bin/hive.distro" "$#"
How to solve this issue?
For me this issue was displaying itself in the Ambari Workflow Editor. To solve it, create a symbolic link in each sqoop client node to hive lib where hive-exec.jar is located. Next, put the hive-exec.jar into HDFS oozie share lib folders.
su root
cd /usr/hdp/current/sqoop-client/
ln -s /usr/hdp/current/hive-client/lib/hive-exec.jar hive-exec.jar
cp hive-exec.jar lib/
su -l hdfs
hdfs dfs -put hive-exec.jar /user/oozie/share/lib/sqoop
hdfs dfs -put hive-exec.jar /user/oozie/share/lib/lib_20161117191926/sqoop

Unable to run org.h2.tools.Script from command line in AUTOSERVER mode

I am running H2 in auto-server mode, so that multiple processes can access it. But I am unable to run org.h2.tools.Script from command line when TOMEE is already using it. If I shut down TOMEE , org.h2.tools.Script works fine.
Here is the command I am using:
java -cp h2-1.4.188.jar org.h2.tools.Script -url 'jdbc:h2:~/test;FILE_LOCK=FILE;AUTO_SERVER=TRUE' -user sa -password sa -script test.sql
Exception I get when I issue the command when Tomcat is up
Exception in thread "main" org.h2.jdbc.JdbcSQLException: IO Exception: "java.io.FileNotFoundException: /var/lib/test.sql (Permission denied)"; SQL statement:
SCRIPT TO '/var/lib/test.sql' [90028-188]
at org.h2.message.DbException.getJdbcSQLException(DbException.java:345)
at org.h2.message.DbException.get(DbException.java:168)
at org.h2.message.DbException.convertIOException(DbException.java:328)
at org.h2.command.dml.ScriptBase.openOutput(ScriptBase.java:146)
at org.h2.command.dml.ScriptCommand.query(ScriptCommand.java:159)
at org.h2.command.CommandContainer.query(CommandContainer.java:90)
at org.h2.command.Command.executeQuery(Command.java:197)
at org.h2.server.TcpServerThread.process(TcpServerThread.java:320)
at org.h2.server.TcpServerThread.run(TcpServerThread.java:159)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.FileNotFoundException: /var/lib/test.sql (Permission denied)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at java.io.FileOutputStream.<init>(FileOutputStream.java:101)
at org.h2.store.fs.FilePathDisk.newOutputStream(FilePathDisk.java:290)
at org.h2.store.fs.FileUtils.newOutputStream(FileUtils.java:233)
at org.h2.command.dml.ScriptBase.openOutput(ScriptBase.java:144)
... 6 more
at org.h2.engine.SessionRemote.done(SessionRemote.java:624)
at org.h2.command.CommandRemote.executeQuery(CommandRemote.java:158)
at org.h2.jdbc.JdbcStatement.executeInternal(JdbcStatement.java:179)
at org.h2.jdbc.JdbcStatement.execute(JdbcStatement.java:158)
at org.h2.tools.Script.process(Script.java:141)
at org.h2.tools.Script.process(Script.java:120)
at org.h2.tools.Script.runTool(Script.java:101)
at org.h2.tools.Script.main(Script.java:46)
I am not sure why I see FileNotFoundException. Remember this goes away when TOMEE isn't running.
Any ideas ? I am in a docker container. I made sure that URL matches between TOMEE and Script.
I found out why this is happening. TOMEE process is running as 'test' user while the Script command is running as root. Since TOMEE, gets to access H2 DB first, 'test' user should have permissions to write to /var/lib/. But /var/lib folder has permissions for root user, so test user can't write to it.

Pig permission denied while hdfs file is readable

Running the following commands is sucessful
hadoop fs -ls /path/
hadoop fs -cat /path/.pig_schema
And all the files in that dir has a -rwxr-xr-x permission
However, in the pig console, when running:
A = LOAD '/path/' USING PigStorage();
B = LIMIT A 5;
DUMP B;
Encounters a permission error
2015-08-27 08:47:59,734 [main] ERROR org.apache.pig.tools.grunt.Grunt - You don't have permission to perform the operation. Error from the server: Permission denied
2015-08-27 08:47:59,735 [main] ERROR org.apache.pig.tools.grunt.Grunt - ERROR 2017: Internal error creating job configuration.
Any idea why ?
EDIT 1: Added error log
================================================================================ Pig Stack Trace
--------------- ERROR 2017: Internal error creating job configuration.
org.apache.pig.impl.logicalLayer.FrontendException: ERROR 1066: Unable
to open iterator for alias B at
org.apache.pig.PigServer.openIterator(PigServer.java:857) at
org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:746)
at
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:320)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:196)
at
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:171)
at org.apache.pig.tools.grunt.Grunt.run(Grunt.java:69) at
org.apache.pig.Main.run(Main.java:543) at
org.apache.pig.Main.main(Main.java:157) Caused by:
org.apache.pig.PigException: ERROR 1002: Unable to store alias B at
org.apache.pig.PigServer.storeEx(PigServer.java:956) at
org.apache.pig.PigServer.store(PigServer.java:919) at
org.apache.pig.PigServer.openIterator(PigServer.java:832) ... 7 more
Caused by:
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobCreationException:
ERROR 2017: Internal error creating job configuration. at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:874)
at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.compile(JobControlCompiler.java:297)
at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher.launchPig(MapReduceLauncher.java:177)
at org.apache.pig.PigServer.launchPlan(PigServer.java:1285) at
org.apache.pig.PigServer.executeCompiledLogicalPlan(PigServer.java:1270)
at org.apache.pig.PigServer.storeEx(PigServer.java:952) ... 9 more
Caused by: java.io.IOException: Permission denied at
java.io.UnixFileSystem.createFileExclusively(Native Method) at
java.io.File.createTempFile(File.java:1879) at
java.io.File.createTempFile(File.java:1923) at
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.JobControlCompiler.getJob(JobControlCompiler.java:538)
... 14 more

MapReduce ERROR UserGroupInformation - PriviledgedActionException

I am trying to run following MapReduce code in my local machine:
https://github.com/Jeffyrao/warcbase/blob/extract-links/src/main/java/org/warcbase/data/ExtractLinks.java
However, I met this exception:
[main] ERROR UserGroupInformation - PriviledgedActionException as:jeffy (auth:SIMPLE) cause:java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: Resource file:/Users/jeffy/Documents/Eclipse/warcbase/map_backup.txt is not publicly accessable and as such cannot be part of the public cache.
Exception in thread "main" java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: Resource file:/Users/jeffy/Documents/Eclipse/warcbase/map_backup.txt is not publicly accessable and as such cannot be part of the public cache.
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:144)
at org.apache.hadoop.mapred.LocalJobRunner$Job.<init>(LocalJobRunner.java:155)
at org.apache.hadoop.mapred.LocalJobRunner.submitJob(LocalJobRunner.java:625)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:391)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1269)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1266)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:394)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1266)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1287)
at org.warcbase.data.ExtractLinks.run(ExtractLinks.java:254)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:84)
at org.warcbase.data.ExtractLinks.main(ExtractLinks.java:270)
Caused by: java.util.concurrent.ExecutionException: java.io.IOException: Resource file:/Users/jeffy/Documents/Eclipse/warcbase/map_backup.txt is not publicly accessable and as such cannot be part of the public cache.
at java.util.concurrent.FutureTask$Sync.innerGet(FutureTask.java:222)
at java.util.concurrent.FutureTask.get(FutureTask.java:83)
at org.apache.hadoop.mapred.LocalDistributedCacheManager.setup(LocalDistributedCacheManager.java:140)
... 14 more
I think this problem is because of I am trying to add a file to DistributedCache(Look at my code at Line 81-86 and Line 235). Any suggestion is welcome. Thanks!
I've met with a similar problem while running a Hadoop 2 job with DistributedCache added in local environment. Finally the cause of my problem is that Hadoop 2 is not only verifying the path itself to have public execution & read access permission, but it also verifies that all its ancestor directories should have execution permission. In this case, if "/" or "/Users" does not have a 755 permission, the file will still fail to be added into public cache.
See method static boolean ancestorsHaveExecutePermissions(FileSystem fs,
Path path, LoadingCache<Path,Future<FileStatus>> statCache)at Hadoop class FSDownload.java
One solution could be granting permission to all directories (sounds unsafe).
And a better solution is making sure all resource files to be cached are in /tmp folder or any other folder that defaultly have a >755 permission.
I've met with similar problem.
I run mahout seq2sparse with tfidf in local mode. And raise error:
Exception in thread "main" java.io.IOException: java.util.concurrent.ExecutionException: java.io.IOException: Resource file:/root/title.tfidf/dictionary.file-0 is not publicly accessable and as such cannot be part of the public cache.
I found permission of /root is 750 by default
drwxr-x---. 12 root root 4096 16:04 root
So i changed permission of /root
chmod 755 /root
then it works. so thank Yitong.
I had to change permissions of only my home directory as following
chmod go+rx /home/hadoop
to fix the problem as / and /home already have rx permisions for group and other users on my system. Here 'hadoop' is my linux login/user name.

"hudson.util.IOException2: Failed to create a temp file"

I came to work today and i found my hudson with this problem! I've tried to research, but i didn't found anything that help me.
Follow the full stack:
hudson.util.IOException2: Failed to create a temp file on /home/cpcaserver5/.hudson/jobs/SVN/workspace
at hudson.FilePath.createTextTempFile(FilePath.java:966)
at hudson.tasks.CommandInterpreter.createScriptFile(CommandInterpreter.java:124)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:68)
at hudson.tasks.CommandInterpreter.perform(CommandInterpreter.java:60)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:19)
at hudson.model.AbstractBuild$AbstractRunner.perform(AbstractBuild.java:630)
at hudson.model.Build$RunnerImpl.build(Build.java:175)
at hudson.model.Build$RunnerImpl.doRun(Build.java:137)
at hudson.model.AbstractBuild$AbstractRunner.run(AbstractBuild.java:429)
at hudson.model.Run.run(Run.java:1366)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:46)
at hudson.model.ResourceController.execute(ResourceController.java:88)
at hudson.model.Executor.run(Executor.java:145)
Caused by: hudson.util.IOException2: Failed to create a temporary directory in /etc/tomcat6/apache-tomcat-6.0.35/temp
at hudson.FilePath$12.invoke(FilePath.java:955)
at hudson.FilePath$12.invoke(FilePath.java:944)
at hudson.FilePath.act(FilePath.java:758)
at hudson.FilePath.act(FilePath.java:740)
at hudson.FilePath.createTextTempFile(FilePath.java:944)
... 12 more
Caused by: java.io.IOException: Permission denied
at java.io.UnixFileSystem.createFileExclusively(Native Method)
at java.io.File.checkAndCreate(File.java:1716)
at java.io.File.createTempFile(File.java:1804)
at hudson.FilePath$12.invoke(FilePath.java:953)
... 16 more
Email was triggered for: Failure
Sending email for trigger: Failure
It looks like you have a permissions problem. Make sure you run Jenkins/Tomcat with appropriate user permissions. Ditto if this happens on a slave - check that slave process runs as a user that has appropriate permissions.

Resources