AndroidStudio error: Access is denied? - filenotfoundexception

If I create a new project from scratch or open one of my previous projects I get:
java.io.FileNotFoundException:
C:\Users\Terry.AndroidStudio2.2\system\android-palette\v1\layout_palette.xml
(Access is denied).
I didn't have any problems previously and now all of a sudden I get this error.
I had no problem opening this file using the IDE so I don't understand why it has a problem opening it.
Does anyone have any suggestions?
I'm using version 2.2.3.
Here is the sequence of events:
java.io.FileNotFoundException: C:\Users\Terry\.AndroidStudio2.2\system\android-palette\v1\layout_palette.xml (Access is denied)
java.lang.RuntimeException: java.io.FileNotFoundException: C:\Users\Terry\.AndroidStudio2.2\system\android-palette\v1\layout_palette.xml (Access is denied)
at com.android.tools.idea.uibuilder.palette.NlPaletteModel.loadPalette(NlPaletteModel.java:82)
at com.android.tools.idea.uibuilder.palette.NlPaletteModel.getPalette(NlPaletteModel.java:60)
at com.android.tools.idea.uibuilder.palette.NlPalettePanel.checkForNewMissingDependencies(NlPalettePanel.java:542)
at com.android.tools.idea.uibuilder.palette.NlPalettePanel.setDesignSurface(NlPalettePanel.java:195)
at com.android.tools.idea.uibuilder.palette.NlPalettePanel.<init>(NlPalettePanel.java:140)
at com.android.tools.idea.uibuilder.editor.NlPaletteManager.createContent(NlPaletteManager.java:88)
at com.intellij.designer.LightToolWindowManager$4.run(LightToolWindowManager.java:261)
at com.intellij.designer.LightToolWindowManager$4.run(LightToolWindowManager.java:258)
at com.intellij.designer.LightToolWindowManager.bind(LightToolWindowManager.java:208)
at com.android.tools.idea.uibuilder.editor.NlPreviewForm.lambda$attachPalette$178(NlPreviewForm.java:336)
at com.intellij.openapi.project.DumbServiceImpl.runWhenSmart(DumbServiceImpl.java:163)
at com.android.tools.idea.uibuilder.editor.NlPreviewForm.attachPalette(NlPreviewForm.java:332)
at com.android.tools.idea.uibuilder.editor.NlPreviewForm.setActiveModel(NlPreviewForm.java:326)
at com.android.tools.idea.uibuilder.editor.NlPreviewForm$Pending.run(NlPreviewForm.java:264)
at com.intellij.openapi.application.impl.LaterInvocator$FlushQueue.runNextEvent(LaterInvocator.java:345)
at com.intellij.openapi.application.impl.LaterInvocator$FlushQueue.run(LaterInvocator.java:329)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:311)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:756)
at java.awt.EventQueue.access$500(EventQueue.java:97)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.awt.EventQueue$3.run(EventQueue.java:703)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:726)
at com.intellij.ide.IdeEventQueue.defaultDispatchEvent(IdeEventQueue.java:857)
at com.intellij.ide.IdeEventQueue._dispatchEvent(IdeEventQueue.java:658)
at com.intellij.ide.IdeEventQueue.dispatchEvent(IdeEventQueue.java:386)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
Caused by: java.io.FileNotFoundException: C:\Users\Terry\.AndroidStudio2.2\system\android-palette\v1\layout_palette.xml (Access is denied)
at java.io.FileOutputStream.open0(Native Method)
at java.io.FileOutputStream.open(FileOutputStream.java:270)
at java.io.FileOutputStream.<init>(FileOutputStream.java:213)
at java.io.FileOutputStream.<init>(FileOutputStream.java:162)
at com.android.tools.idea.uibuilder.palette.NlPaletteModel.copyPredefinedPalette(NlPaletteModel.java:99)
at com.android.tools.idea.uibuilder.palette.NlPaletteModel.loadPalette(NlPaletteModel.java:74)
... 32 more

It was a simple fix. I made the files layout_palette.xml and menu_palette.xml writeable and it worked.

For me all the directories under the /.AndriodStudio were hidden and read-only mode. I changed the permission and after that it is start working.

Related

Export to windows64 failed! message when trying to export application

I just wanted to test exporting applications using Processing and i keep getting an error when i try to embed Java alongside it. The code simply consists of:
def setup():
size(100,100)
I get the error message Export to windows64 failed! but not when i don't try to embed Java. I have the latest version of Java installed and am using Windows 64. The full message is:
java.io.IOException: Launch4j seems to have failed.
at jycessing.mode.export.WindowsExport.runLaunch4j(WindowsExport.java:184)
at jycessing.mode.export.WindowsExport.export(WindowsExport.java:82)
at jycessing.mode.export.Exporter.export(Exporter.java:76)
at jycessing.mode.export.ExportDialog.go(ExportDialog.java:125)
at jycessing.mode.PyEditor.handleExportApplication(PyEditor.java:284)
at jycessing.mode.PyEditor$3.actionPerformed(PyEditor.java:168)
at javax.swing.AbstractButton.fireActionPerformed(AbstractButton.java:2022)
at javax.swing.AbstractButton$Handler.actionPerformed(AbstractButton.java:2348)
at javax.swing.DefaultButtonModel.fireActionPerformed(DefaultButtonModel.java:402)
at javax.swing.DefaultButtonModel.setPressed(DefaultButtonModel.java:259)
at javax.swing.AbstractButton.doClick(AbstractButton.java:376)
at javax.swing.plaf.basic.BasicMenuItemUI.doClick(BasicMenuItemUI.java:842)
at javax.swing.plaf.basic.BasicMenuItemUI$Handler.mouseReleased(BasicMenuItemUI.java:886)
at java.awt.Component.processMouseEvent(Component.java:6539)
at javax.swing.JComponent.processMouseEvent(JComponent.java:3324)
at java.awt.Component.processEvent(Component.java:6304)
at java.awt.Container.processEvent(Container.java:2239)
at java.awt.Component.dispatchEventImpl(Component.java:4889)
at java.awt.Container.dispatchEventImpl(Container.java:2297)
at java.awt.Component.dispatchEvent(Component.java:4711)
at java.awt.LightweightDispatcher.retargetMouseEvent(Container.java:4904)
at java.awt.LightweightDispatcher.processMouseEvent(Container.java:4535)
at java.awt.LightweightDispatcher.dispatchEvent(Container.java:4476)
at java.awt.Container.dispatchEventImpl(Container.java:2283)
at java.awt.Window.dispatchEventImpl(Window.java:2746)
at java.awt.Component.dispatchEvent(Component.java:4711)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:760)
at java.awt.EventQueue.access$500(EventQueue.java:97)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.awt.EventQueue$3.run(EventQueue.java:703)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:74)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:84)
at java.awt.EventQueue$4.run(EventQueue.java:733)
at java.awt.EventQueue$4.run(EventQueue.java:731)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:74)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:730)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:205)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
Thanks in advance!

Spark SQL query fails in Windows 10 due to tmp/hive write error. Current permissions are: rw-rw-rw-;

I went through quite a few Q/A on SO on the same topic but none of the solutions helped me solve this, hence asking the question. Mine is a Windows 10 64 bits machine.
This is my error stacktrace:
py4j.protocol.Py4JJavaError: An error occurred while calling o25.sql.
: org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-;
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
at org.apache.spark.sql.internal.SharedState.globalTempViewManager$lzycompute(SharedState.scala:141)
at org.apache.spark.sql.internal.SharedState.globalTempViewManager(SharedState.scala:136)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$2.apply(HiveSessionStateBuilder.scala:55)
at org.apache.spark.sql.hive.HiveSessionStateBuilder$$anonfun$2.apply(HiveSessionStateBuilder.scala:55)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager$lzycompute(SessionCatalog.scala:91)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.globalTempViewManager(SessionCatalog.scala:91)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.lookupRelation(SessionCatalog.scala:701)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.org$apache$spark$sql$catalyst$analysis$Analyzer$ResolveRelations$$lookupTableFromCatalog(Analyzer.scala:730)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.resolveRelation(Analyzer.scala:685)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:715)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$$anonfun$apply$8.applyOrElse(Analyzer.scala:708)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$apply$1.apply(AnalysisHelper.scala:90)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$apply$1.apply(AnalysisHelper.scala:90)
at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:70)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:89)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:326)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:324)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:326)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:324)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1$$anonfun$1.apply(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:326)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:187)
at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:324)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:87)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$$anonfun$resolveOperatorsUp$1.apply(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.allowInvokingTransformsInAnalyzer(AnalysisHelper.scala:194)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$class.resolveOperatorsUp(AnalysisHelper.scala:86)
at org.apache.spark.sql.catalyst.plans.logical.LogicalPlan.resolveOperatorsUp(LogicalPlan.scala:29)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:708)
at org.apache.spark.sql.catalyst.analysis.Analyzer$ResolveRelations$.apply(Analyzer.scala:654)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:87)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1$$anonfun$apply$1.apply(RuleExecutor.scala:84)
at scala.collection.LinearSeqOptimized$class.foldLeft(LinearSeqOptimized.scala:124)
at scala.collection.immutable.List.foldLeft(List.scala:84)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:84)
at org.apache.spark.sql.catalyst.rules.RuleExecutor$$anonfun$execute$1.apply(RuleExecutor.scala:76)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.apache.spark.sql.catalyst.rules.RuleExecutor.execute(RuleExecutor.scala:76)
at org.apache.spark.sql.catalyst.analysis.Analyzer.org$apache$spark$sql$catalyst$analysis$Analyzer$$executeSameContext(Analyzer.scala:127)
at org.apache.spark.sql.catalyst.analysis.Analyzer.execute(Analyzer.scala:121)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:106)
at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105)
at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201)
at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105)
at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:57)
at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:55)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:47)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78)
at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:642)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:238)
at java.lang.Thread.run(Unknown Source)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(Unknown Source)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(Unknown Source)
at java.lang.reflect.Constructor.newInstance(Unknown Source)
at org.apache.spark.sql.hive.client.IsolatedClientLoader.createClient(IsolatedClientLoader.scala:271)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:384)
at org.apache.spark.sql.hive.HiveUtils$.newClientForMetadata(HiveUtils.scala:286)
at org.apache.spark.sql.hive.HiveExternalCatalog.client$lzycompute(HiveExternalCatalog.scala:66)
at org.apache.spark.sql.hive.HiveExternalCatalog.client(HiveExternalCatalog.scala:65)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply$mcZ$sp(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog$$anonfun$databaseExists$1.apply(HiveExternalCatalog.scala:215)
at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:97)
... 84 more
Caused by: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-
at org.apache.hadoop.hive.ql.session.SessionState.createRootHDFSDir(SessionState.java:612)
at org.apache.hadoop.hive.ql.session.SessionState.createSessionDirs(SessionState.java:554)
at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:508)
... 99 more
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 39, in <module>
File "C:\Users\aakash.basu\Documents\spark-2.4.3-bin-hadoop2.7\python\pyspark\sql\session.py", line 767, in sql
return DataFrame(self._jsparkSession.sql(sqlQuery), self._wrapped)
File "C:\Users\aakash.basu\Documents\spark-2.4.3-bin-hadoop2.7\python\lib\py4j-0.10.7-src.zip\py4j\java_gateway.py", line 1257, in __call__
File "C:\Users\aakash.basu\Documents\spark-2.4.3-bin-hadoop2.7\python\pyspark\sql\utils.py", line 69, in deco
raise AnalysisException(s.split(': ', 1)[1], stackTrace)
pyspark.sql.utils.AnalysisException: 'java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rw-rw-rw-;'
Whenever I'm trying to join two DF, it is trying to save the intermediate output in a temp location and throwing the above error.
I tried:
To give full access permission to the tmp folder: cacls C:\tmp /p everyone:f
To give full access from Properties of tmp and then security, its all ticked.
To give chmod 777 to C:\tmp\hive (as is solution from first answer of this)
And so on...
But nothing is working. On the contrary, my Pycharm works like a charm, it is running spark without any issue. I still want to fix my pyspark shell because I want to run ad hoc Spark in REPL mode, and running batch full jobs is not always convenient during debugging.

Running Hadoop on Windows getting java.lang.UnsatisfiedLinkError: .createDirectoryWithMode0

Trying to run Hadoop on windows and getting this error
Exception in thread "main" java.lang.UnsatisfiedLinkError: org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Ljava/lang/String;I)V
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode0(Native Method)
at org.apache.hadoop.io.nativeio.NativeIO$Windows.createDirectoryWithMode(NativeIO.java:521)
at org.apache.hadoop.fs.RawLocalFileSystem.mkOneDirWithMode(RawLocalFileSystem.java:502)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirsWithOptionalPermission(RawLocalFileSystem.java:555)
at org.apache.hadoop.fs.RawLocalFileSystem.mkdirs(RawLocalFileSystem.java:533)
at org.apache.hadoop.fs.FilterFileSystem.mkdirs(FilterFileSystem.java:313)
at org.apache.hadoop.mapreduce.JobSubmissionFiles.getStagingDir(JobSubmissionFiles.java:133)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:146)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1341)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1338)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1807)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1338)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1359)
at dat.MaxTemperatureDriver.run(MaxTemperatureDriver.java:30)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:90)
at dat.MaxTemperatureDriver.main(MaxTemperatureDriver.java:37)
I already have winutils.exe, hadoop.dll, vcredist_x64.dll, vc_redist.x64.exe, vcredist_arm.exe, hdfs.dll in hadoop home/bin directory. and have that directory in PATH and LD_LIBRARY_PATH. What am I still missing?

After installing BIRT 4.4.2 all-in-one package I am not getting reports

I recently installed the all-in-one package of BIRT 4.4.2 version. After that I prepared a demo report on it. But when I am running it or debugging it I am getting error. Do some plugins like Report engine etc. need to be installed after installing the Report Designer?
If yes, please someone give me step by step procedure of adding plugins.
I am getting the following error after running:
Jun 04, 2015 11:34:49 AM
org.eclipse.birt.report.debug.internal.core.launcher.ReportLauncher
renderReport SEVERE: Engine exception
org.eclipse.birt.report.engine.api.EngineException: Failed to
initialize emitter. at
org.eclipse.birt.report.engine.emitter.EmitterUtil.getOuputStream(EmitterUtil.java:82)
at
org.eclipse.birt.report.engine.emitter.html.HTMLReportEmitter.initialize(HTMLReportEmitter.java:358)
at
org.eclipse.birt.report.engine.api.impl.EngineTask.initializeContentEmitter(EngineTask.java:2320)
at
org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.doRun(RunAndRenderTask.java:118)
at
org.eclipse.birt.report.engine.api.impl.RunAndRenderTask.run(RunAndRenderTask.java:77)
at
org.eclipse.birt.report.debug.internal.core.launcher.ReportLauncher.createReport(ReportLauncher.java:620)
at
org.eclipse.birt.report.debug.internal.core.launcher.ReportLauncher.renderReport(ReportLauncher.java:566)
at
org.eclipse.birt.report.debug.internal.core.launcher.ReportLauncher.run(ReportLauncher.java:480)
at
org.eclipse.birt.report.debug.internal.core.launcher.ReportLauncher.main(ReportLauncher.java:124)
at
org.eclipse.birt.report.debug.internal.core.ReportDebugger.start(ReportDebugger.java:39)
at
org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
at
org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:134)
at
org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:104)
at
org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:380)
at
org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:235)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) at
sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at
java.lang.reflect.Method.invoke(Unknown Source) at
org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:648) at
org.eclipse.equinox.launcher.Main.basicRun(Main.java:603) at
org.eclipse.equinox.launcher.Main.run(Main.java:1465) at
org.eclipse.equinox.launcher.Main.main(Main.java:1438) Caused by:
java.io.FileNotFoundException: \Customers.rptdesign.html (Access is
denied) at java.io.FileOutputStream.open(Native Method) at
java.io.FileOutputStream.(Unknown Source) at
java.io.FileOutputStream.(Unknown Source) at
org.eclipse.birt.report.engine.emitter.EmitterUtil.getOuputStream(EmitterUtil.java:77)
... 22 more
Jun 04, 2015 11:34:49 AM
org.eclipse.birt.report.debug.internal.core.vm.ReportVMServer$1 run
WARNING: [Server] client disconnected
It would be grateful if someone could help me with this as I am completely new to this software.
Thanks.
Hmm, doesn't the exception already say it all?
java.io.FileNotFoundException: \Customers.rptdesign.html (Access is denied) at java.io.FileOutputStream.open(Native Method) at java.io.FileOutputStream.(Unknown Source) at java.io.FileOutputStream.(Unknown Source) ...
You are trying to create an output file \Customers.rptdesign.html at the root of your current disk and obviously you don't have write permission there.
If it is simply a permissions issue run the BIRT Designer or BIRT runtime as Administrator.

Aptana 3 is not starting and only says "An error has occurred. See the log file."

I have checked the log file and see the following: (an ideas how to resolve?)
!ENTRY org.eclipse.osgi 4 0 2012-10-22 09:44:40.920
!MESSAGE Application error
!STACK 1
java.lang.NoClassDefFoundError: org/eclipse/core/resources/IContainer
at com.aptana.rcp.IDEApplication.start(IDEApplication.java:125)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:196)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:110)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:79)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:344)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:179)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
at java.lang.reflect.Method.invoke(Unknown Source)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:622)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:577)
at org.eclipse.equinox.launcher.Main.run(Main.java:1410)
Caused by: org.eclipse.core.runtime.internal.adaptor.EclipseLazyStarter$TerminatingClassNotFoundException: An error occurred while automatically activating bundle org.eclipse.core.resources (427).
at org.eclipse.core.runtime.internal.adaptor.EclipseLazyStarter.postFindLocalClass(EclipseLazyStarter.java:122)
at org.eclipse.osgi.baseadaptor.loader.ClasspathManager.findLocalClass(ClasspathManager.java:463)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.findLocalClass(DefaultClassLoader.java:216)
at org.eclipse.osgi.internal.loader.BundleLoader.findLocalClass(BundleLoader.java:400)
at org.eclipse.osgi.internal.loader.SingleSourcePackage.loadClass(SingleSourcePackage.java:35)
at org.eclipse.osgi.internal.loader.BundleLoader.findClassInternal(BundleLoader.java:473)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:429)
at org.eclipse.osgi.internal.loader.BundleLoader.findClass(BundleLoader.java:417)
at org.eclipse.osgi.internal.baseadaptor.DefaultClassLoader.loadClass(DefaultClassLoader.java:107)
at java.lang.ClassLoader.loadClass(Unknown Source)
... 13 more
Caused by: org.osgi.framework.BundleException: Exception in org.eclipse.core.resources.ResourcesPlugin.start() of bundle org.eclipse.core.resources.
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:734)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:683)
at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:381)
at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:299)
at org.eclipse.osgi.framework.util.SecureAction.start(SecureAction.java:440)
at org.eclipse.osgi.internal.loader.BundleLoader.setLazyTrigger(BundleLoader.java:268)
at org.eclipse.core.runtime.internal.adaptor.EclipseLazyStarter.postFindLocalClass(EclipseLazyStarter.java:107)
... 22 more
Caused by: org.eclipse.core.internal.dtree.ObjectNotFoundException: Tree element '/dlpage-css/database/setup/countries.txt' not found.
at org.eclipse.core.internal.dtree.AbstractDataTree.handleNotFound(AbstractDataTree.java:257)
at org.eclipse.core.internal.dtree.DeltaDataTree.getData(DeltaDataTree.java:585)
at org.eclipse.core.internal.dtree.DataDeltaNode.asBackwardDelta(DataDeltaNode.java:50)
at org.eclipse.core.internal.dtree.NoDataDeltaNode.asBackwardDelta(NoDataDeltaNode.java:59)
at org.eclipse.core.internal.dtree.NoDataDeltaNode.asBackwardDelta(NoDataDeltaNode.java:59)
at org.eclipse.core.internal.dtree.NoDataDeltaNode.asBackwardDelta(NoDataDeltaNode.java:59)
at org.eclipse.core.internal.dtree.DataDeltaNode.asBackwardDelta(DataDeltaNode.java:47)
at org.eclipse.core.internal.dtree.DeltaDataTree.asBackwardDelta(DeltaDataTree.java:88)
at org.eclipse.core.internal.dtree.DeltaDataTree.reroot(DeltaDataTree.java:816)
at org.eclipse.core.internal.dtree.DeltaDataTree.reroot(DeltaDataTree.java:815)
at org.eclipse.core.internal.dtree.DeltaDataTree.reroot(DeltaDataTree.java:815)
at org.eclipse.core.internal.dtree.DeltaDataTree.reroot(DeltaDataTree.java:792)
at org.eclipse.core.internal.watson.ElementTree.immutable(ElementTree.java:518)
at org.eclipse.core.internal.resources.SaveManager.restore(SaveManager.java:710)
at org.eclipse.core.internal.resources.SaveManager.startup(SaveManager.java:1528)
at org.eclipse.core.internal.resources.Workspace.startup(Workspace.java:2503)
at org.eclipse.core.internal.resources.Workspace.open(Workspace.java:2251)
at org.eclipse.core.resources.ResourcesPlugin.start(ResourcesPlugin.java:439)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:711)
at java.security.AccessController.doPrivileged(Native Method)
at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:702)
... 28 more
This just happened to me again and at the worst time possible.
I really didn't want to have to rebuild the whole workspace but I found this answer and it worked!
For anyone interested, I had the exact same problem and deleting the
file .metadata/.plugins/org.eclipse.core.resources/.snap did the trick
for me.
This file is located in your broken workspace's folder.
From: http://www.eclipse.org/forums/index.php/mv/msg/156204/734920/#msg_734920
Try deleting/renaming the Workspace folder (eg. Documents\Aptana 3 Studio Workspace), that worked for me
I increased the -Xms and -Xmx values in STS.ini and it solved the problem.
New Values:
-Xms256m
-Xmx2048m

Resources