when i drag the component ''split'' in the mapping the messages log shows me this error :
" java.lang.IllegalStateException: Unable to replace columns in splitter table" Signalled When Dragging And Dropping SPLIT Component in ODI 12.2.1 Studio Designer"
at oracle.odi.ui.mapping.logical.inspector.components.SplitterConditionsInspector$1.createCustomColumns(SplitterConditionsInspector.java:142)
at oracle.cef.inspector.impl.DefaultPropertyTableCustomComponent.createTableContent(DefaultPropertyTableCustomComponent.java:107)
at oracle.cef.inspector.table.PropertyTableCustomComponent.createComponent(PropertyTableCustomComponent.java:195)
at oracle.cef.inspector.CustomGUIComponent.onInitialize(CustomGUIComponent.java:115)
at oracle.odi.ui.mapping.logical.inspector.components.ConnectorPointsInspector.createComponent(ConnectorPointsInspector.java:59)
at oracle.cef.inspector.CustomGUIComponent.onInitialize(CustomGUIComponent.java:115)
at oracle.ide.inspector.DisplayGroupPanel.componentRendered(DisplayGroupPanel.java:342)
at oracle.ide.inspector.DisplayGroupPanel.render(DisplayGroupPanel.java:297)
at oracle.ide.inspector.DisplayGroupPanel.render(DisplayGroupPanel.java:112)
at oracle.ide.inspector.DisplayGroupPanel.<init>(DisplayGroupPanel.java:73)
at oracle.ide.inspector.VerticalDisplayGroupPanel.<init>(VerticalDisplayGroupPanel.java:29)
at oracle.ide.inspector.DisplayGroupPanelFactory.createDisplayGroupPanel(DisplayGroupPanelFactory.java:20)
at oracle.ide.inspector.PropertyCategoryLayoutPanel.renderDisplayGroup(PropertyCategoryLayoutPanel.java:136)
at oracle.ide.inspector.PropertyCategoryLayoutPanel.displayGroupRendered(PropertyCategoryLayoutPanel.java:124)
at oracle.ide.inspector.PropertyCategoryLayoutPanel.populateRows(PropertyCategoryLayoutPanel.java:91)
at oracle.ide.inspector.PropertyCategoryLayoutPanel.render(PropertyCategoryLayoutPanel.java:76)
at oracle.ide.inspector.VerticalCategoryCollection.expandIfNecessary(VerticalCategoryCollection.java:128)
at oracle.ide.inspector.VerticalCategoryCollection.addCategory(VerticalCategoryCollection.java:108)
at oracle.ide.inspector.PropertiesLayoutRenderer.touchCategoriesWithoutRendering(PropertiesLayoutRenderer.java:75)
at oracle.ide.inspector.PropertiesLayoutRenderer.render(PropertiesLayoutRenderer.java:38)
at oracle.ide.inspector.PropertyInspector.renderPropertiesFrom(PropertyInspector.java:605)
at oracle.ide.inspector.PropertyInspector.render(PropertyInspector.java:475)
at oracle.ide.inspector.PropertyInspector.refresh(PropertyInspector.java:456)
at oracle.ide.inspector.PropertyInspector.updatePropertyModel(PropertyInspector.java:429)
at oracle.ide.inspector.PropertyInspector.setPropertyModel(PropertyInspector.java:377)
at oracle.ideimpl.inspector.InspectorWindowImpl.setModelInNewPropertyInspector(InspectorWindowImpl.java:1447)
at oracle.ideimpl.inspector.InspectorWindowImpl.refresh(InspectorWindowImpl.java:1285)
at oracle.ideimpl.inspector.InspectorWindowImpl$1.actionPerformed(InspectorWindowImpl.java:279)
at javax.swing.Timer.fireActionPerformed(Timer.java:313)
at javax.swing.Timer$DoPostEvent.run(Timer.java:245)
at java.awt.event.InvocationEvent.dispatch(InvocationEvent.java:311)
at java.awt.EventQueue.dispatchEventImpl(EventQueue.java:756)
at java.awt.EventQueue.access$500(EventQueue.java:97)
at java.awt.EventQueue$3.run(EventQueue.java:709)
at java.awt.EventQueue$3.run(EventQueue.java:703)
at java.security.AccessController.doPrivileged(Native Method)
at java.security.ProtectionDomain$JavaSecurityAccessImpl.doIntersectionPrivilege(ProtectionDomain.java:76)
at java.awt.EventQueue.dispatchEvent(EventQueue.java:726)
at oracle.javatools.internal.ui.EventQueueWrapper._dispatchEvent(EventQueueWrapper.java:169)
at oracle.javatools.internal.ui.EventQueueWrapper.dispatchEvent(EventQueueWrapper.java:151)
at java.awt.EventDispatchThread.pumpOneEventForFilters(EventDispatchThread.java:201)
at java.awt.EventDispatchThread.pumpEventsForFilter(EventDispatchThread.java:116)
at java.awt.EventDispatchThread.pumpEventsForHierarchy(EventDispatchThread.java:105)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:101)
at java.awt.EventDispatchThread.pumpEvents(EventDispatchThread.java:93)
at java.awt.EventDispatchThread.run(EventDispatchThread.java:82)
1/ How to correct this?
2/ What is meant by "IllegalStateException"?
3/ How can I get the STate value? In which class/component?
Related
Saving changes in javascript file after comparing projects and editing fails with IllegalStateException
Steps to reproduce
STS4 4.17.2 macos arm_64 (fresh install)
Create project comp1 and comp2
Create file comp1/hello.js with
console.log("Hello Foo");
Create file comp2/hello.js with
console.log("Hello Bar");
Select comp1 and comp2 projects and select Compare With->Each Other
In compare editor, double click on hello.js and make change(s) to either side
Try to save
Result:
Changes are not saved and exception found in the error log
!MESSAGE Unhandled event loop exception
!STACK 0
java.lang.IllegalStateException
at org.eclipse.jface.text.TextViewer.setHyperlinkPresenter(TextViewer.java:5639)
at org.eclipse.jface.text.source.SourceViewer.configure(SourceViewer.java:529)
at org.eclipse.ui.internal.genericeditor.compare.GenericEditorMergeViewer.configureTextViewer(GenericEditorMergeViewer.java:68)
at org.eclipse.ui.internal.genericeditor.compare.GenericEditorMergeViewer$1.inputDocumentChanged(GenericEditorMergeViewer.java:50)
at org.eclipse.jface.text.TextViewer.fireInputDocumentChanged(TextViewer.java:2849)
at org.eclipse.jface.text.TextViewer.setDocument(TextViewer.java:2890)
at org.eclipse.jface.text.source.SourceViewer.setDocument(SourceViewer.java:663)
at org.eclipse.jface.text.source.SourceViewer.setDocument(SourceViewer.java:603)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer$ContributorInfo.updateViewerDocument(TextMergeViewer.java:807)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer$ContributorInfo.internalSetDocument(TextMergeViewer.java:762)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer$ContributorInfo.setDocument(TextMergeViewer.java:679)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer.updateContent(TextMergeViewer.java:3029)
at org.eclipse.compare.contentmergeviewer.ContentMergeViewer.internalRefresh(ContentMergeViewer.java:793)
at org.eclipse.compare.contentmergeviewer.ContentMergeViewer.refresh(ContentMergeViewer.java:769)
at org.eclipse.compare.contentmergeviewer.ContentMergeViewer.handleCompareInputChange(ContentMergeViewer.java:1389)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer.handleCompareInputChange(TextMergeViewer.java:5345)
at org.eclipse.compare.contentmergeviewer.ContentMergeViewer.lambda$1(ContentMergeViewer.java:390)
at org.eclipse.compare.structuremergeviewer.DiffNode.fireChange(DiffNode.java:140)
at org.eclipse.compare.internal.ResourceCompareInput$MyDiffNode.fireChange(ResourceCompareInput.java:90)
at org.eclipse.compare.internal.MergeViewerContentProvider.saveLeftContent(MergeViewerContentProvider.java:152)
at org.eclipse.compare.contentmergeviewer.ContentMergeViewer.flushLeftSide(ContentMergeViewer.java:1272)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer.flushLeftSide(TextMergeViewer.java:5235)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer.flushContent(TextMergeViewer.java:5258)
at org.eclipse.compare.contentmergeviewer.ContentMergeViewer.flush(ContentMergeViewer.java:1243)
at org.eclipse.compare.CompareEditorInput.flushViewer(CompareEditorInput.java:1220)
at org.eclipse.compare.CompareEditorInput.flushViewers(CompareEditorInput.java:1189)
at org.eclipse.compare.internal.ResourceCompareInput.getAdapter(ResourceCompareInput.java:508)
at org.eclipse.core.runtime.Adapters.adapt(Adapters.java:67)
at org.eclipse.core.runtime.Adapters.adapt(Adapters.java:116)
at org.eclipse.ui.ide.ResourceUtil.getFile(ResourceUtil.java:61)
at org.eclipse.ui.internal.ide.actions.BuildUtilities.findSelectedProjects(BuildUtilities.java:108)
at org.eclipse.ui.actions.BuildAction.isEnabled(BuildAction.java:231)
at org.eclipse.ui.actions.RetargetAction.setActionHandler(RetargetAction.java:268)
at org.eclipse.ui.internal.ide.actions.RetargetActionWithDefault.setActionHandler(RetargetActionWithDefault.java:51)
at org.eclipse.ui.actions.RetargetAction.propagateChange(RetargetAction.java:204)
at org.eclipse.ui.SubActionBars.firePropertyChange(SubActionBars.java:289)
at org.eclipse.ui.SubActionBars.fireActionHandlersChanged(SubActionBars.java:273)
at org.eclipse.ui.SubActionBars.updateActionBars(SubActionBars.java:603)
at org.eclipse.compare.internal.CompareHandlerService.updateActionBars(CompareHandlerService.java:131)
at org.eclipse.compare.internal.CompareHandlerService.updatePaneActionHandlers(CompareHandlerService.java:161)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer.connectGlobalActions(TextMergeViewer.java:2784)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer.setActiveViewer(TextMergeViewer.java:2719)
at org.eclipse.compare.contentmergeviewer.TextMergeViewer$22.focusLost(TextMergeViewer.java:2682)
at org.eclipse.swt.widgets.TypedListener.handleEvent(TypedListener.java:147)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:89)
at org.eclipse.swt.widgets.Display.sendEvent(Display.java:4646)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1524)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1547)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1528)
at org.eclipse.swt.widgets.Control.sendFocusEvent(Control.java:3433)
at org.eclipse.swt.widgets.Canvas.sendFocusEvent(Canvas.java:80)
at org.eclipse.swt.widgets.Display.checkFocus(Display.java:744)
at org.eclipse.swt.widgets.Shell.makeFirstResponder(Shell.java:1317)
at org.eclipse.swt.widgets.Display.windowProc(Display.java:6522)
at org.eclipse.swt.internal.cocoa.OS.objc_msgSend_bool(Native Method)
at org.eclipse.swt.internal.cocoa.NSWindow.makeFirstResponder(NSWindow.java:197)
at org.eclipse.swt.widgets.Control.forceFocus(Control.java:1472)
at org.eclipse.swt.widgets.Control.forceFocus(Control.java:1452)
at org.eclipse.swt.widgets.Control.setFocus(Control.java:3880)
at org.eclipse.swt.widgets.Composite.setFocus(Composite.java:1129)
at org.eclipse.swt.widgets.Composite.setFocus(Composite.java:1127)
at org.eclipse.swt.widgets.Composite.setFocus(Composite.java:1127)
at org.eclipse.swt.widgets.Composite.setFocus(Composite.java:1127)
at org.eclipse.swt.widgets.Composite.setFocus(Composite.java:1127)
at org.eclipse.swt.widgets.Control.fixFocus(Control.java:1377)
at org.eclipse.swt.widgets.Control.setEnabled(Control.java:3860)
at org.eclipse.ui.internal.WorkbenchWindow.disableControl(WorkbenchWindow.java:2278)
at org.eclipse.ui.internal.WorkbenchWindow.run(WorkbenchWindow.java:2324)
at org.eclipse.ui.internal.SaveableHelper.runProgressMonitorOperation(SaveableHelper.java:278)
at org.eclipse.ui.internal.SaveableHelper.runProgressMonitorOperation(SaveableHelper.java:260)
at org.eclipse.ui.internal.SaveableHelper.saveModels(SaveableHelper.java:207)
at org.eclipse.ui.internal.SaveableHelper.savePart(SaveableHelper.java:150)
at org.eclipse.ui.internal.WorkbenchPage.saveSaveable(WorkbenchPage.java:3802)
at org.eclipse.ui.internal.WorkbenchPage.saveEditor(WorkbenchPage.java:3815)
at org.eclipse.ui.internal.handlers.SaveHandler.execute(SaveHandler.java:82)
at org.eclipse.ui.internal.handlers.HandlerProxy.execute(HandlerProxy.java:283)
at org.eclipse.ui.internal.handlers.E4HandlerProxy.execute(E4HandlerProxy.java:97)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.eclipse.e4.core.internal.di.MethodRequestor.execute(MethodRequestor.java:58)
at org.eclipse.e4.core.internal.di.InjectorImpl.invokeUsingClass(InjectorImpl.java:317)
at org.eclipse.e4.core.internal.di.InjectorImpl.invoke(InjectorImpl.java:251)
at org.eclipse.e4.core.contexts.ContextInjectionFactory.invoke(ContextInjectionFactory.java:173)
at org.eclipse.e4.core.commands.internal.HandlerServiceHandler.execute(HandlerServiceHandler.java:156)
at org.eclipse.core.commands.Command.executeWithChecks(Command.java:488)
at org.eclipse.core.commands.ParameterizedCommand.executeWithChecks(ParameterizedCommand.java:485)
at org.eclipse.e4.core.commands.internal.HandlerServiceImpl.executeHandler(HandlerServiceImpl.java:213)
at org.eclipse.e4.ui.bindings.keys.KeyBindingDispatcher.executeCommand(KeyBindingDispatcher.java:308)
at org.eclipse.e4.ui.bindings.keys.KeyBindingDispatcher.press(KeyBindingDispatcher.java:580)
at org.eclipse.e4.ui.bindings.keys.KeyBindingDispatcher.processKeyEvent(KeyBindingDispatcher.java:647)
at org.eclipse.e4.ui.bindings.keys.KeyBindingDispatcher.filterKeySequenceBindings(KeyBindingDispatcher.java:439)
at org.eclipse.e4.ui.bindings.keys.KeyBindingDispatcher$KeyDownFilter.handleEvent(KeyBindingDispatcher.java:96)
at org.eclipse.swt.widgets.EventTable.sendEvent(EventTable.java:89)
at org.eclipse.swt.widgets.Display.filterEvent(Display.java:1217)
at org.eclipse.swt.widgets.Display.sendEvent(Display.java:4641)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1524)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1547)
at org.eclipse.swt.widgets.Widget.sendEvent(Widget.java:1532)
at org.eclipse.swt.widgets.Widget.sendKeyEvent(Widget.java:1561)
at org.eclipse.swt.widgets.Widget.sendKeyEvent(Widget.java:1557)
at org.eclipse.swt.widgets.Canvas.sendKeyEvent(Canvas.java:522)
at org.eclipse.swt.widgets.Control.doCommandBySelector(Control.java:1085)
at org.eclipse.swt.widgets.Display.windowProc(Display.java:6492)
at org.eclipse.swt.internal.cocoa.OS.objc_msgSend(Native Method)
at org.eclipse.swt.internal.cocoa.NSResponder.interpretKeyEvents(NSResponder.java:59)
at org.eclipse.swt.widgets.Composite.keyDown(Composite.java:607)
at org.eclipse.swt.widgets.Display.windowProc(Display.java:6324)
at org.eclipse.swt.internal.cocoa.OS.objc_msgSendSuper(Native Method)
at org.eclipse.swt.widgets.Widget.callSuper(Widget.java:236)
at org.eclipse.swt.widgets.Widget.windowSendEvent(Widget.java:2264)
at org.eclipse.swt.widgets.Shell.windowSendEvent(Shell.java:2511)
at org.eclipse.swt.widgets.Display.windowProc(Display.java:6444)
at org.eclipse.swt.internal.cocoa.OS.objc_msgSendSuper(Native Method)
at org.eclipse.swt.widgets.Display.applicationSendEvent(Display.java:5692)
at org.eclipse.swt.widgets.Display.applicationProc(Display.java:5831)
at org.eclipse.swt.internal.cocoa.OS.objc_msgSend(Native Method)
at org.eclipse.swt.internal.cocoa.NSApplication.sendEvent(NSApplication.java:117)
at org.eclipse.swt.widgets.Display.readAndDispatch(Display.java:3986)
at org.eclipse.e4.ui.internal.workbench.swt.PartRenderingEngine$5.run(PartRenderingEngine.java:1155)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:338)
at org.eclipse.e4.ui.internal.workbench.swt.PartRenderingEngine.run(PartRenderingEngine.java:1046)
at org.eclipse.e4.ui.internal.workbench.E4Workbench.createAndRunUI(E4Workbench.java:155)
at org.eclipse.ui.internal.Workbench.lambda$3(Workbench.java:643)
at org.eclipse.core.databinding.observable.Realm.runWithDefault(Realm.java:338)
at org.eclipse.ui.internal.Workbench.createAndRunWorkbench(Workbench.java:550)
at org.eclipse.ui.PlatformUI.createAndRunWorkbench(PlatformUI.java:171)
at org.eclipse.ui.internal.ide.application.IDEApplication.start(IDEApplication.java:152)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:203)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:136)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:104)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:402)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:255)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:659)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:596)
at org.eclipse.equinox.launcher.Main.run(Main.java:1467)
Work around:
Uninstalled Wild Web Developer plugin allows it to work
Changing the file extension to .txt also avoids the problem
I have configured Hive (1.13.1) with Spark (1.4.0) and I am able to access all the Databases and Table from hive and my warehouse directory is hdfs://192.168.1.17:8020/user/hive/warehouse
But when, I am trying to save a Dataframe through Spark-Shell (using master) into Hive using df.saveAsTable("df") function, I got this error.
15/07/03 14:48:59 INFO audit: ugi=user ip=unknown-ip-addr cmd=get_database: default
15/07/03 14:48:59 INFO HiveMetaStore: 0: get_table : db=default tbl=df
15/07/03 14:48:59 INFO audit: ugi=user ip=unknown-ip-addr cmd=get_table : db=default tbl=df
java.net.ConnectException: Call From bdiuser-Vostro-3800/127.0.1.1 to 192.168.1.19:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see: http://wiki.apache.org/hadoop/ConnectionRefused
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.net.NetUtils.wrapWithMessage(NetUtils.java:783)
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:730)
at org.apache.hadoop.ipc.Client.call(Client.java:1414)
at org.apache.hadoop.ipc.Client.call(Client.java:1363)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:190)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:103)
at com.sun.proxy.$Proxy14.getFileInfo(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getFileInfo(ClientNamenodeProtocolTranslatorPB.java:699)
at org.apache.hadoop.hdfs.DFSClient.getFileInfo(DFSClient.java:1762)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1124)
at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1120)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1120)
at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1398)
at org.apache.spark.sql.sources.InsertIntoHadoopFsRelation.run(commands.scala:78)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:939)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:939)
at org.apache.spark.sql.sources.ResolvedDataSource$.apply(ddl.scala:332)
at org.apache.spark.sql.hive.execution.CreateMetastoreDataSourceAsSelect.run(commands.scala:239)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:57)
at org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:68)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:88)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:148)
at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:87)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd$lzycompute(SQLContext.scala:939)
at org.apache.spark.sql.SQLContext$QueryExecution.toRdd(SQLContext.scala:939)
at org.apache.spark.sql.DataFrameWriter.saveAsTable(DataFrameWriter.scala:211)
at org.apache.spark.sql.DataFrame.saveAsTable(DataFrame.scala:1517)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:22)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:27)
at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:31)
at $iwC$$iwC$$iwC$$iwC.<init>(<console>:33)
at $iwC$$iwC$$iwC.<init>(<console>:35)
at $iwC$$iwC.<init>(<console>:37)
at $iwC.<init>(<console>:39)
at <init>(<console>:41)
at .<init>(<console>:45)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1338)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoop.processLine$1(SparkILoop.scala:657)
at org.apache.spark.repl.SparkILoop.innerLoop$1(SparkILoop.scala:665)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$loop(SparkILoop.scala:670)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:997)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:664)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:169)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:192)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:111)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.net.ConnectException: Connection refused
at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
at sun.nio.ch.SocketChannelImpl.finishConnect(SocketChannelImpl.java:744)
at org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:529)
at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:493)
at org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:604)
at org.apache.hadoop.ipc.Client$Connection.setupIOstreams(Client.java:699)
at org.apache.hadoop.ipc.Client$Connection.access$2800(Client.java:367)
at org.apache.hadoop.ipc.Client.getConnection(Client.java:1462)
at org.apache.hadoop.ipc.Client.call(Client.java:1381)
... 86 more
When, I go through this error, I found that program tried different host for HDFS connection to save table.
And i also tried with different worker's spark-shell, I got same error.
Please find the example below:
val options = Map("path" -> hiveTablePath)
result.write.format("orc").partitionBy("partitiondate").options(options).mode(SaveMode.Append).saveAsTable(hiveTable)
I have explained this a little bit more in my blog.
With saveAsTable the default location that Spark saves to is controlled by the HiveMetastore (based on the docs). Another option would be to use saveAsParquetFile and specify the path and then later register that path with your hive metastore OR use the new DataFrameWriter interface and specify the path option write.format(source).mode(mode).options(options).saveAsTable(tableName).
You can write spark dataframe to the existing spark table.
Please find the example below:
df.write.mode("overwrite").saveAsTable("database.tableName")
I am trying to load data from redshift on to hdfs(parquet format),using sqoop(--as-parquetfile).
Has anyone else encountered this same error (see below)? If so, how did you go about fixing the problem?
Error: org.kitesdk.data.DatasetIOException: Cannot decode Avro value
at org.kitesdk.data.spi.SchemaUtil.fromString(SchemaUtil.java:419)
at org.kitesdk.data.spi.predicates.In.fromString(In.java:47)
at org.kitesdk.data.spi.predicates.Predicates.fromString(Predicates.java:85)
at org.kitesdk.data.spi.Constraints.fromQueryMap(Constraints.java:468)
at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.loadOrCreateTaskAttemptView(DatasetKeyOutputFormat.java:577)
at org.kitesdk.data.mapreduce.DatasetKeyOutputFormat.getRecordWriter(DatasetKeyOutputFormat.java:426)
at org.apache.hadoop.mapred.MapTask$NewDirectOutputCollector.<init>(MapTask.java:644)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341)
at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1628)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158)
Caused by: java.io.EOFException
at org.apache.avro.io.BinaryDecoder.readInt(BinaryDecoder.java:153)
at org.apache.avro.io.BinaryDecoder.readIndex(BinaryDecoder.java:423)
at org.apache.avro.io.ResolvingDecoder.doAction(ResolvingDecoder.java:229)
at org.apache.avro.io.parsing.Parser.advance(Parser.java:88)
at org.apache.avro.io.ResolvingDecoder.readIndex(ResolvingDecoder.java:206)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:152)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:139)
at org.kitesdk.data.spi.SchemaUtil.fromString(SchemaUtil.java:417)
... 13 more
Thanks for any suggestions you may have.
I get the following exception:
java.lang.NullPointerException: java.lang.NullPointerException
at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.validateResourceRequest(SchedulerUtils.java:196)
at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.submitApplication(RMAppManager.java:253)
at org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.submitApplication(ClientRMService.java:319)
at org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.submitApplication(ApplicationClientProtocolPBServiceImpl.java:163)
at org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:243)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.hadoop.yarn.ipc.RPCUtil.instantiateException(RPCUtil.java:53)
at org.apache.hadoop.yarn.ipc.RPCUtil.unwrapAndThrowException(RPCUtil.java:107)
at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.submitApplication(ApplicationClientProtocolPBClientImpl.java:185)
at myHandler.handle(myHandler.java:191)
at org.eclipse.jetty.server.handler.HandlerWrapper.handle(HandlerWrapper.java:97)
at org.eclipse.jetty.server.Server.handle(Server.java:459)
at org.eclipse.jetty.server.HttpChannel.handle(HttpChannel.java:280)
at org.eclipse.jetty.server.HttpConnection.onFillable(HttpConnection.java:229)
at org.eclipse.jetty.io.AbstractConnection$1.run(AbstractConnection.java:505)
at org.eclipse.jetty.util.thread.QueuedThreadPool.runJob(QueuedThreadPool.java:607)
at org.eclipse.jetty.util.thread.QueuedThreadPool$3.run(QueuedThreadPool.java:536)
at java.lang.Thread.run(Thread.java:724)
Caused by: org.apache.hadoop.ipc.RemoteException(java.lang.NullPointerException): java.lang.NullPointerException
at org.apache.hadoop.yarn.server.resourcemanager.scheduler.SchedulerUtils.validateResourceRequest(SchedulerUtils.java:196)
at org.apache.hadoop.yarn.server.resourcemanager.RMAppManager.submitApplication(RMAppManager.java:253)
at org.apache.hadoop.yarn.server.resourcemanager.ClientRMService.submitApplication(ClientRMService.java:319)
at org.apache.hadoop.yarn.api.impl.pb.service.ApplicationClientProtocolPBServiceImpl.submitApplication(ApplicationClientProtocolPBServiceImpl.java:163)
at org.apache.hadoop.yarn.proto.ApplicationClientProtocol$ApplicationClientProtocolService$2.callBlockingMethod(ApplicationClientProtocol.java:243)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:585)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:928)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2048)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2044)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1491)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2042)
at org.apache.hadoop.ipc.Client.call(Client.java:1347)
at org.apache.hadoop.ipc.Client.call(Client.java:1300)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:206)
at com.sun.proxy.$Proxy7.submitApplication(Unknown Source)
at org.apache.hadoop.yarn.api.impl.pb.client.ApplicationClientProtocolPBClientImpl.submitApplication(ApplicationClientProtocolPBClientImpl.java:182)
... 9 more
Did I set invalid resources? I'm trying to make this code alive.
I've found out that the NullPointerException got thrown from ShedulerUtils.java # line 196, which is:
if (resReq.getCapability().getMemory() < 0 ||
resReq.getCapability().getMemory() > maximumResource.getMemory()) {
throw new InvalidResourceRequestException("Invalid resource request"
+ ", requested memory < 0"
+ ", or requested memory > max configured"
+ ", requestedMemory=" + resReq.getCapability().getMemory()
+ ", maxMemory=" + maximumResource.getMemory());
}
So probably I did not set the capability (memory) for the container. How should I do it?
I've tried with:
Resource capability = Records.newRecord(Resource.class);
capability.setMemory(amMemory);
amContainer.setResource(capability);
But ContainerLaunchContext amContainer has no method setResource.
I'm running Hadoop 2.2.0.
I had to give the capability to ApplicationSubmissionContext, not to ContainerLaunchContext, as the YARN example states.
Also, I've updated my Hadoop to 2.3.0.
I have a lot of different files *.doc, *.pdf and so on. I wanted to process them with mapReduce.
I put them in HDFS and then started java MapReduce program using Hue.
If files are well formated and doesn't have brackets "(){}[]" in their name all goes fine.
But if there is a file OPN_last_[age.PDF
I get this errors:
Failing Oozie Launcher, Main class [distr.fors.ru.Index], main() threw exception, Illegal file pattern: Unclosed character class near index 17
OPN_last_[age.PDF
^
java.io.IOException: Illegal file pattern: Unclosed character class near index 17
OPN_last_[age.PDF
^
at org.apache.hadoop.fs.GlobFilter.init(GlobFilter.java:70)
at org.apache.hadoop.fs.GlobFilter.<init>(GlobFilter.java:49)
at org.apache.hadoop.fs.FileSystem.globStatusInternal(FileSystem.java:1670)
at org.apache.hadoop.fs.FileSystem.globStatus(FileSystem.java:1627)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:211)
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:248)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1063)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1080)
at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:992)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
at distr.fors.ru.Index.run(Index.java:78)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at distr.fors.ru.Index.main(Index.java:39)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:495)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
Caused by: java.util.regex.PatternSyntaxException: Unclosed character class near index 17
OPN_last_[age.PDF
^
at org.apache.hadoop.fs.GlobPattern.error(GlobPattern.java:167)
at org.apache.hadoop.fs.GlobPattern.set(GlobPattern.java:151)
at org.apache.hadoop.fs.GlobPattern.<init>(GlobPattern.java:42)
at org.apache.hadoop.fs.GlobFilter.init(GlobFilter.java:66)
... 32 more
If there is a file like this: {2011-01-27} (3769330).pdf
I get such error:
Input Pattern hdfs://fd-bigdata.distr.fors.ru:8020/{2011-01-27} (3769330).pdf matches 0 files
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:231)
t org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:248)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:1063)
at org.apache.hadoop.mapred.JobClient.writeSplits(JobClient.java:1080)
at org.apache.hadoop.mapred.JobClient.access$600(JobClient.java:174)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:992)
at org.apache.hadoop.mapred.JobClient$2.run(JobClient.java:945)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:945)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:566)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:596)
at distr.fors.ru.Index.run(Index.java:76)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at distr.fors.ru.Index.main(Index.java:37)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
at java.lang.reflect.Method.invoke(Method.java:597)
at org.apache.oozie.action.hadoop.LauncherMapper.map(LauncherMapper.java:495)
at org.apache.hadoop.mapred.MapRunner.run(MapRunner.java:50)
at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:417)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:332)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapred.Child.main(Child.java:262)
I realy need to process such files. What can I make to solve such problems?
P.S. I am using the latest CDH 4.4.0.
To deal with special characters in Java you should escape them with double backslash '\':
'[' => '\\['
'}' => '\\}'
This works for me in Java, in Pig and in Oozie. Hope it will also solve your problem.