Configuration Assistant (biee.py) Failed with Exit Value 1 at OBIEE12c - oracle

During installation and configuration of obiee12c(12.2.1.4), configuration assistant failed with an error.
The following is the installation log.
[2021-09-19T16:31:35.218+09:00] [bi] [NOTIFICATION] [] [oracle.bi.install.config.basesteps] [tid: 42] [ecid: 0000MTUM^f91Vctsoc7EDm1RztE2000004,0] Failed single shot step: BIEE with: Execution of [/u01/app/oracle/middleware/oracle_common/common/bin/wlst.sh, /u01/app/oracle/middleware/bi/modules/oracle.bi.configassistant/biee.py, /u01/app/oracle/middleware, /u01/app/oracle/middleware/user_projects/domains/bi, weblogic, Expanded, hxfpr371, 9502, 9503, ORACLE, oracle.jdbc.OracleDriver, jdbc:oracle:thin:#//:1521/, TST1, jdbc:oracle:thin:#//:1521/, ] failed with exit value 1
[2021-09-19T16:31:35.219+09:00] [bi] [ERROR] [] [oracle.bi.install.config.actions] [tid: 42] [ecid: 0000MTUM^f91Vctsoc7EDm1RztE2000004,0] Non-skipped failure during configuration action: Execution of [/u01/app/oracle/middleware/oracle_common/common/bin/wlst.sh, /u01/app/oracle/middleware/bi/modules/oracle.bi.configassistant/biee.py, /u01/app/oracle/middleware, /u01/app/oracle/middleware/user_projects/domains/bi, weblogic, Expanded, hxfpr371, 9502, 9503, ORACLE, oracle.jdbc.OracleDriver, jdbc:oracle:thin:#//:1521/, TST1, jdbc:oracle:thin:#//:1521/, ] failed with exit value 1[[
oracle.bi.exec.ExecutionStatusException: Execution of [/u01/app/oracle/middleware/oracle_common/common/bin/wlst.sh, /u01/app/oracle/middleware/bi/modules/oracle.bi.configassistant/biee.py, /u01/app/oracle/middleware, /u01/app/oracle/middleware/user_projects/domains/bi, weblogic, Expanded, hxfpr371, 9502, 9503, ORACLE, oracle.jdbc.OracleDriver, jdbc:oracle:thin:#//:1521/, TST1, jdbc:oracle:thin:#//:1521/, ] failed with exit value 1
at oracle.bi.exec.StdinProcess.runProcess(StdinProcess.java:106)
at oracle.bi.exec.ExecScript.executeScript(ExecScript.java:191)
at oracle.bi.exec.ExecScript.executeSynchronousScript(ExecScript.java:95)
at oracle.bi.exec.ExecWLST.executeWLSTScript(ExecWLST.java:62)
at oracle.bi.install.config.steps.WLSTStep.executeSingleShot(WLSTStep.java:55)
at oracle.bi.install.config.basesteps.SingleShotActionStep.execute(SingleShotActionStep.java:31)
at oracle.bi.install.config.basesteps.StepList.execute(StepList.java:85)
at oracle.bi.install.config.actions.BIConfigAction.doExecute(BIConfigAction.java:127)
at oracle.as.install.engine.modules.configuration.client.ConfigAction.execute(ConfigAction.java:405)
at oracle.as.install.engine.modules.configuration.action.TaskPerformer.run(TaskPerformer.java:88)
at oracle.as.install.engine.modules.configuration.action.TaskPerformer.startConfigAction(TaskPerformer.java:108)
at oracle.as.install.engine.modules.configuration.action.ActionRequest.perform(ActionRequest.java:15)
at oracle.as.install.engine.modules.configuration.action.RequestQueue.performSequentialExecution(RequestQueue.java:284)
at oracle.as.install.engine.modules.configuration.action.RequestQueue.perform(RequestQueue.java:260)
at oracle.as.install.engine.modules.configuration.standard.StandardConfigActionManager.start(StandardConfigActionManager.java:185)
at oracle.as.install.engine.modules.configuration.boot.ConfigurationExtension.kickstart(ConfigurationExtension.java:82)
at oracle.as.install.engine.modules.configuration.ConfigurationModule.run(ConfigurationModule.java:87)
at java.lang.Thread.run(Thread.java:748)

the issue is due to a java configuration parameter i guess, try this before installation :
go to « C:\Java\jdk1.8.0_192\jre\lib\security »
** replace this path with your own path
edit the file « java.security » and replace this line :
securerandom.source=file:/dev/random
by this line :
securerandom.source=file:/dev/urandom
for more details : Doc ID 2221805.1

Related

Hadoop Spark SQL insert failing

I'm trying to insert something around 13M rows into a new table but I'm getting the following error:
22/12/09 19:33:56 ERROR Utils: Aborting task
java.lang.AssertionError: assertion failed: Created file counter 11 is beyond max value 10
at scala.Predef$.assert(Predef.scala:223)
at org.apache.spark.sql.execution.datasources.DynamicPartitionDataWriter.$anonfun$increaseCreatedFileAndCheck$1(FileFormatDataWriter.scala:191)
at scala.runtime.java8.JFunction1$mcVI$sp.apply(JFunction1$mcVI$sp.java:23)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.sql.execution.datasources.DynamicPartitionDataWriter.increaseCreatedFileAndCheck(FileFormatDataWriter.scala:188)
at org.apache.spark.sql.execution.datasources.DynamicPartitionDataWriter.write(FileFormatDataWriter.scala:277)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:280)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1473)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:288)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$write$15(FileFormatWriter.scala:211)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:498)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:501)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
22/12/09 19:33:57 ERROR FileFormatWriter: Job job_202212091917352650741377131539872_0020 aborted.
22/12/09 19:33:57 ERROR Executor: Exception in task 0.1 in stage 20.0 (TID 26337)
org.apache.spark.SparkException: Task failed while writing rows.
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:298)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$write$15(FileFormatWriter.scala:211)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:498)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:501)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.AssertionError: assertion failed: Created file counter 11 is beyond max value 10
at scala.Predef$.assert(Predef.scala:223)
at org.apache.spark.sql.execution.datasources.DynamicPartitionDataWriter.$anonfun$increaseCreatedFileAndCheck$1(FileFormatDataWriter.scala:191)
at scala.runtime.java8.JFunction1$mcVI$sp.apply(JFunction1$mcVI$sp.java:23)
at scala.Option.foreach(Option.scala:407)
at org.apache.spark.sql.execution.datasources.DynamicPartitionDataWriter.increaseCreatedFileAndCheck(FileFormatDataWriter.scala:188)
at org.apache.spark.sql.execution.datasources.DynamicPartitionDataWriter.write(FileFormatDataWriter.scala:277)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.$anonfun$executeTask$1(FileFormatWriter.scala:280)
at org.apache.spark.util.Utils$.tryWithSafeFinallyAndFailureCallbacks(Utils.scala:1473)
at org.apache.spark.sql.execution.datasources.FileFormatWriter$.executeTask(FileFormatWriter.scala:288)
The insert operation is like the following:
insert overwrite table fake_table_txt partition(partition_name)
select id, name, type, description from ( inner query )
I'm a Hadoop beginner and I have no idea what may be causing this.
Could anybody please give me any direction?
After struggling a little, I was told that increasing the property "files per task" would do the trick.
set spark.sql.maxCreatedFilesPerTask = 15;
It was defaulted to 10 previously.

apache nifi Stateless - not able to set parm of controller service (DBCPConnectionPool 1.10.0)

I am following the NiFi 1.10 stateless guildeline to create a simple process group of executing a sql in mysql db. I have put necessary parm of db controller service to parameter context.
it works well in nifi canvas. Then i add it to registry and prepare a json parm file: stateless-simpledb.json
{
"registryUrl": "http://localhost:18080",
"bucketId": "cac8f127-e328-45c1-a4cb-0e03dc837ceb",
"flowId": "cc2753f2-78f3-4449-a2fd-343dfeaafe15",
"flowVersion": "3",
"parameters": {
"lastIngestId" : "20000",
"mysql-jdbc-driver-name" : "com.mysql.jdbc.Driver",
"db-user" : "root",
"db-password" : "password",
"db-con-url" : "jdbc:mysql://localhost:3306/mms",
"jdbc-jar-path" : "/program/jdbc/mysql-connector-java.jar"
}
}
and run the one-off command:
/program/nifi/bin/nifi.sh stateless RunFromRegistry Once --file /app/poc/nifi-stateless/conf/stateless-simpledb.json
It raise error:
=== FlowFileRepository Type ===
org.apache.nifi.controller.repository.RocksDBFlowFileRepository
org.apache.nifi:nifi-framework-nar:1.10.0 || /program/nifi-1.10.0/work/stateless-nars/nifi-framework-nar-1.10.0.nar-unpacked
org.apache.nifi.controller.repository.WriteAheadFlowFileRepository
org.apache.nifi:nifi-framework-nar:1.10.0 || /program/nifi-1.10.0/work/stateless-nars/nifi-framework-nar-1.10.0.nar-unpacked
org.apache.nifi.controller.repository.VolatileFlowFileRepository
org.apache.nifi:nifi-framework-nar:1.10.0 || /program/nifi-1.10.0/work/stateless-nars/nifi-framework-nar-1.10.0.nar-unpacked
=== End FlowFileRepository types ===
23:32:32.626 [main] INFO org.apache.nifi.stateless.bootstrap.ExtensionDiscovery - Successfully discovered extensions in 4411 milliseconds
23:32:32.633 [main] DEBUG org.apache.nifi.stateless.core.ComponentFactory - Setting context class loader to org.apache.nifi.nar.InstanceClassLoader#50fa5938 (parent = org.apache.nifi.nar.NarClassLoader[/program/nifi-1.10.0/work/stateless-nars/nifi-dbcp-service-nar-1.10.0.nar-unpacked]) to create org.apache.nifi.dbcp.DBCPConnectionPool
23:32:32.647 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input #{jdbc-jar-path} found 1 Parameter references: [org.apache.nifi.parameter.StandardParameterReference#2d3eecda]
23:32:32.650 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input /program/jdbc/mysql-connector-java.jar found 0 Parameter references: []
23:32:32.651 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input 500 millis found 0 Parameter references: []
23:32:32.651 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input 8 found 0 Parameter references: []
23:32:32.651 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input 0 found 0 Parameter references: []
23:32:32.651 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input 8 found 0 Parameter references: []
23:32:32.651 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input -1 found 0 Parameter references: []
23:32:32.651 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input -1 found 0 Parameter references: []
23:32:32.651 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input 30 mins found 0 Parameter references: []
23:32:32.651 [main] DEBUG org.apache.nifi.parameter.ExpressionLanguageAwareParameterParser - For input -1 found 0 Parameter references: []
Exception in thread "main" java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.bootstrap.RunStatelessNiFi.main(RunStatelessNiFi.java:69)
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.nifi.StatelessNiFi.main(StatelessNiFi.java:103)
... 5 more
Caused by: java.lang.RuntimeException: Failed to enable Controller Service {id=691ecc97-ff46-3a5e-8aad-37dc568bc247, name=MYSQL-MMS-stateless-test, type=class org.apache.nifi.dbcp.DBCPConnectionPool} because validation failed: ['Database Connection URL' is invalid because Database Connection URL is required, 'Database Driver Class Name' is invalid because Database Driver Class Name is required]
at org.apache.nifi.stateless.core.StatelessControllerServiceLookup.enableControllerServices(StatelessControllerServiceLookup.java:133)
at org.apache.nifi.stateless.core.StatelessFlow.<init>(StatelessFlow.java:153)
at org.apache.nifi.stateless.core.StatelessFlow.createAndEnqueueFromJSON(StatelessFlow.java:469)
at org.apache.nifi.stateless.runtimes.Program.runLocal(Program.java:133)
at org.apache.nifi.stateless.runtimes.Program.launch(Program.java:67)
... 10 more
Seems the apache nifi stateless function failed to set controller service even it's in "process group" scope.
Would anyone has any advice?
As mentioned in the comments, this appears to be a known problem with the validation of controller services.
This can be avoided by using Nifi 1.12 and above as it got fixed in the following jira: https://issues.apache.org/jira/plugins/servlet/mobile#issue/NIFI-7380
Though I am not entirely sure of this, it may also be possible that this simply indicates that your controller service is not configured correctly. This would be worth double checking.

Failed to execute goal org.scalatest:scalatest-maven-plugin:2.0.0:test (test)

I am trying to build my Scala code using Maven POM and I have below plugin in my POM:
<plugin>
<groupId>org.scalatest</groupId>
<artifactId>scalatest-maven-plugin</artifactId>
<version>2.0.0</version>
</plugin>
When I try to do Maven build from Scala IDE (clean Install) I am facing below error:
Failed to execute goal org.scalatest:scalatest-maven-plugin:2.0.0:test (test)
But when checked in my local .m2 folder I could see relevant Jar,pom and sha1 files available there:
scalatest-maven-plugin-2.0.0.jar
scalatest-maven-plugin-2.0.0.pom
If anyone has encountered similar issue could you please help me in fixing this issue? Thanks
Detailed Error log:
--- scalatest-maven-plugin:2.0.0:test (test) # streaming-base ---
[36mDiscovery starting. [0m
[36mDiscovery completed in 6 seconds, 175 milliseconds. [0m
[36mRun starting. Expected test count is: 12 [0m
[32mZookeeperLockSpec: [0m
[32mZookeeperLock [0m
[32m- should create lock path [0m
[NIOServerCxn.Factory:0.0.0.0/0.0.0.0:21810] WARN org.apache.zookeeper.server.NIOServerCnxn - caught end of stream exception
EndOfStreamException: Unable to read additional data from client sessionid 0x1698ba52c8e0000, likely client has closed socket
at org.apache.zookeeper.server.NIOServerCnxn.doIO(NIOServerCnxn.java:228)
at org.apache.zookeeper.server.NIOServerCnxnFactory.run(NIOServerCnxnFactory.java:208)
at java.lang.Thread.run(Unknown Source)
[33m- should not fail if lock path exists !!! IGNORED !!! [0m
[32m- should correctly release lock if job fails [0m
[32mResourceUtilSpec: [0m
[32mResource file reader [0m
[32m- correctly read and parse text file in resources [0m
[32m- fail with FileNotFoundException if there's no file in resources [0m
[32mString array extractor [0m
[32m- return expected value if index within bounds (full version) [0m
[32m- return default value if index is out of bounds [0m
[32m- return expected value if index within bounds (shortened version, empty string default) [0m
[32m- return empty string if index is out of bounds [0m
[32mJdbcToBqExportToolSpec: [0m
[ScalaTest-main-running-DiscoverySuite] WARN org.apache.hadoop.util.Shell - Did not find winutils.exe: {}
java.io.FileNotFoundException: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset. -see https://wiki.apache.org/hadoop/WindowsProblems
at org.apache.hadoop.util.Shell.fileNotFoundException(Shell.java:528)
at org.apache.hadoop.util.Shell.getHadoopHomeDir(Shell.java:549)
at org.apache.hadoop.util.Shell.getQualifiedBin(Shell.java:572)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:669)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.findHadoopBinary(HiveConf.java:2327)
at org.apache.hadoop.hive.conf.HiveConf$ConfVars.<clinit>(HiveConf.java:365)
at org.apache.hadoop.hive.conf.HiveConf.<clinit>(HiveConf.java:105)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Unknown Source)
at org.apache.spark.util.Utils$.classForName(Utils.scala:230)
at org.apache.spark.sql.SparkSession$.hiveClassesArePresent(SparkSession.scala:1063)
at org.apache.spark.sql.SparkSession$Builder.enableHiveSupport(SparkSession.scala:842)
at com.kohls.bigdata.util.SparkSpecBase$class.beforeAll(SparkSpecBase.scala:29)
at com.kohls.bigdata.dp.streaming.jdbc.JdbcToBqExportToolSpec.beforeAll(JdbcToBqExportToolSpec.scala:11)
at org.scalatest.BeforeAndAfterAll$class.liftedTree1$1(BeforeAndAfterAll.scala:212)
at org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:210)
at com.kohls.bigdata.dp.streaming.jdbc.JdbcToBqExportToolSpec.run(JdbcToBqExportToolSpec.scala:11)
at org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1210)
at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1257)
at org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1255)
at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
at org.scalatest.Suite$class.runNestedSuites(Suite.scala:1255)
at org.scalatest.tools.DiscoverySuite.runNestedSuites(DiscoverySuite.scala:30)
at org.scalatest.Suite$class.run(Suite.scala:1144)
at org.scalatest.tools.DiscoverySuite.run(DiscoverySuite.scala:30)
at org.scalatest.tools.SuiteRunner.run(SuiteRunner.scala:45)
at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1346)
at org.scalatest.tools.Runner$$anonfun$doRunRunRunDaDoRunRun$1.apply(Runner.scala:1340)
at scala.collection.immutable.List.foreach(List.scala:392)
at org.scalatest.tools.Runner$.doRunRunRunDaDoRunRun(Runner.scala:1340)
at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1011)
at org.scalatest.tools.Runner$$anonfun$runOptionallyWithPassFailReporter$2.apply(Runner.scala:1010)
at org.scalatest.tools.Runner$.withClassLoaderAndDispatchReporter(Runner.scala:1506)
at org.scalatest.tools.Runner$.runOptionallyWithPassFailReporter(Runner.scala:1010)
at org.scalatest.tools.Runner$.main(Runner.scala:827)
at org.scalatest.tools.Runner.main(Runner.scala)
Caused by: java.io.FileNotFoundException: HADOOP_HOME and hadoop.home.dir are unset.
at org.apache.hadoop.util.Shell.checkHadoopHomeInner(Shell.java:448)
at org.apache.hadoop.util.Shell.checkHadoopHome(Shell.java:419)
at org.apache.hadoop.util.Shell.<clinit>(Shell.java:496)
... 33 more

How to get the jms quickstart packaged with JBoss-Fuse-6.1.0.redhat-355 to work?

The Karaf output is as under :
JBossFuse:karaf#root> features:addurl mvn:org.jboss.quickstarts.fuse/jms/6.1.0.redhat-355/xml/features
JBossFuse:karaf#root> features:install quickstart-jms
[Fatal Error] :6:3: The element type "hr" must be terminated by the matching end-tag "</hr>".
Error executing command: The element type "hr" must be terminated by the matching end-tag "</hr>".
I am not sure where this html element is being used.
log:display-exceptionreveals :
2016-02-12 07:36:24,571 | INFO | l Console Thread | Console | ?
? | 17 - org.apache.karaf.shell.console - 2.3.0.redhat-610355 | Exception caught while executing command
org.xml.sax.SAXParseException: The element type "hr" must be terminated by the matching end-tag "</hr>".
JBossFuse:karaf#root> log:display-exception
org.xml.sax.SAXParseException: The element type "hr" must be terminated by the matching end-tag "</hr>".
at org.apache.xerces.parsers.DOMParser.parse(Unknown Source)[:]
at org.apache.xerces.jaxp.DocumentBuilderImpl.parse(Unknown Source)[:]
at javax.xml.parsers.DocumentBuilder.parse(Unknown Source)[:2.3.0.redhat-610355]
at org.apache.karaf.features.internal.FeatureValidationUtil.validate(FeatureValidationUtil.java:
52)[23:org.apache.karaf.features.core:2.3.0.redhat-610355]
at org.apache.karaf.features.internal.FeaturesServiceImpl.validateRepository(FeaturesServiceImpl
.java:215)[23:org.apache.karaf.features.core:2.3.0.redhat-610355]
at org.apache.karaf.features.internal.FeaturesServiceImpl.internalAddRepository(FeaturesServiceImpl.java:256)[23:org.apache.karaf.features.core:2.3.0.redhat-610355]
at org.apache.karaf.features.internal.FeaturesServiceImpl.getFeatures(FeaturesServiceImpl.java:1
015)[23:org.apache.karaf.features.core:2.3.0.redhat-610355]
at org.apache.karaf.features.internal.FeaturesServiceImpl.getFeature(FeaturesServiceImpl.java:97
3)[23:org.apache.karaf.features.core:2.3.0.redhat-610355]
at org.apache.karaf.features.internal.FeaturesServiceImpl.installFeature(FeaturesServiceImpl.jav
a:395)[23:org.apache.karaf.features.core:2.3.0.redhat-610355]
at org.apache.karaf.features.command.InstallFeatureCommand.doExecute(InstallFeatureCommand.java:
62)[37:org.apache.karaf.features.command:2.3.0.redhat-610355]
at org.apache.karaf.features.command.FeaturesCommandSupport.doExecute(FeaturesCommandSupport.jav
a:41)[37:org.apache.karaf.features.command:2.3.0.redhat-610355]
at org.apache.karaf.shell.console.OsgiCommandSupport.execute(OsgiCommandSupport.java:39)[17:org.
apache.karaf.shell.console:2.3.0.redhat-610355]
at org.apache.felix.gogo.commands.basic.AbstractCommand.execute(AbstractCommand.java:35)[17:org.
apache.karaf.shell.console:2.3.0.redhat-610355]
at org.apache.felix.gogo.runtime.CommandProxy.execute(CommandProxy.java:78)[18:org.apache.felix.gogo.runtime:0.11.0.redhat-610355] at org.apache.felix.gogo.runtime.Closure.executeCmd(Closure.java:477)[18:org.apache.felix.gogo.runtime:0.11.0.redhat-610355]
at org.apache.felix.gogo.runtime.Closure.executeStatement(Closure.java:403)[18:org.apache.felix.gogo.runtime:0.11.0.redhat-610355]
at org.apache.felix.gogo.runtime.Pipe.run(Pipe.java:108)[18:org.apache.felix.gogo.runtime:0.11.0.redhat-610355]
at org.apache.felix.gogo.runtime.Closure.execute(Closure.java:183)[18:org.apache.felix.gogo.runtime:0.11.0.redhat-610355]
at org.apache.felix.gogo.runtime.Closure.execute(Closure.java:120)[18:org.apache.felix.gogo.runtime:0.11.0.redhat-610355] at org.apache.felix.gogo.runtime.CommandSessionImpl.execute(CommandSessionImpl.java:89)[18:org.apache.felix.gogo.runtime:0.11.0.redhat-610355]
at org.apache.karaf.shell.console.jline.Console.run(Console.java:189)[17:org.apache.karaf.shell.console:2.3.0.redhat-610355]
at org.apache.karaf.shell.console.jline.DelayedStarted.run(DelayedStarted.java:61)[17:org.apache.karaf.shell.console:2.3.0.redhat-610355]
JBossFuse:karaf#root>
If anyone can help me get this working it will help.
TIA
I got most of the quickstarts (incl the JMS one) to work on this version -> jboss-fuse-6.0.0.redhat-024. My JDK was 1.7.0_60 and was set to JAVA_HOME.

gDebugger not working

I downloaded the last version of gDebugger (currently 5.8.1). But I can't make it works. I try to run the provided examples, but nothing happen.
I try to load the teapot example, but when I run it (F5), I just see a command line prompt which disappear immediately. (I also try to make the same actions as "Timothy Smith" described in http://devgurus.amd.com/thread/158943 in its first post, but the result was the same)
I'm on Win7 64 bits, with a GTX 470.
Edit: This is the error I get in the log:
[20:10:36] [2495245290] [ERROR] [4448] [gaAPIToSpyConnector::initialize] [src\gaAPIToSpyConnector.cpp] [217] [Assertion failure (rcAPISocket)]
## [20:10:36] [2495245327] [ERROR] [4448] [gaLaunchDebuggedProcess] [src\gaGRApiFunctions.cpp] [388] [Assertion failure (rcAPIConnection)]
## [20:10:36] [2495821910] [ERROR] [4448] [osPipeSocket::close] [src\win32\osPipeSocket.cpp] [124] [Cannot close pipe (pipe type: osPipeSocketServer)]
## [20:10:36] [2495822046] [ERROR] [4448] [osPipeSocket::close] [src\win32\osPipeSocket.cpp] [137] [Cannot close pipe (pipe type: osPipeSocketServer)]
## [20:10:36] [2495822236] [ERROR] [4448] [osPipeSocketServer::open] [src\win32\osPipeSocketServer.cpp] [76] [Assertion failure (_incomingPipe != ((HANDLE)(LONG_PTR)-1))]
## [20:10:36] [2495822377] [ERROR] [4448] [osPipeSocketServer::open] [src\win32\osPipeSocketServer.cpp] [97] [Assertion failure (_outgoingPipe != ((HANDLE)(LONG_PTR)-1))]
## [20:10:36] [2495822453] [ERROR] [4448] [gaAPIToSpyConnector::initialize] [src\gaAPIToSpyConnector.cpp] [217] [Assertion failure (rcAPISocket)]
## [20:10:36] [2495822520] [ERROR] [4448] [gaLaunchDebuggedProcess] [src\gaGRApiFunctions.cpp] [388] [Assertion failure (rcAPIConnection)]
gDebugger is now an AMD product and all the new versions are available on their developer site. Current version is 6.2.
Edit:
The tool is now called CodeXL and available here

Resources