Could not find bundle: org.eclipse.equinox.console - osgi

I'm trying to deploy the Equinox v3.9.1, but I reach the following Exception:
!ENTRY org.eclipse.osgi 4 0 2013-11-11 12:19:29.027
!MESSAGE Could not find bundle: org.eclipse.equinox.console
!STACK 0
org.osgi.framework.BundleException: Could not find bundle: org.eclipse.equinox.console
at org.eclipse.osgi.framework.internal.core.ConsoleManager.checkForConsoleBundle(ConsoleManager.java:211)
at org.eclipse.core.runtime.adaptor.EclipseStarter.startup(EclipseStarter.java:298)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:177)
at org.eclipse.core.runtime.adaptor.EclipseStarter.main(EclipseStarter.java:152)
I am using the config.ini file as follows:
osgi.bundles=org.eclipse.osgi#-1:start ,\
org.apache.felix.gogo.command#:start, \
org.apache.felix.gogo.runtime#:start, \
org.apache.felix.gogo.shell#:start, \
org.eclipse.equinox.console#:start, \
org.eclipse.equinox.cm#:start, \
org.eclipse.equinox.common#:start, \
I am sure the jar file is in the folder. Could anyone help me to solve this error?
Thank you very much

Related

Running spark2 job through Oozie shell action?

As mentioned from the title i'm trying to run a shell action that kicks off a spark job but unfortunately i'm consistently getting the following error...
19/05/10 14:03:39 ERROR AbstractRpcClient: SASL authentication failed.
The most likely cause is missing or invalid credentials. Consider
'kinit'.
javax.security.sasl.SaslException: GSS initiate failed [Caused by
GSSException: No valid credentials provided (Mechanism level: Failed
to find any Kerberos tgt)]
java.io.IOException: Could not set up IO Streams to
<hbaseregionserver>
Fri May 10 14:03:39 BST 2019,
RpcRetryingCaller{globalStartTime=1557493419339, pause=100,
retries=2}, org.apache.hadoop.hbase.ipc.FailedServerException: This
server is in the failed servers list: <hbaseregionserver>
Been playing around trying to get the script to take in the kerberos ticket but having no luck, as far as I can tell its related to the Oozie job not being able to pass the kerberos ticket any ideas why its not picking it up? I'm at a loss? Related code is below
Oozie workflow action
<action name="sparkJ" cred="hive2Cred">
<shell xmlns="uri:oozie:shell-action:0.1">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<configuration>
<property>
<name>mapred.job.queue.name</name>
<value>${oozieQueueName}</value>
</property>
</configuration>
<exec>run.sh</exec>
<file>/thePathToTheScript/run.sh#run.sh</file>
<file>/thePathToTheProperties/myp.properties#myp.properties</file>
<capture-output />
</shell>
<ok to="end" />
<error to="fail" />
</action>
Shell script
#!/bin/sh
export job_name=SPARK_JOB
export configuration=myp.properties
export num_executors=10
export executor_memory=1G
export queue=YARNQ
export max_executors=50
kinit -kt KEYTAB KPRINCIPAL
echo "[[[[[[[[[[[[[ Starting Job - name:${job_name},
configuration:${configuration} ]]]]]]]]]]]]]]"
/usr/hdp/current/spark2-client/bin/spark-submit \
--name ${job_name} \
--driver-java-options "-Dlog4j.configuration=file:./log4j.properties" \
--num-executors ${num_executors} \
--executor-memory ${executor_memory} \
--master yarn \
--keytab KEYTAB \
--principal KPRINCIPAL \
--supervise \
--deploy-mode cluster \
--queue ${queue} \
--files "./${configuration},./hbase-site.xml,./log4j.properties" \
--conf spark.driver.extraClassPath="/usr/hdp/current/hive-
client/lib/datanucleus-*.jar:/usr/hdp/current/tez-client/*.jar" \
--conf spark.executor.extraJavaOptions="-
Djava.security.auth.login.config=./jaas.conf -
Dlog4j.configuration=file:./log4j.properties" \
--conf spark.executor.extraClassPath="/usr/hdp/current/hive-
client/lib/datanucleus-*.jar:/usr/hdp/current/tez-client/*.jar" \
--conf spark.streaming.stopGracefullyOnShutdown=true \
--conf spark.dynamicAllocation.enabled=true \
--conf spark.shuffle.service.enabled=true \
--conf spark.dynamicAllocation.maxExecutors=${max_executors} \
--conf spark.streaming.concurrentJobs=2 \
--conf spark.streaming.backpressure.enabled=true \
--conf spark.yarn.security.tokens.hive.enabled=true \
--conf spark.yarn.security.tokens.hbase.enabled=true \
--conf spark.streaming.kafka.maxRatePerPartition=5000 \
--conf spark.streaming.backpressure.pid.maxRate=3000 \
--conf spark.streaming.backpressure.pid.minRate=200 \
--conf spark.streaming.backpressure.initialRate=5000 \
--jars /usr/hdp/current/hbase-client/lib/guava-
12.0.1.jar,/usr/hdp/current/hbase-client/lib/hbase-
common.jar,/usr/hdp/current/hbase-client/lib/hbase-
client.jar,/usr/hdp/current/hbase-client/lib/hbase-
protocol.jar,/usr/hdp/current/spark-client/lib/datanucleus-api-jdo-
3.2.6.jar,/usr/hdp/current/spark-client/lib/datanucleus-rdbms-
3.2.9.jar,/usr/hdp/current/spark-client/lib/datanucleus-core-
3.2.10.jar \
--class myclass myjar.jar ./${configuration}
Many thanks to any help you can provide.

Problems running websphere application server for developers

I've just installed Websphere application server for developers version 8.5.5.9, using IBM installation manager. When I try and create a profile I get an error that points me to a log file. Here's the beginning of it:
java.fullversion=JRE 1.6.0 IBM J9 2.6 Windows 7 amd64-64 Compressed References 20151222_283040 (JIT enabled, AOT enabled)
J9VM - R26_Java626_SR8_20151222_1616_B283040
JIT - tr.r11_20151209_107111.01
GC - R26_Java626_SR8_20151222_1616_B283040_CMPRSS
J9CL - 20151222_283040
BootLoader constants: OS=win32, ARCH=x86_64, WS=win32, NL=en_GB
!ENTRY org.eclipse.equinox.app 0 0 2016-07-27 11:53:16.101
!MESSAGE Product com.ibm.ws.pmt.views.standalone.standAloneWasTools could not be found.
!ENTRY org.eclipse.osgi 2 0 2016-07-27 11:53:16.146
!MESSAGE One or more bundles are not resolved because the following root constraints are not resolved:
!SUBENTRY 1 org.eclipse.osgi 2 0 2016-07-27 11:53:16.146
!MESSAGE Bundle websphere#plugins\com.ibm.ws.pmt.views_8.5.1.jar was not resolved.
!SUBENTRY 2 com.ibm.ws.pmt.views 2 0 2016-07-27 11:53:16.146
!MESSAGE Missing imported package com.ibm.ws.profile.utils_0.0.0.
!SUBENTRY 2 com.ibm.ws.pmt.views 2 0 2016-07-27 11:53:16.146
!MESSAGE Missing imported package com.ibm.ws.profile_0.0.0.
!SUBENTRY 2 com.ibm.ws.pmt.views 2 0 2016-07-27 11:53:16.146
!MESSAGE Missing imported package com.ibm.ws.wct.config.definitionLocations_0.0.0.
!SUBENTRY 2 com.ibm.ws.pmt.views 2 0 2016-07-27 11:53:16.146
!MESSAGE Missing imported package com.ibm.ws.install.configmanager.logging_0.0.0.
!SUBENTRY 2 com.ibm.ws.pmt.views 2 0 2016-07-27 11:53:16.146
!MESSAGE Missing imported package com.ibm.ws.wct.config.definitions_0.0.0.
!SUBENTRY 2 com.ibm.ws.pmt.views 2 0 2016-07-27 11:53:16.146
!MESSAGE Missing imported package com.ibm.wsspi.profile.registry_0.0.0.
!SUBENTRY 2 com.ibm.ws.pmt.views 2 0 2016-07-27 11:53:16.146
!MESSAGE Missing imported package com.ibm.wsspi.profile_0.0.0.
!SUBENTRY 1 org.eclipse.osgi 2 0 2016-07-27 11:53:16.147
!MESSAGE Bundle websphere#plugins\com.ibm.ws.pmt.tools_8.0.0.jar was not resolved.
!SUBENTRY 2 com.ibm.ws.pmt.tools 2 0 2016-07-27 11:53:16.147
!MESSAGE Missing imported package com.ibm.ws.install.configmanager.logging_0.0.0.
!SUBENTRY 1 org.eclipse.osgi 2 0 2016-07-27 11:53:16.147
!MESSAGE Bundle websphere#plugins\com.ibm.ws.pmt.views.standalone_8.0.0.jar was not resolved.
Cleary there are some missing jar files, but I have no idea why they weren't installed or were to get them from.
Should also mention that I'm completely new to Websphere.
I was having this same issue, but found the solution (at least for me).
What I found was that my computer was imaged with #user.home pointing to a UNC, which webshpere does not like. (see http://www-01.ibm.com/support/docview.wss?uid=swg21584343). When I got the error, a dialog was even displayed showing the location of the error log with a UNC name.
As shown in the ibm link I placed in the previous paragraph, to resolve i simply located my config.ini (in my case, C:\Program_Files
(x86)\IBM\WebSphere\AppServer\bin\ProfileManagement\eclipse64\configuration) and changed #user.home to c\:/temp in the following 2 keys:
• osgi.instance.area.default=#user.home/AppData/Lo . . .
• osgi.configuration.area=#user.home/AppData/Lo . . .
Hope this helps!
G

Apache Spark's deployment issue (cluster-mode) with Hive

EDIT:
I'm developing a Spark application that reads a data from the multiple structured schemas and I'm trying to aggregate the information from those schemas. My application runs well when I run it locally. But when I run it on a cluster, I'm having trouble with the configurations (most probably with hive-site.xml) or with the submit-command arguments. I've looked for the other related posts, but couldn't find the solution SPECIFIC to my scenario. I've mentioned what commands I tried and what errors I got in detail below. I'm new to Spark and I might be missing something trivial, but can provide more information to support my question.
Original Question:
I've been trying to run my spark application in a 6-node Hadoop cluster bundled with HDP2.3 components.
Here are component information that might be useful for you guys in suggesting the solutions:
Cluster information: 6-node cluster:
128GB RAM
24 core
8TB HDD
Components used in the application
HDP - 2.3
Spark - 1.3.1
$ hadoop version:
Hadoop 2.7.1.2.3.0.0-2557
Subversion git#github.com:hortonworks/hadoop.git -r 9f17d40a0f2046d217b2bff90ad6e2fc7e41f5e1
Compiled by jenkins on 2015-07-14T13:08Z
Compiled with protoc 2.5.0
From source with checksum 54f9bbb4492f92975e84e390599b881d
Scenario:
I'm trying to use the SparkContext and HiveContext in a way to take full advantage of the spark's real time query on it's data structure like dataframe. The dependencies used in my application are:
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-sql_2.10</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-hive_2.10</artifactId>
<version>1.3.1</version>
</dependency>
<dependency>
<groupId>com.databricks</groupId>
<artifactId>spark-csv_2.10</artifactId>
<version>1.4.0</version>
</dependency>
Below are the submit commands and the coresponding error logs that I'm getting:
Submit Command1:
spark-submit --class working.path.to.Main \
--master yarn \
--deploy-mode cluster \
--num-executors 17 \
--executor-cores 8 \
--executor-memory 25g \
--driver-memory 25g \
--num-executors 5 \
application-with-all-dependencies.jar
Error Log1:
User class threw exception: java.lang.RuntimeException: Unable to instantiate org.apache.hadoop.hive.metastore.HiveMetaStoreClient
Submit Command2:
spark-submit --class working.path.to.Main \
--master yarn \
--deploy-mode cluster \
--num-executors 17 \
--executor-cores 8 \
--executor-memory 25g \
--driver-memory 25g \
--num-executors 5 \
--files /etc/hive/conf/hive-site.xml \
application-with-all-dependencies.jar
Error Log2:
User class threw exception: java.lang.NumberFormatException: For input string: "5s"
Since I don't have the administrative permissions, I cannot modify the configuration. Well, I can contact to the IT engineer and make the changes, but I'm looking for the
solution that involves less changes in the configuration files, if possible!
Configuration changes were suggested here.
Then I tried passing various jar files as arguments as suggested in other discussion forums.
Submit Command3:
spark-submit --class working.path.to.Main \
--master yarn \
--deploy-mode cluster \
--num-executors 17 \
--executor-cores 8 \
--executor-memory 25g \
--driver-memory 25g \
--num-executors 5 \
--jars /usr/hdp/2.3.0.0-2557/spark/lib/datanucleus-api-jdo-3.2.6.jar,/usr/hdp/2.3.0.0-2557/spark/lib/datanucleus-core-3.2.10.jar,/usr/hdp/2.3.0.0-2557/spark/lib/datanucleus-rdbms-3.2.9.jar \
--files /etc/hive/conf/hive-site.xml \
application-with-all-dependencies.jar
Error Log3:
User class threw exception: java.lang.NumberFormatException: For input string: "5s"
I didn't understood what happened with the following command and couldn't analyze the error log.
Submit Command4:
spark-submit --class working.path.to.Main \
--master yarn \
--deploy-mode cluster \
--num-executors 17 \
--executor-cores 8 \
--executor-memory 25g \
--driver-memory 25g \
--num-executors 5 \
--jars /usr/hdp/2.3.0.0-2557/spark/lib/*.jar \
--files /etc/hive/conf/hive-site.xml \
application-with-all-dependencies.jar
Submit Log4:
Application application_1461686223085_0014 failed 2 times due to AM Container for appattempt_1461686223085_0014_000002 exited with exitCode: 10
For more detailed output, check application tracking page:http://cluster-host:XXXX/cluster/app/application_1461686223085_0014Then, click on links to logs of each attempt.
Diagnostics: Exception from container-launch.
Container id: container_e10_1461686223085_0014_02_000001
Exit code: 10
Stack trace: ExitCodeException exitCode=10:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:545)
at org.apache.hadoop.util.Shell.run(Shell.java:456)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:722)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:211)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Container exited with a non-zero exit code 10
Failing this attempt. Failing the application.
Any other possible options? Any kind of help will be highly appreciated. Please let me know if you need any other information.
Thank you.
The solution explained in here worked for my case. There are two locations hive-site.xml resides that might be confusing. Use --files /usr/hdp/current/spark-client/conf/hive-site.xml instead of --files /etc/hive/conf/hive-site.xml. I didn't have to add the jars for my configuration. Hope this will help someone struggling with the similar problem. Thanks.

It be fail to startup MyEclipse 10.7 when the argument -vm be changed to other jdk ,SOS

why I can't change myeclipse 10.7 64bit configeration file : myelispse.ini, change config argument "-vm" will fail to start up.by the way,my operation system is win8 64bit,and my jdk work without any problem.
someone could tell me the reason and help me to solved the problem,thanks a lot!
good configeration just like that:
#utf8 (do not remove)
#utf8 (do not remove)
-startup
../Common/plugins/org.eclipse.equinox.launcher_1.2.0.v20110502.jar
--launcher.library
../Common/plugins/org.eclipse.equinox.launcher.i18n.win32.win32.x86_64_4.2.0.v201201111650
-install
G:/Program Files/MyEclipse/MyEclipse 10
-vm
G:/Program Files/MyEclipse/Common/binary/com.sun.java.jdk.win32.x86_64_1.6.0.013/bin/javaw.exe
-vmargs
-Xmn512m
-Xms1024m
-Xmx1024m
-XX:SurvivorRatio=4
-XX:PermSize=256m
-XX:ReservedCodeCacheSize=128m
-Xverify:none
-Xnoclassgc
-XX:+UseParNewGC
-XX:+UseConcMarkSweepGC
-XX:MaxTenuringThreshold=10
-XX:ParallelGCThreads=4
-XX:CMSInitiatingOccupancyFraction=85
-XX:+DisableExplicitGC
-Dosgi.nls.warnings=ignore
bad configeration :
#utf8 (do not remove)
#utf8 (do not remove)
-startup
../Common/plugins/org.eclipse.equinox.launcher_1.2.0.v20110502.jar
--launcher.library
../Common/plugins/org.eclipse.equinox.launcher.i18n.win32.win32.x86_64_4.2.0.v201201111650
-install
G:/Program Files/MyEclipse/MyEclipse 10
-vm
G:/Program Files/Java/jdk1.7.0_09/bin/javaw.exe
-vmargs
-Xmn512m
-Xms1024m
-Xmx1024m
-XX:SurvivorRatio=4
-XX:PermSize=256m
-XX:ReservedCodeCacheSize=128m
-Xverify:none
-Xnoclassgc
-XX:+UseParNewGC
-XX:+UseConcMarkSweepGC
-XX:MaxTenuringThreshold=10
-XX:ParallelGCThreads=4
-XX:CMSInitiatingOccupancyFraction=85
-XX:+DisableExplicitGC
-Dosgi.nls.warnings=ignore
error log:
!SESSION 2013-01-26 23:51:39.452 -----------------------------------------------
eclipse.buildId=unknown
java.version=1.7.0_09
java.vendor=Oracle Corporation
BootLoader constants: OS=win32, ARCH=x86_64, WS=win32, NL=zh_CN
Command-line arguments: -os win32 -ws win32 -arch x86_64
!ENTRY org.eclipse.equinox.app 0 0 2013-01-26 23:51:41.347
!MESSAGE Product com.genuitec.myeclipse.product.ide could not be found.
!ENTRY org.eclipse.osgi 2 0 2013-01-26 23:51:50.086
!MESSAGE One or more bundles are not resolved because the following root constraints are not resolved:
!SUBENTRY 1 org.eclipse.osgi 2 0 2013-01-26 23:51:50.086
!MESSAGE Bundle reference:file:../Common/plugins/com.genuitec.eclipse.core_10.7.0.me201211011550.jar was not resolved.
!SUBENTRY 2 com.genuitec.eclipse.core 2 0 2013-01-26 23:51:50.087
!MESSAGE Missing host null_0.0.0.
!SUBENTRY 1 org.eclipse.osgi 2 0 2013-01-26 23:51:50.087
!MESSAGE Bundle reference:file:../Common/plugins/com.genuitec.eclipse.jniwrapper_9.0.0.me201105051700.jar was not resolved.
!SUBENTRY 2 com.genuitec.eclipse.jniwrapper 2 0 2013-01-26 23:51:50.087
!MESSAGE Missing host null_0.0.0.
!ENTRY org.eclipse.osgi 2 0 2013-01-26 23:51:52.696
!MESSAGE The following is a complete list of bundles which are not resolved, see the prior log entry for the root cause if it exists:
!SUBENTRY 1 org.eclipse.osgi 2 0 2013-01-26 23:51:52.696
!MESSAGE Bundle com.genuitec.eclipse.aspphp.core_9.0.0.me201108091322 [1224] was not resolved.
!SUBENTRY 2 com.genuitec.eclipse.aspphp.core 2 0 2013-01-26 23:51:52.696
!MESSAGE Missing required bundle com.genuitec.eclipse.core_0.0.0.
!SUBENTRY 1 org.eclipse.osgi 2 0 2013-01-26 23:51:52.696
!MESSAGE Bundle com.genuitec.eclipse.aspphp.ui_9.0.0.me201108091322 [1225] was not resolved.
!SUBENTRY 2 com.genuitec.eclipse.aspphp.ui 2 0 2013-01-26 23:51:52.696
!MESSAGE Missing required bundle com.genuitec.eclipse.core_0.0.0.
!SUBENTRY 2 com.genuitec.eclipse.aspphp.ui 2 0 2013-01-26 23:51:52.696
!MESSAGE Missing required bundle com.genuitec.eclipse.aspphp.core_0.0.0.
!SUBENTRY 2 com.genuitec.eclipse.aspphp.ui 2 0 2013-01-26 23:51:52.696
!MESSAGE Missing required bundle com.genuitec.eclipse.webdesigner3_0.0.0.
!SUBENTRY 2 com.genuitec.eclipse.aspphp.ui 2 0 2013-01-26 23:51:52.696
!MESSAGE Missing required bundle com.genuitec.eclipse.core.common_0.0.0.
!SUBENTRY 1 org.eclipse.osgi 2 0 2013-01-26 23:51:52.696
!MESSAGE Bundle com.genuitec.eclipse.ast.deploy.core_10.1.0.me201211011550 [1226] was not resolved.
!SUBENTRY 2 com.genuitec.eclipse.ast.deploy.core 2 0 2013-01-26 23:51:52.696
!MESSAGE Missing required bundle com.genuitec.eclipse.j2eedt.core_0.0.0.
...................so many erorr.........................
MyEclipse 10.7 will work just fine if started with a 1.7 JDK. I can see nothing wrong with your configuration - a quick sanity check, are you sure the 1.7 VM you are pointing to is a 64-bit VM? You could confirm this by going to the bin folder on the command line and typing "java -version" to see if it reports a 64-bit VM.
Is there any specific reason you want to start with a 1.7 VM? Even if you leave the default VM arguments in the ini file (1.6), you can still configure a 1.7 JRE / JDK on the JRE preference page and use that in your applications and / or set that to the workspace default JRE.

How to setup a proper equinox installation

I googled this for hours but I couldn't find anything useful.
I have developed some OSGi bundles and now I want to run them outside of the Eclipse IDE in the equinox container, but it always throws an exception:
!SESSION 2011-01-03 14:26:58.958 -----------------------------------------------
eclipse.buildId=unknown
java.version=1.6.0_20
java.vendor=Sun Microsystems Inc.
BootLoader constants: OS=win32, ARCH=x86_64, WS=win32, NL=de_CH
Framework arguments: -Dosgi.clean=true -Declipse.ignoreApp=true -Dosgi.noShutdown=true -console;
Command-line arguments: -consoleLog -Dosgi.clean=true -Declipse.ignoreApp=true -Dosgi.noShutdown=true -console;
!ENTRY org.eclipse.osgi 4 0 2011-01-03 14:26:59.567
!MESSAGE Error starting bundle: initial#reference:file:javax.transaction_1.1.1.v201006150915.jar/
!STACK 0
org.osgi.framework.BundleException: A fragment bundle cannot be started: javax.transaction_1.1.1.v201006150915 [49]
at org.eclipse.osgi.framework.internal.core.BundleFragment.startWorker(BundleFragment.java:228)
at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:284)
at org.eclipse.core.runtime.adaptor.EclipseStarter.startBundle(EclipseStarter.java:1133)
at org.eclipse.core.runtime.adaptor.EclipseStarter.startBundles(EclipseStarter.java:1126)
at org.eclipse.core.runtime.adaptor.EclipseStarter.loadBasicBundles(EclipseStarter.java:646)
at org.eclipse.core.runtime.adaptor.EclipseStarter.startup(EclipseStarter.java:301)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:175)
at org.eclipse.core.runtime.adaptor.EclipseStarter.main(EclipseStarter.java:150)
init dd core...
!ENTRY org.eclipse.osgi 4 0 2011-01-03 14:26:59.773
!MESSAGE Bundle javax.transaction_1.1.1.v201006150915 [49] is not active.
(the "init dd core..." text comes properly from my project.)
I have the following file structure:
+configuration
+config.ini
+ch.thobens.dd.commands_1.0.0.jar
+ch.thobens.dd.common.items_1.0.0.jar
+ch.thobens.dd.core_1.0.0.jar
+ch.thobens.dd.game_1.0.0.jar
+javax.transaction_1.1.1.v201006150915.jar
+org.eclipse.core.contenttype_3.4.100.v20100505-1235.jar
+org.eclipse.core.jobs_3.5.0.v20100515.jar
+org.eclipse.core.runtime.compatibility.auth_3.2.200.v20100517.jar
+org.eclipse.core.runtime.compatibility.registry_3.3.0.v20100520/runtime_registry_compatibility.jar
+org.eclipse.core.runtime_3.6.0.v20100505.jar
+org.eclipse.equinox.app_1.3.0.v20100512.jar
+org.eclipse.equinox.common_3.6.0.v20100503.jar
+org.eclipse.equinox.preferences_3.3.0.v20100503.jar
+org.eclipse.equinox.registry_3.5.0.v20100503.jar
+org.eclipse.osgi.services_3.2.100.v20100503.jar
+org.eclipse.osgi_3.6.0.v20100517.jar
and my config.ini file has the following contents:
osgi.bundles=javax.transaction_1.1.1.v201006150915.jar#start, org.eclipse.core.contenttype_3.4.100.v20100505-1235.jar#start, org.eclipse.core.jobs_3.5.0.v20100515.jar#start, org.eclipse.core.runtime.compatibility.auth_3.2.200.v20100517.jar#start,org.eclipse.core.runtime.compatibility.registry_3.3.0.v20100520/runtime_registry_compatibility.jar#start, org.eclipse.core.runtime_3.6.0.v20100505.jar#start, org.eclipse.equinox.app_1.3.0.v20100512.jar#start, org.eclipse.equinox.common_3.6.0.v20100503.jar#2:start, org.eclipse.equinox.preferences_3.3.0.v20100503.jar#start, org.eclipse.equinox.registry_3.5.0.v20100503.jar#start, org.eclipse.osgi.services_3.2.100.v20100503.jar#start, ch.thobens.dd.commands_1.0.0.jar#start, ch.thobens.dd.common.items_1.0.0.jar#start, ch.thobens.dd.core_1.0.0.jar#start, ch.thobens.dd.game_1.0.0.jar#1:start
eclipse.ignoreApp=true
osgi.noShutdown=true
The bundles that are listed here are the same bundles that are selected if I select "Add required Plug-ins" in the run configuration. If I run these bundles from the Eclipse PDE, it works fine.
Additionally, there is no difference between running the command
java -jar org.eclipse.osgi_3.6.0.v20100517.jar -consoleLog -Dosgi.clean=true -Declipse.ignoreApp=true -Dosgi.noShutdown=true
and when I use the equinox launcher (via eclipse.exe).
Thanks for any help
I found the solution (through the eclipse product exporter):
The javax.transaction_1.1.1.v201006150915.jar is not an OSGi bundle itself, it's an OSGi framework extension.
I had to change the config.ini file:
Remove the entry for the
javax_transaction bundle in the
property osgi.bundles
added the following line:
osgi.framework.extensions=javax.transaction_1.1.1.v201006150915.jar

Resources