OSGI. Waiting for timeout during installation of bundle (tinybundles) - osgi

I have a problem running pax exam test under karaf.
During an installation of bundle
2015-05-20 14:16:42,644 | DEBUG | FelixStartLevel | BundleManager | 1 - org.apache.karaf.features.core - 3.0.0 | Checking file:/tmp/ops4j-store-anonymous-568188393605841387/tinybundles_c3beb06a2df3da75e08a1ae1a44c832cfbd034c0.bin
it hangs for some time in
org.apache.karaf.features.internal.BundleManager#waitForUrlHandler
looking for a
(&(objectClass=org.osgi.service.url.URLStreamHandlerService)(url.handler.protocol=file))
here is the thread dump:
at java.lang.Object.wait(Object.java:-1)
at org.osgi.util.tracker.ServiceTracker.waitForService(ServiceTracker.java:499)
at org.apache.karaf.features.internal.BundleManager.waitForUrlHandler(BundleManager.java:229)
at org.apache.karaf.features.internal.BundleManager.getInputStreamForBundle(BundleManager.java:186)
at org.apache.karaf.features.internal.BundleManager.doInstallBundleIfNeeded(BundleManager.java:99)
at org.apache.karaf.features.internal.BundleManager.installBundleIfNeeded(BundleManager.java:90)
at org.apache.karaf.features.internal.FeaturesServiceImpl.doInstallFeature(FeaturesServiceImpl.java:523)
at org.apache.karaf.features.internal.FeaturesServiceImpl.installFeatures(FeaturesServiceImpl.java:383)
at org.apache.karaf.features.internal.BootFeaturesInstaller.installBootFeatures(BootFeaturesInstaller.java:92)
at org.apache.karaf.features.internal.BootFeaturesInstaller.start(BootFeaturesInstaller.java:76)
at sun.reflect.NativeMethodAccessorImpl.invoke0(NativeMethodAccessorImpl.java:-1)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.aries.blueprint.utils.ReflectionUtils.invoke(ReflectionUtils.java:297)
at org.apache.aries.blueprint.container.BeanRecipe.invoke(BeanRecipe.java:958)
at org.apache.aries.blueprint.container.BeanRecipe.runBeanProcInit(BeanRecipe.java:712)
at org.apache.aries.blueprint.container.BeanRecipe.internalCreate2(BeanRecipe.java:824)
at org.apache.aries.blueprint.container.BeanRecipe.internalCreate(BeanRecipe.java:787)
at org.apache.aries.blueprint.di.AbstractRecipe$1.call(AbstractRecipe.java:79)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at org.apache.aries.blueprint.di.AbstractRecipe.create(AbstractRecipe.java:88)
at org.apache.aries.blueprint.container.BlueprintRepository.createInstances(BlueprintRepository.java:245)
at org.apache.aries.blueprint.container.BlueprintRepository.createAll(BlueprintRepository.java:183)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.instantiateEagerComponents(BlueprintContainerImpl.java:681)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.doRun(BlueprintContainerImpl.java:378)
at org.apache.aries.blueprint.container.BlueprintContainerImpl.run(BlueprintContainerImpl.java:269)
- locked <0xbb5> (a java.util.concurrent.atomic.AtomicBoolean)
at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:276)
at org.apache.aries.blueprint.container.BlueprintExtender.createContainer(BlueprintExtender.java:245)
at org.apache.aries.blueprint.container.BlueprintExtender.modifiedBundle(BlueprintExtender.java:235)
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:500)
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.customizerModified(BundleHookBundleTracker.java:433)
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$AbstractTracked.track(BundleHookBundleTracker.java:725)
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$Tracked.bundleChanged(BundleHookBundleTracker.java:463)
at org.apache.aries.util.tracker.hook.BundleHookBundleTracker$BundleEventHook.event(BundleHookBundleTracker.java:422)
at org.apache.felix.framework.util.SecureAction.invokeBundleEventHook(SecureAction.java:1103)
at org.apache.felix.framework.util.EventDispatcher.createWhitelistFromHooks(EventDispatcher.java:695)
at org.apache.felix.framework.util.EventDispatcher.fireBundleEvent(EventDispatcher.java:483)
at org.apache.felix.framework.Felix.fireBundleEvent(Felix.java:4403)
at org.apache.felix.framework.Felix.startBundle(Felix.java:2092)
at org.apache.felix.framework.Felix.setActiveStartLevel(Felix.java:1291)
at org.apache.felix.framework.FrameworkStartLevelImpl.run(FrameworkStartLevelImpl.java:304)
at java.lang.Thread.run(Thread.java:745)
I didn't find any bundle that provides that service => so my test hangs for 30sec each time installing tinybundle's folder.
I've started a paxexam's karaf manually from the target folder and I ran service:list to see what bundles can provide URLStreamHandlerService service but there wasn't any with url.handler.protocol=file
service.id = 2
Provided by :
System Bundle (0)
Used by:
Apache Felix File Install (17)
Apache Aries JMX Core (225)
[org.osgi.service.url.URLStreamHandlerService]
service.id = 51
url.handler.protocol = feature
Provided by :
Apache Karaf :: Deployer :: Features (12)
[org.osgi.service.url.URLStreamHandlerService]
service.id = 15
url.handler.protocol = jardir
Provided by :
Apache Felix File Install (17)
[org.osgi.service.url.URLStreamHandlerService]
service.id = 42
url.handler.protocol = spring
Provided by :
Apache Karaf :: Deployer :: Spring (27)
[org.osgi.service.url.URLStreamHandlerService]
service.id = 45
url.handler.protocol = blueprint
Provided by :
Apache Karaf :: Deployer :: Blueprint (30)
[org.osgi.service.url.URLStreamHandlerService]
service.id = 4
url.handler.protocol = [wrap]
Provided by :
OPS4J Pax Url - wrap: (37)
[org.osgi.service.url.URLStreamHandlerService]
So the question is: how can I avoid waiting for a timeout. Do we really need URLStreamHandlerService service, if yes how can I provide it ?

Related

Corda 4.0: How to retrieve transactions performed by old CorDapp?

We are building a POC using Corda 4.0 and Springboot web server.
Following are the versions of Corda platform, Springboot server, and other essential dependencies used for building the POC-
cordaReleaseGroup=net.corda
cordaVersion=4.0
gradlePluginsVersion=4.0.45
kotlinVersion=1.2.71
junitVersion=4.12
quasarVersion=0.7.10
spring_version = '4.3.11.RELEASE'
spring_boot_version = '2.0.2.RELEASE'
spring_boot_gradle_plugin_version = '2.1.1.RELEASE'
jvmTarget = "1.8"
log4jVersion =2.11.2
platformVersion=4
slf4jVersion=1.7.25
nettyVersion=4.1.22.Final
The CorDapp developed for POC has four nodes -
Notary Node
Provider Company Node
Consumer Company 1 Node
Consumer Company 1 Sub Contact Node
There are 10 transaction flows between the above nodes, in the node databases.
Later due to a new requirement, new fields were added to the transaction schema and the Cordapp was recreated and distributed to the nodes.
When the web application was run, I cannot view the 10 transactions.
The node throws following errors:
Sun May 09 07:32:28 IST 2021>>> [ERROR] 07:50:17+0530 [Node thread-1] proxies.ExceptionSerialisingRpcOpsProxy.log - Error during RPC invocation [errorCode=1dl0tb4, moreInformationAt=https://errors.corda.net/OS/4.0/1dl0tb4] {actor_id=user1, actor_owning_identity=OU=Company, O=MyComp, L=Houston, C=US, actor_store_id=NODE_CONFIG, fiber-id=10000001, flow-id=3a590f6a-9920-4381-ab53-1e4b5465b1c8, invocation_id=f309aad5-487a-4b04-be6d-f90f83c7def7, invocation_timestamp=2021-05-09T02:20:16.497Z, origin=user1, session_id=909b6b61-f9fc-4a0f-b521-a0c2eed39516, session_timestamp=2021-05-09T02:11:19.347Z, thread-id=197}
[ERROR] 07:50:17+0530 [Node thread-1] proxies.ExceptionSerialisingRpcOpsProxy.log - Error during RPC invocation [errorCode=1uso5y8, moreInformationAt=https://errors.corda.net/OS/4.0/1uso5y8] {actor_id=user1, actor_owning_identity=OU=Company, O=MyComp, L=Houston, C=US, actor_store_id=NODE_CONFIG, fiber-id=10000001, flow-id=3a590f6a-9920-4381-ab53-1e4b5465b1c8, invocation_id=f309aad5-487a-4b04-be6d-f90f83c7def7, invocation_timestamp=2021-05-09T02:20:16.497Z, origin=user1, session_id=909b6b61-f9fc-4a0f-b521-a0c2eed39516, session_timestamp=2021-05-09T02:11:19.347Z, thread-id=197}
[ERROR] 07:50:17+0530 [Node thread-1] proxies.ExceptionSerialisingRpcOpsProxy.log - Error during RPC invocation [errorCode=1uso5y8, moreInformationAt=https://errors.corda.net/OS/4.0/1uso5y8] {actor_id=user1, actor_owning_identity=OU=Company, O=MyComp, L=Houston, C=US, actor_store_id=NODE_CONFIG, fiber-id=10000001, flow-id=3a590f6a-9920-4381-ab53-1e4b5465b1c8, invocation_id=f309aad5-487a-4b04-be6d-f90f83c7def7, invocation_timestamp=2021-05-09T02:20:16.497Z, origin=user1, session_id=909b6b61-f9fc-4a0f-b521-a0c2eed39516, session_timestamp=2021-05-09T02:11:19.347Z, thread-id=197}
[ERROR] 07:50:17+0530 [Node thread-1] proxies.ExceptionSerialisingRpcOpsProxy.log - Error during RPC invocation [errorCode=1uso5y8, moreInformationAt=https://errors.corda.net/OS/4.0/1uso5y8] {actor_id=user1, actor_owning_identity=OU=Company, O=MyComp, L=Houston, C=US, actor_store_id=NODE_CONFIG, fiber-id=10000001, flow-id=3a590f6a-9920-4381-ab53-1e4b5465b1c8, invocation_id=f309aad5-487a-4b04-be6d-f90f83c7def7, invocation_timestamp=2021-05-09T02:20:16.497Z, origin=user1, session_id=909b6b61-f9fc-4a0f-b521-a0c2eed39516, session_timestamp=2021-05-09T02:11:19.347Z, thread-id=197}
The node log shows following error:
Mandatory property bndArea of local type is not present in remote type - did someone remove a property from the schema without considering old clients?
W 07:50:19 119 ThrowableSerializer.fromProxy - Unexpected exception de-serializing throwable: net.corda.core.node.services.VaultQueryException. Converting to CordaRuntimeException.
java.lang.reflect.InvocationTargetException: null
at sun.reflect.GeneratedConstructorAccessor63.newInstance(Unknown Source) ~[?:?]
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) ~[?:1.8.0_271]
at java.lang.reflect.Constructor.newInstance(Constructor.java:423) ~[?:1.8.0_271]
at net.corda.serialization.internal.amqp.custom.ThrowableSerializer.fromProxy(ThrowableSerializer.kt:71) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.custom.ThrowableSerializer.fromProxy(ThrowableSerializer.kt:14) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.CustomSerializer$Proxy.readObject(CustomSerializer.kt:179) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.DeserializationInput.readObject$serialization(DeserializationInput.kt:182) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.DeserializationInput.readObjectOrNull$serialization(DeserializationInput.kt:147) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.DescribedTypeReadStrategy.readProperty(ComposableTypePropertySerializer.kt:202) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.ComposableTypePropertySerializer.readProperty(ComposableTypePropertySerializer.kt) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.ComposableObjectReader$readObject$$inlined$ifThrowsAppend$lambda$1.invoke(ObjectSerializer.kt:140) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.ComposableObjectReader$readObject$$inlined$ifThrowsAppend$lambda$1.invoke(ObjectSerializer.kt:122) ~[corda-serialization-4.0.jar:?]
at kotlin.sequences.TransformingSequence$iterator$1.next(Sequences.kt:149) ~[kotlin-stdlib-1.2.71.jar:1.2.71-release-64 (1.2.71)]
at net.corda.serialization.internal.amqp.ComposableObjectReader.readObject(ObjectSerializer.kt:219) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.ComposableObjectSerializer.readObject(ObjectSerializer.kt:91) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.CustomSerializer$Proxy.readObject(CustomSerializer.kt:178) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.DeserializationInput.readObject$serialization(DeserializationInput.kt:182) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.DeserializationInput.readObjectOrNull$serialization(DeserializationInput.kt:147) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.DeserializationInput$deserialize$1.invoke(DeserializationInput.kt:124) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.DeserializationInput.des(DeserializationInput.kt:99) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.DeserializationInput.deserialize(DeserializationInput.kt:119) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.amqp.AbstractAMQPSerializationScheme.deserialize(AMQPSerializationScheme.kt:225) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.SerializationFactoryImpl$deserialize$1$1.invoke(SerializationScheme.kt:102) ~[corda-serialization-4.0.jar:?]
at net.corda.core.serialization.SerializationFactory.withCurrentContext(SerializationAPI.kt:72) ~[corda-core-4.0.jar:?]
at net.corda.serialization.internal.SerializationFactoryImpl$deserialize$1.invoke(SerializationScheme.kt:102) ~[corda-serialization-4.0.jar:?]
at net.corda.serialization.internal.SerializationFactoryImpl$deserialize$1.invoke(SerializationScheme.kt:70) ~[corda-serialization-4.0.jar:?]
at net.corda.core.serialization.SerializationFactory.asCurrent(SerializationAPI.kt:86) ~[corda-core-4.0.jar:?]
at net.corda.serialization.internal.SerializationFactoryImpl.deserialize(SerializationScheme.kt:102) ~[corda-serialization-4.0.jar:?]
at net.corda.nodeapi.RPCApi$ServerToClient$Companion.fromClientMessage(RPCApi.kt:378) ~[corda-node-api-4.0.jar:?]
at net.corda.client.rpc.internal.RPCClientProxyHandler.artemisMessageHandler(RPCClientProxyHandler.kt:309) ~[corda-rpc-4.0.jar:?]
at net.corda.client.rpc.internal.RPCClientProxyHandler.access$artemisMessageHandler(RPCClientProxyHandler.kt:75) ~[corda-rpc-4.0.jar:?]
at net.corda.client.rpc.internal.RPCClientProxyHandler$initSessions$1.invoke(RPCClientProxyHandler.kt:519) ~[corda-rpc-4.0.jar:?]
<============-> 93% EXECUTING [28m 52s]
> :server:compComp
at net.corda.client.rpc.internal.RPCClientProxyHandlerKt$sam$org_apache_activemq_artemis_api_core_client_MessageHandler$0.onMessage(RPCClientProxyHandler.kt) ~[corda-rpc-4.0.jar:?]
at org.apache.activemq.artemis.core.client.impl.ClientConsumerImpl.callOnMessage(ClientConsumerImpl.java:1002) ~[artemis-core-client-2.6.2.jar:2.6.2]
at org.apache.activemq.artemis.core.client.impl.ClientConsumerImpl.access$400(ClientConsumerImpl.java:50) ~[artemis-core-client-2.6.2.jar:2.6.2]
at org.apache.activemq.artemis.core.client.impl.ClientConsumerImpl$Runner.run(ClientConsumerImpl.java:1125) ~[artemis-core-client-2.6.2.jar:2.6.2]
at org.apache.activemq.artemis.utils.actors.OrderedExecutor.doTask(OrderedExecutor.java:42) ~[artemis-commons-2.6.2.jar:2.6.2]
at org.apache.activemq.artemis.utils.actors.OrderedExecutor.doTask(OrderedExecutor.java:31) ~[artemis-commons-2.6.2.jar:2.6.2]
at org.apache.activemq.artemis.utils.actors.ProcessorBase.executePendingTasks(ProcessorBase.java:66) ~[artemis-commons-2.6.2.jar:2.6.2]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[?:1.8.0_271]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[?:1.8.0_271]
at org.apache.activemq.artemis.utils.ActiveMQThreadFactory$1.run(ActiveMQThreadFactory.java:118) ~[artemis-commons-2.6.2.jar:2.6.2]
Caused by: java.lang.IllegalArgumentException: Parameter specified as non-null is null: method net.corda.core.node.services.VaultQueryException.<init>, parameter description
at net.corda.core.node.services.VaultQueryException.<init>(VaultService.kt) ~[corda-core-4.0.jar:?]
... 43 more
E 08:07:59 74 [dispatcherServlet].log - Servlet.service() for servlet [dispatcherServlet] in context with path [] threw exception [Request processing failed; nested exception is net.corda.core.CordaRuntimeException] with root cause
net.corda.core.CordaRuntimeException: null
Question:
Is it possible to retrieve the transactions having old schema? Pleasee guide
UPDATED
I am using following code to retrieve the records from vault:
val vaultStatesAndRefs = services.proxy.vaultQueryBy<State>().states
val vaultStates = vaultStatesAndRefs.map { it.state.data }
resultset = vaultStates.map { it.toJson() }.toString()
Some recommendations:
How are you running the schema queries? I don't seem to see that here.
If you're still building on Corda 4, be aware Corda 5 comes out later this year, and we're currently on 4.8, so it may be worth trying to upgrade.
If you're looking for guidance on structuring the app & updating your dependencies, take a look at the cordapp template: https://github.com/corda/cordapp-template-kotlin
now as far as how to query, you might want to query your schema within the flow to see if it's getting stored the way you expect?
you should be able to stick something like this in your flow (assuming it's a linear state)
List<UUID> listOfLinearIds = Arrays.asList(stateLinearId.getId());
QueryCriteria queryCriteria = new QueryCriteria.LinearStateQueryCriteria(null, listOfLinearIds);
Vault.Page results = getServiceHub().getVaultService().queryBy(IOUState.class, queryCriteria);
StateAndRef inputStateAndRefToSettle = (StateAndRef) results.getStates().get(0);
IOUState inputStateToSettle = (IOUState) ((StateAndRef) results.getStates().get(0)).getState().getData();
Some resources:
https://docs.corda.net/docs/corda-os/4.7/api-vault-query.html
(example vault query) https://github.com/corda/samples-java/blob/master/Advanced/obligation-cordapp/workflows/src/main/java/net/corda/samples/obligation/flows/IOUSettleFlow.java#L56-L61

Ebean and H2 configuration issue with Play framework 2.5

I am trying to develop with Play Framework version 2.5, and I cannot get the database connection right. I am using H2 database with Ebean plugin 3.0.2 as an ORM. I tried several options for the entries in application.conf, based on the information found on Play Framework website and many of your posts. Please find below the entries to my configuration files and the error trace and help me :
**Application.conf**
play.db {
# The combination of these two settings results in "db.default" as the
#default JDBC pool:
config = "db"
default = "default"
prototype {
# Sets a fixed JDBC connection pool size of 50
#hikaricp.minimumIdle = 50
#hikaricp.maximumPoolSize = 50
}
}
db {
default.hikaricp.dataSourceClassName = org.h2.jdbcx.JdbcDataSource
default.driver = org.h2.Driver
default.url = "jdbc:h2:mem:play"
default.username = sa
default.password = ""
ebean.default = ["models.*"]
#play.ebean.default.dataSource = default
default.logSql=true
}
**Plugins.sbt**
addSbtPlugin("com.typesafe.play" % "sbt-plugin" % "2.5.10")
// Web plugins
addSbtPlugin("com.typesafe.sbt" % "sbt-coffeescript" % "1.0.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-less" % "1.1.0")
addSbtPlugin("com.typesafe.sbt" % "sbt-jshint" % "1.0.4")
addSbtPlugin("com.typesafe.sbt" % "sbt-rjs" % "1.0.8")
addSbtPlugin("com.typesafe.sbt" % "sbt-digest" % "1.1.1")
addSbtPlugin("com.typesafe.sbt" % "sbt-mocha" % "1.1.0")
addSbtPlugin("org.irundaia.sbt" % "sbt-sassify" % "1.4.6")
addSbtPlugin("com.typesafe.sbt" % "sbt-play-enhancer" % "1.1.0")
enablePlugins(PlayEbean).
addSbtPlugin("com.typesafe.sbt" % "sbt-play-ebean" % "3.0.2")
addSbtPlugin("com.typesafe.sbteclipse" % "sbteclipse-plugin" %
"5.0.1")
**Build.sbt**
name := """Institut"""
version := "1.0-SNAPSHOT"
lazy val Institut = (project in
file(".")).enablePlugins(PlayJava,PlayEbean)
scalaVersion := "2.11.7"
libraryDependencies ++= Seq(
javaJdbc,
cache,
javaWs,
evolutions
)
**Console output**
[info] application - Creating Pool for datasource 'ebean'
[error] c.z.h.HikariConfig - HikariPool-1 - dataSource or
dataSourceClassName
or jdbcUrl is required.
[info] application - Creating Pool for datasource 'ebean'
[error] c.z.h.HikariConfig - HikariPool-2 - dataSource or
dataSourceClassName
or jdbcUrl is required.
[info] application - Creating Pool for datasource 'ebean'
[error] c.z.h.HikariConfig - HikariPool-3 - dataSource or
dataSourceClassName
or jdbcUrl is required.
[info] application - Creating Pool for datasource 'ebean'
[error] c.z.h.HikariConfig - HikariPool-4 - dataSource or
dataSourceClassName
or jdbcUrl is required.
[error] application -
! #736eodoo7 - Internal server error, for (GET) [/] ->
play.api.Configuration$$anon$1: Configuration error[Cannot connect to
database [ebean]]
at play.api.Configuration$.configError(Configuration.scala:154)
at play.api.Configuration.reportError(Configuration.scala:806)
at
play.api.db.DefaultDBApi$$anonfun$connect$1.apply(DefaultDBApi.scala:48)
at playi.db.DefaultDBApi$$anonfun$connect$1.apply(DefaultDBApi.scala:42)
at scala.collection.immutable.List.foreach(List.scala:381)
at play.api.db.DefaultDBApi.connect(DefaultDBApi.scala:42)
at play.api.db.DBApiProvider.get$lzycompute(DBModule.scala:72)
at play.api.db.DBApiProvider.get(DBModule.scala:62)
at play.api.db.DBApiProvider.get(DBModule.scala:58)
at
Caused by: play.api.Configuration$$anon$1: Configuration
error[dataSource or
dataSourceClassName or jdbcUrl is required.]
at play.api.Configuration$.configError(Configuration.scala:154)
at play.api.PlayConfig.reportError(Configuration.scala:996)
at play.api.db.HikariCPConnectionPool.create(HikariCPModule.scala:70)
at play.api.db.PooledDatabase.createDataSource(Databases.scala:199)
at
play.api.db.DefaultDatabase.dataSource$lzycompute(Databases.scala:123)
at play.api.db.DefaultDatabase.dataSource(Databases.scala:121)
at play.api.db.DefaultDatabase.getConnection(Databases.scala:142)
at play.api.db.DefaultDatabase.getConnection(Databases.scala:138)
at
play.api.db.DefaultDBApi$$anonfun$connect$1.apply(DefaultDBApi.scala:44)
at
play.api.db.DefaultDBApi$$anonfun$connect$1.apply(DefaultDBApi.scala:42)
Caused by: java.lang.IllegalArgumentException: dataSource or
dataSourceClassName or jdbcUrl is required.
at com.zaxxer.hikari.HikariConfig.validate(HikariConfig.java:786)
at play.api.db.HikariCPConfig.toHikariConfig(HikariCPModule.scala:141)
at
at scala.util.Try$.apply(Try.scala:192)
at play.api.db.HikariCPConnectionPool.create(HikariCPModule.scala:54)
at play.api.db.PooledDatabase.createDataSource(Databases.scala:199)
at
play.api.db.DefaultDatabase.dataSource$lazycompute(Databases.scala:123)
at play.api.db.DefaultDatabase.dataSource(Databases.scala:121)
at play.api.db.DefaultDatabase.getConnection(Databases.scala:142)
I had the same error message with H2 configuration , which was resolved be adding the dataSourceClassName, however I have tried to use default as datasource for Ebean without success.
Thank you !
ebean.default = ["models.*"] is not a member of the key db.default you have to place it outside of db.
According to your log ebean thinks that this key is another database
play.api.Configuration$$anon$1: Configuration error[Cannot connect to
database [ebean]]
More info here:
https://www.playframework.com/documentation/2.5.x/JavaDatabase#H2-database-engine-connection-properties

Issues in creating J2C Java beans using the Batch Import Utility in WAS 8.5

I'm facing issues in creating J2C Java beans using the Batch Import Utility.
In my project I have a custom ANT Build file which invokes ImportBatch.bat file of WSAD 5.1 Plugin. In WAS 5.1 it is working fine but in WAS 8.5 using Rational Application Developer 9.5 the same utility throws NullPointerException.
As per my analysis WAS 5.1 has "com.ibm.etools.ctc.import.batch_5.1.1" plugin which is used to perform the above task. I searched for this plugin in WAS 8.5 and I got that it has been changed to "com.ibm.adapter.j2c.command_6.2.700.v20150320_0049" plugin. It has the same importBatch.bat file.
Also I've to change importBatch file as per the current RAD 9.5 jar "equinox Launcher" because RAD 9.5 has no startup.jar
The original entry in RAD 9.5 importBatch.bat file:-
"%eclipse_root%\jre\bin\java" -Xmx256M -verify -Dimport.batch.cp="%currentCodepage%" -cp "%eclipse_root%\startup.jar"
org.eclipse.core.launcher.Main -clean -data "%workspace%" -application com.ibm.adapter.j2c.command.BatchImport -file=%import_file% -style=%generation_style%
changes I've made:-
"%eclipse_root%\jdk\jre\bin\java" -Xmx256M -verify -Dimport.batch.cp="%currentCodepage%" -cp "%eclipse_root%\plugins\org.eclipse.equinox.launcher_1.3.0.v20140415-2008.jar" org.eclipse.equinox.launcher.Main -clean -data "%workspace%" -application
com.ibm.adapter.j2c.command.BatchImport -file=%import_file% -style=%generation_style%
I've gone through the IBM Knowledge center for this topic but still no success.
http://www.ibm.com/support/knowledgecenter/SS4JCV_7.5.5/com.ibm.etools.j2c.doc/topics/tusingbatchimporter.html
Please have a look at the exception that I'm getting in workspace logs.
!SESSION 2016-08-20 14:07:55.714 -----------------------------------------------
eclipse.buildId=unknown
java.fullversion=JRE 1.8.0 IBM J9 2.8 Windows 7 amd64-64 Compressed References 20150630_255633 (JIT enabled, AOT enabled)
J9VM - R28_jvm.28_20150630_1742_B255633
JIT - tr.r14.java_20150625_95081.01
GC - R28_jvm.28_20150630_1742_B255633_CMPRSS
J9CL - 20150630_255633
BootLoader constants: OS=win32, ARCH=x86_64, WS=win32, NL=en_GB
!ENTRY org.eclipse.osgi 4 0 2016-08-20 14:07:57.403
!MESSAGE The -clean (osgi.clean) option was not successful. Unable to clean the storage area: C:\Program Files\IBM\SDP
\configuration\org.eclipse.osgi
!ENTRY org.eclipse.equinox.registry 2 0 2016-08-20 14:10:05.082
!MESSAGE The extensions and extension-points from the bundle "org.eclipse.emf.commonj.sdo" are ignored. The bundle is not
marked as singleton.
!ENTRY org.eclipse.core.resources 2 10035 2016-08-20 14:10:54.180
!MESSAGE The workspace exited with unsaved changes in the previous session; refreshing workspace to recover changes.
!ENTRY org.eclipse.osgi 4 0 2016-08-20 14:10:55.789
!MESSAGE Application error
!STACK 1
java.lang.NullPointerException
at com.ibm.adapter.j2c.command.internal.batchimport.BatchImportApplication.copyTestBucket(BatchImportApplication.java:140)
at com.ibm.adapter.j2c.command.internal.batchimport.BatchImportApplication.run(BatchImportApplication.java:118)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
at java.lang.reflect.Method.invoke(Method.java:507)
at org.eclipse.equinox.internal.app.EclipseAppContainer.callMethodWithException(EclipseAppContainer.java:587)
at org.eclipse.equinox.internal.app.EclipseAppHandle.run(EclipseAppHandle.java:198)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.runApplication(EclipseAppLauncher.java:134)
at org.eclipse.core.runtime.internal.adaptor.EclipseAppLauncher.start(EclipseAppLauncher.java:104)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:380)
at org.eclipse.core.runtime.adaptor.EclipseStarter.run(EclipseStarter.java:235)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
at java.lang.reflect.Method.invoke(Method.java:507)
at org.eclipse.equinox.launcher.Main.invokeFramework(Main.java:648)
at org.eclipse.equinox.launcher.Main.basicRun(Main.java:603)
at org.eclipse.equinox.launcher.Main.run(Main.java:1465)
at org.eclipse.equinox.launcher.Main.main(Main.java:1438)
!ENTRY com.ibm.etools.references 4 0 2016-08-20 14:10:56.147
!MESSAGE Framework stop [EXCEPTION] Exception during shutdown. Indexer will be rebuilt on next startup.
!STACK 0
java.lang.NullPointerException
at com.ibm.etools.references.internal.management.InternalReferenceManager.hasStar(InternalReferenceManager.java:1394)
at com.ibm.etools.references.internal.management.InternalReferenceManager$ShutdownCode.call
(InternalReferenceManager.java:175)
at com.ibm.etools.references.internal.management.InternalReferenceManager$ShutdownCode.call
(InternalReferenceManager.java:1)
at java.util.concurrent.FutureTask.run(FutureTask.java:267)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1143)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:618)
at java.lang.Thread.run(Thread.java:785)
at com.ibm.etools.references.internal.ReferenceThreadFactrory$ReferencesThread.run(ReferenceThreadFactrory.java:37)
!ENTRY com.ibm.etools.references 4 0 2016-08-20 14:10:56.153
!MESSAGE [SCR] Error while attempting to deactivate instance of component Component[
name = Reference Manager
activate = activate
deactivate = deactivate
modified =
configuration-policy = optional
factory = null
autoenable = true
immediate = true
implementation = com.ibm.etools.references.management.ReferenceManager
state = Disabled
properties =
serviceFactory = false
serviceInterface = [com.ibm.etools.references.management.ReferenceManager]
references = {
Reference[name = IWorkspace, interface = org.eclipse.core.resources.IWorkspace, policy = static, cardinality =
1..1, target = null, bind = null, unbind = null]
Reference[name = IPreferencesService, interface = org.eclipse.core.runtime.preferences.IPreferencesService, policy
= static, cardinality = 1..1, target = null, bind = null, unbind = null]
Reference[name = ThreadSupport, interface = com.ibm.etools.references.internal.ThreadSupport, policy = static,
cardinality = 1..1, target = null, bind = addThreadSupport, unbind = removeThreadSupport]
Reference[name = EventAdmin, interface = org.osgi.service.event.EventAdmin, policy = static, cardinality = 1..1,
target = null, bind = addEventAdmin, unbind = removeEventAdmin]
}
located in bundle = com.ibm.etools.references_1.2.400.v20150320_0049 [732]
]
!STACK 0
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
at java.lang.reflect.Method.invoke(Method.java:507)
at org.eclipse.equinox.internal.ds.model.ServiceComponent.deactivate(ServiceComponent.java:363)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.deactivate(ServiceComponentProp.java:161)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.dispose(ServiceComponentProp.java:387)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.dispose(ServiceComponentProp.java:102)
at org.eclipse.equinox.internal.ds.InstanceProcess.disposeInstances(InstanceProcess.java:366)
at org.eclipse.equinox.internal.ds.InstanceProcess.disposeInstances(InstanceProcess.java:306)
at org.eclipse.equinox.internal.ds.Resolver.disposeComponentConfigs(Resolver.java:724)
at org.eclipse.equinox.internal.ds.Resolver.disableComponents(Resolver.java:700)
at org.eclipse.equinox.internal.ds.SCRManager.stoppingBundle(SCRManager.java:554)
at org.eclipse.equinox.internal.ds.SCRManager.bundleChanged(SCRManager.java:233)
at org.eclipse.osgi.internal.framework.BundleContextImpl.dispatchEvent(BundleContextImpl.java:902)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:148)
at org.eclipse.osgi.internal.framework.EquinoxEventPublisher.publishBundleEventPrivileged(EquinoxEventPublisher.java:165)
at org.eclipse.osgi.internal.framework.EquinoxEventPublisher.publishBundleEvent(EquinoxEventPublisher.java:75)
at org.eclipse.osgi.internal.framework.EquinoxEventPublisher.publishBundleEvent(EquinoxEventPublisher.java:67)
at org.eclipse.osgi.internal.framework.EquinoxContainerAdaptor.publishModuleEvent(EquinoxContainerAdaptor.java:102)
at org.eclipse.osgi.container.Module.publishEvent(Module.java:466)
at org.eclipse.osgi.container.Module.doStop(Module.java:624)
at org.eclipse.osgi.container.Module.stop(Module.java:488)
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.decStartLevel(ModuleContainer.java:1623)
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.doContainerStartLevel(ModuleContainer.java:1542)
at org.eclipse.osgi.container.SystemModule.stopWorker(SystemModule.java:248)
at org.eclipse.osgi.internal.framework.EquinoxBundle$SystemBundle$EquinoxSystemModule.stopWorker(EquinoxBundle.java:145)
at org.eclipse.osgi.container.Module.doStop(Module.java:626)
at org.eclipse.osgi.container.Module.stop(Module.java:488)
at org.eclipse.osgi.container.SystemModule.stop(SystemModule.java:186)
at org.eclipse.osgi.internal.framework.EquinoxBundle$SystemBundle$EquinoxSystemModule$1.run(EquinoxBundle.java:160)
at java.lang.Thread.run(Thread.java:785)
Caused by: java.lang.NullPointerException
at com.ibm.etools.references.internal.management.InternalReferenceManager.deactivate(InternalReferenceManager.java:1153)
at com.ibm.etools.references.management.ReferenceManager.deactivate(ReferenceManager.java:321)
... 33 more
Root exception:
java.lang.NullPointerException
at com.ibm.etools.references.internal.management.InternalReferenceManager.deactivate(InternalReferenceManager.java:1153)
at com.ibm.etools.references.management.ReferenceManager.deactivate(ReferenceManager.java:321)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:95)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:55)
at java.lang.reflect.Method.invoke(Method.java:507)
at org.eclipse.equinox.internal.ds.model.ServiceComponent.deactivate(ServiceComponent.java:363)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.deactivate(ServiceComponentProp.java:161)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.dispose(ServiceComponentProp.java:387)
at org.eclipse.equinox.internal.ds.model.ServiceComponentProp.dispose(ServiceComponentProp.java:102)
at org.eclipse.equinox.internal.ds.InstanceProcess.disposeInstances(InstanceProcess.java:366)
at org.eclipse.equinox.internal.ds.InstanceProcess.disposeInstances(InstanceProcess.java:306)
at org.eclipse.equinox.internal.ds.Resolver.disposeComponentConfigs(Resolver.java:724)
at org.eclipse.equinox.internal.ds.Resolver.disableComponents(Resolver.java:700)
at org.eclipse.equinox.internal.ds.SCRManager.stoppingBundle(SCRManager.java:554)
at org.eclipse.equinox.internal.ds.SCRManager.bundleChanged(SCRManager.java:233)
at org.eclipse.osgi.internal.framework.BundleContextImpl.dispatchEvent(BundleContextImpl.java:902)
at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:230)
at org.eclipse.osgi.framework.eventmgr.ListenerQueue.dispatchEventSynchronous(ListenerQueue.java:148)
at org.eclipse.osgi.internal.framework.EquinoxEventPublisher.publishBundleEventPrivileged(EquinoxEventPublisher.java:165)
at org.eclipse.osgi.internal.framework.EquinoxEventPublisher.publishBundleEvent(EquinoxEventPublisher.java:75)
at org.eclipse.osgi.internal.framework.EquinoxEventPublisher.publishBundleEvent(EquinoxEventPublisher.java:67)
at org.eclipse.osgi.internal.framework.EquinoxContainerAdaptor.publishModuleEvent(EquinoxContainerAdaptor.java:102)
at org.eclipse.osgi.container.Module.publishEvent(Module.java:466)
at org.eclipse.osgi.container.Module.doStop(Module.java:624)
at org.eclipse.osgi.container.Module.stop(Module.java:488)
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.decStartLevel(ModuleContainer.java:1623)
at org.eclipse.osgi.container.ModuleContainer$ContainerStartLevel.doContainerStartLevel(ModuleContainer.java:1542)
at org.eclipse.osgi.container.SystemModule.stopWorker(SystemModule.java:248)
at org.eclipse.osgi.internal.framework.EquinoxBundle$SystemBundle$EquinoxSystemModule.stopWorker(EquinoxBundle.java:145)
at org.eclipse.osgi.container.Module.doStop(Module.java:626)
at org.eclipse.osgi.container.Module.stop(Module.java:488)
at org.eclipse.osgi.container.SystemModule.stop(SystemModule.java:186)
at org.eclipse.osgi.internal.framework.EquinoxBundle$SystemBundle$EquinoxSystemModule$1.run(EquinoxBundle.java:160)
at java.lang.Thread.run(Thread.java:785)
This looks like a problem for IBM support, I suggest you raise a PMR via the IBM support process to get a solution to this issue.

How to enlarge data column in SonarQube?

I'm trying to check my source code with cppcheck and SonarQube.
When I run sonar-runner, I met error below
SonarQube Runner 2.4
Java 1.6.0_33 Sun Microsystems Inc. (64-bit)
Linux 3.11.0-26-generic amd64
INFO: Error stacktraces are turned on.
INFO: Runner configuration file: /var/lib/jenkins/sonarqube/sonar-runner-2.4/conf/sonar-runner.properties
INFO: Project configuration file: /var/lib/jenkins/jobs/MIP35.KT.Centrex.Branch/workspace/hudson_mvmw/sonar-project.properties
INFO: Default locale: "ko_KR", source code encoding: "UTF-8"
INFO: Work directory: /data/jenkins/jobs/MIP35.KT.Centrex.Branch/workspace/hudson_mvmw/./.sonar
INFO: SonarQube Server 4.5
16:23:56.070 INFO - Load global referentials...
16:23:56.152 INFO - Load global referentials done: 84 ms
16:23:56.158 INFO - User cache: /var/lib/jenkins/.sonar/cache
16:23:56.164 INFO - Install plugins
16:23:56.273 INFO - Install JDBC driver
16:23:56.278 INFO - Create JDBC datasource for jdbc:mysql://localhost:3306/sonar?useUnicode=true&characterEncoding=utf8
16:23:57.156 INFO - Initializing Hibernate
16:23:57.990 INFO - Load project referentials...
16:23:58.522 INFO - Load project referentials done: 532 ms
16:23:58.522 INFO - Load project settings
16:23:58.788 INFO - Loading technical debt model...
16:23:58.809 INFO - Loading technical debt model done: 21 ms
16:23:58.811 INFO - Apply project exclusions
16:23:58.962 INFO - ------------- Scan mvmw for KT centrex at branch
16:23:58.968 INFO - Load module settings
16:23:59.939 INFO - Language is forced to c++
16:23:59.940 INFO - Loading rules...
16:24:00.558 INFO - Loading rules done: 618 ms
16:24:00.576 INFO - Configure Maven plugins
16:24:00.660 INFO - No quality gate is configured.
16:24:00.759 INFO - Base dir: /data/jenkins/jobs/MIP35.KT.Centrex.Branch/workspace/hudson_mvmw/.
16:24:00.759 INFO - Working dir: /data/jenkins/jobs/MIP35.KT.Centrex.Branch/workspace/hudson_mvmw/./.sonar
16:24:00.760 INFO - Source paths: moimstone
16:24:00.760 INFO - Source encoding: UTF-8, default locale: ko_KR
16:24:00.760 INFO - Index files
16:24:20.825 INFO - 13185 files indexed
16:26:35.895 WARN - SQL Error: 1406, SQLState: 22001
16:26:35.895 ERROR - Data truncation: Data too long for column 'data' at row 1
INFO: ------------------------------------------------------------------------
INFO: EXECUTION FAILURE
INFO: ------------------------------------------------------------------------
Total time: 2:40.236s
Final Memory: 27M/1765M
INFO: ------------------------------------------------------------------------
ERROR: Error during Sonar runner execution
org.sonar.runner.impl.RunnerException: Unable to execute Sonar
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:91)
at org.sonar.runner.impl.BatchLauncher$1.run(BatchLauncher.java:75)
at java.security.AccessController.doPrivileged(Native Method)
at org.sonar.runner.impl.BatchLauncher.doExecute(BatchLauncher.java:69)
at org.sonar.runner.impl.BatchLauncher.execute(BatchLauncher.java:50)
at org.sonar.runner.api.EmbeddedRunner.doExecute(EmbeddedRunner.java:102)
at org.sonar.runner.api.Runner.execute(Runner.java:100)
at org.sonar.runner.Main.executeTask(Main.java:70)
at org.sonar.runner.Main.execute(Main.java:59)
at org.sonar.runner.Main.main(Main.java:53)
Caused by: org.sonar.api.utils.SonarException: Unable to read and import the source file : '/data/jenkins/jobs/MIP35.KT.Centrex.Branch/workspace/hudson_mvmw/moimstone/mgrs/mUIMgr/gui/resource/wideBasicStyle/320Wx240H/imageMerged.c' with the charset : 'UTF-8'.
at org.sonar.batch.scan.filesystem.ComponentIndexer.importSources(ComponentIndexer.java:96)
at org.sonar.batch.scan.filesystem.ComponentIndexer.execute(ComponentIndexer.java:79)
at org.sonar.batch.scan.filesystem.DefaultModuleFileSystem.index(DefaultModuleFileSystem.java:245)
at org.sonar.batch.phases.PhaseExecutor.execute(PhaseExecutor.java:111)
at org.sonar.batch.scan.ModuleScanContainer.doAfterStart(ModuleScanContainer.java:194)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.scan.ProjectScanContainer.scan(ProjectScanContainer.java:233)
at org.sonar.batch.scan.ProjectScanContainer.scanRecursively(ProjectScanContainer.java:228)
at org.sonar.batch.scan.ProjectScanContainer.doAfterStart(ProjectScanContainer.java:221)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.scan.ScanTask.scan(ScanTask.java:64)
at org.sonar.batch.scan.ScanTask.execute(ScanTask.java:51)
at org.sonar.batch.bootstrap.TaskContainer.doAfterStart(TaskContainer.java:125)
at org.sonar.api.platform.ComponentContainer.startComponents(ComponentContainer.java:92)
at org.sonar.api.platform.ComponentContainer.execute(ComponentContainer.java:77)
at org.sonar.batch.bootstrap.BootstrapContainer.executeTask(BootstrapContainer.java:173)
at org.sonar.batch.bootstrapper.Batch.executeTask(Batch.java:95)
at org.sonar.batch.bootstrapper.Batch.execute(Batch.java:67)
at org.sonar.runner.batch.IsolatedLauncher.execute(IsolatedLauncher.java:48)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:622)
at org.sonar.runner.impl.BatchLauncher$1.delegateExecution(BatchLauncher.java:87)
... 9 more
Caused by: javax.persistence.PersistenceException: Unable to persist : SnapshotSource[snapshot_id=53035,data=#if defined(__cplusplus)
#pragma hdrstop
#endif
#include "Prj_pcx2_resource.h"
#if defined(__cplusplus)
#pragma package(smart_init)
#endif
const rgb24_type Prj_Bg_call_ColorTable[59] PCX2_SEGMENT =
{
{0xFF,0xFF,0xFF}, {0xFE,0xFE,0xFE}, {0xE7,0xE7,0xE7}, {0xC7,0xC7,0xC7}, {0x9B,0x9B,0x9B}, {0xFD,0xFD,0xFD}, {0xCF,0xCF,0xCF}, {0xA8,0xA8,0xA8}, {0xBC,0xBC,0xBC}, {0xD6,0xD6,0xD6},
{0xDC,0xDC,0xDC}, {0xCE,0xCE,0xCE}, {0xB5,0xB5,0xB5}, {0xD0,0xD0,0xD0}, {0xE1,0xE1,0xE1}, {0xA7,0xA7,0xA7}, {0xFA,0xFA,0xFA}, {0xBE,0xBE,0xBE}, {0xBB,0xBB,0xBB}, {0xF3,0xF3,0xF3},
{0x9A,0x9A,0x9A}, {0xEC,0xEC,0xEC}, {0xE9,0xE9,0xE9}, {0x99,0x99,0x99}, {0x98,0x98,0x98}, {0x97,0x97,0x97}, {0x96,0x96,0x96}, {0x95,0x95,0x95}, {0x94,0x94,0x94}, {0x93,0x93,0x93},
{0x92,0x92,0x92}, {0x91,0x91,0x91}, {0x90,0x90,0x90}, {0x8F,0x8F,0x8F}, {0x8E,0x8E,0x8E}, {0x8D,0x8D,0x8D}, {0x8C,0x8C,0x8C}, {0x8B,0x8B,0x8B}, {0x8A,0x8A,0x8A}, {0x89,0x89,0x89},
{0x88,0x88,0x88}, {0x87,0x87,0x87...]
at org.sonar.jpa.session.JpaDatabaseSession.internalSave(JpaDatabaseSession.java:136)
at org.sonar.jpa.session.JpaDatabaseSession.save(JpaDatabaseSession.java:103)
at org.sonar.batch.index.SourcePersister.saveSource(SourcePersister.java:47)
at org.sonar.batch.index.DefaultPersistenceManager.setSource(DefaultPersistenceManager.java:68)
at org.sonar.batch.index.DefaultIndex.setSource(DefaultIndex.java:467)
at org.sonar.batch.scan.filesystem.ComponentIndexer.importSources(ComponentIndexer.java:93)
... 34 more
Caused by: javax.persistence.PersistenceException: org.hibernate.exception.DataException: could not insert: [org.sonar.api.database.model.SnapshotSource]
at org.hibernate.ejb.AbstractEntityManagerImpl.throwPersistenceException(AbstractEntityManagerImpl.java:614)
at org.hibernate.ejb.AbstractEntityManagerImpl.persist(AbstractEntityManagerImpl.java:226)
at org.sonar.jpa.session.JpaDatabaseSession.internalSave(JpaDatabaseSession.java:130)
... 39 more
Caused by: org.hibernate.exception.DataException: could not insert: [org.sonar.api.database.model.SnapshotSource]
at org.hibernate.exception.SQLStateConverter.convert(SQLStateConverter.java:100)
at org.hibernate.exception.JDBCExceptionHelper.convert(JDBCExceptionHelper.java:66)
at org.hibernate.id.insert.AbstractReturningDelegate.performInsert(AbstractReturningDelegate.java:64)
at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2176)
at org.hibernate.persister.entity.AbstractEntityPersister.insert(AbstractEntityPersister.java:2656)
at org.hibernate.action.EntityIdentityInsertAction.execute(EntityIdentityInsertAction.java:71)
at org.hibernate.engine.ActionQueue.execute(ActionQueue.java:279)
at org.hibernate.event.def.AbstractSaveEventListener.performSaveOrReplicate(AbstractSaveEventListener.java:321)
at org.hibernate.event.def.AbstractSaveEventListener.performSave(AbstractSaveEventListener.java:204)
at org.hibernate.event.def.AbstractSaveEventListener.saveWithGeneratedId(AbstractSaveEventListener.java:130)
at org.hibernate.ejb.event.EJB3PersistEventListener.saveWithGeneratedId(EJB3PersistEventListener.java:49)
at org.hibernate.event.def.DefaultPersistEventListener.entityIsTransient(DefaultPersistEventListener.java:154)
at org.hibernate.event.def.DefaultPersistEventListener.onPersist(DefaultPersistEventListener.java:110)
at org.hibernate.event.def.DefaultPersistEventListener.onPersist(DefaultPersistEventListener.java:61)
at org.hibernate.impl.SessionImpl.firePersist(SessionImpl.java:646)
at org.hibernate.impl.SessionImpl.persist(SessionImpl.java:620)
at org.hibernate.impl.SessionImpl.persist(SessionImpl.java:624)
at org.hibernate.ejb.AbstractEntityManagerImpl.persist(AbstractEntityManagerImpl.java:220)
... 40 more
Caused by: com.mysql.jdbc.MysqlDataTruncation: Data truncation: Data too long for column 'data' at row 1
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4235)
at com.mysql.jdbc.MysqlIO.checkErrorPacket(MysqlIO.java:4169)
at com.mysql.jdbc.MysqlIO.sendCommand(MysqlIO.java:2617)
at com.mysql.jdbc.MysqlIO.sqlQueryDirect(MysqlIO.java:2778)
at com.mysql.jdbc.ConnectionImpl.execSQL(ConnectionImpl.java:2825)
at com.mysql.jdbc.PreparedStatement.executeInternal(PreparedStatement.java:2156)
at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2459)
at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2376)
at com.mysql.jdbc.PreparedStatement.executeUpdate(PreparedStatement.java:2360)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
at org.apache.commons.dbcp.DelegatingPreparedStatement.executeUpdate(DelegatingPreparedStatement.java:105)
at org.hibernate.id.IdentityGenerator$GetGeneratedKeysDelegate.executeAndExtract(IdentityGenerator.java:94)
at org.hibernate.id.insert.AbstractReturningDelegate.performInsert(AbstractReturningDelegate.java:57)
... 55 more
ERROR:
ERROR: Re-run SonarQube Runner using the -X switch to enable full debug logging.
I have a huge source file which is image data file. It's over 100 Megabytes.
How can I enlarge data column? Is there setting for it?
There's no point in analyzing such a file, SonarQube won't give you useful information about it. And this is true for any other file like this one.
The solution is to exclude those image data files using the standard exclusion mechanism provided by SonarQube.
For instance, I would do something like:
sonar.exclusions=**/*imageMerged*

Unable to resolve dummy/0.0.0: missing requirement

I've a problem when try deploy a profile on a cluster of fabric8 CR5.
this's a log part on karaf.log
2014-09-09 12:33:43,142 | INFO | o.fabric8.agent) | DeploymentAgent | 66 - io.fabric8.fabric-agent - 1.1.0.CR5 | DeploymentAgent updated with {repository.karaf-enterprise=mvn:org.apache.karaf.assemblies.features/enterprise/2.3.0.redhat-610379/xml/features, optional.mvn:org.apache.activemq/activemq-osgi/5.10.0=mvn:org.apache.activemq/activemq-osgi/5.10.0, attribute.parents=feature-camel, feature.insight-log=insight-log, org.ops4j.pax.url.mvn.defaultrepositories=file:/opt/fabric8-karaf-1.1.0.CR5/system#snapshots#id=karaf-default,file:/opt/fabric8-karaf-1.1.0.CR5/local-repo#snapshots#id=karaf-local, feature.camel=camel, feature.fabric-jaas=fabric-jaas, attribute.abstract=false, fabric.zookeeper.pid=io.fabric8.agent, bundle.mvn:anagrafiche_bloomberg/mortgages/0.1=mvn:anagrafiche_bloomberg/Mortgages/0.1, feature.camel-blueprint=camel-blueprint, bundle.mvn:anagrafiche_bloomberg/varcomty/0.1=mvn:anagrafiche_bloomberg/Varcomty/0.1, feature.fabric-core=fabric-core, optional.ops4j-base-lang=mvn:org.ops4j.base/ops4j-base-lang/1.4.0, service.pid=io.fabric8.agent, org.ops4j.pax.url.mvn.repositories=http://repo1.maven.org/maven2#id=central, https://repo.fusesource.com/nexus/content/groups/public#id=fusepublic, https://repository.jboss.org/nexus/content/repositories/public#id=jbosspublic, https://repo.fusesource.com/nexus/content/repositories/releases#id=jbossreleases, https://repo.fusesource.com/nexus/content/groups/ea#id=jbossearlyaccess, http://repository.springsource.com/maven/bundles/release#id=ebrreleases, http://repository.springsource.com/maven/bundles/external#id=ebrexternal, http://10.0.0.124:8081/artifactory/libs-release-local#id=pippo, feature.fabric-web=fabric-web, feature.fabric-git-server=fabric-git-server, repository.apache-camel=mvn:org.apache.camel.karaf/apache-camel/2.13.2/xml/features, bundle.mvn:org.talend.esb.job/org.talend.esb.job.api/5.4.1=mvn:org.talend.esb.job/org.talend.esb.job.api/5.4.1, bundle.mvn:anagrafiche_bloomberg/equities/0.1=mvn:anagrafiche_bloomberg/Equities/0.1, resolve.optional.imports=false, repository.fabric8=mvn:io.fabric8/fabric8-karaf/1.1.0.CR5/xml/features, feature.camel-amq=camel-amq, optional.mvn:org.talend.esb.job/org.talend.esb.job.api/5.4.1=mvn:org.talend.esb.job/org.talend.esb.job.api/5.4.1, bundle.mvn:anagrafiche_bloomberg/equity_option_warrants/0.1=mvn:anagrafiche_bloomberg/Equity_Option_Warrants/0.1, feature.karaf=karaf, feature.camel-ftp=camel-ftp, feature.fabric-agent=fabric-agent, feature.jolokia=jolokia, feature.fabric-git=fabric-git, feature.camel-core=camel-core, feature.fabric-camel=fabric-camel, hash=ProfileImpl[id='Anagrafica_Bloomberg', version='1.0']-328b4e0-9859682-9859682-cbe9c83, bundle.mvn:anagrafiche_bloomberg/commodities/0.1=mvn:anagrafiche_bloomberg/Commodities/0.1, repository.karaf-spring=mvn:org.apache.karaf.assemblies.features/spring/2.3.0.redhat-610379/xml/features, lastrefresh.anagrafica_bloomberg=1410258607347, patch.repositories=https://repo.fusesource.com/nexus/content/repositories/releases, https://repo.fusesource.com/nexus/content/groups/ea, repository.karaf-standard=mvn:org.apache.karaf.assemblies.features/standard/2.3.0.redhat-610379/xml/features, bundle.mvn:anagrafiche_bloomberg/vargov_corp_muni/0.1=mvn:anagrafiche_bloomberg/Vargov_Corp_Muni/0.1, bundle.mvn:it.gestielle/anagrafica_bloomberg/1.0=mvn:it.gestielle/Anagrafica_Bloomberg/1.0}
2014-09-09 12:33:44,157 | ERROR | agent-1-thread-1 | DeploymentAgent | 66 - io.fabric8.fabric-agent - 1.1.0.CR5 | Unable to update agent
org.osgi.service.resolver.ResolutionException: Unable to resolve dummy/0.0.0: missing requirement [dummy/0.0.0] osgi.identity; osgi.identity=anagrafiche_bloomberg.Commodities; type=osgi.bundle; version="[0.1.0,0.1.0]" [caused by: Unable to resolve anagrafiche_bloomberg.Commodities/0.1.0: missing requirement [anagrafiche_bloomberg.Commodities/0.1.0] osgi.wiring.package; filter:="(osgi.wiring.package=org.apache.cxf.management.counters)"]
at org.apache.felix.resolver.Candidates.populateResource(Candidates.java:285)[66:io.fabric8.fabric-agent:1.1.0.CR5]
at org.apache.felix.resolver.Candidates.populate(Candidates.java:153)[66:io.fabric8.fabric-agent:1.1.0.CR5]
at org.apache.felix.resolver.ResolverImpl.resolve(ResolverImpl.java:148)[66:io.fabric8.fabric-agent:1.1.0.CR5]
at io.fabric8.agent.DeploymentBuilder.resolve(DeploymentBuilder.java:224)[66:io.fabric8.fabric-agent:1.1.0.CR5]
at io.fabric8.agent.DeploymentAgent.doUpdate(DeploymentAgent.java:575)[66:io.fabric8.fabric-agent:1.1.0.CR5]
at io.fabric8.agent.DeploymentAgent$2.run(DeploymentAgent.java:301)[66:io.fabric8.fabric-agent:1.1.0.CR5]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)[:1.7.0_65]
at java.util.concurrent.FutureTask.run(FutureTask.java:262)[:1.7.0_65]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)[:1.7.0_65]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)[:1.7.0_65]
at java.lang.Thread.run(Thread.java:745)[:1.7.0_65]
this is my local repository
http://10.0.0.124:8081/artifactory/libs-release-local#id=pippo
set on org.ops4j.pax.url.mvn.repositories property where the jar are deploy.
Can someone help me ??
Why the dependencies are not resolve?
Thanks
Mirko
dummy/0.0.0.0 is just the entry point -- read the rest of the error where it walks the dependency chain.
You'll see it's requiring anagrafiche_bloomberg.Commodities, which is requiring org.apache.cxf.management.counters. Be sure to install the CXF package.

Resources