Im trying to implement distributed tracing using Spring, RabbitMQ, Sleuth and Zipkin. So I added the dependencies:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-sleuth</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-sleuth-zipkin</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-actuator</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-starter-bus-amqp</artifactId>
</dependency>
And configured sleuth and zipkin in my bootstrap.yml:
spring:
sleuth:
sampler:
probability: 1.0
zipkin:
rabbitmq:
addresses: ${RABBIT_MQ_HOST:localhost}:${RABBIT_MQ_PORT:5672}
sender:
type: rabbit
So starting my services and making some rest calls I get this in the logs:
zuul-gateway | 2020-04-01 01:49:25.453 INFO [zuul-gateway,384ce1318abef3f3,74ad64f70e120f4e,true] 1 --- [nio-8765-exec-4] d.f.z.ZuulLoggingFilter : request -> org.springframework.cloud.netflix.zuul.filters.pre.Servlet30RequestWrapper#70eb5e67 request uri -> /user-service/users
user-service-3 | 2020-04-01 01:49:25.462 INFO [user-service,384ce1318abef3f3,4522ae801a21c30e,true] 1 --- [nio-8000-exec-2] de.fronetic.userservice.UserController : Order by: age, Users: [User(id=4, lastName=Savona, firstName=Albert, age=101), User(id=3, lastName=Esposito, firstName=Derryl, age=12), User(id=1, lastName=Belmonte, firstName=Maeleth, age=14), User(id=6, lastName=Grillo, firstName=Madhu, age=21), User(id=2, lastName=Colt, firstName=Tychon, age=28), User(id=8, lastName=Causer, firstName=Stana, age=44), User(id=7, lastName=Seelen, firstName=Bellatrix, age=52), User(id=5, lastName=Hakobyan, firstName=Zinoviy, age=57)]
user-transformation-service | 2020-04-01 01:49:25.475 INFO [user-transformation-service,384ce1318abef3f3,47a61185e3cca375,true] 1 --- [nio-8100-exec-7] d.f.u.UserTransformationController : Users: [User(id=4, lastName=Savona, firstName=Albert, age=101), User(id=3, lastName=Esposito, firstName=Derryl, age=12), User(id=1, lastName=Belmonte, firstName=Maeleth, age=14), User(id=6, lastName=Grillo, firstName=Madhu, age=21), User(id=2, lastName=Colt, firstName=Tychon, age=28), User(id=8, lastName=Causer, firstName=Stana, age=44), User(id=7, lastName=Seelen, firstName=Bellatrix, age=52), User(id=5, lastName=Hakobyan, firstName=Zinoviy, age=57)]
For now it looks good. Sleuth added the tracing ID's.
Calling the Zipkin UI I can see that the service names where added:
But there are no tracing informations at all:
So im wondering what im missing in my configuration.
EDIT
Turned out there are tracing informations arriving in zipkin. I can use the search bar in the top right corner to search for tracing id's directly:
I will then get:
So the question is why is there nothing in the overview or queryable via the trace lookup?
Related
I have a Rest Endpoint that returns the below response
{
"incentives": [
{
"incentiveId": "271230",
"effectiveDate": "2022-06-01T07:00:00.000+0000",
"expiryDate": "2022-10-01T03:00:00.000+0000",
"incentiveName": "$500 MAZDA LOYALTY REWARD PROGRAM",
"incentiveAmount": 500,
"incentiveType": "discount",
"disclaimer": ""
},
{
"incentiveId": "271231",
"effectiveDate": "2022-06-01T07:00:00.000+0000",
"expiryDate": "2022-10-01T03:00:00.000+0000",
"incentiveName": "$500 MAZDA MILITARY APPRECIATION BONUS CASH",
"incentiveAmount": 500,
"incentiveType": "discount",
"disclaimer": ""
},
{
"incentiveId": "271229",
"effectiveDate": "2022-06-01T07:00:00.000+0000",
"expiryDate": "2022-07-01T03:00:00.000+0000",
"incentiveName": "MAZDA MOBILITY PROGRAM CASH BONUS",
"incentiveAmount": 1000,
"incentiveType": "discount",
"disclaimer": "Program Period is 6/1/2022 - 6/30/2022."
},
{
"incentiveId": "271242",
"effectiveDate": "2022-06-01T07:00:00.000+0000",
"expiryDate": "2022-07-06T06:59:00.000+0000",
"incentiveName": "$500 MAZDA LEASE TO LEASE LOYALTY REWARD PROGRAM",
"incentiveAmount": 500,
"incentiveType": "discount",
"disclaimer": ""
}
]
}
I am using Rest Assured Jsonpath to deserialize the list of incentives using the below lines of code
JsonPath jsonPathEvaluator = response.jsonPath();
List<Incentive> incentivesList = jsonPathEvaluator.getList("incentives", Incentive.class);
But the above lines of code is throwing the following exception
java.util.ServiceConfigurationError: com.fasterxml.jackson.databind.Module: Provider com.fasterxml.jackson.datatype.jdk8.Jdk8Module not found
at java.base/java.util.ServiceLoader.fail(ServiceLoader.java:593)
at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.nextProviderClass(ServiceLoader.java:1219)
at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNextService(ServiceLoader.java:1228)
at java.base/java.util.ServiceLoader$LazyClassPathLookupIterator.hasNext(ServiceLoader.java:1273)
at java.base/java.util.ServiceLoader$2.hasNext(ServiceLoader.java:1309)
at java.base/java.util.ServiceLoader$3.hasNext(ServiceLoader.java:1393)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:929)
at com.fasterxml.jackson.databind.ObjectMapper.findModules(ObjectMapper.java:912)
at com.fasterxml.jackson.databind.ObjectMapper.findAndRegisterModules(ObjectMapper.java:948)
at io.restassured.path.json.mapper.factory.DefaultJackson2ObjectMapperFactory.create(DefaultJackson2ObjectMapperFactory.java:29)
at io.restassured.path.json.mapper.factory.DefaultJackson2ObjectMapperFactory.create(DefaultJackson2ObjectMapperFactory.java:27)
at io.restassured.common.mapper.factory.ObjectMapperFactory$create.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:148)
at io.restassured.internal.path.json.mapping.JsonPathJackson2ObjectDeserializer.createJackson2ObjectMapper(JsonPathJackson2ObjectDeserializer.groovy:37)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:568)
at org.codehaus.groovy.runtime.callsite.PlainObjectMetaMethodSite.doInvoke(PlainObjectMetaMethodSite.java:43)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite$PogoCachedMethodSiteNoUnwrapNoCoerce.invoke(PogoMetaMethodSite.java:193)
at org.codehaus.groovy.runtime.callsite.PogoMetaMethodSite.callCurrent(PogoMetaMethodSite.java:61)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallCurrent(CallSiteArray.java:51)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:171)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callCurrent(AbstractCallSite.java:194)
at io.restassured.internal.path.json.mapping.JsonPathJackson2ObjectDeserializer.deserialize(JsonPathJackson2ObjectDeserializer.groovy:44)
at io.restassured.path.json.mapping.JsonPathObjectDeserializer$deserialize.call(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:47)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:125)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:139)
at io.restassured.internal.path.json.mapping.JsonObjectDeserializer.deserializeWithJackson2(JsonObjectDeserializer.groovy:109)
at io.restassured.internal.path.json.mapping.JsonObjectDeserializer$deserializeWithJackson2.callStatic(Unknown Source)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCallStatic(CallSiteArray.java:55)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:217)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.callStatic(AbstractCallSite.java:240)
at io.restassured.internal.path.json.mapping.JsonObjectDeserializer.deserialize(JsonObjectDeserializer.groovy:70)
at io.restassured.path.json.JsonPath.jsonStringToObject(JsonPath.java:1093)
at io.restassured.path.json.JsonPath.getList(JsonPath.java:400)
Below are the dependencies in my pom.xml file
<!-- https://mvnrepository.com/artifact/io.rest-assured/rest-assured -->
<dependency>
<groupId>io.rest-assured</groupId>
<artifactId>rest-assured</artifactId>
<version>5.1.0</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jdk8</artifactId>
<version>2.6.3</version>
</dependency>
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jsr310</artifactId>
<version>2.6.0</version>
</dependency>
Below is the java version on my machine
java version "1.8.0_333"
Java(TM) SE Runtime Environment (build 1.8.0_333-b02)
Java HotSpot(TM) Client VM (build 25.333-b02, mixed mode, sharing)
Could you please help me understand on what is wrong and on how to resolve this issue?
My issue got solved by adding the below dependency in the pom.xml file, I'm using Java 11 version. The below dependency is used to support JDK8 data types.
<!-- https://mvnrepository.com/artifact/com.fasterxml.jackson.datatype/jackson-datatype-jdk8 -->
<dependency>
<groupId>com.fasterxml.jackson.datatype</groupId>
<artifactId>jackson-datatype-jdk8</artifactId>
<version>2.13.3</version>
</dependency>
I'm using Spring Boot and would like to have this code:
LOG.info("Scheduled appointment for user 12345 [appointment ID 100]");
Produce the following log message in JSON GELF format:
{
"version": "1.1",
"host": "hostname.ec2.internal",
"short_message": "Scheduled appointment for user 12345 [appointment ID 100]",
"timestamp": 1318280136,
"level": 1,
"_user": "user#acme.com",
"_clientip": "127.0.0.1",
"_env": "prod",
"_app":"scheduler"
}
Do I need to create my own logger for this or can I customize Logback/Log4j2 to behave this way?
From a Log4j 2.x perspective, you can use the JSON Layout Template, which has a built-in eventTemplate for GELF.
Your appender configuration in the log4j2-spring.xml file would look like:
<Console name="CONSOLE">
<JsonTemplateLayout eventTemplateUri="classpath:GelfLayout.json" />
</Console>
Remark: Since Spring Boot uses Logback as default logging system, you'll have to exclude the spring-boot-starter-logging and replace it with spring-boot-starter-log4j2.
Moreover the JSON Layout Template requires an additional dependency:
<dependency>
<groupId>org.apache.logging.log4j</groupId>
<artifactId>log4j-layout-template-json</artifactId>
</dependency>
I am trying to fetch records from Elastic search and i get this error as below
ElasticsearchStatusException[Elasticsearch exception [type=exception, reason=SearchPhaseExecutionException[Failed to execute phase [query_fetch], all shards failed;
shardFailures {[-kDbP0fmTUa5B8v1gpgoZQ][dataintelindex_ra][0]: SearchParseException[[dataintelindex_ra]
[0]: from[-1],size[-1]: Parse Failure [Failed to parse source [{"query":{"geo_bounding_box":{"loc":
{"top_left":[-74.1,40.73],"bottom_right":
[-73.99,40.717]},"validation_method":"STRICT","type":"MEMORY","ignore_unmapped":false,"boost":1.0}}}]]];
nested: QueryParsingException[[dataintelindex_ra] No query registered for [geo_bounding_box]]; }]]]
My Java Code is as below
SearchRequest searchRequest = new SearchRequest("dataintelindex_ra").types("station_info");
SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder();
searchSourceBuilder.query(QueryBuilders.geoBoundingBoxQuery("loc").setCorners(40.73,-74.1,40.717,-73.99));
searchRequest.source(searchSourceBuilder);
SearchResponse response = elasticSearchClient.search(searchRequest, RequestOptions.DEFAULT);
for (SearchHit searchHit : response.getHits().getHits()) {
System.out.println("~~~~~~~~SearchHit[] searchHits~~~~~~~~~~~~~~ "+searchHit.getSourceAsString());
}
Please let me know if i missed something while trying to index , i am new to Elastic search.
Also incase if i want to include one more criteria to my query like below
searchSourceBuilder.query(QueryBuilders.termsQuery("zoneType", ["test","oms"]));
Below is the result for the above query and it works fine
~~~~~~~~SearchHit[] searchHits~~~~~~~~~~~~~~ {"tag_datatype":"sensor","loc":[{"lat":"0","lon":"0"}],"level":1,"kml_path":"","created":"Mon Aug 10 16:02:51 IST 2020","latitude":"0","station_id":"5f312253b4c93c1d20bbbb39","longtitude":"0","tag_owner":"","description":"","zoneType":"oms","tag_network_name":"chak_network","display_name":"506020200236117-O1","supply_zone":"506020200236117-O1","outflow":null,"tag_sector":"dmameter","name":"506020200236117-O1","tag_category":"sensorstation","inflow":null,"_id":"5f312253b4c93c1d20bbbb39","tag_location":"NA","lastmod":"Mon Aug 10 16:02:51 IST 2020","status":"ACTIVE"}
~~~~~~~~SearchHit[] searchHits~~~~~~~~~~~~~~ {"tag_datatype":"sensor","loc":[{"lat":"0","lon":"0"}],"level":1,"kml_path":"","created":"Tue Aug 11 11:36:51 IST 2020","latitude":"0","station_id":"5f32357b3ccb8f51e003587e","longtitude":"0","tag_owner":"","description":"","zoneType":"village","display_name":"testvillage1","supply_zone":"testvillage1","outflow":null,"tag_sector":"dmameter","name":"testvillage1","tag_category":"sensorstation","inflow":null,"_id":"5f32357b3ccb8f51e003587e","tag_location":"NA","lastmod":"Tue Aug 11 11:36:51 IST 2020","status":"ACTIVE"}
how do i combine with above geoboundingbox query ? do i need to add it as a filter?
Update : Dependencies
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>transport</artifactId>
<version>6.4.0</version>
</dependency>
<dependency>
<groupId>org.elasticsearch</groupId>
<artifactId>elasticsearch</artifactId>
<version>6.4.0</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-client</artifactId>
<version>6.4.0</version>
</dependency>
<dependency>
<groupId>org.elasticsearch.client</groupId>
<artifactId>elasticsearch-rest-high-level-client</artifactId>
<version>6.4.0</version>
</dependency>
{
"status" : 200,
"name" : "test1",
"version" : {
"number" : "1.2.2",
"build_hash" : "243243432feaga",
"build_timestamp" : "2014-07-09T12:02:32Z",
"build_snapshot" : false,
"lucene_version" : "4.8"
},
"tagline" : "You Know, for Search"
}
Thanks in advance
Rakesh
The problem is that you're running ES server v1.2.2 (extremely old version) with a 6.4.0 client.
So the 6.4.0 client has the geoBoundingBoxQuery() method, however the 1.2.2 client provides the geoBoundingBoxFilter() method, both are incompatible. There has been a big query/filter refactor in ES 2.x.
As a rule of thumb you should always run the same version of ES and the client library. In your case, you have a delta of several versions betweem your server and your client.
You should definitely consider upgrading your ES cluster to at least 6.4.0 or downgrade your client to 1.x.
I have maven based interface, I need to connect MSBI database and fetch the data into mule.
I have added the required jar sqljdbc42.jar to build path.
PFB is the MSBI connection configuration:
<poll doc:name="Poll">
<schedulers:cron-scheduler expression="${msbi.poll.schedule.cdo}"/>
<db:select config-ref="MSBI_Database_Configuration" streaming="true" fetchSize="1000" doc:name="MSBI Select Contact CDO data">
<db:parameterized-query><![CDATA[SELECT [Cstmr_Acct_Id] AS SAP_Account_ID1
,[Accnt_Nm] AS CRM_Account_Name1
,[Accnt_Type] AS APL_Account_Attributes___Account_Type1
,[Cstmr_Sgmnt] AS CRM_Customer_Segment1
,[Trnprttn_role] AS CRM_Transportation_Role1
,CASE WHEN [Accnt_Stts]='A' THEN 'Active' WHEN [Accnt_Stts]='I' THEN 'Inactive' ELSE NULL END AS CRM_Account_Status1
,[Rgn] AS Region1
,[Rgn_desc] AS Region_Text1
,[Clster] AS Cluster1
,[Clster_desc] AS Cluster_Text1
,[Dstrct] AS District1
,[Dstrct_desc] AS District_Text1
,[Cntry] AS Country1
,[Cntry_desc] AS Country_Text1
,[Brnch] AS Branch1
,[Brnch_desc] AS Branch_Text1
,[Trrtry] AS Territory1
,[Trrtry_desc] AS Territory_Desc1
,[New_BT_Cd] AS BT_Code1
,[Lgcy_BT_Cd] AS Legacy_BTCode1
,[Last_Update_Dt] AS LAST_UPDATE_Date1
FROM [dbo].[vw_elqa_CRM_Accnt_Sales_Hierarchy]
WHERE
(
CAST(Last_Update_Dt AS date) >= CAST(GETDATE() AS date) OR
CAST(Last_Update_Dt AS date) >= CAST(#[server.systemProperties['mule.env']=='dev'?server.systemProperties['msbi.debug.csr.query.filterDate']:'2100-01-01'] AS date)
) AND Cstmr_Acct_Id IS NOT NULL]]></db:parameterized-query>
</db:select>
</poll>
When i run the interface it is deployed successfully but while triggering the MSBI throwing below error
ERROR 2017-10-30 20:58:26,792 [nol-integration-v1.3-polling://MSBItoEloquaContactCDODataUpdate/541182371_Worker-1] org.mule.exception.DefaultSystemExceptionStrategy:
********************************************************************************
Message : org.mule.module.db.internal.domain.connection.ConnectionCreationException: Error trying to load driver: com.microsoft.sqlserver.jdbc.SQLServerDriver : Cannot load class 'com.microsoft.sqlserver.jdbc.SQLServerDriver' (java.sql.SQLException)
Element : /MSBI_Database_Configuration # app:bulk-integration.xml:26 (Generic Database Configuration)
--------------------------------------------------------------------------------
Root Exception stack trace:
java.sql.SQLException: Error trying to load driver: com.microsoft.sqlserver.jdbc.SQLServerDriver : Cannot load class 'com.microsoft.sqlserver.jdbc.SQLServerDriver'
at org.enhydra.jdbc.standard.StandardDataSource.getConnection(StandardDataSource.java:184)
at org.enhydra.jdbc.standard.StandardDataSource.getConnection(StandardDataSource.java:144)
at org.mule.module.db.internal.domain.connection.SimpleConnectionFactory.doCreateConnection(SimpleConnectionFactory.java:30)
at org.mule.module.db.internal.domain.connection.AbstractConnectionFactory.create(AbstractConnectionFactory.java:23)
at org.mule.module.db.internal.domain.connection.TransactionalDbConnectionFactory.createDataSourceConnection(TransactionalDbConnectionFactory.java:84)
at org.mule.module.db.internal.domain.connection.TransactionalDbConnectionFactory.createConnection(TransactionalDbConnectionFactory.java:53)
at org.mule.module.db.internal.processor.AbstractDbMessageProcessor.process(AbstractDbMessageProcessor.java:72)
at org.mule.transport.polling.MessageProcessorPollingMessageReceiver$1.process(MessageProcessorPollingMessageReceiver.java:165)
at org.mule.transport.polling.MessageProcessorPollingMessageReceiver$1.process(MessageProcessorPollingMessageReceiver.java:149)
at org.mule.execution.ExecuteCallbackInterceptor.execute(ExecuteCallbackInterceptor.java:16)
at org.mule.execution.CommitTransactionInterceptor.execute(CommitTransactionInterceptor.java:35)
at org.mule.execution.CommitTransactionInterceptor.execute(CommitTransactionInterceptor.java:22)
at org.mule.execution.HandleExceptionInterceptor.execute(HandleExceptionInterceptor.java:30)
at org.mule.execution.HandleExceptionInterceptor.execute(HandleExceptionInterceptor.java:14)
at org.mule.execution.BeginAndResolveTransactionInterceptor.execute(BeginAndResolveTransactionInterceptor.java:67)
at org.mule.execution.ResolvePreviousTransactionInterceptor.execute(ResolvePreviousTransactionInterceptor.java:44)
at org.mule.execution.SuspendXaTransactionInterceptor.execute(SuspendXaTransactionInterceptor.java:50)
at org.mule.execution.ValidateTransactionalStateInterceptor.execute(ValidateTransactionalStateInterceptor.java:40)
at org.mule.execution.IsolateCurrentTransactionInterceptor.execute(IsolateCurrentTransactionInterceptor.java:41)
at org.mule.execution.ExternalTransactionInterceptor.execute(ExternalTransactionInterceptor.java:48)
at org.mule.execution.RethrowExceptionInterceptor.execute(RethrowExceptionInterceptor.java:28)
at org.mule.execution.RethrowExceptionInterceptor.execute(RethrowExceptionInterceptor.java:13)
at org.mule.execution.TransactionalErrorHandlingExecutionTemplate.execute(TransactionalErrorHandlingExecutionTemplate.java:110)
at org.mule.execution.TransactionalErrorHandlingExecutionTemplate.execute(TransactionalErrorHandlingExecutionTemplate.java:30)
at org.mule.transport.polling.MessageProcessorPollingMessageReceiver.pollWith(MessageProcessorPollingMessageReceiver.java:148)
at org.mule.transport.polling.MessageProcessorPollingMessageReceiver.poll(MessageProcessorPollingMessageReceiver.java:139)
at org.mule.transport.AbstractPollingMessageReceiver.performPoll(AbstractPollingMessageReceiver.java:216)
at org.mule.transport.PollingReceiverWorker.poll(PollingReceiverWorker.java:84)
at org.mule.transport.PollingReceiverWorker.run(PollingReceiverWorker.java:48)
at org.mule.modules.schedulers.cron.CronJob.execute(CronJob.java:33)
at org.quartz.core.JobRunShell.run(JobRunShell.java:202)
at org.quartz.simpl.SimpleThreadPool$WorkerThread.run(SimpleThreadPool.java:573)
Getting the below error after adding dependency
ERROR
********************************************************************************
Message : Response code 500 mapped as failure.
Payload : org.glassfish.grizzly.utils.BufferInputStream#cb5d5af
Payload Type : org.mule.module.db.internal.result.resultset.ResultSetIterator
Element : /MSBItoEloquaContactCDODataUpdate/input/0/0/EloquaLookupContactsCDOBulk/subprocessors/1/EloquaLookupFields/subprocessors/0/0/1/2 # nol-integration-v1:bulk-integration.xml:92 (Eloqua Get CDO fields)
Element XML : <http:request config-ref="Eloqua_Bulk_API" path="/customObjects/{customObjectId}/fields" method="GET" doc:name="Eloqua Get CDO fields">
<http:request-builder>
<http:uri-param paramName="customObjectId" value="#[flowVars.cdo.id]"></http:uri-param>
</http:request-builder>
</http:request>
--------------------------------------------------------------------------------
Root Exception stack trace:
org.mule.module.http.internal.request.ResponseValidatorException: Response code 500 mapped as failure.
at org.mule.module.http.internal.request.SuccessStatusCodeValidator.validate(SuccessStatusCodeValidator.java:37)
at org.mule.module.http.internal.request.DefaultHttpRequester.validateResponse(DefaultHttpRequester.java:413)
at org.mule.module.http.internal.request.DefaultHttpRequester.innerProcess(DefaultHttpRequester.java:401)
at org.mule.module.http.internal.request.DefaultHttpRequester.processBlocking(DefaultHttpRequester.java:221)
at org.mule.processor.AbstractNonBlockingMessageProcessor.process(AbstractNonBlockingMessageProcessor.java:43)
It seems sqljdbc4.jar is not found at runtime.
If you are using Maven build(pom.xml)
Add the dependency to the pom.xml
<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>sqljdbc4</artifactId>
<version>4.0</version>
</dependency>
Note :Just for information-
If any jar is not in Maven repo you are referring in your pom.xml , you need to add it yourself to your local repository/company repository.
To add to your local repository,
1.Please check if you are having correct version of driver jar.
2.Execute below to add it to local Maven repository
mvn install:install-file -Dfile=<jar_name>.jar -DgroupId=<group_id_of_jar> -DartifactId=<artifact_id_of_jar> -Dversion=<version_of_jar> -Dpackaging=jar
I'm using Mulesoft, and using the answer above and #Mahesh_Loya's comment, I was able to get this working.
Add dependency to pom.xml, just before closing </dependencies> tag:
<dependency>
<groupId>com.microsoft.sqlserver</groupId>
<artifactId>sqljdbc4</artifactId>
<version>4.0</version>
</dependency>
Add the Clojars repo to pom.xml, just before closing </repositories> tag
<repository>
<id>Clojars</id>
<name>Clojars</name>
<url>http://clojars.org/repo/</url>
<layout>default</layout>
</repository>
Save pom.xml and rerun Maven, and all should be working. Happy times.
I want to run a simple example of word count with map reduce. but I have this problem and have no idea how to solve it.
Exception in thread "main" java.lang.VerifyError: Bad type on operand stack
Exception Details:
Location:
org/apache/hadoop/mapred/JobTrackerInstrumentation.create(Lorg/apache/hadoop/mapred/JobTracker;Lorg/apache/hadoop/mapred/JobConf;)Lorg/apache/hadoop/mapred/JobTrackerInstrumentation; #5: invokestatic
Reason:
Type 'org/apache/hadoop/metrics2/lib/DefaultMetricsSystem' (current frame, stack[2]) is not assignable to 'org/apache/hadoop/metrics2/MetricsSystem'
Current Frame:
bci: #5
flags: { }
locals: { 'org/apache/hadoop/mapred/JobTracker', 'org/apache/hadoop/mapred/JobConf' }
stack: { 'org/apache/hadoop/mapred/JobTracker', 'org/apache/hadoop/mapred/JobConf', 'org/apache/hadoop/metrics2/lib/DefaultMetricsSystem' }
Bytecode:
0000000: 2a2b b200 03b8 0004 b0
I had the same problem and it got solved by removing some unneeded references in Maven (hadoop-common and hadoop-hdfs).
I'm using hadoop 2.2.0 from Windows, connecting to Linux hadoop single-node cluster.
the below order for dependencies solved the problem for me.
hadoop-core 1.2.1
hadoop-common 2.6.0
The below dependencies worked for me
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-core</artifactId>
<version>1.2.1</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-common</artifactId>
<version>2.6.0</version>
</dependency>
<dependency>
<groupId>org.apache.hadoop</groupId>
<artifactId>hadoop-client</artifactId>
<version>2.6.0</version>
</dependency>
</dependencies>