Spring XD properties-location attribute - spring-xd

I tried to create a stream with this definition:
stream create --name test3 --definition "jms --destination=test3 --pubSub=true --subscriptionName=test3 --durableSubscription=true --clientId=cbda83 --outputType=text/plain | filter --script=file:/opt/bidpwr/app_config/spring_xd/fleet/startstop/ready1.groovy --properties-location=file:/opt/bidpwr/app_config/spring_xd/fleet/common/fleet_config.properties --inputType=application/json --outputType=application/json | log"
And it threw this error:
Command failed org.springframework.xd.rest.client.impl.SpringXDException: Error with option(s) for module filter of type processor:
properties-location: option named 'properties-location' is not supported
I went through the documentation, but couldn't figure out why it's failing.
When I run "info", I don't see the --properties-location attribute listed either:
xd:>module info processor:filter
Information about processor module 'filter':
Option Name Description Default Type
----------- --------------------------------------------------------------- ------- --------
expression a SpEL expression to evaluate as a predicate <none> String
script location of a groovy script to use as a predicate to the filter <none> String
outputType how this module should emit messages it produces <none> MimeType
inputType how this module should interpret messages it consumes <none> MimeType
Any idea how to solve this?

Related

Docker-compose string interpolation causes type conversion issue in spring boot project

I am running a stack of services/applications and would like to deploy these using a single docker-compose file. I tested this and it works flawlessly. But now, when I try to make the docker-compose file more configurable by having an .env file take control of the configuration the spring boot project that has to be deployed gives me the following error:
api | Failed to bind properties under 'spring.data.mongodb.port' to java.lang.Integer:
api |
api | Property: spring.data.mongodb.port
api | Value: '27017'
api | Origin: System Environment Property "spring.data.mongodb.port"
api | Reason: failed to convert java.lang.String to java.lang.Integer (caused by java.lang.NumberFormatException: For input string: "'27017'")
The env file I am using:
SPRING_DATA_MONGODB_PORT=27017
How the back-end is receiving the configuration request in spring boot's application.properties file:
spring.data.mongodb.port=27017
The problem is clear, but how do I make sure that whenever the environment variable arrives, the '27017' in String format is casted to an Integer?
Thanks!

Spring XD Error creating streams with Composite module with shell processor in it

I have created a composite module:
module compose common-module --definition "kafka --topic=topic1 --outputType=text/plain | shell --command='script1.sh' "
I then created a stream using this module:
stream create stream1 --definition "common-module > queue:job:job1"
And I got the following error:
Command failed org.springframework.xd.rest.client.impl.SpringXDException:
Error with option(s) for module common-module of type source:
command: may not be null
command: may not be empty
Anyone knows what's going on? Thanks !
It's a bug, I opened a JIRA Issue.
The only work-around I can think of (short of creating a custom shell module - see the JIRA) is to pass-in the script again...
stream create stream1 --definition "common-module --shell.script=script1.sh > queue:job:job1"

WARN Error while fetching metadata with correlation id 1 : {MY_TOPIC?=INVALID_TOPIC_EXCEPTION} (org.apache.kafka.clients.NetworkClient)

When I run the following command with kafka 0.9.0.1, I get these warnings[1]. Can you please tell me what is wrong with my topics? (I'm talking to the kafka broker which runs in ec2)
./kafka-console-consumer.sh --new-consumer --bootstrap-server kafka.xx.com:9092 --topic MY_TOPIC?
[1]
[2016-04-06 10:57:45,839] WARN Error while fetching metadata with correlation id 1 : {MY_TOPIC?=INVALID_TOPIC_EXCEPTION} (org.apache.kafka.clients.NetworkClient)
[2016-04-06 10:57:46,066] WARN Error while fetching metadata with correlation id 3 : {MY_TOPIC?=INVALID_TOPIC_EXCEPTION} (org.apache.kafka.clients.NetworkClient)
[2016-04-06 10:57:46,188] WARN Error while fetching metadata with correlation id 5 : {MY_TOPIC?=INVALID_TOPIC_EXCEPTION} (org.apache.kafka.clients.NetworkClient)
[2016-04-06 10:57:46,311] WARN Error while fetching metadata with correlation id 7 : {MY_TOPIC?=INVALID_TOPIC_EXCEPTION} (org.apache.kafka.clients.NetworkClient)
You topic name is not valid because it has character '?' which is not legalCharacter for topic names.
I got same error. in my case problem was space between comma separated topics in my code:
#source(type='kafka',
topic.list="p1, p2, p3",
partition.no.list='0',
threading.option='single.thread',
group.id="group",
bootstrap.servers='kafka:9092',
#map(type='json')
)
finally find solution:
#source(type='kafka',
topic.list="p1,p2,p3",
partition.no.list='0',
threading.option='single.thread',
group.id="group",
bootstrap.servers='kafka:9092',
#map(type='json')
)
it happens when our producer is not able to produce to the respective address, Kindly check in /kafka/config/server.properties the value of advertised listeners,
if its commented out , there are other issues.
But if its not please put your ip address in place of localhost and then restart both zookeeper and kafka
Try starting the console producer hopefully it will work.
Just in case anyone is having this issue related with a comma " , " and logstash output to kafka or a calculated topic name:
In the topic_id of logstash output to kafka we tried to create the topic_id appending a variable we calculated in the filter.
The problem is that this field was already present in the source document and we later add it "again" in the logstash filter, converting the string field into a hash (array/list).
So as we used in the logstash output
topic_id => ["topicName_%{field}"]
we end up with:
topic_id : "topicName_fieldItem1,FieldItem2"
Which caused the exception in logstash logs
[WARN ][org.apache.kafka.clients.NetworkClient] [Producer clientId=logstash] Error while fetching metadata with correlation id 3605264 : {topicName_fieldItem1,FieldItem2=INVALID_TOPIC_EXCEPTION}

Spring XD missing modules

I install spring-xd-1.2.1.RELEASE and start in Spring XD in xd-signle mode, when I type the following command
xd:>stream create --definition "time | log" --name ticktock --deploy
I get the following result:
Command failed org.springframework.xd.rest.client.impl.SpringXDException: Could not find module with name 'log' and type 'sink'
When I type the following command:
xd:> module list
I get the following resul:
Source Processor Sink Job
gemfire gemfire-json-server filejdbc
gemfire-cq gemfire-server hdfsjdbc
jdbc jdbc jdbchdfs
kafka rabbit sqoop
rabbit redis
twittersearch
twitterstream
Some default modules are missed ? What happens ? Is there any other configuration to set before starting spring xd ?
Check XD_HOME/modules/sink/log - Is this folder exist?

ICC Error 1072896749

I am getting this error (in taskroute log) while trying to ingest documents from ICC (IBM Content Collector v2.1.x) with xml as metadata file. Can anyone shed more light on "Whitespace is not allowed at this location." error?
2011-09-12T18:39:37Z Error An error occurred while evaluating the task route 'M1 TR Docs': Task Method 'ibm.ctms.filesystem.metadata' failed for entity with id 'd:\icc_migration\conventional_pm\test.xml': Status=error; Message='Error -1072896749 at 22:22 - "Whitespace is not allowed at this location." - D:\ICC_Migration\Conventional_PM\test.xml'Reason: Task Method 'ibm.ctms.filesystem.metadata' failed for entity with id 'd:\icc_migration\conventional_pm\test.xml': Status=error; Message='Error -1072896749 at 22:22 - "Whitespace is not allowed at this location." - D:\ICC_Migration\Conventional_PM\test.xml' ibm::ctms::taskrouting::TaskRouteEvaluator::SubmitRoute (taskrouteevaluator.cpp:427) 0x820 Stack Trace: (class ibm::ctms::taskrouting::TaskStatusException) at ibm::ctms::taskrouting::TaskRoutingException::TaskRoutingException (taskrouting.cpp:11), at ibm::ctms::taskrouting::TaskStatusException::TaskStatusException (taskrouting.cpp:88), at ibm::ctms::taskrouting::TaskMethodManager::checkTaskStatus (taskmethodmanager.cpp:610), at ibm::ctms::taskrouting::TaskMethodManager::InvokeTaskMethod (taskmethodmanager.cpp:730), at ibm::ctms::taskrouting::TaskRouteEvaluator::invokeTaskMethod (taskrouteevaluator.cpp:255), at ibm::ctms::taskrouting::TaskRouteEvaluator::SubmitRoute (taskrouteevaluator.cpp:375), at ibm::ctms::taskrouting::TaskRouteEngine::SubmitEntity (taskrouteengine.cpp:475), at ibm::ctms::taskrouting::SubmissionTask::Execute (submissiontask.cpp:44), at ibm::ctms::core::threads::ThreadPoolImplementation::TaskThread::Execute (threadpool.cpp:214), at ATL::CThreadPool<ibm::ctms::core::threads::ThreadPoolImplementation::TaskThread,ATL::CRTThreadTraits,ATL::Win32WaitTraits>::ThreadProc (atlutil.h:1386), at ATL::CThreadPool<ibm::ctms::core::threads::ThreadPoolImplementation::TaskThread,ATL::CRTThreadTraits,ATL::Win32WaitTraits>::WorkerThreadProc (atlutil.h:1404), at MSVCR80.dll:0x29ba, at MSVCR80.dll:0x2a46,
I figured it. The XML that I was trying to use had a tag like this.
<Company name>Test & Company</Company Name>
And of course, it did not like the "&" symbol there. I enclosed it in CDATA tag and it was fixed. What was mysterious though was the disconnect between the type of error and error message ICC was throwing.

Resources