what would cause this debezium kafka connector error? - apache-kafka-connect

I'm looking to connect mysql --> debezium --> kafka (confluent cloud). I was wondering if someone can help me with this error msg:
Failed to find any class that implements Connector and which name matches io.debezium.connector.mysql.MySqlConnector.
I have the jar files for debezium in my kafka worker's file:
plugin.path=/home/ec2-user/kafka/plugins
The content of the plugin folder has the following:
-antlr4-runtime-4.7.2.jar
-debezium-api-1.5.3.Final.jar
-debezium-connector-mysql-1.5.3.Final.jar
-debezium-core-1.5.3.Final.jar
-debezium-ddl-parser-1.5.3.Final.jar
-failureaccess-1.0.1.jar
-guava-30.0-jre.jar
-mysql-binlog-connector-java-0.25.1.jar
-mysql-connector-java-8.0.21.jar
when I started up the distributor, I see that the plugins are added:
[2021-06-24 23:01:54,680] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/debezium-api-1.5.3.Final.jar}
[2021-06-24 23:01:54,684] INFO Loading plugin from: /home/ec2-user/kafka/plugins/debezium-connector-mysql-1.5.3.Final.jar
[2021-06-24 23:01:54,744] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/debezium-connector-mysql-
[2021-06-24 23:01:54,745] INFO Added plugin 'io.debezium.connector.mysql.transforms.ReadToInsertEvent'
[2021-06-24 23:01:54,745] INFO Loading plugin from: /home/ec2-user/kafka/plugins/debezium-core-1.5.3.Final.jar
[2021-06-24 23:01:54,866] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/debezium-core-1.5.3.Final.jar}
[2021-06-24 23:01:54,867] INFO Added plugin 'io.debezium.converters.CloudEventsConverter'
[2021-06-24 23:01:54,871] INFO Added plugin 'io.debezium.transforms.outbox.EventRouter'
[2021-06-24 23:01:54,872] INFO Added plugin 'io.debezium.transforms.ExtractNewRecordState'
[2021-06-24 23:01:54,872] INFO Added plugin 'io.debezium.transforms.ByLogicalTableRouter'
[2021-06-24 23:01:54,873] INFO Added plugin 'io.debezium.transforms.tracing.ActivateTracingSpan'
[2021-06-24 23:01:54,873] INFO Loading plugin from: /home/ec2-user/kafka/plugins/debezium-ddl-parser-1.5.3.Final.jar
[2021-06-24 23:01:55,060] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/debezium-ddl-parser-1.5.3.Final.jar}
[2021-06-24 23:01:55,061] INFO Loading plugin from: /home/ec2-user/kafka/plugins/failureaccess-1.0.1.jar
[2021-06-24 23:01:55,069] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/failureaccess-1.0.1.jar}
[2021-06-24 23:01:55,070] INFO Loading plugin from: /home/ec2-user/kafka/plugins/guava-30.0-jre.jar
[2021-06-24 23:01:55,307] INFO Registered loader: PluginClassLoader{pluginLocation=file:/home/ec2-user/kafka/plugins/guava-30.0-jre.jar}
However, when I run my curl command:
curl -s -X POST -H 'Content-Type: application/json' --data
#debe-connector.json http://localhost:8083/connectors -v
I get the error:
Failed to find any class that implements Connector and which name matches io.debezium.connector.mysql.MySqlConnector.
Any idea what I might be missing?
Thanks,

You need to put all the jar files of the debezium mysql connector inside a directory
/home/ec2-user/kafka/plugins/debezium-connector-mysql
instead of just under
/home/ec2-user/kafka/plugins

You can use the curl command
curl -X GET http://localhost:8083/connector-plugins
to see which plugins are installed correctly. If debezium is not listed here, that means there is a problem with the installation. You may want to check file permissions.

Related

HBase - Jmeter Load test gives - check the value configured in 'zookeeper.znode.parent'

I am try to load test HBase using Jmeter.
In the Jmeter.log file I am seeing that the hbase-site.xml file is getting picked. Following is the log snippet:
2021-02-10 10:27:05,791 INFO o.a.j.JMeter: Loading user properties from: user.properties
2021-02-10 10:27:05,791 INFO o.a.j.JMeter: Loading system properties from: system.properties
2021-02-10 10:27:05,792 INFO o.a.j.JMeter: Setting System property: javax.security.auth.useSubjectCredsOnly=false
2021-02-10 10:27:05,792 INFO o.a.j.JMeter: Setting System property: java.security.krb5.conf=/etc/krb5.conf
2021-02-10 10:27:05,792 INFO o.a.j.JMeter: Setting System property: java.security.auth.login.config=/home/svctranhist/gss-jaas.conf
2021-02-10 10:27:05,792 INFO o.a.j.JMeter: Setting System properties from file: /etc/hbase/conf/hbase-site.xml
2021-02-10 10:27:05,804 INFO o.a.j.JMeter: Copyright (c) 1998-2021 The Apache Software Foundation
2021-02-10 10:27:05,805 INFO o.a.j.JMeter: Version 5.4.1
And hbase-site.xml has zookeeper.znode.parent property. But on running the jmx file I am getting the following error.
2021-02-10 09:16:08,969 ERROR o.a.h.h.z.ZooKeeperNodeTracker: Check the value configured in 'zookeeper.znode.parent'. There could be a mismatch with the one configured in the master.
I am using the following command to run the JMX file:
./jmeter -Djavax.security.auth.useSubjectCredsOnly=false -Djava.security.krb5.conf=/etc/krb5.conf -Djava.security.auth.login.config=/home/svctranhist/gss-jaas.conf -S /etc/hbase/conf/hbase-site.xml -n -t ../HBase_Load_Test_Plan.jmx -l ./hbase_50.csv
Am I missing something? Do I need to explicitly give zookeeper.znode.parent? I am stuck on this. Can someone please help me on this. Thanks in advance.

Debezium Kafka connector mongodb : Error connecting kafka connector to mongodb

Below are my MongoDB config in /etc/kafka/connect-mongodb-source.properties
name=mongodb-source-connector
connector.class=io.debezium.connector.mongodb.MongoDbConnector
mongodb.hosts=/remoteserveraddress:27017
mongodb.name=mongo_conn
initial.sync.max.threads=1
tasks.max=1
but getting below error
ERROR Plugin class loader for connector: 'io.debezium.connector.mongodb.MongoDbConnector' was not found. Returning: org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader#5a058be5 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:165)
Running connector in standalone mode.
I have mongodbconnector class under debezium-debezium-connector-mongodb-1.0.0/debezium-connector-mongodb-1.0.0.Final.jar
also classpath is set as follows
#for CLASSPATH
CLASSPATH=/Users/111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/*
export CLASSPATH
PATH=$PATH:/usr/local/sbin
export PATH
Using plugin path I see its able to register and load all required plugins..
[2020-01-10 08:14:07,916] INFO Scanning for plugin classes. This might take a moment ... (org.apache.kafka.connect.cli.ConnectStandalone:78)
[2020-01-10 08:14:07,942] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongodb-driver-3.11.1.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,082] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongodb-driver-3.11.1.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,083] INFO Added plugin 'org.apache.kafka.connect.connector.policy.AllConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,083] INFO Added plugin 'org.apache.kafka.connect.connector.policy.PrincipalConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,083] INFO Added plugin 'org.apache.kafka.connect.connector.policy.NoneConnectorClientConfigOverridePolicy' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,085] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/debezium-connector-mongodb-1.0.0.Final.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,120] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/debezium-connector-mongodb-1.0.0.Final.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,121] INFO Added plugin 'io.debezium.connector.mongodb.MongoDbConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,121] INFO Added plugin 'io.debezium.connector.mongodb.transforms.ExtractNewDocumentState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,121] INFO Added plugin 'io.debezium.connector.mongodb.transforms.UnwrapFromMongoDbEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,122] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/debezium-core-1.0.0.Final.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,198] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/debezium-core-1.0.0.Final.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,198] INFO Added plugin 'io.debezium.converters.ByteBufferConverter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,199] INFO Added plugin 'io.debezium.transforms.UnwrapFromEnvelope' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,199] INFO Added plugin 'io.debezium.transforms.ExtractNewRecordState' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,199] INFO Added plugin 'io.debezium.transforms.outbox.EventRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,199] INFO Added plugin 'io.debezium.transforms.ByLogicalTableRouter' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,200] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongo-kafka-0.2-all.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,340] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongo-kafka-0.2-all.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,340] INFO Added plugin 'com.mongodb.kafka.connect.MongoSourceConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,340] INFO Added plugin 'com.mongodb.kafka.connect.MongoSinkConnector' (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:193)
[2020-01-10 08:14:08,341] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/bson-3.11.1.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,373] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/bson-3.11.1.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
[2020-01-10 08:14:08,373] INFO Loading plugin from: /Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongodb-driver-core-3.11.1.jar (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:241)
[2020-01-10 08:14:08,465] INFO Registered loader: PluginClassLoader{pluginLocation=file:/Users/1111111/workspace/KafkaConnect/confluent-5.3.2/debezium-debezium-connector-mongodb-1.0.0/mongodb-driver-core-3.11.1.jar} (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:264)
but at the end again same issue
[2020-01-10 08:40:43,613] ERROR Plugin class loader for connector: 'io.debezium.connector.mongodb.MongoDbConnector' was not found. Returning: org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader#33f2df51 (org.apache.kafka.connect.runtime.isolation.DelegatingClassLoader:165)
[2020-01-10 08:40:43,809] ERROR Stopping after connector error (org.apache.kafka.connect.cli.ConnectStandalone:121)
java.lang.IllegalArgumentException: username can not be null
at com.mongodb.MongoCredential.<init>(MongoCredential.java:350)
at com.mongodb.MongoCredential.<init>(MongoCredential.java:344)
at com.mongodb.MongoCredential.createCredential(MongoCredential.java:169)
at io.debezium.connector.mongodb.ConnectionContext.<init>(ConnectionContext.java:69)
at io.debezium.connector.mongodb.MongoDbConnector.validate(MongoDbConnector.java:222)
at org.apache.kafka.connect.runtime.AbstractHerder.validateConnectorConfig(AbstractHerder.java:313)
at org.apache.kafka.connect.runtime.standalone.StandaloneHerder.putConnectorConfig(StandaloneHerder.java:192)
at org.apache.kafka.connect.cli.ConnectStandalone.main(ConnectStandalone.java:115)
[2020-01-10 08:40:43,810] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:66)
output of >curl localhost:8083/connector-plugins
[{"class":"com.mongodb.kafka.connect.MongoSinkConnector","type":"sink","version":"0.2"},{"class":"com.mongodb.kafka.connect.MongoSourceConnector","type":"source","version":"0.2"},{"class":"io.confluent.connect.activemq.ActiveMQSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.elasticsearch.ElasticsearchSinkConnector","type":"sink","version":"5.3.2"},{"class":"io.confluent.connect.ibm.mq.IbmMQSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.jdbc.JdbcSinkConnector","type":"sink","version":"5.3.2"},{"class":"io.confluent.connect.jdbc.JdbcSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.jms.JmsSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.replicator.ReplicatorSourceConnector","type":"source","version":"5.3.2"},{"class":"io.confluent.connect.s3.S3SinkConnector","type":"sink","version":"5.3.2"},{"class":"io.confluent.connect.storage.tools.SchemaSourceConnector","type":"source","version":"5.3.2-ce"},{"class":"io.debezium.connector.mongodb.MongoDbConnector","type":"source","version":"1.0.0.Final"},{"class":"org.apache.kafka.connect.file.FileStr
thanks in advance
First of all, please check the installation of your plugin using the Kafka Connect REST Interface (see details here).
Try to install Kafka Connect plugins using the plugin path mechanism instead of CLASSPATH (more info in the docs).
There were two issues
Debezium connector has hardcoded source='admin'and there is no way to overwrite it and the DB server I was trying to connect didn't had any admin as authentication.
Test server that I was using was standalone server and had no replica, so I had to create replica and add this server to it.
Debezium connector doesnt support standalone servers.

Cannot run SonarQube as Service

If I just run the StartSonar.bat everything works and starts fine.
If I setup a service to run using NSSM using:
Path: ..\blah\wrapper.exe
Startup directory: ..\blah
Arguments: -c ..\blah\conf\wrapper.conf
I get an error in the logs that says:
2019.06.24 16:03:49 INFO web[][o.s.p.ProcessEntryPoint] Starting web
2019.06.24 16:03:50 INFO web[][o.a.t.u.n.NioSelectorPool] Using a shared selector for servlet write/read
2019.06.24 16:03:51 INFO web[][o.e.p.PluginsService] no modules loaded
2019.06.24 16:03:51 INFO web[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.join.ParentJoinPlugin]
2019.06.24 16:03:51 INFO web[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.percolator.PercolatorPlugin]
2019.06.24 16:03:51 INFO web[][o.e.p.PluginsService] loaded plugin [org.elasticsearch.transport.Netty4Plugin]
2019.06.24 16:03:52 INFO web[][o.s.s.e.EsClientProvider] Connected to local Elasticsearch: [127.0.0.1:9001]
2019.06.24 16:03:52 INFO web[][o.s.s.p.LogServerVersion] SonarQube Server / 7.7.0.23042 / 1dcac8b8de36b377a1810cc8f1c4c31744e12729
2019.06.24 16:03:52 INFO web[][o.sonar.db.Database] Create JDBC data source for jdbc:sqlserver://BRKPRCSQUBE1;databaseName=sonar;integratedSecurity=true
2019.06.24 16:03:53 ERROR web[][o.s.s.p.Platform] Web server startup failed
java.lang.IllegalStateException: Can not connect to database. Please check
connectivity and settings (see the properties prefixed by 'sonar.jdbc.').
That leads me to believe there is a problem with setting up the database correctly, but once again.. if I run the StartSonar.bat everything works fine.
The only thing I can think of is that I am setting up the service as a "Local System account," instead of when I run the bat it would be from my own account.
That being said, I cannot use my account and then login and change my password or something every time my AD changes.

Apache Nifi windows unable to load NAR library bundles

I'm only attempting to launch the Nifi UI as a local instance to start playing with it. I've unzipped the package and made sure to set the JAVA_HOME variable to my Java 1.8. When I try to bin/run-nifi, in my nifi-app log, the error message is:
2018-05-03 15:03:50,585 INFO [main] org.apache.nifi.NiFi Launching NiFi...
2018-05-03 15:03:52,330 INFO [main] o.a.nifi.properties.NiFiPropertiesLoader Determined default nifi.properties path to be 'Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\conf\nifi.properties'
2018-05-03 15:03:52,363 INFO [main] o.a.nifi.properties.NiFiPropertiesLoader Loaded 146 properties from Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\conf\nifi.properties
2018-05-03 15:03:52,423 INFO [main] org.apache.nifi.NiFi Loaded 146 properties
2018-05-03 15:03:52,779 INFO [main] org.apache.nifi.BootstrapListener Started Bootstrap Listener, Listening for incoming requests on port 64802
2018-05-03 15:03:53,071 INFO [main] org.apache.nifi.BootstrapListener Successfully initiated communication with Bootstrap
2018-05-03 15:03:53,181 WARN [main] org.apache.nifi.nar.NarUnpacker Unable to load NAR library bundles due to java.io.IOException: Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\work\nar\framework directory does not have read/write privilege Will proceed without loading any further Nar bundles
2018-05-03 15:03:53,242 ERROR [main] org.apache.nifi.NiFi Failure to launch NiFi due to java.io.IOException: Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\work\nar\framework could not be created
java.io.IOException: Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\work\nar\framework could not be created
at org.apache.nifi.util.FileUtils.ensureDirectoryExistAndCanReadAndWrite(FileUtils.java:48)
at org.apache.nifi.nar.NarClassLoaders.load(NarClassLoaders.java:155)
at org.apache.nifi.nar.NarClassLoaders.init(NarClassLoaders.java:131)
at org.apache.nifi.NiFi.<init>(NiFi.java:133)
at org.apache.nifi.NiFi.<init>(NiFi.java:71)
at org.apache.nifi.NiFi.main(NiFi.java:292)
2018-05-03 15:03:53,383 INFO [Thread-1] org.apache.nifi.NiFi Initiating shutdown of Jetty web server...
2018-05-03 15:03:53,387 INFO [Thread-1] org.apache.nifi.NiFi Jetty web server shutdown completed (nicely or otherwise).
I've followed the installation instructions and haven't been able to trouble shoot. How do I load these NAR files upon running Nifi?
Thanks
I believe the underlying error in your output is java.io.IOException: Z:\DoE\LOCAL-~1\NIFI-1~1.0\.\work\nar\framework could not be created.
NiFi requires file permissions to create and write several directories, there is a list in the NiFi Admin Guide: How to install and start NiFi. NiFi does this to unpack the NAR files, write logs, and for various data repositories that comprise your data flow.
You have a few options:
Modify the permissions of the directory to allow NiFi read/write access. This can be done for each individual child directory.
Copy the entire NiFi distribution to a read/write location and run it from there.
Edit the conf/nifi-properties file to change the locations of these directories to read/write locations. See NiFi Admin Guide: System Properties for help on the properties.
Symlinks are a great solution for systems that support symlinks.
Two things you can try:
Run NiFi with administrator privilege (not a good practice) by going to ~\<NIFI_INSTALLATION_DIR>\bin and right click run-nifi.bat. Click Run as Administrator
Move NiFi directory to a location where the logged in user has full access to. Ex: C:\Users\<YOUR_USER>\Documents\. Now try to execute bin\run-nifi.bat
Similarly to the resolution that James proposed. I had to do the below 3-step process.
My scenario: I'm using docker containers and had the same problem. Even changing the user of my container to root didn't work. So, I did the following:
1 - Download Minifi https://nifi.apache.org/minifi/download.html
2 - Untar and execute the Minifi agent on my own laptop (I'm using MAC) so that the necessary folders and files will be created.
3 - Tar it up again and add to the DockerFile of my container creation
Done! Everything worked fine after that.

Flume NG not writing to HDFS

I'm new at using Flume and Hadoop so I'm trying to setup the simplest (but somewhat helpful/realistic) example I can. I'm using the HortonWorks Sandbox in a VM client. After following one tutorial 12 (which involves setting up and using Flume) everything seems to be working correctly.
So I setup my own flume.conf that should
Read from an apache access log
Use a memory channel
Write to the HDFS
Simple enough right? Here's my conf file
agent.sources=exec-source
agent.sinks=hdfs-sink
agent.channels=ch1
agent.sources.exec-source.type=exec
agent.sources.exec-source.command=tail -F /var/log/httpd/access_log
agent.sinks.hdfs-sink.type=hdfs
agent.sinks.hdfs-sink.hdfs.path=/flume/events
agent.sinks.hdfs-sink.hdfs.filePrefix=apacheaccess
agent.sinks.hdfs-sink.hdfs.rollInterval=10
agent.sinks.hdfs-sink.hdfs.rollSize=0
agent.channels.ch1.type=memory
agent.channels.ch1.capacity=1000
agent.sources.exec-source.channels=ch1
agent.sinks.hdfs-sink.channel=ch1
I've seen several people have problems writing to HDFS, and in most cases it was that there weren't enough logs to fill the HDFS block. However, rollInterval=10 should generate a new file every 10 seconds, as long as at least 1 line is written to it. I can run "tail -F /var/log/httpd/access_log" in another window and see lines being written to the log fairly consistantly, so I don't think it's that.
and here's the command/output from trying to start this agent
[root#sandbox ~]# flume-ng agent -f /etc/flume/conf/flume.conf -n apache-agent
Warning: No configuration directory set! Use --conf <dir> to override.
Info: Including Hadoop libraries found via (/usr/bin/hadoop) for HDFS access
Info: Excluding /usr/lib/hadoop/libexec/../lib/slf4j-api-1.4.3.jar from classpath
Info: Excluding /usr/lib/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar from classpath
Info: Including HBASE libraries found via (/usr/bin/hbase) for HBASE access
Info: Excluding /usr/lib/hbase/bin/../lib/slf4j-api-1.6.1.jar from classpath
Info: Excluding /usr/lib/hbase/bin/../lib/slf4j-log4j12-1.6.1.jar from classpath
Info: Excluding /usr/lib/hadoop/lib/slf4j-api-1.4.3.jar from classpath
Info: Excluding /usr/lib/hadoop/lib/slf4j-log4j12-1.4.3.jar from classpath
Info: Excluding /usr/lib/zookeeper/lib/slf4j-api-1.6.1.jar from classpath
Info: Excluding /usr/lib/zookeeper/lib/slf4j-log4j12-1.6.1.jar from classpath
Info: Excluding /usr/lib/hadoop/libexec/../lib/slf4j-api-1.4.3.jar from classpath
Info: Excluding /usr/lib/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar from classpath
+ exec /usr/jdk/jdk1.6.0_31//bin/java -Xmx20m -cp '/usr/lib/flume/lib/*:/usr/lib/hadoop/libexec/../conf:/usr/jdk/jdk1.6.0_31/lib/tools.jar:/usr/lib/hadoop/libexec/..:/usr/lib/hadoop/libexec/../hadoop-core-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/libexec/../lib/ambari-log4j-1.2.3.7.jar:/usr/lib/hadoop/libexec/../lib/asm-3.2.jar:/usr/lib/hadoop/libexec/../lib/aspectjrt-1.6.11.jar:/usr/lib/hadoop/libexec/../lib/aspectjtools-1.6.11.jar:/usr/lib/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/lib/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/lib/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/lib/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/lib/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/lib/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/lib/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/lib/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/lib/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/lib/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/lib/hadoop/libexec/../lib/commons-net-3.1.jar:/usr/lib/hadoop/libexec/../lib/core-3.1.1.jar:/usr/lib/hadoop/libexec/../lib/guava-11.0.2.jar:/usr/lib/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/libexec/../lib/hadoop-lzo-0.5.0.jar:/usr/lib/hadoop/libexec/../lib/hadoop-thriftfs-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/libexec/../lib/hadoop-tools.jar:/usr/lib/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/libexec/../lib/hue-plugins-2.2.0-SNAPSHOT.jar:/usr/lib/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/lib/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/lib/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/lib/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/lib/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/lib/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/lib/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/lib/hadoop/libexec/../lib/junit-4.5.jar:/usr/lib/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/lib/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/lib/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/libexec/../lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/lib/hadoop/libexec/../lib/postgresql-9.1-901-1.jdbc4.jar:/usr/lib/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/lib/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/usr/lib/hbase/bin/../conf:/usr/jdk/jdk1.6.0_31/lib/tools.jar:/usr/lib/hbase/bin/..:/usr/lib/hbase/bin/../hbase-0.94.6.1.3.0.0-107-security.jar:/usr/lib/hbase/bin/../hbase-0.94.6.1.3.0.0-107-security-tests.jar:/usr/lib/hbase/bin/../lib/activation-1.1.jar:/usr/lib/hbase/bin/../lib/asm-3.1.jar:/usr/lib/hbase/bin/../lib/avro-1.5.3.jar:/usr/lib/hbase/bin/../lib/avro-ipc-1.5.3.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hbase/bin/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hbase/bin/../lib/commons-cli-1.2.jar:/usr/lib/hbase/bin/../lib/commons-codec-1.4.jar:/usr/lib/hbase/bin/../lib/commons-collections-3.2.1.jar:/usr/lib/hbase/bin/../lib/commons-configuration-1.6.jar:/usr/lib/hbase/bin/../lib/commons-digester-1.8.jar:/usr/lib/hbase/bin/../lib/commons-el-1.0.jar:/usr/lib/hbase/bin/../lib/commons-httpclient-3.1.jar:/usr/lib/hbase/bin/../lib/commons-io-2.1.jar:/usr/lib/hbase/bin/../lib/commons-lang-2.5.jar:/usr/lib/hbase/bin/../lib/commons-logging-1.1.1.jar:/usr/lib/hbase/bin/../lib/commons-math-2.1.jar:/usr/lib/hbase/bin/../lib/commons-net-1.4.1.jar:/usr/lib/hbase/bin/../lib/core-3.1.1.jar:/usr/lib/hbase/bin/../lib/guava-11.0.2.jar:/usr/lib/hbase/bin/../lib/hadoop-core.jar:/usr/lib/hbase/bin/../lib/high-scale-lib-1.1.1.jar:/usr/lib/hbase/bin/../lib/httpclient-4.1.2.jar:/usr/lib/hbase/bin/../lib/httpcore-4.1.3.jar:/usr/lib/hbase/bin/../lib/jackson-core-asl-1.8.8.jar:/usr/lib/hbase/bin/../lib/jackson-jaxrs-1.8.8.jar:/usr/lib/hbase/bin/../lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hbase/bin/../lib/jackson-xc-1.8.8.jar:/usr/lib/hbase/bin/../lib/jamon-runtime-2.3.1.jar:/usr/lib/hbase/bin/../lib/jasper-compiler-5.5.23.jar:/usr/lib/hbase/bin/../lib/jasper-runtime-5.5.23.jar:/usr/lib/hbase/bin/../lib/jaxb-api-2.1.jar:/usr/lib/hbase/bin/../lib/jaxb-impl-2.2.3-1.jar:/usr/lib/hbase/bin/../lib/jersey-core-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-json-1.8.jar:/usr/lib/hbase/bin/../lib/jersey-server-1.8.jar:/usr/lib/hbase/bin/../lib/jettison-1.1.jar:/usr/lib/hbase/bin/../lib/jetty-6.1.26.jar:/usr/lib/hbase/bin/../lib/jetty-util-6.1.26.jar:/usr/lib/hbase/bin/../lib/jruby-complete-1.6.5.jar:/usr/lib/hbase/bin/../lib/jsp-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsp-api-2.1-6.1.14.jar:/usr/lib/hbase/bin/../lib/jsr305-1.3.9.jar:/usr/lib/hbase/bin/../lib/junit-4.10-HBASE-1.jar:/usr/lib/hbase/bin/../lib/libthrift-0.8.0.jar:/usr/lib/hbase/bin/../lib/log4j-1.2.16.jar:/usr/lib/hbase/bin/../lib/metrics-core-2.1.2.jar:/usr/lib/hbase/bin/../lib/netty-3.2.4.Final.jar:/usr/lib/hbase/bin/../lib/protobuf-java-2.4.0a.jar:/usr/lib/hbase/bin/../lib/servlet-api-2.5-6.1.14.jar:/usr/lib/hbase/bin/../lib/snappy-java-1.0.3.2.jar:/usr/lib/hbase/bin/../lib/stax-api-1.0.1.jar:/usr/lib/hbase/bin/../lib/velocity-1.7.jar:/usr/lib/hbase/bin/../lib/xmlenc-0.52.jar:/usr/lib/hbase/bin/../lib/zookeeper.jar:/etc/hadoop/conf:/usr/lib/hadoop/bin:/usr/lib/hadoop/build.xml:/usr/lib/hadoop/CHANGES.txt:/usr/lib/hadoop/conf:/usr/lib/hadoop/contrib:/usr/lib/hadoop/hadoop-ant-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/hadoop-ant.jar:/usr/lib/hadoop/hadoop-client-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/hadoop-client.jar:/usr/lib/hadoop/hadoop-core-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/hadoop-core.jar:/usr/lib/hadoop/hadoop-examples-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/hadoop-examples.jar:/usr/lib/hadoop/hadoop-minicluster-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/hadoop-minicluster.jar:/usr/lib/hadoop/hadoop-test-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/hadoop-test.jar:/usr/lib/hadoop/hadoop-tools-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/hadoop-tools.jar:/usr/lib/hadoop/HDP-CHANGES.txt:/usr/lib/hadoop/ivy:/usr/lib/hadoop/ivy.xml:/usr/lib/hadoop/lib:/usr/lib/hadoop/libexec:/usr/lib/hadoop/LICENSE.txt:/usr/lib/hadoop/logs:/usr/lib/hadoop/LONGWING-CHANGES.txt:/usr/lib/hadoop/NOTICE.txt:/usr/lib/hadoop/pids:/usr/lib/hadoop/README.txt:/usr/lib/hadoop/sbin:/usr/lib/hadoop/webapps:/usr/lib/hadoop/lib/ambari-log4j-1.2.3.7.jar:/usr/lib/hadoop/lib/asm-3.2.jar:/usr/lib/hadoop/lib/aspectjrt-1.6.11.jar:/usr/lib/hadoop/lib/aspectjtools-1.6.11.jar:/usr/lib/hadoop/lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/lib/commons-cli-1.2.jar:/usr/lib/hadoop/lib/commons-codec-1.4.jar:/usr/lib/hadoop/lib/commons-collections-3.2.1.jar:/usr/lib/hadoop/lib/commons-configuration-1.6.jar:/usr/lib/hadoop/lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/lib/commons-digester-1.8.jar:/usr/lib/hadoop/lib/commons-el-1.0.jar:/usr/lib/hadoop/lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/lib/commons-io-2.1.jar:/usr/lib/hadoop/lib/commons-lang-2.4.jar:/usr/lib/hadoop/lib/commons-logging-1.1.1.jar:/usr/lib/hadoop/lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/lib/commons-math-2.1.jar:/usr/lib/hadoop/lib/commons-net-3.1.jar:/usr/lib/hadoop/lib/core-3.1.1.jar:/usr/lib/hadoop/lib/guava-11.0.2.jar:/usr/lib/hadoop/lib/hadoop-capacity-scheduler-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/lib/hadoop-fairscheduler-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/lib/hadoop-lzo-0.5.0.jar:/usr/lib/hadoop/lib/hadoop-thriftfs-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/lib/hadoop-tools.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/lib/hsqldb-1.8.0.10.LICENSE.txt:/usr/lib/hadoop/lib/hue-plugins-2.2.0-SNAPSHOT.jar:/usr/lib/hadoop/lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop/lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop/lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/lib/jdeb-0.8.jar:/usr/lib/hadoop/lib/jdiff:/usr/lib/hadoop/lib/jersey-core-1.8.jar:/usr/lib/hadoop/lib/jersey-json-1.8.jar:/usr/lib/hadoop/lib/jersey-server-1.8.jar:/usr/lib/hadoop/lib/jets3t-0.6.1.jar:/usr/lib/hadoop/lib/jetty-6.1.26.jar:/usr/lib/hadoop/lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/lib/jsch-0.1.42.jar:/usr/lib/hadoop/lib/jsp-2.1:/usr/lib/hadoop/lib/junit-4.5.jar:/usr/lib/hadoop/lib/kfs-0.2.2.jar:/usr/lib/hadoop/lib/kfs-0.2.LICENSE.txt:/usr/lib/hadoop/lib/log4j-1.2.15.jar:/usr/lib/hadoop/lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/lib/native:/usr/lib/hadoop/lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/lib/oro-2.0.8.jar:/usr/lib/hadoop/lib/hue-plugins-2.2.0-SNAPSHOT.jar:/usr/lib/hadoop/lib/hue-plugins-2.2.0-SNAPSHOT.jar:/usr/lib/hadoop/lib/*plugin*jar:/usr/lib/hadoop/lib/postgresql-9.1-901-1.jdbc4.jar:/usr/lib/hadoop/lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/lib/xmlenc-0.52.jar:/usr/lib/zookeeper/bin:/usr/lib/zookeeper/conf:/usr/lib/zookeeper/lib:/usr/lib/zookeeper/zookeeper-3.4.5.1.3.0.0-107.jar:/usr/lib/zookeeper/zookeeper.jar:/usr/lib/zookeeper/lib/ant-1.8.0.jar:/usr/lib/zookeeper/lib/ant-launcher-1.8.0.jar:/usr/lib/zookeeper/lib/backport-util-concurrent-3.1.jar:/usr/lib/zookeeper/lib/classworlds-1.1-alpha-2.jar:/usr/lib/zookeeper/lib/commons-codec-1.6.jar:/usr/lib/zookeeper/lib/commons-io-2.2.jar:/usr/lib/zookeeper/lib/commons-logging-1.1.1.jar:/usr/lib/zookeeper/lib/httpclient-4.2.3.jar:/usr/lib/zookeeper/lib/httpcore-4.2.3.jar:/usr/lib/zookeeper/lib/jline-0.9.94.jar:/usr/lib/zookeeper/lib/jsoup-1.7.1.jar:/usr/lib/zookeeper/lib/log4j-1.2.15.jar:/usr/lib/zookeeper/lib/maven-ant-tasks-2.1.3.jar:/usr/lib/zookeeper/lib/maven-artifact-2.2.1.jar:/usr/lib/zookeeper/lib/maven-artifact-manager-2.2.1.jar:/usr/lib/zookeeper/lib/maven-error-diagnostics-2.2.1.jar:/usr/lib/zookeeper/lib/maven-model-2.2.1.jar:/usr/lib/zookeeper/lib/maven-plugin-registry-2.2.1.jar:/usr/lib/zookeeper/lib/maven-profile-2.2.1.jar:/usr/lib/zookeeper/lib/maven-project-2.2.1.jar:/usr/lib/zookeeper/lib/maven-repository-metadata-2.2.1.jar:/usr/lib/zookeeper/lib/maven-settings-2.2.1.jar:/usr/lib/zookeeper/lib/nekohtml-1.9.6.2.jar:/usr/lib/zookeeper/lib/netty-3.2.2.Final.jar:/usr/lib/zookeeper/lib/plexus-container-default-1.0-alpha-9-stable-1.jar:/usr/lib/zookeeper/lib/plexus-interpolation-1.11.jar:/usr/lib/zookeeper/lib/plexus-utils-3.0.8.jar:/usr/lib/zookeeper/lib/wagon-file-1.0-beta-6.jar:/usr/lib/zookeeper/lib/wagon-http-2.4.jar:/usr/lib/zookeeper/lib/wagon-http-lightweight-1.0-beta-6.jar:/usr/lib/zookeeper/lib/wagon-http-shared-1.0-beta-6.jar:/usr/lib/zookeeper/lib/wagon-http-shared4-2.4.jar:/usr/lib/zookeeper/lib/wagon-provider-api-2.4.jar:/usr/lib/zookeeper/lib/xercesMinimal-1.9.6.2.jar:/usr/lib/hadoop/libexec/../conf:/usr/jdk/jdk1.6.0_31/lib/tools.jar:/usr/lib/hadoop/libexec/..:/usr/lib/hadoop/libexec/../hadoop-core-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/libexec/../lib/ambari-log4j-1.2.3.7.jar:/usr/lib/hadoop/libexec/../lib/asm-3.2.jar:/usr/lib/hadoop/libexec/../lib/aspectjrt-1.6.11.jar:/usr/lib/hadoop/libexec/../lib/aspectjtools-1.6.11.jar:/usr/lib/hadoop/libexec/../lib/commons-beanutils-1.7.0.jar:/usr/lib/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/lib/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/lib/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/lib/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/lib/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/lib/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/lib/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/lib/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/lib/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/lib/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/lib/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/lib/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/lib/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/lib/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/lib/hadoop/libexec/../lib/commons-net-3.1.jar:/usr/lib/hadoop/libexec/../lib/core-3.1.1.jar:/usr/lib/hadoop/libexec/../lib/guava-11.0.2.jar:/usr/lib/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/libexec/../lib/hadoop-lzo-0.5.0.jar:/usr/lib/hadoop/libexec/../lib/hadoop-thriftfs-1.2.0.1.3.0.0-107.jar:/usr/lib/hadoop/libexec/../lib/hadoop-tools.jar:/usr/lib/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/lib/hadoop/libexec/../lib/hue-plugins-2.2.0-SNAPSHOT.jar:/usr/lib/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/lib/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/lib/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/lib/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/lib/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/lib/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/lib/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/lib/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/lib/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/lib/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/lib/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/lib/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/lib/hadoop/libexec/../lib/junit-4.5.jar:/usr/lib/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/lib/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/lib/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/lib/hadoop/libexec/../lib/netty-3.6.2.Final.jar:/usr/lib/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/lib/hadoop/libexec/../lib/postgresql-9.1-901-1.jdbc4.jar:/usr/lib/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/lib/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/lib/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/lib/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar:/conf' -Djava.library.path=:/usr/lib/hadoop/libexec/../lib/native/Linux-amd64-64:/usr/lib/hadoop/libexec/../lib/native/Linux-amd64-64:/usr/lib/hbase/bin/../lib/native/Linux-amd64-64 org.apache.flume.node.Application -f /etc/flume/conf/flume.conf -n apache-agent
13/09/03 12:35:11 INFO node.PollingPropertiesFileConfigurationProvider: Configuration provider starting
13/09/03 12:35:11 INFO node.PollingPropertiesFileConfigurationProvider: Reloading configuration file:/etc/flume/conf/flume.conf
13/09/03 12:35:11 INFO conf.FlumeConfiguration: Added sinks: hdfs-sink Agent: agent
13/09/03 12:35:11 INFO conf.FlumeConfiguration: Processing:hdfs-sink
13/09/03 12:35:11 INFO conf.FlumeConfiguration: Processing:hdfs-sink
13/09/03 12:35:11 INFO conf.FlumeConfiguration: Processing:hdfs-sink
13/09/03 12:35:11 INFO conf.FlumeConfiguration: Processing:hdfs-sink
13/09/03 12:35:11 INFO conf.FlumeConfiguration: Processing:hdfs-sink
13/09/03 12:35:11 INFO conf.FlumeConfiguration: Processing:hdfs-sink
13/09/03 12:35:11 INFO conf.FlumeConfiguration: Post-validation flume configuration contains configuration for agents: [agent]
13/09/03 12:35:11 WARN node.AbstractConfigurationProvider: No configuration found for this host:apache-agent
13/09/03 12:35:11 INFO node.Application: Starting new configuration:{ sourceRunners:{} sinkRunners:{} channels:{} }
Now at this point I realize I'm missing several things.
1) I expect to see something along the lines of "INFO instrumentation.MonitoredCounterGroup: Component type: SINK, name: hdfs-sink started" as my last line, which I don't
2) If I use the command “hadoop fs -lsr /flume” I should see new logs in my HDFS, but I don't. The last logs are from 8/28/2013, when I did the tutorial.
I also don't expect to see that WARN line in there, but I'm not sure why it's there, so maybe that's my problem and someone can tell me why.
So my questions are:
1) Can anyone tell me what might be going wrong here?
2) When I get this problem sorted out, is there anything else I should be looking for to see what Flume is working correctly, reading what it should and writing to where it should and when?
The answer is, of course, to name your agent when you start flume the same as your agent name in the config file. So my command line should have ended "-n agent" and NOT "-n apache-agent" since my flume.conf file specifies "agent.X"
After that everything appears to work.
In the config file you specified
agent.sources=exec-source
agent.sinks=hdfs-sink
agent.channels=ch1
so the agent name is 'agent' flume expects that while running the flume-agent you should use the same name as specified in the config file so the command should be
/usr/lib/flume/bin/flume-ng agent -n agent
Did you do set the agent in step #3 ?
Check out the original blog post and the Hadoop UI Hue and it Hadoop tutorials.

Resources