Sqoop running into local job runner mode - hadoop

When i run sqoop am not sure why it runs into local job runner mode and then says that i have provided invalid jobtracker url for LocalJobRunner. Can anyone tell whats going on?
$ bin/sqoop import -jt myjobtracker:50070 --connect jdbc:mysql://mydbhost.com/mydata --username foo --password bar --as-parquetfile --table campaigns --target-dir hdfs://myhdfs:8020/user/myself/campaigns
14/08/20 21:04:50 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-SNAPSHOT
14/08/20 21:04:50 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
14/08/20 21:04:51 INFO manager.SqlManager: Using default fetchSize of 1000
14/08/20 21:04:51 INFO tool.CodeGenTool: Beginning code generation
14/08/20 21:04:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `campaigns` AS t LIMIT 1
14/08/20 21:04:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `campaigns` AS t LIMIT 1
14/08/20 21:04:51 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `campaigns` AS t LIMIT 1
14/08/20 21:04:51 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-myself/compile/6acdb40688239f19ddf86a1290ad6c64/campaigns.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
14/08/20 21:04:54 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-myself/compile/6acdb40688239f19ddf86a1290ad6c64/campaigns.jar
14/08/20 21:04:54 WARN manager.MySQLManager: It looks like you are importing from mysql.
14/08/20 21:04:54 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
14/08/20 21:04:54 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
14/08/20 21:04:54 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
14/08/20 21:04:54 INFO mapreduce.ImportJobBase: Beginning import of campaigns
14/08/20 21:04:54 WARN conf.Configuration: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
14/08/20 21:04:54 WARN mapred.JobConf: The variable mapred.child.ulimit is no longer used.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/zookeeper/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/share/hbase/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
14/08/20 21:04:54 WARN conf.Configuration: mapred.jar is deprecated. Instead, use mapreduce.job.jar
14/08/20 21:04:56 WARN conf.Configuration: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
14/08/20 21:04:56 INFO mapreduce.Cluster: Failed to use org.apache.hadoop.mapred.LocalClientProtocolProvider due to error: Invalid "mapreduce.jobtracker.address" configuration value for LocalJobRunner : "myjobtracker:50070"
14/08/20 21:04:56 ERROR security.UserGroupInformation: PriviledgedActionException as:myself (auth:SIMPLE) cause:java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
14/08/20 21:04:56 ERROR tool.ImportTool: Encountered IOException running import job: java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:121)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:83)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:76)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1239)
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1235)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:396)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1408)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1234)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1263)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1287)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:247)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:665)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:102)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:497)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:601)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)

Figured out the problem, i was running sqoop 1.4.5 and pointing it to the latest hadoop 2.0.0-cdh4.4.0 which had the yarn stuff also thats why it was complaining.
When i pointed sqoop to hadoop-0.20/2.0.0-cdh4.4.0 (MR1 i think) it worked.

Related

Error while importing tables from Mysql using Sqoop

I am trying to import table from Mysql database using sqoop. Mysql is installed in the same box where sqoop, hadoop and hive installed and i can access the database from terminal. while trying to import getting below error. Please help to resolve this.
sqoop/bin$ ./sqoop import --connect jdbc:mysql://localhost/sqoop_test --username **** --password ***** --table emp2 --delete-target-dir -m 1;
Warning: /home/skd799/Downloads/sqoop/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /home/skd799/Downloads/sqoop/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /home/skd799/Downloads/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
Warning: /home/skd799/Downloads/sqoop/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
18/05/18 15:24:09 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5
18/05/18 15:24:09 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/05/18 15:24:09 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/05/18 15:24:09 INFO tool.CodeGenTool: Beginning code generation
18/05/18 15:24:10 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp2` AS t LIMIT 1
18/05/18 15:24:10 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `emp2` AS t LIMIT 1
18/05/18 15:24:10 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /home/skd799/Downloads/hadoop
Note: /tmp/sqoop-hduser/compile/d58481969b312338b764bc550c174b3a/emp2.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/05/18 15:24:11 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-hduser/compile/d58481969b312338b764bc550c174b3a/emp2.jar
18/05/18 15:24:12 INFO tool.ImportTool: Destination directory emp2 deleted.
18/05/18 15:24:12 WARN manager.MySQLManager: It looks like you are importing from mysql.
18/05/18 15:24:12 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
18/05/18 15:24:12 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
18/05/18 15:24:12 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
18/05/18 15:24:12 INFO mapreduce.ImportJobBase: Beginning import of emp2
18/05/18 15:24:13 INFO db.DBInputFormat: Using read commited transaction isolation
18/05/18 15:24:14 INFO db.DBInputFormat: Using read commited transaction isolation
18/05/18 15:24:15 WARN mapreduce.Counters: Group FileSystemCounters is deprecated. Use org.apache.hadoop.mapreduce.FileSystemCounter instead
18/05/18 15:24:15 INFO mapreduce.ImportJobBase: Transferred 0 bytes in 2.9424 seconds (0 bytes/sec)
18/05/18 15:24:15 WARN mapreduce.Counters: Group org.apache.hadoop.mapred.Task$Counter is deprecated. Use org.apache.hadoop.mapreduce.TaskCounter instead
18/05/18 15:24:15 INFO mapreduce.ImportJobBase: Retrieved 0 records.
18/05/18 15:24:15 ERROR tool.ImportTool: Error during import: Import job failed!
can you write your sqoop query for importing the tables

Error Message when sqooping oracle table into hive

I was looking to find out how I can fix the following error message that I keep getting when I am sqooping a data table into oracle. I was able to sqoop another table this morning but every attempt after that has failed. The following is the error within the log file:
Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
I am not sure why I keep getting this error message as without the Kerberos Ticket I am unable to login to Putty in order to run the script to sqoop the table and hence I am confused as to a) why I am getting this error b)what I can do to fix it.
Would appreciate it if somebody could advise where I am going wrong.
Thanks in advance.
Part of the Error Message from Log File:
17/01/05 15:08:52 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.7.1
17/01/05 15:08:52 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
17/01/05 15:08:52 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
17/01/05 15:08:52 INFO manager.SqlManager: Using default fetchSize of 1000
17/01/05 15:08:52 INFO tool.CodeGenTool: Beginning code generation
17/01/05 15:08:53 INFO manager.OracleManager: Time zone has been set to GMT
17/01/05 15:08:53 INFO manager.SqlManager: Executing SQL statement: SELECT * FROM TABLE_NAME where SNAPSHOT_DATE_TIME >= '01-APR-16' and (1 = 0)
17/01/05 15:08:53 INFO manager.SqlManager: Executing SQL statement: SELECT * FROM TABLE_NAME where SNAPSHOT_DATE_TIME >= '01-APR-16' and (1 = 0)
17/01/05 15:08:53 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /opt/cloudera/parcels/CDH/lib/hadoop-mapreduce
Note: /tmp/sqoop-username/compile/ba4df230b0d18377522bbfe053ed3661/QueryResult.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
17/01/05 15:08:55 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-username/compile/ba4df230b0d18377522bbfe053ed3661/QueryResult.jar
17/01/05 15:08:55 INFO mapreduce.ImportJobBase: Beginning query import.
17/01/05 15:08:55 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
17/01/05 15:08:55 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
17/01/05 15:08:56 WARN security.UserGroupInformation: PriviledgedActionException as:username (auth:KERBEROS) cause:javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
17/01/05 15:08:56 WARN ipc.Client: Exception encountered while connecting to the server : javax.security.sasl.SaslException: GSS initiate failed [Caused by GSSException: No valid credentials provided (Mechanism level: Failed to find any Kerberos tgt)]
....

Sqoop import getting halted

I am trying to import a table from mysql to HDFS,but it is getting paused here as below:
sqoop import --connect jdbc:mysql://localhost/movielens --username root --table tutorials_tbl --m 1
16/11/26 10:47:33 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
16/11/26 10:47:33 INFO tool.CodeGenTool: Beginning code generation
16/11/26 10:47:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `tutorials_tbl` AS t LIMIT 1
16/11/26 10:47:34 INFO orm.CompilationManager: HADOOP_HOME is /usr/lib/hadoop
16/11/26 10:47:34 INFO orm.CompilationManager: Found hadoop core jar at: /usr/lib/hadoop/hadoop-core.jar
16/11/26 10:47:36 ERROR orm.CompilationManager: Could not rename /tmp/sqoop-training/compile/f150b283edf7b39ed9facc57a781542e/tutorials_tbl.java to /home/training/./tutorials_tbl.java
java.io.IOException: Destination '/home/training/./tutorials_tbl.java' already exists
at org.apache.commons.io.FileUtils.moveFile(FileUtils.java:1811)
at com.cloudera.sqoop.orm.CompilationManager.compile(CompilationManager.java:229)
at com.cloudera.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:85)
at com.cloudera.sqoop.tool.ImportTool.importTable(ImportTool.java:369)
at com.cloudera.sqoop.tool.ImportTool.run(ImportTool.java:455)
at com.cloudera.sqoop.Sqoop.run(Sqoop.java:146)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at com.cloudera.sqoop.Sqoop.runSqoop(Sqoop.java:182)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:221)
at com.cloudera.sqoop.Sqoop.runTool(Sqoop.java:230)
at com.cloudera.sqoop.Sqoop.main(Sqoop.java:239)
16/11/26 10:47:36 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-training/compile/f150b283edf7b39ed9facc57a781542e/tutorials_tbl.jar
16/11/26 10:47:36 WARN manager.MySQLManager: It looks like you are importing from mysql.
16/11/26 10:47:36 WARN manager.MySQLManager: This transfer can be faster! Use the --direct
16/11/26 10:47:36 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
16/11/26 10:47:36 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
16/11/26 10:47:36 INFO mapreduce.ImportJobBase: Beginning import of tutorials_tbl
This is the last line of the job.
Thanks in advance!
Sqoop is a Map only job hence following all semantics of MR .
Please delete "/home/training/./tutorials_tbl.java" and rerun your job.

Big Data: Sqoop-Export Error

I am very new to this world. While running export command using sqoop, I am getting the below error “Input path does not exist: hdfs://quickstart.cloudera:8020/home/cloudera/Test5”. I have checked the path /home/cloudera/Test5 and the file exists in the path. From the core-site.xml file of sqoop configuration the details of hdfs path is coming, when I tested it through file browser Just opening IE and type hdfs://quickstart.cloudera:8020/home/cloudera/Test5, the message is coming as “Unable to connect”. I do not know the correct paramater values of the property. Please help me in solving this issue.
Please find the property file parameter and errir details below.
Parameter file
<name>fs.defaultFS</name>
<value>hdfs://quickstart.cloudera:8020</value>
Error
[cloudera#quickstart hadoop-conf]$ sqoop export --connect jdbc:sqlserver://10.34.83.177:54815 --username hadoop --password xxxxxx --table hadoop_sanjeeb3 --export-dir /home/cloudera/Test5 -mapreduce-job-name sqoop_export_job -m 1
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
15/10/01 08:42:47 INFO sqoop.Sqoop: Running Sqoop version: 1.4.5-cdh5.4.2
15/10/01 08:42:47 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
15/10/01 08:42:48 INFO manager.SqlManager: Using default fetchSize of 1000
15/10/01 08:42:48 INFO tool.CodeGenTool: Beginning code generation
15/10/01 08:42:49 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [hadoop_sanjeeb3] AS t WHERE 1=0
15/10/01 08:42:49 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-cloudera/compile/aa9c9fd9f69b76202be29508561f22ff/hadoop_sanjeeb3.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
15/10/01 08:42:51 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-cloudera/compile/aa9c9fd9f69b76202be29508561f22ff/hadoop_sanjeeb3.jar
15/10/01 08:42:51 INFO mapreduce.ExportJobBase: Beginning export of hadoop_sanjeeb3
15/10/01 08:42:51 INFO Configuration.deprecation: mapred.job.tracker is deprecated. Instead, use mapreduce.jobtracker.address
15/10/01 08:42:51 INFO Configuration.deprecation: mapred.jar is deprecated. Instead, use mapreduce.job.jar
15/10/01 08:42:54 WARN mapreduce.ExportJobBase: Input path hdfs://quickstart.cloudera:8020/home/cloudera/Test5 does not exist
15/10/01 08:42:54 INFO Configuration.deprecation: mapred.reduce.tasks.speculative.execution is deprecated. Instead, use mapreduce.reduce.speculative
15/10/01 08:42:54 INFO Configuration.deprecation: mapred.map.tasks.speculative.execution is deprecated. Instead, use mapreduce.map.speculative
15/10/01 08:42:54 INFO Configuration.deprecation: mapred.map.tasks is deprecated. Instead, use mapreduce.job.maps
15/10/01 08:42:54 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032
15/10/01 08:42:58 INFO mapreduce.JobSubmitter: Cleaning up the staging area /tmp/hadoop-yarn/staging/cloudera/.staging/job_1443557935828_0011
15/10/01 08:42:58 WARN security.UserGroupInformation: PriviledgedActionException as:cloudera (auth:SIMPLE) cause:org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://quickstart.cloudera:8020/home/cloudera/Test5
15/10/01 08:42:58 ERROR tool.ExportTool: Encountered IOException running export job: org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://quickstart.cloudera:8020/home/cloudera/Test5
Regards - Sanjeeb
Seems like you have some confusion between local file system and hadoop file system. The file that you are trying to export using sqoop should be present in hdfs. The directory location /home/cloudera/Test seems to be present in local file system.
Execute the command given below and confirm that the location that you mentioned exists in hdfs.
hadoop fs -ls /home/cloudera/Test5
If this is giving error, it means the directory doesn't exists in hdfs. You can't browse hdfs with a simple ls command, you have to use hadoop commands. If you want to browse hdfs directories using a browser, open the namenode web ui (http://namenode-host:50070) and there you have an option to browse the files and directories.
You cannot browse the hdfs filesystem using a url like hdfs://quickstart.cloudera:8020/home/cloudera/Test5 using the browser. You can use webhdfs for similar operation.
Ensure that the file is present in hdfs and trigger the command again. It will work.
NB: Usually we never keep user directories like /home/cloudera in hdfs. The structure will be something like /user/{username}. By default, hdfs considers /user/{username} as the home dir in hdfs. Where {username} will be the current logged-in user in linux
That file may be in the local file system, but not in the hadoop distributed file system (HDFS). You can add those local files from local file system to HDFS by
hadoop fs -put <local_file_path> <HDFS_diresctory>
command. You should do it as an HDFS user.

Sqoop error while importing

I am new to sqoop and trying ti import a table in MYSQL table widgets table from the hadoopguide Database.
I am using Hadoop version 0.20.
and my Sqoop is sqoop-1.4.4.bin__hadoop-0.20
I am Running the command:
sqoop import --connect jdbc:mysql://localhost/hadoopguide --table widgets -m 1
This is error log I am getting
Warning: /usr/lib/hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
13/09/25 15:29:41 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
13/09/25 15:29:41 INFO tool.CodeGenTool: Beginning code generation
13/09/25 15:29:41 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `widgets` AS t LIMIT 1
13/09/25 15:29:41 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `widgets` AS t LIMIT 1
13/09/25 15:29:41 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
13/09/25 15:29:41 INFO orm.CompilationManager: Found hadoop core jar at: /usr/local/hadoop/hadoop-0.20.2-core.jar
Note: /tmp/sqoop-ubuntu/compile/348861f092b25aac3fae4089da9abdf0/widgets.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
13/09/25 15:29:42 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop- ubuntu/compile/348861f092b25aac3fae4089da9abdf0/widgets.jar
13/09/25 15:29:42 WARN manager.MySQLManager: It looks like you are importing from mysql.
13/09/25 15:29:42 WARN manager.MySQLManager: This transfer can be faster! Use the -- direct
13/09/25 15:29:42 WARN manager.MySQLManager: option to exercise a MySQL-specific fast path.
13/09/25 15:29:42 INFO manager.MySQLManager: Setting zero DATETIME behavior to convertToNull (mysql)
13/09/25 15:29:42 INFO mapreduce.ImportJobBase: Beginning import of widgets
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.mapred.JobConf.getCredentials()Lorg/apache/hadoop/security/Credentials;
at org.apache.sqoop.mapreduce.db.DBConfiguration.getPassword(DBConfiguration.java:304)
at org.apache.sqoop.mapreduce.db.DBConfiguration.getConnection(DBConfiguration.java:272)
at org.apache.sqoop.mapreduce.db.DBInputFormat.getConnection(DBInputFormat.java:187)
at org.apache.sqoop.mapreduce.db.DBInputFormat.setConf(DBInputFormat.java:162)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:62)
at org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:117)
at org.apache.hadoop.mapred.JobClient.writeNewSplits(JobClient.java:882)
at org.apache.hadoop.mapred.JobClient.submitJobInternal(JobClient.java:779)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:432)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:447)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:186)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:159)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:239)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:600)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:118)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:413)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:502)
at org.apache.sqoop.Sqoop.run(Sqoop.java:145)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:181)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:220)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:229)
at org.apache.sqoop.Sqoop.main(Sqoop.java:238)
Can anyone have any idea about it.
If you have installed hive, hcatalog is installed with it. Now set the HCAT_HOME in your .bashrc as below
cd ~
gedit .bashrc
export HCAT_HOME=${HIVE_HOME}/hcatalog/
export PATH=$HCAT_HOME/bin:$PATH
source .bashrc //to refresh the .bashrc file
otherwise install hcatalog saperately and set home path.
The Hadoop release 0.20 is very old release that is lacking a lot of features. One feature that Sqoop requires is a security additions that were added in 1.x. As a result Sqoop won't work on bare 0.20, at least CDH3u1 or Hadoop 1.x is required. I would strongly suggest to upgrade your Hadoop cluster.

Resources