Why can't I find the file when importing from sqoop? - hadoop

I try to import mysql table into hive through sqoop.
I used the command below.
sqoop import --username testuser --password test123 --num-mappers 1 --connect "jdbc:sqlserver://testdomain.com:2031;database=test_db" --target-dir /sqoop_batch/ --append --as-textfile --table test_table
However, an error occurred at the bottom error occurred.
...
2022-03-29 09:47:23,322 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1648452113129_0011
2022-03-29 09:47:23,324 INFO mapreduce.JobSubmitter: Executing with tokens: []
2022-03-29 09:47:23,376 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/root/.staging/job_1648452113129_0011
2022-03-29 09:47:23,387 ERROR tool.ImportTool: Import failed: java.io.FileNotFoundException: File does not exist: hdfs://mycluster:8020/user/root/.staging/job_1648452113129_0011
at org.apache.hadoop.fs.Hdfs.getFileStatus(Hdfs.java:145)
at org.apache.hadoop.fs.AbstractFileSystem.resolvePath(AbstractFileSystem.java:488)
at org.apache.hadoop.mapred.YARNRunner.setupLocalResources(YARNRunner.java:394)
at org.apache.hadoop.mapred.YARNRunner.createApplicationSubmissionContext(YARNRunner.java:573)
at org.apache.hadoop.mapred.YARNRunner.submitJob(YARNRunner.java:325)
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:251)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1570)
at org.apache.hadoop.mapreduce.Job$11.run(Job.java:1567)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1730)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1567)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1588)
at org.apache.sqoop.mapreduce.ImportJobBase.doSubmitJob(ImportJobBase.java:200)
at org.apache.sqoop.mapreduce.ImportJobBase.runJob(ImportJobBase.java:173)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:270)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.SQLServerManager.importTable(SQLServerManager.java:163)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
NOTE. I also modified the permission to access the /user/root/.staging directory to 777, but it was not resolved.
NOTE. I also modified 'yarn.app.mapreduce.am.staging-dir' in 'yarn-site.xml', but it was not resolved.

Related

java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy

I am trying to import mySql data to Hadoop using sqoop.
Environment:
Ubuntu 16.04
Hadoop 2.6.0-cdh5.16.1
Sqoop 1.4.7
On executing
sqoop import --connect jdbc:mysql://127.0.0.1/crawl_data_stats
--username root --password password --table auth_group
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.io.retry.RetryUtils.getDefaultRetryPolicy(Lorg/apache/hadoop/conf/Configuration;Ljava/lang/String;ZLjava/lang/String;Ljava/lang/String;Ljava/lang/Class;)Lorg/apache/hadoop/io/retry/RetryPolicy;
at org.apache.hadoop.hdfs.NameNodeProxies.createNNProxyWithClientProtocol(NameNodeProxies.java:408)
at org.apache.hadoop.hdfs.NameNodeProxies.createNonHAProxy(NameNodeProxies.java:314)
at org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:176)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:763)
at org.apache.hadoop.hdfs.DFSClient.<init>(DFSClient.java:694)
at org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:159)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3303)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:124)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3352)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3320)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:479)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:227)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:463)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:361)
at org.apache.hadoop.mapreduce.lib.output.FileOutputFormat.setOutputPath(FileOutputFormat.java:160)
at org.apache.sqoop.mapreduce.ImportJobBase.configureOutputFormat(ImportJobBase.java:160)
at org.apache.sqoop.mapreduce.ImportJobBase.runImport(ImportJobBase.java:263)
at org.apache.sqoop.manager.SqlManager.importTable(SqlManager.java:692)
at org.apache.sqoop.manager.MySQLManager.importTable(MySQLManager.java:127)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:520)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

Sqoop Import Error: org.apache.hadoop.security.AccessControlException: Permission denied by sticky bit

I have a single node Cloudera Cluster (CDH 5.16) in a Rhel 7 Remote server.
I have installed CDH using packages.
When i am running sqoop import job, i am getting the following error:
Warning: /usr/lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
19/06/04 15:49:31 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6-cdh5.16.1
19/06/04 15:49:31 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
19/06/04 15:49:32 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
19/06/04 15:49:32 INFO tool.CodeGenTool: Beginning code generation
Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary.
19/06/04 15:49:34 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
19/06/04 15:49:35 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `categories` AS t LIMIT 1
19/06/04 15:49:35 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/lib/hadoop-mapreduce
Note: /tmp/sqoop-ak_bng/compile/d07f2f60a7ecbf9411c79687daa024c9/categories.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
19/06/04 15:49:37 INFO orm.CompilationManager: Writing jar file: /tmp/sqoop-ak_bng/compile/d07f2f60a7ecbf9411c79687daa024c9/categories.jar
19/06/04 15:49:38 ERROR tool.ImportTool: Import failed: org.apache.hadoop.security.AccessControlException: Permission denied by sticky bit: user=ak_bng, path="/user/hive/warehouse/sales.db/categories":hive:hive:drwxr-xr-t, parent="/user/hive/warehouse/sales.db":hive:hive:drwxr-xr-t
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkStickyBit(DefaultAuthorizationProvider.java:387)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:159)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3885)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6861)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:4290)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:4245)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:4229)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:856)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.delete(AuthorizationProviderProxyClientProtocol.java:313)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:626)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2278)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2274)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2272)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.hadoop.ipc.RemoteException.instantiateException(RemoteException.java:106)
at org.apache.hadoop.ipc.RemoteException.unwrapRemoteException(RemoteException.java:73)
at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:2106)
at org.apache.hadoop.hdfs.DistributedFileSystem$13.doCall(DistributedFileSystem.java:688)
at org.apache.hadoop.hdfs.DistributedFileSystem$13.doCall(DistributedFileSystem.java:684)
at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
at org.apache.hadoop.hdfs.DistributedFileSystem.delete(DistributedFileSystem.java:684)
at org.apache.sqoop.tool.ImportTool.deleteTargetDir(ImportTool.java:546)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:509)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
Caused by: org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.AccessControlException): Permission denied by sticky bit: user=ak_bng, path="/user/hive/warehouse/sales.db/categories":hive:hive:drwxr-xr-t, parent="/user/hive/warehouse/sales.db":hive:hive:drwxr-xr-t
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkStickyBit(DefaultAuthorizationProvider.java:387)
at org.apache.hadoop.hdfs.server.namenode.DefaultAuthorizationProvider.checkPermission(DefaultAuthorizationProvider.java:159)
at org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.checkPermission(FSPermissionChecker.java:152)
at org.apache.hadoop.hdfs.server.namenode.FSDirectory.checkPermission(FSDirectory.java:3885)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:6861)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInternal(FSNamesystem.java:4290)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.deleteInt(FSNamesystem.java:4245)
at org.apache.hadoop.hdfs.server.namenode.FSNamesystem.delete(FSNamesystem.java:4229)
at org.apache.hadoop.hdfs.server.namenode.NameNodeRpcServer.delete(NameNodeRpcServer.java:856)
at org.apache.hadoop.hdfs.server.namenode.AuthorizationProviderProxyClientProtocol.delete(AuthorizationProviderProxyClientProtocol.java:313)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolServerSideTranslatorPB.delete(ClientNamenodeProtocolServerSideTranslatorPB.java:626)
at org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$ClientNamenodeProtocol$2.callBlockingMethod(ClientNamenodeProtocolProtos.java)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Server$ProtoBufRpcInvoker.call(ProtobufRpcEngine.java:617)
at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:1073)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2278)
at org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:2274)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1924)
at org.apache.hadoop.ipc.Server$Handler.run(Server.java:2272)
at org.apache.hadoop.ipc.Client.call(Client.java:1504)
at org.apache.hadoop.ipc.Client.call(Client.java:1441)
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:231)
at com.sun.proxy.$Proxy10.delete(Unknown Source)
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.delete(ClientNamenodeProtocolTranslatorPB.java:552)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:258)
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:104)
at com.sun.proxy.$Proxy11.delete(Unknown Source)
at org.apache.hadoop.hdfs.DFSClient.delete(DFSClient.java:2104)
... 13 more
Sqoop command is
sqoop import --connect jdbc:mysql://10.188.177.228:3306/sales --username vaishak --password root_123 --table categories --m 1 --delete-target-dir --target-dir /user/hive/warehouse/sales.db/categories
As per the below documentation, i tried changing the fs.defaultFS in core-site.xml.
https://community.cloudera.com/t5/CDH-Manual-Installation/Permission-denied-user-root-access-WRITE-inode-quot-user/td-p/4943
It didnt work.
I tried the below link in stackoverflow:
Permission exception for Sqoop
That too didnt work out for me.
i created a new folder for ak_bng and added to hive group as below:
sudo -u hdfs hadoop fs -mkdir /user/ak_bng
sudo -u hdfs hadoop fs -chown ak_bng:hive /user/ak_bng
i was still getting the same error.
in few links i saw suggestions of adding users(in my case ak_bng) to supergroup.
But i am not aware of how to do it.
few suggested to run sqoop script as different user. I am not aware of how to do that too.
I am very new to Unix and CDH and i am not aware of how to achieve this.
I had a similar permission issue when i tried running sqoop scripts from HUE editor
Following is the error i got at that time:
Failed to create deployment directory: AccessControlException: Permission denied: user=hive, access=WRITE, inode="/user/hue/oozie/deployments":hue:hue:drwxr-xr-x (error 500)
Before CDH, I had setup Hadoop 3.1 and sqoop separately (Both outside CDH) and i was able to successfully import data into HDFS.
But while using CDH i am getting these errors.
Can someone please shed some light on what is the issue here and how to resolve this issue.
Output of hadoop fs -ls /user
drwx------ - hdfs supergroup 0 2019-06-04 12:47 /user/hdfs
drwxr-xr-x - mapred hadoop 0 2019-05-27 20:06 /user/history
drwxr-xr-t - hive hive 0 2019-06-03 18:01 /user/hive
drwxr-xr-x - hue hue 0 2019-06-03 18:01 /user/hue
drwxr-xr-x - impala impala 0 2019-05-27 20:08 /user/impala
drwxr-xr-x - oozie oozie 0 2019-05-27 20:12 /user/oozie
drwxr-xr-x - spark spark 0 2019-05-27 20:06 /user/spark
Group details:
root:x:0:
bin:x:1:
daemon:x:2:
sys:x:3:
adm:x:4:
tty:x:5:
disk:x:6:
lp:x:7:
mem:x:8:
kmem:x:9:
wheel:x:10:ak_bng
cdrom:x:11:
mail:x:12:postfix
man:x:15:
dialout:x:18:
floppy:x:19:
games:x:20:
tape:x:33:
video:x:39:
ftp:x:50:
lock:x:54:
audio:x:63:
nobody:x:99:
users:x:100:
utmp:x:22:
utempter:x:35:
input:x:999:
systemd-journal:x:190:
systemd-network:x:192:
dbus:x:81:
polkitd:x:998:
ssh_keys:x:997:
sshd:x:74:
postdrop:x:90:
postfix:x:89:
printadmin:x:996:
dip:x:40:
cgred:x:995:
rpc:x:32:
libstoragemgmt:x:994:
unbound:x:993:
kvm:x:36:qemu
qemu:x:107:
chrony:x:992:
gluster:x:991:
rtkit:x:172:
radvd:x:75:
tss:x:59:
usbmuxd:x:113:
colord:x:990:
abrt:x:173:
geoclue:x:989:
saslauth:x:76:
libvirt:x:988:
pulse-access:x:987:
pulse-rt:x:986:
pulse:x:171:
gdm:x:42:
setroubleshoot:x:985:
gnome-initial-setup:x:984:
stapusr:x:156:
stapsys:x:157:
stapdev:x:158:
tcpdump:x:72:
avahi:x:70:
slocate:x:21:
ntp:x:38:
ak_bng:x:1000:
localadmin:x:1001:
am_bng:x:1002:
localuser:x:1003:
apache:x:48:
cassandra:x:983:
mysql:x:27:
cloudera-scm:x:982:
hadoop:x:1011:yarn,hdfs,mapred
postgres:x:26:
zookeeper:x:981:
yarn:x:980:
hdfs:x:979:
mapred:x:978:
kms:x:977:kms
httpfs:x:976:httpfs
hbase:x:975:
hive:x:974:impala
sentry:x:973:
solr:x:972:
sqoop:x:971:
flume:x:970:
spark:x:969:
oozie:x:968:
hue:x:967:
impala:x:966:
llama:x:965:
kudu:x:964:
I need to run sqoop scripts from command line as user ak_bng
As the vaishak user, sqoop wants to write to /user/hive/warehouse/sales.db. However, vaishak does not have permission to write to that directory, so sqoop throws
Permission denied by sticky bit: user=ak_bng, path="/user/hive/warehouse/sales.db/categories":hive:hive:drwxr-xr-t, parent="/user/hive/warehouse/sales.db":hive:hive:drwxr-xr-t
Try to specify a target directory that is owned by vaishak and rerun. For example: /user/vaishak/sales.db

Table from sqoop 1.4.7 importing into HDFS is fine. while importing into hive 3.1.1

sqoop import --connect jdbc:mysql://localhost:3306/sqoopdb --username dsa -P
--split-by id --columns id,name --table employee --target-dir /test1
--fields-terminated-by "," --hive-import --create-hive-table
--hive-table employee_sqoop
Table from sqoop 1.4.7 importing into HDFS is fine. while importing into hive 3.1.1 getting
ERROR [main] tool.ImportTool: Import failed: java.io.IOException: Hive CliDriver exited with status=1
This is pseudo hadoop 3.1.1 cluster with hbase , sqoop , hive with latest versions....
I copy the libthrift*.jar file from hive/lib into sqoop/lib directory
And also I set HBASE_HOME to a non-existing path
copy jackson-annotations-2.9.5.jar,jackson-core-2.9.5.jar,jackson-databind-2.9.5.jar files into sqoop/lib folder*
ERROR [main] tool.ImportTool: Import failed: java.io.IOException: Hive CliDriver exited with status=1 at
org.apache.sqoop.hive.HiveImport.executeScript(HiveImport.java:355)
at org.apache.sqoop.hive.HiveImport.importTable(HiveImport.java:241)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:537)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628) at
org.apache.sqoop.Sqoop.run(Sqoop.java:147) at
org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) at
org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183) at
org.apache.sqoop.Sqoop.runTool(Sqoop.java:234) at
org.apache.sqoop.Sqoop.runTool(Sqoop.java:243) at
org.apache.sqoop.Sqoop.main(Sqoop.java:252)

Unable to import data from Informix non transactional database to HDFS

I'am trying to import a table from an Informix DB to HDFS. I'am using Cloudera distribution. I have made sure that the jars are correctly placed under /var/lib/sqoop folder on the host where I'am running the command.
Below is the command and the error that I'am receiving.
Sqoop Command:
sqoop import --connect jdbc:informix-sqli://mgomtcmsp1:50001/cms:INFORMIXSERVER=fmg_pgomtp1 \
--username AL667RE \
--target-dir "/data/raw/manage_agex" \
--password Bdhuern#17892 \
--table agex \
--driver com.informix.jdbc.IfxDriver
Error:
18/09/03 10:54:49 ERROR manager.SqlManager: Error executing statement: java.sql.SQLException: No Transaction Isolation on non-logging db's
java.sql.SQLException: No Transaction Isolation on non-logging db's
at com.informix.util.IfxErrMsg.getSQLException(IfxErrMsg.java:348)
at com.informix.jdbc.IfxSqliConnect.setTransactionIsolation(IfxSqliConnect.java:2012)
at org.apache.sqoop.manager.SqlManager.makeConnection(SqlManager.java:910)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:763)
at org.apache.sqoop.manager.SqlManager.execute(SqlManager.java:786)
at org.apache.sqoop.manager.SqlManager.getColumnInfoForRawQuery(SqlManager.java:289)
at org.apache.sqoop.manager.SqlManager.getColumnTypesForRawQuery(SqlManager.java:260)
at org.apache.sqoop.manager.SqlManager.getColumnTypes(SqlManager.java:246)
at org.apache.sqoop.manager.ConnManager.getColumnTypes(ConnManager.java:327)
at org.apache.sqoop.orm.ClassWriter.getColumnTypes(ClassWriter.java:1858)
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1657)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
18/09/03 10:54:49 ERROR tool.ImportTool: Import failed: java.io.IOException: No columns to generate for ClassWriter
at org.apache.sqoop.orm.ClassWriter.generate(ClassWriter.java:1663)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:106)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:494)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:621)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)

sqoop | getting nullpointerexception while running sqoop import tool

I am trying to import mysql table into hive using sqoop 1.4.7. Command I have executed
sqoop import --connect jdbc:mysql://localhost:3306/sb --username root --password xxxx --table sb_user --hive-import --create-hive-table --hive-table sqoop_import.test_from_sqoop --fields-terminated-by "," --bindir ./
But getting the below exception
Warning: /usr/local/sqoop/sqoop-1.4.7.bin__hadoop-2.6.0/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
18/04/03 22:44:56 INFO sqoop.Sqoop: Running Sqoop version: 1.4.7
18/04/03 22:44:56 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/04/03 22:44:56 INFO manager.MySQLManager: Preparing to use a MySQL streaming resultset.
18/04/03 22:44:56 INFO tool.CodeGenTool: Beginning code generation
18/04/03 22:44:57 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `sb_user` AS t LIMIT 1
18/04/03 22:44:57 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM `sb_user` AS t LIMIT 1
18/04/03 22:44:57 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop
Note: ./sb_user.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
18/04/03 22:44:58 INFO orm.CompilationManager: Writing jar file: ./sb_user.jar
18/04/03 22:44:58 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
java.lang.NullPointerException
at java.util.Objects.requireNonNull(Objects.java:203)
at java.util.Arrays$ArrayList.<init>(Arrays.java:3813)
at java.util.Arrays.asList(Arrays.java:3800)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:76)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListing(FileListing.java:67)
at com.cloudera.sqoop.util.FileListing.getFileListing(FileListing.java:39)
at org.apache.sqoop.orm.CompilationManager.addClassFilesFromDir(CompilationManager.java:293)
at org.apache.sqoop.orm.CompilationManager.jar(CompilationManager.java:378)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:108)
at org.apache.sqoop.tool.ImportTool.importTable(ImportTool.java:501)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:628)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:243)
at org.apache.sqoop.Sqoop.main(Sqoop.java:252)
But I can execute the below command successfully
sqoop list-tables --connect jdbc:mysql://localhost:3306/information_schema --username root -password xxxx
Please help me here. Thanks
Problem is in --bindir ./ option. Sqoop tries to recursively add all contents inside ./ (https://github.com/apache/sqoop/blob/trunk/src/java/org/apache/sqoop/util/FileListing.java#L72). Check this dir or just change it.

Resources