Hi i am trying to import all table from all schema from Oracle DB to HDFS.
This is my script:
sqoop-import-all-tables -Dmapreduce.job.user.classpath.first=true -Dhadoop.security.credential.provider.path=jceks://x.jceks --connect jdbc:oracle:thin:#x.x.x.x:1521/yyyy --username xxxx --password xxxx --warehouse-dir /data-warehouse/xxxx --as-avrodatafile --compression-codec snappy --autoreset-to-one-mapper
When i am running this script, not getting any error and no any Job is starting.
Output:
Warning: /usr/hdp/2.6.2.0-205/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
find: failed to restore initial working directory: Permission denied
18/08/11 08:32:51 INFO sqoop.Sqoop: Running **Sqoop version: 1.4.6.2.6.2.0-205**
18/08/11 08:32:51 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
18/08/11 08:32:51 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
18/08/11 08:32:51 INFO manager.SqlManager: Using default fetchSize of 1000
18/08/11 08:32:53 INFO manager.OracleManager: Time zone has been set to IST
It seems that the user configured in sqoop does not have enough privileges to query and export the data from Oracle. Please check connect and query from command line to Oracle database.
Regards !!!
Related
I'm trying to connect to oracle database to check the number of records.
Scenario-1:
[user#hostname ingestion]$ sqoop eval --connect jdbc:oracle:thin:#//hostname_1:PORT_1/Service_1 --username USER --password PASSWORD --query 'select count(*) from SCHEMA_1.TABLE_1'
class path is /usr/hdp/current/hive-client/lib/libthrift-0.9.3.jar:
Warning: /usr/hdp/2.5.3.0-37/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/05/19 14:50:23 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.3.0-37
17/05/19 14:50:25 INFO hdfs.PeerCache: SocketCache disabled.
17/05/19 14:50:26 INFO manager.SqlManager: Using default fetchSize of 1000
17/05/19 14:50:26 WARN tool.EvalSqlTool: SQL exception executing statement: java.sql.SQLException: Io exception: Oracle Error ORA-12650
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:112)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:146)
at oracle.jdbc.driver.DatabaseError.throwSqlException(DatabaseError.java:255)
at oracle.jdbc.driver.T4CConnection.logon(T4CConnection.java:387)
at oracle.jdbc.driver.PhysicalConnection.<init>(PhysicalConnection.java:441)
at oracle.jdbc.driver.T4CConnection.<init>(T4CConnection.java:165)
at oracle.jdbc.driver.T4CDriverExtension.getConnection(T4CDriverExtension.java:35)
at oracle.jdbc.driver.OracleDriver.connect(OracleDriver.java:801)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at org.apache.sqoop.manager.OracleManager.makeConnection(OracleManager.java:327)
at org.apache.sqoop.manager.GenericJdbcManager.getConnection(GenericJdbcManager.java:52)
at org.apache.sqoop.tool.EvalSqlTool.run(EvalSqlTool.java:64)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:76)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:225)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:234)
at org.apache.sqoop.Sqoop.main(Sqoop.java:243)
Scenario-2:
[user#hostname ingestion]$ sqoop eval --connect jdbc:oracle:thin:#//hostname_2:PORT_2/Service_2 --username USER --password PASSWORD --query 'select count(*) from SCHEMA_2.TABLE_2'
class path is /usr/hdp/current/hive-client/lib/libthrift-0.9.3.jar:
Warning: /usr/hdp/2.5.3.0-37/accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
17/05/19 15:02:21 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6.2.5.3.0-37
17/05/19 15:02:23 INFO hdfs.PeerCache: SocketCache disabled.
17/05/19 15:02:23 INFO manager.SqlManager: Using default fetchSize of 1000
17/05/19 15:02:24 INFO manager.OracleManager: Time zone has been set to GMT
------------------------
| COUNT(*) |
------------------------
| 43 |
------------------------
The first one is giving me error whereas the second one is giving me the expected result.
Can anyone help?
jar used ojdbc14-10.2.0.4.0.jar
After spending quite some time found the issue. The edge node where Sqoop is running cannot telnet because of the closed ports(OS Patching was done before week and the linux admins closed the ports and did not open them).
Once the ports are open in the edge node where sqoop runs I was able to execute the sqoop eval
I want to export hdfs file to sql server. I'm using sqoop for that
sqoop export --bindir . --connect "jdbc:sqlserver://server;database=db" --username sa --password pwd --table sqoop_test -m 1 --export-dir /user/sqooptest
but i get the following error.
Warning: /usr/local/sqoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /usr/local/sqoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
sqoop export --bindir . --connect "jdbc:sqlserver://server;database=db" --username sa --password pwd --table sqoop_test -m 1 --export-dir /user/sqooptest
Warning: /usr/local/sqoop/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../zookeeper does not exist! Accumulo imports will fail.
Please set $ZOOKEEPER_HOME to the root of your Zookeeper installation.
16/07/30 03:59:06 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/07/30 03:59:07 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/07/30 03:59:07 INFO manager.SqlManager: Using default fetchSize of 1000
16/07/30 03:59:07 INFO tool.CodeGenTool: Beginning code generation
16/07/30 03:59:07 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM [sqoop_test] AS t WHERE 1=0
16/07/30 03:59:07 INFO orm.CompilationManager: HADOOP_MAPRED_HOME is /usr/local/hadoop/hadoop-2.6.0
Note: ./sqoop_test.java uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
16/07/30 03:59:10 INFO orm.CompilationManager: Writing jar file: ./sqoop_test.jar
16/07/30 03:59:10 ERROR sqoop.Sqoop: Got exception running Sqoop: java.lang.NullPointerException
java.lang.NullPointerException
at java.util.Objects.requireNonNull(Objects.java:203)
at java.util.Arrays$ArrayList.<init>(Arrays.java:3813)
at java.util.Arrays.asList(Arrays.java:3800)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:76)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListingNoSort(FileListing.java:82)
at org.apache.sqoop.util.FileListing.getFileListing(FileListing.java:67)
at com.cloudera.sqoop.util.FileListing.getFileListing(FileListing.java:39)
at org.apache.sqoop.orm.CompilationManager.addClassFilesFromDir(CompilationManager.java:284)
at org.apache.sqoop.orm.CompilationManager.jar(CompilationManager.java:346)
at org.apache.sqoop.tool.CodeGenTool.generateORM(CodeGenTool.java:109)
at org.apache.sqoop.tool.ExportTool.exportTable(ExportTool.java:64)
at org.apache.sqoop.tool.ExportTool.run(ExportTool.java:100)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
the file has only 3 rows with three columns each. it has no null values. I tried using --input-null-string as well.
my sql table :
create table sqoop_test
(id int,
name nvarchar(200),
title nvarchar(200))
and the file content in hdfs is,
5,X,analyst
6,Y,architect
7,Z,lead
i have used sqoop to transfer data between hdfs and oracle as shown below :
hadoop#jiogis-cluster-jiogis-master-001:~$ sqoop import --connect jdbc:oracle:gis-scan.ril.com/SAT --username=r4g_viewer --password=viewer_123 --table=R4G_OSP.ENODEB --hive-import --hive-table=ENODEB --target-dir=user/hive/warehouse/proddb/JioCenterBoundary -- direct
And i get error as shown below when i use sqoop as show above
Warning: /volumes/disk1/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Warning: /volumes/disk1/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../hcatalog does not exist! HCatalog jobs will fail.
Please set $HCAT_HOME to the root of your HCatalog installation.
Warning: /volumes/disk1/sqoop-1.4.6.bin__hadoop-2.0.4-alpha/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
16/05/09 11:11:19 INFO sqoop.Sqoop: Running Sqoop version: 1.4.6
16/05/09 11:11:19 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
16/05/09 11:11:19 INFO tool.BaseSqoopTool: Using Hive-specific delimiters for output. You can override
16/05/09 11:11:19 INFO tool.BaseSqoopTool: delimiters with --fields-terminated-by, etc.
16/05/09 11:11:19 INFO oracle.OraOopManagerFactory: Data Connector for Oracle and Hadoop is disabled.
16/05/09 11:11:19 ERROR tool.BaseSqoopTool: Got error creating database manager: java.io.IOException: No manager for connect string: jdbc:oracle:gis-scan.ril.com/SAT
at org.apache.sqoop.ConnFactory.getManager(ConnFactory.java:191)
at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:256)
at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:89)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:593)
at org.apache.sqoop.Sqoop.run(Sqoop.java:143)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:179)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:218)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:227)
at org.apache.sqoop.Sqoop.main(Sqoop.java:236)
Your jdbc connection string does not look correct. Can you try it in this format:
--connect jdbc:oracle:thin:#//hostname:port/servicename
In your case, this is probably:
--connect jdbc:oracle:thin:#//gis-scan.ril.com:1521/SAT
You may want to double check the port number is correct as the scan listener may not be on the default 1521 port.
sqoop to transfer data to HDFS from Teradata:
Getting error as below:
-bash-4.1$ sqoop import --connection-manager com.cloudera.sqoop.manager.DefaultManagerFactory --driver com.teradata.jdbc.TeraDriver \
--connect jdbc:teradata://dwsoat.dws.company.co.uk/DATABASE=TS_72258_BASELDB \
--username userid -P --table ADDRESS --num-mappers 3 \
--target-dir /user/nathalok/ADDRESS
Warning: /apps/cloudera/parcels/CDH-5.1.3-1.cdh5.1.3.p0.12/bin/../lib/sqoop/../accumulo does not exist! Accumulo imports will fail.
Please set $ACCUMULO_HOME to the root of your Accumulo installation.
14/10/29 14:00:14 INFO sqoop.Sqoop: Running Sqoop version: 1.4.4-cdh5.1.3
14/10/29 14:00:14 WARN tool.BaseSqoopTool: Setting your password on the command-line is insecure. Consider using -P instead.
14/10/29 14:00:14 ERROR sqoop.ConnFactory: Sqoop wasn't able to create connnection manager properly. Some of the connectors supports explicit --driver and some do not. Please try to either specify --driver or leave it out.
14/10/29 14:00:14 ERROR tool.BaseSqoopTool: Got error creating database manager: java.io.IOException: java.lang.NoSuchMethodException: com.cloudera.sqoop.manager.DefaultManagerFactory.(java.lang.String, com.cloudera.sqoop.SqoopOptions)
at org.apache.sqoop.ConnFactory.getManager(ConnFactory.java:165)
at org.apache.sqoop.tool.BaseSqoopTool.init(BaseSqoopTool.java:243)
at org.apache.sqoop.tool.ImportTool.init(ImportTool.java:84)
at org.apache.sqoop.tool.ImportTool.run(ImportTool.java:494)
at org.apache.sqoop.Sqoop.run(Sqoop.java:147)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
at org.apache.sqoop.Sqoop.runSqoop(Sqoop.java:183)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:222)
at org.apache.sqoop.Sqoop.runTool(Sqoop.java:231)
at org.apache.sqoop.Sqoop.main(Sqoop.java:240)
Caused by: java.lang.NoSuchMethodException: com.cloudera.sqoop.manager.DefaultManagerFactory.(java.lang.String, com.cloudera.sqoop.SqoopOptions)
at java.lang.Class.getConstructor0(Class.java:2810)
at java.lang.Class.getDeclaredConstructor(Class.java:2053)
at org.apache.sqoop.ConnFactory.getManager(ConnFactory.java:151)
... 9 more
-bash-4.1$
Any help will be appreciated.
To get Teradata working properly using a Cloudera distribution, you need to do the following:
Install the Teradata JDBC jars in /var/lib/sqoop. For me these were terajdbc4.jar and tdgssconfig.jar.
Install either Cloudera Connector Powered by Teradata or the Cloudera Connector for Teradata installed somewhere on your filesystem (I prefer /var/lib/sqoop).
In /etc/sqoop/conf/managers.d/, create a file (of any name) and add com.cloudera.connector.teradata.TeradataManagerFactory=<location of connector jar>. For example, I have /etc/sqoop/conf/managers.d/teradata => com.cloudera.connector.teradata.TeradataManagerFactory=/var/lib/sqoop/sqoop-connector-teradata-1.2c5.jar.
There are different ways to install the Teradata connector as well. For example, it may be easier to use Cloudera Manager.
If you're still having trouble, try reaching out to the sqoop mailing list.
I have Vectorwise 2.0.2 and Sqoop 1.4.1 installed.
When I'm trying to use sqoop-export:
sudo -u hdfs ./sqoop-export --driver com.ingres.jdbc.IngresDriver --connect jdbc:ingres://172.16.63.157:VW7/amit --username ingres -P -m 1 --table book_call_log --export-dir /user/hive/warehouse/book_call_log --input-fields-terminated-by '\001' --verbose
Warning: /usr/lib/hbase does not exist! HBase imports will fail.
Please set $HBASE_HOME to the root of your HBase installation.
Enter password:
12/06/08 19:00:52 INFO manager.SqlManager: Using default fetchSize of 1000
12/06/08 19:00:52 INFO tool.CodeGenTool: Beginning code generation
12/06/08 19:00:53 INFO manager.SqlManager: Executing SQL statement: SELECT t.* FROM book_call_log AS t WHERE 1=0
The operation gets stuck here. No error is indicated and the prompt also does not appear.
Any help related to this is appreciated.