SAP HANA Sqoop Import - hadoop

I am trying to sqoop import from a HANA view. I have tried many ways and it still persists. Anyone had a similar experience and also please help me figure out if I m missing something:
Sqoop Job :
sqoop import --driver com.sap.db.jdbc.Driver --connect
'jdbc:sap://hostname:30015?currentschema="_SYS_BIC"' --username
HDP_READ --password-file /path/vangaphx/clrhanapwd --query 'SELECT *
from ZFI_LOS_SUMMARY_NET_GROSS WHERE $CONDITIONS' --delete-target-dir
--target-dir /user/vangaphx/well2/SAPdatazone --num-mappers 1
Error:

Related

Getting Protocol violation error while sqoop import from oracle to hive

I am trying to import data from oracle to hive through sqoop import command.But i am getting java.sql.SQLexception-protocol violation error. I checked found 1 text column with length(4000).
So i removed that column and ran the sqoop command its working.
So i found that because of that column only i am getting that protocol violation error.
Is this because of the length or something else.
Can someone help me on solving this.Below is the sqoop command i am using
sqoop import --connect jdbc:oracle:thin:#:port/servicename--username --password --query "select *from table_name where $CONDITIONS" --hive-drop-import-delims --target-dir /user/test --map-column-java -m 1

Hive import not taking place using Sqoop

I am trying to import mysql to hive but it's not happening with the below query
:
sqoop import --connect jdbc:mysql://localhost/cars --username root --query 'Select carnum,carname from carsinfo where $CONDITIONS' --hive-import --hive-table exams.examresults --target-dir /hive_table1_data --m 1
I am getting the error while importing saying
Encoutered IOException running import job: java.io.IOException.
i really don't understand what's the mistake I am doing. Hours I have spent on this. But nothing seems to work.
Thanks !!
Looks like you forgot to specify the port.
Try this: 'jdbc:mysql://localhost:3306/cars' this of course, makes an assumption that you're running mysql on the default port.

Sqoop the data from oracle using HDP 2.4 on windows environment

Sqoop the data from oracle using HDP 2.4 on windows environment
Error message:
The system can not find the file specified.
sqoop import --connect jdbc:oracle:thin:hud_reader/hud_reader_n1le#huiprd_nile:1527 --username hud_reader --password hud_reader_n1le --table <DATAAGGRUN> --target-dir C:\hadoop\hdp\temp --m 1
Help appreciated. Thank you.

sqoop import is showing error

I'm using hadoop 2.5.1 and sqoop 1.4.6.
I am using sqoop import for importing table from mysql database to be used with hadoop. It is showing following error
Sqoop Command
sqoop import --connect jdbc:mysql://localhost/<dbname> --username hadoopsqoop --password hadoop#123 --table tablename -m 1
Exception
Exception in thread "main" java.lang.NoSuchMethodError: org.apache.hadoop.fs.FSOutputSummer
Is there any way to figure out issue?
I figured out issue. I set HADOOP_HOME correctly and it solves my problem.
How can you import with out mentioning where to store the file. try this
sqoop import --connect jdbc:mysql://localhost/dbname --username hadoopsqoop --password hadoop#123 --table tablename --target-dir 'hdfspath' -m 1

How to import tables from sql server through sqoop to hdfs

I have hadoop, hive, sqoop installed. I imported the table from my database to hdfs but couldnt import it to hive. Do I need to configure any file in hive? Also when I browsed the web the configuration is shown for MySQL but I am using the driver jdbc:sqlserver.
Anyone please help me as I am stuck with this since many days.
jdbc:mysql is for mysql and it won't work for sqlserver, i have tried using it and it was giving out errors. I have tried the below command and it worked wonderfully.
Command – import
Copy data from Database Table to HDFS File System
In the example below, our database & hdfs configuration is:
server name :- labDB
database name :- demo
SQL user name :- sqoop
SQL password :- simp1e
Driver Class Name :- com.microsoft.sqlserver.jdbc.SQLServerDriver
Table :- dbo.customers
Target Directory : /tmp/dbo-customer (HDFS Folder name)
Syntax:
sqoop import --connect jdbc:sqlserver://sqlserver-name \
--username <username> \
--password <password> \
--driver <driver-manager-class> \
--table <table-name> \
--target-dir <target-folder-name>
Sample:
sqoop import --connect "jdbc:sqlserver://labDB;database=demo" \
--username sqoop \
--password simp1e \
--driver com.microsoft.sqlserver.jdbc.SQLServerDriver \
--table "dbo.customer" \
--target-dir "/tmp/dbo-customer"
https://danieladeniji.wordpress.com/2013/05/06/hadoop-sqoop-importing-data-from-microsoft-sql-server/
You should be able to import a table and see it in Hive using the --hive-import flag
Check if you have defined all the global variables, HADOOP_HOME, SQOOP_HOME and HIVE_HOME
If it doesn't work for you, in the meantime you can always use CREATE EXTERNAL TABLE syntax to make use of your imported data in Hive.
Have you used the specific --hive-import switch in the sqoop command line?
Sqoop import --connect ‘jdbc:sqlserver://sqlservername.mycompany.com;username=hadoop;password=hadoop;database=hadoop’ --table dataforhive --hive-import
just create an external hive table on the path in hdfs. or use --hive-import
Any of the two should work for you.
I also had the same problem, I could store my MYSQL table in the HDFS but couldn't store it in hive. I simple imported the table in hive using the following command without again storing it in the HDFS and it worked for me.
sqoop import --connect jdbc:mysql://ipAddress:portNo/mysqldatabase --table mysqltablename --username mysqlusername --password mysqlpassword --hive-import --hive-table hivedatabase.hivetablename

Resources