sqoop: Error when tried to import table from database - hadoop

When used the below command to import the data from the database "MyData", error occured.
sqoop import --connect jdbc:sqlserver://localhost/MyData --table My_Practice --username sa -P --target-dir /userPavan/table -m 1
But when the database name is not given as below, no error occurred.
>sqoop import --connect jdbc:sqlserver://localhost --table My_Practice --username sa -P --target-dir /userPavan/table -m 1
But i need to run the command with the database name..Can anyone suggest me..

Try using
sqoop import --connect jdbc:sqlserver://localhost:port/databaseName --table My_Practice --username sa -P --target-dir /userPavan/table -m 1
Sqlserver default port is 1433.

Related

User does not belong to supergroup

I am getting error while running Sqoop import-all-table command.
chgrp: changing ownership of 'hdfs://quickstart.cloudera:8020/user/hive/warehouse/retail_db.db/categories/part-m-00000': User does not belong to supergroup.
The command is sqoop import-all-tables --connect "jdbc:mysql://localhost:3306/retail_db" --username retail_dba -P --hive-import --hive-database retail_db --create-hive-table --hive-overwrite -m 2
I have checked following file but dfs.permission is set to false.
[cloudera#quickstart ~]$ locate hdfs-site.xml
/etc/hadoop/conf.empty/hdfs-site.xml
/etc/hadoop/conf.impala/hdfs-site.xml
/etc/hadoop/conf.pseudo/hdfs-site.xml
/etc/impala/conf.dist/hdfs-site.xml
SuperGroup is not present in /etc/group
Please suggest to resolve this.
after reboot of machine somehow i was able to run the below query was same Info message but with all data in all tables.
sqoop import-all-tables --connect "jdbc:mysql://localhost:3306/retail_db" --username retail_dba -P --hive-import --hive-database retail_db --create-hive-table --hive-overwrite

Importing Data In Sqoop 1.99.6

I've been able to import data from Oracle in Sqoop 1.99.6 by creating links and jobs. However, I was wondering if the following syntax can be used to import data:
sqoop import \
--connect jdbc:mysql://mysql.example.com/sqoop \
--username sqoop \
--password sqoop \
--table cities
I could only find sqoop.sh file and not sqoop file in /<sqoop_home>/bin directory.
Thanks.
The below syntax can be used to import data from Oracle Database to HDFS using Sqoop
/usr/bin/sqoop import --connect jdbc:oracle:thin:system/system#:1521:xe --username -P--table . --columns "" --target-dir -m 1

Sqoop error when connecting through teradata

I am unable to execute the sqoop command with Teradata.
I am getting this error:
Error 8017] [SQLState 28000] The UserId, Password or Account is invalid.
Sqoop Command:
sqoop import --connect jdbc:teradata://TDPRODC/LOGMECH=LDAP
--driver com.teradata.jdbc.TeraDriver
--username svddssas
--password ' '
--table ADW.GST_STST_V
--hive-import
--hive-table wins.gst_stst_v1 -m 1
Please make sure, you have permissions with following userId and password in Teradata.
You can check with the below mentioned
select * from dbc.logonoff where logdate >= date '2015-10-07'
Note: change the date to the execution date.

SQOOP connection-param-file format

In Sqoop for Hadoop you can use a parameters file for connection string information.
--connection-param-file filename Optional properties file that provides connection parameters
What is the format of that file?
Say for example I have:
jdbc:oracle:thin:#//myhost:1521/mydb
How should that be in a parameters file?
if you want to give your database connection string and credentials then create a file with those details and use --options-file in your sqoop command
create a file database.props with the following details:
import
--connect
jdbc:mysql://localhost:5432/test_db
--username
root
--password
password
then your sqoop import command will look like:
sqoop --options-file database.props \
--table test_table \
--target-dir /user/test_data
and related to --connection-param-file hope this link will be helpful to understand its usage
It should be same as the command.
Example
import
--connect
jdbc:oracle:thin:#//myhost:1521/mydb
--username
foo
Below is the sample command connecting with mysql server
sqoop list-databases --connect jdbc:mysql://192.168.256.156/test --username root --password root
It will give you the list of databases available at your mysql server

hadoop sqoop export table to sql server error

I am new to hadoop and have recently started work on sqoop. While trying to export a table from hadoop to sql server i am getting the following error:
input path does not exist hdfs://sandbox:8020/user/root/
The command i am using is :
sqoop export --connect "jdbc:sqlserver://;username=;password=xxxxx;database=" --table --export-dir /user/root/ -input-fields-terminated-by " "
Could you please guide what i am missing here.
Also could you please let me know the command to navigate to the hadoop directory where the tables are stored.
For a proper sqoop export, Sqoop requires the complete data file location. You cant just specify the root folder.
Try specifying the complete src path
sqoop export --connect jdbc:oracle:thin:<>:1521/<> --username <> --password <> --table <> --export-dir hdfs://<>/user/<>/<> -m 1 --input-fields-terminated-by '|' --input-null-string '\\N' --input-null-non-string '\\N'
Hope this helps

Resources