User does not belong to supergroup - sqoop

I am getting error while running Sqoop import-all-table command.
chgrp: changing ownership of 'hdfs://quickstart.cloudera:8020/user/hive/warehouse/retail_db.db/categories/part-m-00000': User does not belong to supergroup.
The command is sqoop import-all-tables --connect "jdbc:mysql://localhost:3306/retail_db" --username retail_dba -P --hive-import --hive-database retail_db --create-hive-table --hive-overwrite -m 2
I have checked following file but dfs.permission is set to false.
[cloudera#quickstart ~]$ locate hdfs-site.xml
/etc/hadoop/conf.empty/hdfs-site.xml
/etc/hadoop/conf.impala/hdfs-site.xml
/etc/hadoop/conf.pseudo/hdfs-site.xml
/etc/impala/conf.dist/hdfs-site.xml
SuperGroup is not present in /etc/group
Please suggest to resolve this.

after reboot of machine somehow i was able to run the below query was same Info message but with all data in all tables.
sqoop import-all-tables --connect "jdbc:mysql://localhost:3306/retail_db" --username retail_dba -P --hive-import --hive-database retail_db --create-hive-table --hive-overwrite

Related

I wanted to know why the tables from hive db which I imported from sqlserver using sqoop is disappearing

So I'm trying to import-all-tables into hive db, ie, user/hive/warehouse/... on hdfs, using the below command:
sqoop import-all-tables --connect "jdbc:sqlserver://<servername>;database=<dbname>" \
--username "<username>" \
--password "<password>" \
--warehouse-dir "/user/hive/warehouse/" \
--hive-import \
-m 1
In the testdatabase I have 3 tables, when mapreduce runs, the output is success,
ie, the mapreduce job is 100% complete but the file is not found on hive db.
It’s basically getting overwritten by the last table, try removing the forward slash at the end of the directory path. For the tests I would suggest not to use the warehouse directory, use something like ‘/tmp/sqoop/allTables’
There is a another way
1. Create a hive database pointing to a location says "targetLocation"
2. Create hcatalog table in your sqoop import using previously created database.
3. Use target-directory import options to point that targetLocation.
you doesn't need need to define warehouse directory.just define hive database it will automatically find out working directory.
sqoop import-all-tables --connect "jdbc:sqlserver://xxx.xxx.x.xxx:xxxx;databaseName=master" --username xxxxxx --password xxxxxxx --hive-import --create-hive-table --hive-database test -m 1
it will just run like rocket.
hope it work for you....

Importing Data In Sqoop 1.99.6

I've been able to import data from Oracle in Sqoop 1.99.6 by creating links and jobs. However, I was wondering if the following syntax can be used to import data:
sqoop import \
--connect jdbc:mysql://mysql.example.com/sqoop \
--username sqoop \
--password sqoop \
--table cities
I could only find sqoop.sh file and not sqoop file in /<sqoop_home>/bin directory.
Thanks.
The below syntax can be used to import data from Oracle Database to HDFS using Sqoop
/usr/bin/sqoop import --connect jdbc:oracle:thin:system/system#:1521:xe --username -P--table . --columns "" --target-dir -m 1

Sqoop error when connecting through teradata

I am unable to execute the sqoop command with Teradata.
I am getting this error:
Error 8017] [SQLState 28000] The UserId, Password or Account is invalid.
Sqoop Command:
sqoop import --connect jdbc:teradata://TDPRODC/LOGMECH=LDAP
--driver com.teradata.jdbc.TeraDriver
--username svddssas
--password ' '
--table ADW.GST_STST_V
--hive-import
--hive-table wins.gst_stst_v1 -m 1
Please make sure, you have permissions with following userId and password in Teradata.
You can check with the below mentioned
select * from dbc.logonoff where logdate >= date '2015-10-07'
Note: change the date to the execution date.

hadoop sqoop export table to sql server error

I am new to hadoop and have recently started work on sqoop. While trying to export a table from hadoop to sql server i am getting the following error:
input path does not exist hdfs://sandbox:8020/user/root/
The command i am using is :
sqoop export --connect "jdbc:sqlserver://;username=;password=xxxxx;database=" --table --export-dir /user/root/ -input-fields-terminated-by " "
Could you please guide what i am missing here.
Also could you please let me know the command to navigate to the hadoop directory where the tables are stored.
For a proper sqoop export, Sqoop requires the complete data file location. You cant just specify the root folder.
Try specifying the complete src path
sqoop export --connect jdbc:oracle:thin:<>:1521/<> --username <> --password <> --table <> --export-dir hdfs://<>/user/<>/<> -m 1 --input-fields-terminated-by '|' --input-null-string '\\N' --input-null-non-string '\\N'
Hope this helps

sqoop: Error when tried to import table from database

When used the below command to import the data from the database "MyData", error occured.
sqoop import --connect jdbc:sqlserver://localhost/MyData --table My_Practice --username sa -P --target-dir /userPavan/table -m 1
But when the database name is not given as below, no error occurred.
>sqoop import --connect jdbc:sqlserver://localhost --table My_Practice --username sa -P --target-dir /userPavan/table -m 1
But i need to run the command with the database name..Can anyone suggest me..
Try using
sqoop import --connect jdbc:sqlserver://localhost:port/databaseName --table My_Practice --username sa -P --target-dir /userPavan/table -m 1
Sqlserver default port is 1433.

Resources