Sqoop Import Using Greenplum - sqoop

I want to import data into hive from greenplum using sqoop.
I am able to import successfully data from default schema of greenplum for user.
But, I am not able to fetch data from table present in other schemas of greenplum.
I tried various option.
Can you please help?
Thanks in advance.

Which sqoop-version do you use?
With v1.4.3 you can set schema-parameter.
With v1.4.2 you can use freeform query (--query) with schema.
I tried and it works fine.

Sqoop itself don't have a notion of "schema". Some specialized connectors (PostgreSQL, Microsoft SQL Server) are exposing user ability to specify a schema, but as Sqoop don't have specialized connector for Greenplum it won't help you here.
You should be able to use query-based import instead of table and specify the schema name in the query, e.g. something like:
sqoop import --query "select * from schema.tablename where $CONDITIONS"

You can take the advantage of custom schema
try with
--schema <<schema_name>>

Related

sqoop import-all-tables - with SQL Server imports system tables

I am trying to use sqoop import-all-tables to get the data from SQL Server into HDFS from a particular database.
After it imports all the expected tables from the DB successfully, it also tries to import system tables in the DB. Is there a way t force sqoop to import only non-system tables?
Thanks.
It looks like a couple of system tables are listed as user tables. Hence the issue.

Migrate oracle data (.dmp file) to mongodb

I have installed MongoDB on my Linux machine. Now i want to migrate my oracle DB to MongoDB. I have .dmp file of my oracle user which need to be imported in Mongodb.
I am not sure whether i can directly use mongoimport to import it or there is any tool available for this migration.
Any help will be appreciated.
Thank you!
We can import the oracle data with mongoimport. Just we need the table to be exported in .csv format.
We can export oracle table in .csv format wih the help of toad or SQL developer.
Once we have .csv file ready, we can import it into our mongodb with mongoimport command.

What is the actual use case of using eval in production? If we need to query the DB, we can directly access it. Why would someone go to sqoop?

I would like to know the importance of eval in sqoop. As per the command, we can query the remote database through sqoop. But I would like know real use case of it specially in production as I don't see any.
First of all, sqoop eval tool is for evaluation purpose only.
As per sqoop documentation:
Warning
The eval tool is provided for evaluation purpose only. You can use it to verify database connection from within the Sqoop or to test simple queries. It’s not suppose to be used in production workflows.
Regarding use case of eval:
You can preview the result of SQL queries on the console. This will help the developer to preview sqoop import queries.
Sqoop eval is used to verify established connection database and preview query result.
sqoop eval --connect "connection_url" --username * --password --query "select count() from Table_name"
Don't use in prod environment not a good practice.
Its just to verify your connection

Import of data in Oracle using impdp fails because of missing dependencies

I need to export a subset of an Oracle table and import it in another Oracle instance. The export using expdp works pretty well but when I try to import the data in the other instance using impdp tool it fails because there are dependencies (foreign keys) missing. Is there any option to force expdp tool to export all required dependencies as well?
no.
You should makie sure your dump set is complete.
What you could try is to use impdp to generate the sql file, apply the generated sql to the other database to only create the table[s]. For this you might need to do some edit work on the generated sql until it fits your task.
Next use impdp with contents=data_only to import the rows in the pre-create table[s]

Import and Export Data plus schema using SQLDeveloper 3.0.04

i am newbie to oracle and i like to export database from remote database and import it on local machine. eOn both machines i have oracle 10.2.
I need to know how to export/import schema and data from oracle 10.2 using SQLDeveloper 3.0.0.4.
To export from remote database, i have used export Tool-> Database Export -> export wizard.
and at the end i have got only sql file with DDL and DML statements but somewhere in file it is written
"Cannot render Table DDL for object XXX.TABLE_NAME with DBMS_METADATA attempting internal generator error.
I have ignored previously mentioned message and tried to run those DDL and DML statements but all this ended up with errors.
Is it possible that all this tied with read-only database user? More over, i dont find any table under tables but also tables under other users in SqlDeveloper.
Thanks in advance
As a test, can you select one object in the tree, and navigate to the script panel? SQLDEV also uses DBMS_METADATA to generate those scripts.
Also, as a work-around, try using DataPump to export and import your data. It will be much more efficient for moving around larger schemas.
Your note about not seeing tables under indicates your schema doesn't actually own any tables. You may be working with synonyms that allow you to query objects as if they are in your account. You could be running into a privilege issue, but your error message doesn't indicate that. Error messages often come in bunches, and the very first one is usually the most important.
If you could try using the EXPORT feature say against a very simple schema like SCOTT as a test, this should indicate whether there is a problem with your account settings or with the software.
I'm not sure with SQL Developer 3.0 but with version 3.1 you can follow this:
SQL Developer 3.1 Data Pump Wizards (expdp, impdp)

Resources