Mysql dump for all tables with limit1 - limit

Is it possible to take dump of all tables in a mysql database with single record (limit 1)?Any help is appreciated.
Thanks

Use this :
mysqldump db_name --where="1 limit 1" > backup-file.sql

Related

Oracle database in data source as {e5e86-......}

Data Source Check this image. Instead of oracle database i am getting {......} this variables. why so ? Thanks in advance.

Impala command to know DB table size

Is there any way that we can check the DB table size and other properties ? I tried COMPUTE STATS but it gives the details of table except the size.
any link to find information and other details are much appreciated.
show table stats tablename
Works as I wanted, Thanks alot

Find a table across databases in hadoop

I would like to find a specific table across multiple databases in Hadoop. I'm looking for an automatic solution since there dozens of databases involved.
Is there a hive command that could help me to do this?
Or I have to write something in bash instead?
Thanks
You can simply query your metastore. In my case i have mysql to be the metastore. so i did it this way
connect to your metastore. e.g mysql -uUser -hHost -pPassword
Use your metsatore db e.g use metastoredb;
select * from TBLS where TBL_NAME='table_name';
I have queried three columns and here is the output that i got.
select TBL_ID,DB_ID,TBL_NAME from TBLS where TBL_NAME='ri_reg_datamodels_tmp';
ask me if you get any issue with it

How to take database backup in hive? i mean hive database back up

Please any one suggest me how to take hive database backup. we are using mapr.
Regards
Sunilkumar
Currently I have taken backups of Hive DB by using the Import/ Export hive provided utilities. It will backup both the metadata (hive structure info) and the actual data.
EXPORT TABLE tablename [PARTITION (part_column="value"[, ...])]
TO 'export_target_path' [ FOR replication('eventid') ]
IMPORT [[EXTERNAL] TABLE new_or_original_tablename
[PARTITION (part_column="value"[, ...])]]
FROM 'source_path'
[LOCATION 'import_target_path']
But the problem with the above method is for every individual table, you need to provide this statement.
The other method is to get a list of all the available tables in the Hive DB by querying the MySQL Database which will have the metadata of all the Hive Tables. Refer to TBLS table in MySQL for the list of tables.

How to copy bytea from Postgres to Oracle

I have a Postgres 9.1 table plines with a bytea field shape.
Number of records is about 500000.
What is the best way to copy bytea data plines.shape from Postgres to a field shape of an Oracle 10g table olines?
Thank you in advance, ysa
I'd create a program in Java which would connect to PostgreSQL (using JDBC PostgreSQL driver) and Oracle (using Oracle Instant Client) simultaneously and then read a row from Postgres, put this row to Oracle table, repeat.
This would be much easier the other way around... ;-)

Resources