Is it possible to take dump of all tables in a mysql database with single record (limit 1)?Any help is appreciated.
Thanks
Use this :
mysqldump db_name --where="1 limit 1" > backup-file.sql
Related
Data Source Check this image. Instead of oracle database i am getting {......} this variables. why so ? Thanks in advance.
Is there any way that we can check the DB table size and other properties ? I tried COMPUTE STATS but it gives the details of table except the size.
any link to find information and other details are much appreciated.
show table stats tablename
Works as I wanted, Thanks alot
I would like to find a specific table across multiple databases in Hadoop. I'm looking for an automatic solution since there dozens of databases involved.
Is there a hive command that could help me to do this?
Or I have to write something in bash instead?
Thanks
You can simply query your metastore. In my case i have mysql to be the metastore. so i did it this way
connect to your metastore. e.g mysql -uUser -hHost -pPassword
Use your metsatore db e.g use metastoredb;
select * from TBLS where TBL_NAME='table_name';
I have queried three columns and here is the output that i got.
select TBL_ID,DB_ID,TBL_NAME from TBLS where TBL_NAME='ri_reg_datamodels_tmp';
ask me if you get any issue with it
Please any one suggest me how to take hive database backup. we are using mapr.
Regards
Sunilkumar
Currently I have taken backups of Hive DB by using the Import/ Export hive provided utilities. It will backup both the metadata (hive structure info) and the actual data.
EXPORT TABLE tablename [PARTITION (part_column="value"[, ...])]
TO 'export_target_path' [ FOR replication('eventid') ]
IMPORT [[EXTERNAL] TABLE new_or_original_tablename
[PARTITION (part_column="value"[, ...])]]
FROM 'source_path'
[LOCATION 'import_target_path']
But the problem with the above method is for every individual table, you need to provide this statement.
The other method is to get a list of all the available tables in the Hive DB by querying the MySQL Database which will have the metadata of all the Hive Tables. Refer to TBLS table in MySQL for the list of tables.
I have a Postgres 9.1 table plines with a bytea field shape.
Number of records is about 500000.
What is the best way to copy bytea data plines.shape from Postgres to a field shape of an Oracle 10g table olines?
Thank you in advance, ysa
I'd create a program in Java which would connect to PostgreSQL (using JDBC PostgreSQL driver) and Oracle (using Oracle Instant Client) simultaneously and then read a row from Postgres, put this row to Oracle table, repeat.
This would be much easier the other way around... ;-)