Greenplum's pg_dump cannot support lock statement - greenplum

I'm using pg_dump to backup and recreate a database's structure. To that end I'm calling pg_dump -scf backup.sql for the backup. But it fails with the following error:
pg_dump: [archiver (db)] query failed: ERROR: Cannot support lock statement yet
pg_dump: [archiver (db)] query was: LOCK TABLE my_schema.my_table IN ACCESS SHARE MODE
I couldn't find reference this particular error anywhere. Is it possible to get around this?
edit: for a little more context, I ran it in verbose mode, and this is what is displayed before the errors:
pg_dump: last built-in OID is 16383
pg_dump: reading extensions
pg_dump: identifying extension members
pg_dump: reading schemas
pg_dump: reading user-defined tables

Related

Dumping db schema using DBIx::Class::Schema::Loader

I want to dump db schema so I can later use it with DBIx:Class.
Connection itself is apparently ok, however there are complaints about moniker clashes that I've tried to resolve by using naming => {ALL=>'v8', force_ascii=>1}
perl -MDBIx::Class::Schema::Loader=make_schema_at,dump_to_dir:./lib -E "$|++;make_schema_at('EC::Schema', { debug => 1, naming => {ALL=>'v8', force_ascii=>1} }, [ 'dbi:Oracle:', 'XX/PP#TNS' ])"
output (ending up without content in ./lib)
Bad table or view 'V_TRANSACTIONS', ignoring: DBIx::Class::Schema::Loader::DBI::_sth_for(): DBI Exception: DBD::Oracle::db prepare failed: ORA-04063: view "ss.vv" has errors (DBD ERROR: error possibly near <*> indicator at char 22 in 'SELECT * FROM ..
Unable to load schema - chosen moniker/class naming style results in moniker clashes. Change the naming style, or supply an explicit moniker_map: tables "ss"."AQ$_PRARAC_ASY_QUEUE_TABLE_S", "ss"."AQ$PRARAC_ASY_QUEUE_TABLE_S" reduced to the same source moniker 'AqDollarSignPraracAsyQueueTableS';
Any suggestions how to solve clashes or use some other dump schema method are welcome.

How to delete an external table in Hive when the hdfs path has been deleted?

I've removed my HDFS path /user/abc, and some Hive tables were stored in /user/abc/data/abc.db , with a rm -R command.
Despite having my regular tables correctly deleted with Hive SQL, my external tables didn't drop, with the following error:
[Code: 1, SQL State: 08S01] Error while processing statement: FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. MetaException(message:Failed to load storage handler: Error in loading storage handler.org.apache.phoenix.hive.PhoenixStorageHandler)
How can I safely delete the tables?
I tried using:
delete from TBL_COL_PRIVS where TBL_ID=[myexternaltableID];
delete from TBL_PRIVS where TBL_ID=[myexternaltableID];
delete from TBLS where TBL_ID=[myexternaltableID];
But it didn't work with the following error message:
[Code: 10297, SQL State: 42000] Error while compiling statement: FAILED: SemanticException [Error 10297]: Attempt to do update or delete on table sys.TBLS that is not transactional
Thank you,
NB: I know a schema is supposed to be deleted more safely with HiveQL but on this particular case, this was not done this way.
Solution is to delete the tables from Hive Metastore (PostgreSQL) with
delete from "TABLE_PARAMS" where "TBL_ID"='[myexternaltableID]';
delete from "TBL_COL_PRIVS" where "TBL_ID"='[myexternaltableID]';
delete from "TBL_PRIVS" where "TBL_ID"='[myexternaltableID]';
delete from "TBLS" where "TBL_ID"='[myexternaltableID]';
NB: Order is important.

how to resolve thousand of errors in oracle import using impdp where I don't know the what parameters were used during expdp?

I am trying to import an oracle 11g dump file using impdp utility but while doing so, inter alia, I am facing two major errors:
First, It is showing the following error:
Processing object type DATABASE_EXPORT/TABLESPACE
ORA-39083: Object type TABLESPACE:"HIS_USER" failed to create with error:
ORA-01119: error in creating database file '/oracle/app/oracle/oradata/dwhrajdr1/his_user13.dbf'
ORA-27040: file create error, unable to create file
OSD-04002: unable to open file
O/S-Error: (OS 3) The system cannot find the path specified.
so to solve this, I have created the tablesapce with same name but now it is showing that 'HIS_USER' tablespace already exists.
Second, I am getting thousands of errors, where it is showing user or role does not exist:
Failing sql is:
GRANT EXECUTE ANY ASSEMBLY TO "DSS"
ORA-39083: Object type SYSTEM_GRANT failed to create with error:
ORA-01917: user or role 'DSS' does not exist
Please suggest how to solve these errors!
How can I import the dumpfile without making hundreds of users/roles or tablespaces?
you can generate sql statement using impdp the following way.
http://www.dba-oracle.com/t_convert_expdp_dmp_file_sql.htm
then adjust parameter accordingly.
scott

Hive Index Creation failed

I am using hive version 3.1.0 in my project I have created one external table using below command.
CREATE EXTERNAL TABLE IF NOT EXISTS testing(ID int,DEPT int,NAME string)
ROW FORMAT DELIMITED
FIELDS TERMINATED BY ','
STORED AS TEXTFILE;
I am trying to create an index for the same external table using the below command.
CREATE INDEX index_test ON TABLE testing(ID)
AS 'org.apache.hadoop.hive.ql.index.compact.CompactIndexHandler'
WITH DEFERRED REBUILD ;
But I am getting below error.
Error: Error while compiling statement: FAILED: ParseException line 1:7 cannot recognize input near 'create' 'index' 'user_id_user' in ddl statement (state=42000,code=40000)
According to Hive documentation, Hive indexing is removed since version 3.0
https://cwiki.apache.org/confluence/display/Hive/LanguageManual+Indexing#LanguageManualIndexing-IndexingIsRemovedsince3.0

heroku pg:transfer - receiving error during upload of postgresql db

I am trying to upload my local database to production and I keep running into the following error while processing:
$ heroku pg:transfer --from postgres://username:password#localhost/blog_develo
pment --to olive --confirm appname
Source database: blog_development on localhost:5432
Target database: HEROKU_POSTGRESQL_OLIVE_URL on afternoon-taiga-2766.herokuapp.com
pg_dump: reading schemas
pg_dump: reading user-defined tables
pg_dump: reading extensions
pg_dump: reading user-defined functions
pg_dump: reading user-defined types
pg_dump: reading procedural languages
pg_dump: reading user-defined aggregate functions
pg_dump: reading user-defined operators
pg_dump: reading user-defined operator classes
pg_dump: reading user-defined operator families
pg_dump: reading user-defined text search parsers
pg_dump: reading user-defined text search templates
pg_dump: reading user-defined text search dictionaries
pg_dump: reading user-defined text search configurations
pg_dump: reading user-defined foreign-data wrappers
pg_dump: reading user-defined foreign servers
pg_dump: reading default privileges
pg_dump: reading user-defined collations
pg_dump: reading user-defined conversions
pg_dump: reading type casts
pg_dump: reading table inheritance information
pg_dump: reading rewrite rules
pg_dump: finding extension members
pg_dump: finding inheritance relationships
pg_dump: reading column info for interesting tables
pg_dump: finding the columns and types of table "schema_migrations"
pg_dump: finding the columns and types of table "articles"
pg_dump: finding default expressions of table "articles"
pg_dump: flagging inherited columns in subtables
pg_dump: reading indexes
pg_dump: reading indexes for table "schema_migrations"
pg_dump: reading indexes for table "articles"
pg_dump: reading constraints
pg_dump: reading triggers
pg_dump: reading large objects
pg_dump: reading dependency data
pg_dump: saving encoding = WIN1252
pg_dump: saving standard_conforming_strings = on
pg_dump: saving database definition
pg_dump: [custom archiver] WARNING: ftell mismatch with expected position -- ftell used
pg_dump: [custom archiver] WARNING: ftell mismatch with expected position -- ftell used
pg_dump: dumping contents of table articles
pg_dump: [custom archiver] WARNING: ftell mismatch with expected position -- ftell used
pg_dump: dumping contents of table schema_migrations
pg_restore: [archiver] did not find magic string in file header
This is a very simple app that I've just created in order to practice using an postgresql db (It only has two tables, articles and the migration). Has anyone seen this error before. And I'm trying to use pg:transfer to upload to the database. Thanks for the help.
EDIT
Database.yml
development:
adapter: postgresql
encoding: unicode
database: blog_development
pool: 5
username: benjaminw
password:
test:
adapter: postgresql
encoding: unicode
database: blog_test
pool: 5
username: benjaminw
password:
production:
adapter: postgresql
encoding: unicode
database: blog_production
pool: 5
username: blog
password:
And here is the Gemfile.
source 'https://rubygems.org'
gem 'rails', '3.2.13'
# Bundle edge Rails instead:
# gem 'rails', :git => 'git://github.com/rails/rails.git'
gem 'pg'
#gem 'activerecord-postgresql-adapter'
#gem 'sequel'
# Gems used only for assets and not required
# in production environments by default.
group :assets do
gem 'sass-rails', '~> 3.2.3'
gem 'coffee-rails', '~> 3.2.1'
# See https://github.com/sstephenson/execjs#readme for more supported runtimes
# gem 'therubyracer', :platforms => :ruby
gem 'uglifier', '>= 1.0.3'
end
gem 'jquery-rails'

Resources