Locking entire database while running a delayed job - ruby-on-rails-3.1

My delayed job has something to do with exporting slightly edited version of most of the tables in the app's database, and while doing so, it is critical that none of the current data is being edited.
Is it possible to lock the entire database while running this delayed job?
More Information:
The database to be exported is in PostgreSQL, Heroku's postgresql database, to be more specific.
The flow is something like (all below should be done automatically by the code):
site would be put in maintenance mode,
freeze then export the database, then
when exporting is complete, re-activate the site back

Given there is not a lot of information with your question, I am going to answer you as best I can.
1) What is the database type and model? Is it a standalone DB like MS Access or Informix SE?
2) If not a standalone engine, does this database support replication. I used to work a lot with MS SQL Server, and replication had implications while the database was live and being edited. That is the implications were whether edited data was replicated. In this case, consult the docs. Is it an option to use replication to preserve the current database?
3) What kind of task is this? It sounds like maintenance. Our Informix SE databases lock when being imported or exported. On the production server, it is my job to make sure no local server applications are trying to access the locked DB, and that our external payments web site cannot interfere while the db is locked.
4) If this is a production site that is not in maintenance mode, then I suggest you probably do not want to lock an entire database.
I am sorry for not answering your question directly, but more information is needed like are you asking if this can be done from the Ruby DB interface on some model of db.

Related

Which is the fastest way to create a test database(with all data) from a production database which is quite big in size (400 GB)?

I am a java person and not so much familiar with Oracle available features. Please help me.
The requirement is that, we are looking for some virtual(replica/mirror/view) database to be created from Production database just for testing purpose. Once we are done with executing all the automation test cases, delete the virtual database created. So are there any such concepts in Oracle ?
We are on Oracle 12c.
Many apps use same DB(its huge)
PS: We also use docker for deployment and also AWS.
use Rman duplicate to duplicate the test database from production.
https://oracle-base.com/articles/11g/duplicate-database-using-rman-11gr2
you can duplicate from backups or duplicate from active database
You can probably ask your database admin to export the table space to a new test machine which has the same oracle version installed. May require If there are only very few tables, then you can spool your tables out and use sqlloader to load them to a test database ( you will need to manually create the structure of the tables in test environment before hand.
In both cases, you might want to scrub out the sensitive information as per your requirements and standards.

Oracle Application Express - Backups

I have been tasked with creating some backups for some Oracle Apex apps (Application Express v4.1.1.00.23). The request is to back up both the applications & referenced db objects (not sure if this means just structure or structure & data).
On the one hand, I would have expected standard db backups to handle most or all of this but I'm very new to Apex so it's all a learning curve.
I'm currently exporting the application from apex and then exporting (using SQL Developer) all the database object dependencies that Apex gives me - although I see that the list doesn't include functions that are used for auth.
This seems a really clunky process that's very prone to mistakes (miss an object, save something to the wrong place, no guarantees of consistency etc).
Does Apex (my version!) offer something to do the job or is there something else I could be doing? I've had a good google but nothing has stood out.
UPDATE: I realise now that I should have included some extra info. I'm currently at a large organisation & I believe our db backups (which I guess/hope are done using rman) are done by a different department. I think the motivation for the request is so that we have some local, easily accessible backup so that if one of the developers messes something up we don't have to go through multiple layers of organisation (& undoubtedly a lot of time) to sort ourselves out. I suspect that some kind of source control would be a great starting point but I'm not sure how far I'll get with that idea - especially as we seem to have little in the way of autonomy over things like servers.
RMAN is the way to go to backup an oracle database:
https://docs.oracle.com/database/121/BRADV/toc.htm
there are tons of material on the hows and whys online; just google "oracle rman" and you'll find what you need (the documentation should cover you as well of course).
cheers
Standard DB backups will include everything you need.
The Apex applications I develop are static, meaning the end users make no changes to the Apex application, and there is no need to make a specific backup other than to store the original apex application .sql installation files in a safe place.
If you must, you can make an export of the database schemas the application uses. For example with the expdp utility.
IN apex you need to take 2 backups one in the workspace
Second your application
Third is while using export import from the database it tends to loose & character in procedure ..So beter use rman and take a complete backup.
I have found this Oracle white paper Life Cycle Management with Oracle Application Express (Revision 2) which does what it says on the tin - including various strategies for exporting, backing up & managing 'lost application development'. It's a really good read and I'll be using it as a template for suggestions of how we can manage our process in future.

Oracle database, moving changes between databases

We have application where all logic is implemented in oracle database using pl/sql.
We have different oracle databases for development and production.
When developer make changes in development database after testing we move changes from development database to production database using schema compare tool of toad. Problem here is that developer must have password of production database. We want only admin to know this password.
Can somebody advice me better way of moving changes between databases without need of having production database password, what is best practice for this ?
I posted this question on oracle OTN forums and got some advices there. Maybe it will be interesting for somebody.
Her is a link
I do not recommend to use comparison tools for generating of database migration scripts.
Development and production databases (and also test databases) must be identical except for current changes made by developers in development databases. Generally speaking this assertion is not correct, because there are many kinds of differencies between development and production databases, e.g. partitioned objects, additional objects for audit (triggers, tables), replication-based objects (snapshots), different tablespaces etc.
Every developer must know, what changes were made by him and applied to development database.
If developer was able to change schema and data in developer database, then he/she must be able to create programs for these DDL and DML changes.
To delegate the same developer an ability to run these migration programs on production database is a bad idea. But if you don't have a better way of database migration, then you can use one of following:
1. Configure Oracle authentication by OS. OS authentication allows Oracle to pass
control of user authentication to the operating system.
2. TOAD can save passwords without disclose them. DBA will insert required password
to local TOAD installation at developer PC (if developers use PC).

Storing DDL scripts into a clob

Using Toad for Oracle 10.6 with DB Admin Add-in. For our migration process Dev to QA to Prod, we are starting to use the Schema Compare Module to generate the Sync Script DDL. After execution, we want to store the Sync Scripts for historical purposes. Due to policy, we are unable to copy these much of anyplace. Even the Windows server where toad runs is write-restricted.
I am thinking I could create a table with a CLOB column and store the scripts there, unless you folks tell me that this is a Really Bad Idea. I am looking for any tips on things like handling embedded special characters, or any other pitfalls that I may encounter.
Thanks,
JimR
Doing the best with what you have is a good thing. If management won't let you have version control, then jamming it in the database is better than nothing at all. The important thing is to create a process, document it, and follow it for deployments.
As for using schema differences to deploy, take a look at:
http://thedailywtf.com/Articles/Database-Changes-Done-Right.aspx

Selective tables/objects Oracle Backup

I need to automate a selective table / user object backup I currently am doing via PL / SQL Developer.
The way I currently do it is via Tools/Export Tables and Tools/Export User Objects, manually select tables / objects, then set the options, choose destination and export. I do this from a windows laptop and the database is located in a suse linux server, both are in the same LAN. DB is running 24/7 and can not be shutdown. Also currently my oracle programming skills are very basic as I only do maintenance to this solution. I would like to keep doing the backup process in the windows laptop, but I would consider a server side script solution also and then retrieving the .sql files from server.
Thanks in advance
I wouldn't really call it a backup, but look at exp/imp and expdp/impdp (data pump) in the Utilities manual
As Gary implies exp/imp really isn't a backup solution. If this database is important to you or others, figure out how to use RMAN , which is usually configured to run in a mode that doesn't require the database to be shut down. Although it executes on the database host and for non-tape destinations must write its files to a filesystem attached to the host, it can be launched remotely.
RMAN is aimed at restoring/recovering the entire database, so if what you're looking for is only the ability to recover isolated objects it may not be for you.

Resources