Using Toad for Oracle 10.6 with DB Admin Add-in. For our migration process Dev to QA to Prod, we are starting to use the Schema Compare Module to generate the Sync Script DDL. After execution, we want to store the Sync Scripts for historical purposes. Due to policy, we are unable to copy these much of anyplace. Even the Windows server where toad runs is write-restricted.
I am thinking I could create a table with a CLOB column and store the scripts there, unless you folks tell me that this is a Really Bad Idea. I am looking for any tips on things like handling embedded special characters, or any other pitfalls that I may encounter.
Thanks,
JimR
Doing the best with what you have is a good thing. If management won't let you have version control, then jamming it in the database is better than nothing at all. The important thing is to create a process, document it, and follow it for deployments.
As for using schema differences to deploy, take a look at:
http://thedailywtf.com/Articles/Database-Changes-Done-Right.aspx
Related
I am a java person and not so much familiar with Oracle available features. Please help me.
The requirement is that, we are looking for some virtual(replica/mirror/view) database to be created from Production database just for testing purpose. Once we are done with executing all the automation test cases, delete the virtual database created. So are there any such concepts in Oracle ?
We are on Oracle 12c.
Many apps use same DB(its huge)
PS: We also use docker for deployment and also AWS.
use Rman duplicate to duplicate the test database from production.
https://oracle-base.com/articles/11g/duplicate-database-using-rman-11gr2
you can duplicate from backups or duplicate from active database
You can probably ask your database admin to export the table space to a new test machine which has the same oracle version installed. May require If there are only very few tables, then you can spool your tables out and use sqlloader to load them to a test database ( you will need to manually create the structure of the tables in test environment before hand.
In both cases, you might want to scrub out the sensitive information as per your requirements and standards.
I have been tasked with creating some backups for some Oracle Apex apps (Application Express v4.1.1.00.23). The request is to back up both the applications & referenced db objects (not sure if this means just structure or structure & data).
On the one hand, I would have expected standard db backups to handle most or all of this but I'm very new to Apex so it's all a learning curve.
I'm currently exporting the application from apex and then exporting (using SQL Developer) all the database object dependencies that Apex gives me - although I see that the list doesn't include functions that are used for auth.
This seems a really clunky process that's very prone to mistakes (miss an object, save something to the wrong place, no guarantees of consistency etc).
Does Apex (my version!) offer something to do the job or is there something else I could be doing? I've had a good google but nothing has stood out.
UPDATE: I realise now that I should have included some extra info. I'm currently at a large organisation & I believe our db backups (which I guess/hope are done using rman) are done by a different department. I think the motivation for the request is so that we have some local, easily accessible backup so that if one of the developers messes something up we don't have to go through multiple layers of organisation (& undoubtedly a lot of time) to sort ourselves out. I suspect that some kind of source control would be a great starting point but I'm not sure how far I'll get with that idea - especially as we seem to have little in the way of autonomy over things like servers.
RMAN is the way to go to backup an oracle database:
https://docs.oracle.com/database/121/BRADV/toc.htm
there are tons of material on the hows and whys online; just google "oracle rman" and you'll find what you need (the documentation should cover you as well of course).
cheers
Standard DB backups will include everything you need.
The Apex applications I develop are static, meaning the end users make no changes to the Apex application, and there is no need to make a specific backup other than to store the original apex application .sql installation files in a safe place.
If you must, you can make an export of the database schemas the application uses. For example with the expdp utility.
IN apex you need to take 2 backups one in the workspace
Second your application
Third is while using export import from the database it tends to loose & character in procedure ..So beter use rman and take a complete backup.
I have found this Oracle white paper Life Cycle Management with Oracle Application Express (Revision 2) which does what it says on the tin - including various strategies for exporting, backing up & managing 'lost application development'. It's a really good read and I'll be using it as a template for suggestions of how we can manage our process in future.
We are developing a large data migration from Oracle DB (12c) to another system with SSIS. The developers are using a production copy database but the problem is that, due to the complexity of the data transformation, we have to do things in stages by preprocessing data into intermediate helper tables which are then used further downstream. The problem is that all developers are using the same database and screw each other up by running things simultaneously. Does Oracle DB offer anything in terms of developer sandboxing? We could build a mechanism to handle this (e.g. have dev ID in the helper tables, then query views that map to the dev), but I'd much rather use built-in functionality. Could I use Oracle Multitenant for this?
We ended up producing a master subset database of select schemas/tables through some fairly elaborate PL/SQL, then made several copies of this master schema so each dev has his/her own sandbox (as Alex suggested). We could have used Oracle Data Masking and Subsetting but it's too expensive. Another option for creating the subset database wouldn have been to use Jailer. I should note that we didn't have a need to mask any sensitive data.
Note. I would think this a fairly common problem so if new tools and solutions arise, please post them here as answers.
Hey guys,
I need to figure out a way to back up and also migrate our Oracle database from our production schema to the dev schema and the other way around.
We have bunch of config tables that drive how systems on our platform run, and when setting up new systems or doing maintenance, we need to update our config tables. We want to be able to work on the dev schemas and after setting up a system/feature, we want to be able to migrate all those configs to the dev schemas.
I thought of running a procedure where we give the ID of the system (from the main table) and i would go through all the tables and select nvl(..) and if it doesn't exist, i would insert into, and if it does exist then i just run an update on that row.
This code will get very messy and complicated especially since the whole config schema is very complex and it might be hard to handle all the keys properly.
Another option i was looking at was triggers, so when setting up a new system, there would be a log of all the statements we ran while setting up/editing a system, then we would run it on our production schema.
I'm on a coop term, and have only been working with databases for 6 months, so i don't know that much and any information/advice would be greatly appericiated.
(We use pl/sql)
What about using export / import (or datapump) to bring over the config tables?
Check out data comparison tools like this
Think TOAD has one built in. I'm sure there are others out there too.
It is common to have tables in a schema that are what we call "static data", i.e. the users don't change it because it controls how the application works.
Each change to config data should not be run ad-hoc in the target environment. Instead, you design and code your DML carefully in one or more scripts, which get tested in a dev environment, checked into change control, and can be re-run in any environment when required.
I have an Access application with a SQL server back-end, mixed with quite a few DB objects local to the Access app. I've tried running SQL Profiler, but I got very little except a cryptic sp_execute 2,4288,4289,4290,4291,4292,4293,4294,4295,4296,4297.
I would like a trace tool that is local to the Access DB, so I also pick up any activity that doesn't go back to the SQL server.
As far as I know there is no such facility within Access but, depending on your case, you could try these few things:
Write a wrapper against SQL executables: that would mean replacing all calls to Execute, OpenRecordset etc within your VBA to an alternative version that would log the query.
This isn't going to catch everything obviously but it could help.
Move your local tables to another database and use ODBC to relink them to your original Access application. You can then use ODBC's logging facilities.
This could be the best altenative as it's fairly easy to setup for debugging.
It's not the best solution for a production environment though as all your calls to local tables will in fact go through ODBC, but again, it's a temporary solution for debugging.
Use ShowPlan and ISAMStats to view how Jet/ACE interprets your queries and get other database activity stats.
It's easy to setup by writing a key to the registry and you'll end-up with a log describing how your queries are analysed.
It's more useful for optimisation than logging but again, it could help.
Use Flextracer, a shareware, free for 30 days or so. My colleague here has just found this for us as we were going through a similar situation. Problem solved.
http://www.geardownload.com/development/flextracer-download.html
[]s,
Pedro Carneiro Jr.
pedrokarneiro#hotmail.com