AIX 5.3
DB 10.2.0.4
My requirement is to create a script that does export a big table from source to remote destination server and on remote destination database server to create another script that import that export table.
In export script I need to add to purge exported data and it schedule on weekly basis.
Thanks,
There's a lot to this question. I would recommend you break it down and investigate how to accomplish each piece. For example:
Copy files between computers: see scp
Execute tasks on remote server: see ssh
Schedule on weekly basis: see cron
Your question may be better served on ServerFault, as well.
Related
I will need to move all the job from old server to new server.
May I know is that any way to move all the job in the old server to new server by using the script in 1 shot.
Currently I only know to import the job 1 by one all the job into the server and type the account name and password. For example in this screenshot how can I import all 3 job in one time into new server? because in production server I have more than 1000 job for all the server.
Is that any way like using powershell or bat script?
This is the example I have try and it show me this error message
This is possible using PowerShell commands Get-ScheduledTask, Export-ScheduledTask, and Register-ScheduledTask to get your tasks and loop through them to export then do the reverse to import. Writing this all out for you is beyond the scope of SO though. You might want to have a look at this site it goes into some detail about how to do this.
We have number of jobs configured in Controlm and now we got new environment to configure same set of jobs,
Is there any way to create controlm jobs from backend server by writing shell or any scripts?
Any other possibilities to avoid spending time in creating same jobs again and again for each environment??
Control-M provides the functionality to export/import jobs across environments : https://communities.bmc.com/docs/DOC-50416
There are a couple of ways:
From the Planning domain, load the job folders you want to migrate into a blank workspace.
from the top bar then click Export and save. This will save a .xml file you can edit using a text editor and then Import Workspace and load - you can import the .xml into the same Control-M if environments are on one CTM EM Server, or a different CTM EM Server.
OR
From the planning Domain, load the job folders you want to migrate, right click on the folder and select Duplicate, then through the GUI update the new folder (If doing this I'd then unload the source folder just to ensure nothing is overwritten there). You can use the Find & Update option. This will only work if your environments are on the same CTM EM server.
I currently have a VPS and I have the fear of something happening to it, either by me or by the hosting company, and so I need to have a daily backup sent to servers unrelated to that of the hosting company.
Essentially I need my server to automatically export my database into an SQL file and then send it to a third party server, idk, such as google or whatever, on a daily basis or even a few times every day so if something happens to the server the sql files will be accessible regardless.
How can I achieve that?
We are not suppose to write you a solution, only help you with coding errors etc.
Here's what you can do:
Create a shell script on the remote server that you want to save the database,this can be a mac or a linux box, we need cron an a shell.
Create a cron job to run dayly.
ShellScript Example. [dbBackup.sh]
#!/bin/bash
today =`date '+%Y-%m-%d'`;
ssh root#remoteServer.com mysqldump -u root --password=SomeDiffPassword databaseName > /home/user/DailyDatabaseBackups/database_$today.sql
Cron Example
* * * * * /home/user/dbBackup.sh
I'm using "COPY SELECT ... INTO file" statement from within application code. After file is ready, next step is to move the file to different location. The only problem is that file created by MonetDB has only root permissions so my application code can't touch it. Is there a way to configure MonetDB so dumps are saved as specified user? or my only solution is to iterate results in batches in application and save to file that way. Dumps can range from several MB to 1GB.
You could run MonetDB as the same user that your application server is configured for. Also, both your application server and MonetDB probably should not run as 'root'.
There is no direct support to export files with different permissions. You could try configuring the umask for the user that the starts the DB.
So, MSSQL is nice enough to have given us a nifty little sql code for creating a database backup from a command line:
BACKUP DATABASE [db_name] TO DISK = N'D:\backups\back.bak' WITH NOFORMAT, NOINIT, NAME = N'db_name', SKIP, NOREWIND, NOUNLOAD, STATS = 10
GO
However, I am looking to be able to run this command from a php or even shell script on a remote Mac server.
The Problem I am running into is when I try to change the DISK to say my admin home directory, it keeps complaining to me about:
Cannot open backup device 'D:\PATH\ON\SERVER\/Users/admin/back.bak'. Operating system error 3(The system cannot find the path specified.).
Anyone know what I am missing here? I would be very appreciative
SQL Server's BACKUP command does a backup to the database server's local disk. That means that setting the path to a directory on the client machine makes no sense.
If you want a database backup stored on your client machine, I can basically see 3 options;
Back up to a temporary location accessible from the database server, and copy it from there to your client.
Mount a disk shared from your client machine on your database server as for example X:\ and do the backup to that disk.
Find another backup solution that does backups in a different way (sorry, no, I have no recommendations)
You can use RasorSQL, it's a client for mac and windows.
https://razorsql.com/