MonetDB - issue with dumping results of select statement into file - monetdb

I'm using "COPY SELECT ... INTO file" statement from within application code. After file is ready, next step is to move the file to different location. The only problem is that file created by MonetDB has only root permissions so my application code can't touch it. Is there a way to configure MonetDB so dumps are saved as specified user? or my only solution is to iterate results in batches in application and save to file that way. Dumps can range from several MB to 1GB.

You could run MonetDB as the same user that your application server is configured for. Also, both your application server and MonetDB probably should not run as 'root'.
There is no direct support to export files with different permissions. You could try configuring the umask for the user that the starts the DB.

Related

How to migrate a PostgreSQL 10 database from Windows C drive to another drive

I have almost an identical problem as this post:
How to migrate a Windows 10 installation of PostgreSQL 9.5.7 to a larger disk
I have a PostgreSQL database on my C drive which is running out of space. I want to move my database to my larger F drive. I'm running into the same issue as the user in the post I mentioned:
The path to executable under the service to start my server is
C:\PostgreSQL\pg10\pgservice.exe "//RS//PostgreSQL 10 Server"
There's no specific path to the data directory explicitly written. I'm not sure how to change where PostgreSQL looks to store data since there's not a -D variable defined there.
I think if I just copy my data over to the larger drive and pass the new data directory as a parameter argument on startup, my issue would be solved. Any ideas on how to do this given my current configuration?
I won’t call it migration rather just transferring files from one location to another.
It can be done by:
Stopping database server
Cut/paste data to your new drive location
reconfigure database server to use new location
Start server again or restart system if needed

Program solution reading through share folders

I have a quick project I am working on for one of our VPs.
We have a few thousand CAD jobs stored on a network file share. The file structure is such that there is a parent folder for the CAD job. Part of the folder name contains the job number. Inside the folder, there are 1 to many .ini text files that contain the connection information I need.
What I need is a programatic way to search through all the folders and extract the job number from the folder name, and all the connection values from the ini files.
For example for a folder named CM8252390-3, the job number is 8252390-3. Inside this folder are 3 ini files. Inside the ini files are that look like this:
[Connection]
Name=IMP_Acme_3.5
[Origin]
X=-15.044784
Y=19.620095
Z=44.621395
So my program needs to give me the following result
Job Connection
8252390-3 IMP_Acme1_3.5
8252390-3 IMP_Acme2_3.5
8252390-3 IMP_Acme3_3.5
8254260-1 IMP_Acme3_2.4
8254260-1 IMP_Acme3_4.1
...continued for all folders in the network share
Any suggestion on the best way to do this. I am primarily an Oracle PL/SQL developer, but have some basic Windows batch and Unix shell experience. If I can get the data loaded into Oracle tables, I can search using PL/SQL tools, but is there a better way using shell, batch, or other tools?
Thank you.
I think this is a job for Powershell or vbScript. It would be easy to use these tools to write the information you need to one file.
This file should be written to an Oracle directory.
grant read permission to a database user on this directory
use utl_file to read the file or treat the file as an external table and expose it as a view
schedule a regular OS job to refresh or rebuild the list

Can I delete .dmp and .phd files of Liberty Profile server?

In the folder <WAS Liberty Profile root>\<profile>\usr\servers\defaultServer there are many files named core.*.dmp and heapdump.*.phd. The size of these files is between 130 MB and 1.3 GB when my deployed app uses 4 MB.
Can I delete these files *.dmp and *.phd?
What are these files for?
Short answer: yes, it's safe to delete them, but you should find out why they're appearing, as it could indicate that your application is not running correctly.
If your dump files were created a long time ago, or you know you were debugging an OutOfMemoryException or have been running server javadump --include=heap,system then go ahead and delete the files. If, however, you keep getting new dump files and don't know why then read on.
The core and heapdump files contain a snapshot of the memory of the application from a specific point in time. Usually you do this to capture the state of your application at the point where something goes wrong so that you can examine it with analysis tools and try to work out what went wrong.
For example, by default the IBM JVM will perform a dump when an OutOfMemoryException is thrown. This allows you to look at the dump file and see what's using up all the memory.
If you have a corresponding javacore file, the fourth line or so should say why the memory dump was made.
e.g. 1TISIGINFO Dump Requested By User (00100000) Through com.ibm.jvm.Dump.javaDumpToFile (caused by running server javadump)
or 1TISIGINFO Dump Event "user" (00004000) received (caused by running kill -3)
If it's a "user" event, then something's asking the JVM to create a dump. If not, and you're still not sure what's causing it, check your jvm.options file for any -Xdump options which can be used to cause the JVM to create a dump in response to certain events. More information on that in the Knowledge Center.

Creating a mssql database backup using odbc on mac

So, MSSQL is nice enough to have given us a nifty little sql code for creating a database backup from a command line:
BACKUP DATABASE [db_name] TO DISK = N'D:\backups\back.bak' WITH NOFORMAT, NOINIT, NAME = N'db_name', SKIP, NOREWIND, NOUNLOAD, STATS = 10
GO
However, I am looking to be able to run this command from a php or even shell script on a remote Mac server.
The Problem I am running into is when I try to change the DISK to say my admin home directory, it keeps complaining to me about:
Cannot open backup device 'D:\PATH\ON\SERVER\/Users/admin/back.bak'. Operating system error 3(The system cannot find the path specified.).
Anyone know what I am missing here? I would be very appreciative
SQL Server's BACKUP command does a backup to the database server's local disk. That means that setting the path to a directory on the client machine makes no sense.
If you want a database backup stored on your client machine, I can basically see 3 options;
Back up to a temporary location accessible from the database server, and copy it from there to your client.
Mount a disk shared from your client machine on your database server as for example X:\ and do the backup to that disk.
Find another backup solution that does backups in a different way (sorry, no, I have no recommendations)
You can use RasorSQL, it's a client for mac and windows.
https://razorsql.com/

check directory of oracle logs

I'm using the check_logfiles nagios plugin to monitor Oracle alert logs. It works wonderfully for that purpose.
However I also need to monitor and entire directory of oracle trace logs for errors. This is because the oracle database is always creating new log files with different names.
What I need to know is the best way to scan an entire directory of oracle trace logs to find out which ones match patterns that specify oracle alerts.
Using check logfiles I tried specifying these options -
--criticalpattern='ORA-00600|ORA-00060|ORA-07445|ORA-04031|Shutting
down instance'
and to specify the directory of logs -
--logfile='/global/cms/u01/app/orahb/admin/opbhb/udump/'
and
--logfile="/global/cms/u01/app/orahb/admin/opbhb/udump/*"
Neither of which have any effect. The check runs but returns ok. Does anyone know if this nagios plugin called check_logfiles can monitor a directory of files rather than just a single file? Or perhaps there is another, better way to achieve the same goal of monitoring a bunch of files that can't be specified ahead of time?
Use a script which:
Opens each file
Copies entries which match the pattern
Outputs the matches to a file

Resources