Program solution reading through share folders - oracle

I have a quick project I am working on for one of our VPs.
We have a few thousand CAD jobs stored on a network file share. The file structure is such that there is a parent folder for the CAD job. Part of the folder name contains the job number. Inside the folder, there are 1 to many .ini text files that contain the connection information I need.
What I need is a programatic way to search through all the folders and extract the job number from the folder name, and all the connection values from the ini files.
For example for a folder named CM8252390-3, the job number is 8252390-3. Inside this folder are 3 ini files. Inside the ini files are that look like this:
[Connection]
Name=IMP_Acme_3.5
[Origin]
X=-15.044784
Y=19.620095
Z=44.621395
So my program needs to give me the following result
Job Connection
8252390-3 IMP_Acme1_3.5
8252390-3 IMP_Acme2_3.5
8252390-3 IMP_Acme3_3.5
8254260-1 IMP_Acme3_2.4
8254260-1 IMP_Acme3_4.1
...continued for all folders in the network share
Any suggestion on the best way to do this. I am primarily an Oracle PL/SQL developer, but have some basic Windows batch and Unix shell experience. If I can get the data loaded into Oracle tables, I can search using PL/SQL tools, but is there a better way using shell, batch, or other tools?
Thank you.

I think this is a job for Powershell or vbScript. It would be easy to use these tools to write the information you need to one file.
This file should be written to an Oracle directory.
grant read permission to a database user on this directory
use utl_file to read the file or treat the file as an external table and expose it as a view
schedule a regular OS job to refresh or rebuild the list

Related

how to temporary store uploaded files using FLASK

I'm creating a web application using flask that takes 3 input from the user: name, picture, grades.
I want to store these information temporary depending on the user's session.
and as a beginner I read that sessions are not for storing files, what other secure way you recommend me to use?
I would recommend to write the files to disk.
If this is really temporary, e.g. you have a two-step-sign-up-form, you could write the files to temporary files or into a temporary directory.
Please see the excellent documentation at https://docs.python.org/3/library/tempfile.html
Maybe this should not be this temporary? It sounds like a user picture is something more permanent.
Then I would recommend e.g. to create a directory for each user and store the files there.
This is done with standard Python io, e.g with the open function.
More info about reading and writing files also can be found in the official Python documentation:
https://docs.python.org/3/tutorial/inputoutput.html#reading-and-writing-files

Transferring multiple files though Informatica FTP connection

I have a requirement to generate the target file in Informatica with date/time appended to it. How will the Informatica FTP connection identify such dynamic file name with date appended to its name?
Also I would like to know if it is possible to FTP multiple files at a time via Informatica FTP connection. Please someone help me on this.
Its actually pretty simple, you just have to use the part of the file name that is constant and then place a *
for eg:
Myfile_20190607.txt
Myfile_20190507.txt
If i specify Myfile_2019* , this is good enough to pickup the files soecified above. You may have to play with the * and criteria to fit the files that you need.
Note: if you are sending files to third party, try to use SFTP instead of plain old ftp and most organization blocked ftp to outside ip's.
As far as I know until Informatica 9.x, it is neither possible to generate dynamic filename nor create multiple files using FTP connection. Only option was to create the files on Informatica server and then run a script to FTP them over to the destination server.
Here is how:
edit your workflow, choose variables tab, create a workflow
variable with datatype NSTRING; assume the variable name is
$wf_timestamp;
create an assignment task and assign TO_CHAR(SYSDATE,'YYYYMMDD')
to the variable in the assignment task;
edit session: Choose Mapping tab, choose your target; then
Connections; then edit FTP Value; then in the Remote Filename
attribute, enter your filename with the timestemp, eg,
myfile_$$$wf_timestamp.csv;
put your assignment before you session in your workflow.
that's it.

MonetDB - issue with dumping results of select statement into file

I'm using "COPY SELECT ... INTO file" statement from within application code. After file is ready, next step is to move the file to different location. The only problem is that file created by MonetDB has only root permissions so my application code can't touch it. Is there a way to configure MonetDB so dumps are saved as specified user? or my only solution is to iterate results in batches in application and save to file that way. Dumps can range from several MB to 1GB.
You could run MonetDB as the same user that your application server is configured for. Also, both your application server and MonetDB probably should not run as 'root'.
There is no direct support to export files with different permissions. You could try configuring the umask for the user that the starts the DB.

check directory of oracle logs

I'm using the check_logfiles nagios plugin to monitor Oracle alert logs. It works wonderfully for that purpose.
However I also need to monitor and entire directory of oracle trace logs for errors. This is because the oracle database is always creating new log files with different names.
What I need to know is the best way to scan an entire directory of oracle trace logs to find out which ones match patterns that specify oracle alerts.
Using check logfiles I tried specifying these options -
--criticalpattern='ORA-00600|ORA-00060|ORA-07445|ORA-04031|Shutting
down instance'
and to specify the directory of logs -
--logfile='/global/cms/u01/app/orahb/admin/opbhb/udump/'
and
--logfile="/global/cms/u01/app/orahb/admin/opbhb/udump/*"
Neither of which have any effect. The check runs but returns ok. Does anyone know if this nagios plugin called check_logfiles can monitor a directory of files rather than just a single file? Or perhaps there is another, better way to achieve the same goal of monitoring a bunch of files that can't be specified ahead of time?
Use a script which:
Opens each file
Copies entries which match the pattern
Outputs the matches to a file

What's the best way to (programatically) determine a file's network origin?

For an application I'm writing, i want to programatically find out what computer on the network a file came from. How can I best accomplish this?
Do I need to monitor network transactions or is this data stored somewhere in Windows?
When a file is copied to the local system Windows does not keep any record of where it was copied. So unless the application that created it saved such information in the file then it will be lost.
With file auditing file and directory operations can be tracked, but I don't think that will include the source path with file copies (just who created it and when).
Yes, it seems like you would either need to detect the file transfer based on interception of network traffic, or if you have the ability to alter the file in some way, use public key cryptography to sign files using a machine-specific key before they are transferred.
Create a service on either the destination computer, or on the file hosting computers which will add records to an Alternate Data Stream attached to each file, much the way that Windows handles ZoneInfo for files downloaded from the internet.
You can have a background process on machine A which "tags" each file as having been tagged by machine A on such-and-such a date and time. Then when machine B downloads the file, assuming we are using NTFS filesystems, it can see the tag from A. Or, if you can't have a process at the server, you can use NTFS streams on the "client" side via packet sniffing methods as others have described. The bonus here is that future file-copies will retain the data as long as it is between NTFS systems.
Alternative: create a requirement that all file transfers must be done through a Web portal (as opposed to network drag-and-drop). Built in logging. Or other type of file retrieval proxy. Do you have control over procedures such as this?

Resources