Search the File Pattern from File Name - shell

I have a file Patter_File.txt which stores lines like below -
ABC|ABC_[0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9].dat|8|,|70|NAME
ABC|ABC_[0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9].dat|9|,|70|PLACE
XYZ|XYZ_[0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9].dat|23|,|70|SSN
XYZ|XYZ_[0-9][0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9].dat|33|,|70|DOB
MNO|MNO_SUMMIT.dat|40|,|70|ADDRESS
MNO|MNO_SUMMIT.dat|5|,|70|COUNTRY
So this PATTERN_FILE.txt stores some information of the actual file but file name is stored in the pattern(if file name has date in the name) except the actual name.
My requirement is a command in which I should pass the actual file name like "ABC_20200408.dat" and it should return all the related lines from this file. Can someone please help.
below command is working fine but in this case I have to pass each pattern one by one to check which one is working.
echo "ABC_20200408.dat"|grep ABC_[0-9][0-9][0-9][0-9][0-9][0-9][0-9][0-9].dat

Related

shell script to read metadata file line by line and take those text as search pattern to find files placed in another directory

I have a directory(say: myDir) with many files dynamically populated inside it
I have a another meta data file with some texts inside it. Eg:
cust_order
cust_mgmt
...
...
I need to write a script to read that meta data file line by line and use that text as search pattern to find files with same name which were placed in the directory(myDir). Kindly help to write shell script for this scenario.
Thanks,
vivek

Shell script - replace a string in the all the files in a directory based on file name

I want to replace a constant string in multiple files based on the name of the file.
Example:
In a directory I have many files named like 'X-A01', 'X-B01', 'X-C01'.
In each file there is a string 'SS-S01'.
I want to replace string 'SS-S01' in the first file with 'X-A01', second file with 'X-B01' and third file with 'X-C01'.
Please help me how can we do it as I have hundreds of files like this and do not want to manually edit all files one by one.
Remember to back up your files(!) before running this command, since I have not actually tried it myself:
You could do something like:
for file in <DIR>/*; do sed -i "s/SS-S01/${file##*/}/" "$file"; done
This will loop over each file in <DIR> and for each loop iteration assign the file name to $file. For each file, sed will replace the first occurence of SS-S01 in that file by the file name.

Laravel file upload name is incorrect

If i do the following with file example.jpg the storage file is getting a random name. How can i give it the example.jpg name?
Storage::disk('local')->put('test', $request->file('file') )
This works, but how can i specify a disk?
$file = request()->file('files');
$file->storeAs('test',$request->file('files')->getClientOriginalName());
From the docs,
If you would not like a file name to be automatically assigned to your stored file, you may use the storeAs method, which receives the path, the file name, and the (optional) disk as its arguments:
$file->storeAs('test', $request->file('files')->getClientOriginalName(), 'local');

How to get the full name of a folder if I only know the beginning

I am receiving an input from the user which looks like follows:
echo +++Your input:+++
read USER_INPUT
The way I should use it is to retrieve the full name of a folder which starts with that input, but that contains other stuffs right after. All I know is that the folder is unique.
For example:
User input
123456
Target folder
/somepath/someotherpath/123456-111-222
What I need
MYNEED=123456-111-222
I was thinking to retrieve this with an MYNEED=$(ls /somepath/someotherpath/$USER_INPUT*), but if I do this I will get instead all the content of /somepath/someotherpath/123456-111-222 because that's the only folder existing with that name so the ls command directly goes to the next step.
May I have your idea to retrieve the value 123456-111-222 into a variable that I will need to use after?
basename extracts the filename from the whole path so this will do it:
MYNEED=$(basename /somepath/someotherpath/123456*)

SQLLDR file path argument

I have more than 30 files to load the data.
The path changes at every run in those files. So the path becomes
INFILE "/home/dmf/Cycle7Data/ITEM_IMAGE.csv"
INFILE "/home/dmf/Cycle8Data/ITEM_IMAGE.csv"
The file names change on every control file (SUPPLIER.csv)
Is there any way to pass the File path in a variable, or set any Env. Variable?
So that the control file is not edited everytime
You can pass the data file name on the command line; from the documentation:
DATA specifies the name of the data file containing the data to be loaded. If you do not specify a file extension or file type, then the default is .dat.
If you specify a data file on the command line and also specify data files in the control file with INFILE, then the data specified on the command line is processed first. The first data file specified in the control file is ignored. All other data files specified in the control file are processed.
So pass the relevant file name with each invocation, e.g.
sqlldr user/passwd control=myfile.ctl data=/home/dmf/Cycle7Data/ITEM_IMAGE.csv
If you have lots of files to load from a directory you could have a shell script that loops over the directory contents and passes each file name in turn to an SQL*Loader session.

Resources