permission denied on .gz files on linux - bash

I have some .gz files, of which i want to run a script on. I need these files to remain in .gz format. When I run my script:
#!#bin#bash
for f1 in $(~/Desktop/hawkfiles17/NG*.fq.gz);
do
echo "${f1}"
done
I want to check the location of the files. The script returns:
bash: /home/amyhouseman/Desktop/hawkfiles1/NG0921017_EKDN210018957-1A_HN2MGDSX2_L2_1.fq.gz: Permission denied`
I have tried using:
chmod u+x /home/amyhouseman/Desktop/hawkfiles17/NG0921017_EKDN210018957-1A_HN2MGDSX2_L2_1.fq.gz, but bash returns:
bash: /home/amyhouseman/Desktop/hawkfiles17/NG0921017_EKDN210018957-1A_HN2MGDSX2_L2_1.fq.gz: cannot execute binary file: Exec format error
I'd be grateful if someone could help, I know that you can't execute .gz files, but I'm not sure what else i can do?
I did look through other posts before.

I want to check the location of the files.
You're shebang is incorrect, and several other small strings.
Here is your solution:
$ cat my_script.sh
#!/bin/bash
for item in $(ls ~/Desktop/hawkfiles17/NG*.fq.gz) ; do echo "$item" ; done

Related

.sh file returned file path instead of file name

I am writing a .sh file to print the file names one by one. I have installed ubuntu in windows 10 and using the windows command prompt for executing below code. It is returning "E:/Official/Backups/GGG/*" instead of file names inside. I have also changed the EOL conversion to Unix(LF) by using notepad ++. please help.
#!/bin/bash
folder="E:/Official/Backups/GGG"
for entry in "$folder"/*
do
echo "$entry"
done
Running the script outputs:
$ bash test1.sh
E:/Official/Backups/GGG/*
Output of echo $-
himBHs
Output of ls -ld E:/Official/Backups/GGG
ls: cannot access 'E:/Official/Backups/GGG': No such file or directory
My bash in WSL does not recognize windows paths. If I want to access E:\Official\Backups\GGG I would have to use /mnt/e/Official/Backups/GGG.
I assume, the same goes for your WSL bash. Therefore the "path" E:/Official/Backups/GGG is just a non-existing directory and your observed behavior is to be expected. With bash's default settings a * just stays there as a literal if the directory does not exist or is empty. Example:
$ echo /dir/that/doesnt/exist/*
/dir/that/doesnt/exist/*
$ echo /dir/that/exists/but/is/empty/*
/dir/that/exists/but/is/empty/*
$ echo /dir/*
/dir/file1 /dir/file2 /dir/file3 ...
GGG folder is not exists. Please check and update with valid folder and try again.
#!/bin/bash
folder="E:"
for entry in "$folder"/*
do
echo "$entry"
done

shell scripting no such file or directory

I wrote a shell script that calls the ffmpeg tool but when I run it, it says No such file or directory yet it does!
Here is my script:
#!/bin/bash
MAIN_DIR="/media/sf_data/pipeline"
FFMPEG_DIR="/media/sf_data/livraison_transcripts/ffmpeg-git-20180208-64bit-static"
for file in MAIN_DIR/audio_mp3/*.mp3;
do
cp -p file FFMPEG_DIR;
done
for file in FFMPEG_DIR/*.mp3;
do
./ffmpeg -i ${file%.mp3}.ogg
sox $file -t raw --channels=1 --bits=16 --rate=16000 --encoding=signed-
integer --endian=little ${file%.ogg}.raw;
done
for file in FFMPEG_DIR/*.raw;
do
cp -p file MAIN_DIR/pipeline/audio_raw/;
done
and here is the debug response:
cp: cannot stat ‘file’: No such file or directory
./essai.sh: line 14: ./ffmpeg: No such file or directory
sox FAIL formats: can't open input file `FFMPEG_DIR/*.mp3': No such file or
directory
cp: cannot stat ‘file’: No such file or directory
FYI I'm running CentOS7 on VirtualBox
Thank you
Here's a Minimal, Complete, and Verifiable example (MCVE), a version of your script that removes everything not required to show the problem:
#!/bin/bash
MAIN_DIR="/media/sf_data/pipeline"
echo MAIN_DIR
Expected output:
/media/sf_data/pipeline
Actual output:
MAIN_DIR
This is because bash requires a $ when expanding variables:
#!/bin/bash
MAIN_DIR="/media/sf_data/pipeline"
echo "$MAIN_DIR"
The quotes are not required to fix the issue, but prevent issues with whitespaces.
Hi You need couple of correction in your shell script see below. To get the actual value assigned to a variable you need to add $ at the front of the variable in shell script.
for file in $"MAIN_DIR"/audio_mp3/*.mp3;
do
cp -p "$file" "$FFMPEG_DIR";
done
for file in "$FFMPEG_DIR"/*.mp3;
./ffmpeg -i ${file%.mp3}.ogg
#provide full path like /usr/bin/ffmpeg
for file in "$FFMPEG_DIR"/*.raw;
do
cp -p "$file" "$MAIN_DIR"/pipeline/audio_raw/;
done

How to export multiple files to database using shell script?

Hi everyone I have some files to be copied to database. but for every time I should write "cp .csv'/tmp" so suggest me command in shell script so that I can copy all the files at once. Thank you
#!/bin/sh
file=/pathtoyour/textfile.txt
while read line
do
cp $line /tmp/
done < $file

Unix Bash Alias Command

I am trying to simplify my work with the help of Alias commands in my bash shell.
Problem Statement:
I want to copy different files from different directories to one single folder. The syntax i am using here is as below
cp <folder>/<file> <path>/file.dir
Here I want to save the destination file with filename.directory for easy identification. To achieve the same, I have written the below alias.
Alias Script
cp $Folder/$fileName ~/<path>/$fileName.$Folder
OR
cp $1/$2 ~/<path>/$2.$1
Expected output,
cp bin/file1 ~/Desktop/personal/file1.bin
cp etc/file2 ~/Desktop/personal/file2.etc*
However, It's failing at parsing the source file. i.e. $Folder is not replaced with my first argument.
cp: cannot stat `/file1': No such file or directory
I am writing the above script only to reduce my command lengths. As I am not expert in the above code, seeking any expert help in resolving the issue.
Rather than using an alias you could use a function which you define in some suitable location such as .profile or .bashrc
For example:
mycp()
{
folder=$1
filename=$2
if [ $# -ne 2 ]
then
echo "Two parameters not entered"
return
fi
if [ -d $folder -a -r $folder/$filename ]
then
cp $folder/$filename ~/playpen/$filename.$folder
else
echo "Invalid parameter"
fi
}
There is no way a bash alias can use arguments as you are trying to do. However, perl based rename can probably help you here. Note that it will effectively mv the files, not cp them.
rename 's|([^/]*)/(.*)|/home/user/path/$2.$1|' */*
Limitations: You can only process the files in 1 sub-directory level.
So, below alias can work (with above limitation):
$ alias backupfiles="rename 's|([^/]*)/(.*)|/home/user/path/\$2.\$1|'"
$ backupfiles */*
You can make more sophisticated perl expression if you want to work with multi-directory-level file structure.
A directory contains some files say ~/Documents/file1.d contains newfile.txt
joe#indiana:~/Documents$ ls -l $file
total 1
-rw-r--r-- 1 joe staff 0 May 5 11:39 newfile.txt
Add the variable 'file' in .bashrc for example my .bashrc is shown here
alias ll='ls -la'
file=~/Documents/file1.d
Now whenever you copy to '$file' it will copy to file1.d directory under ~/Documents :)

Why aren't the BASH commands in for loop working

I have a simple code which is:
#!/bin/bash
#LaTex code generator for figures.
ls *.pdf > pdfs.file
ls *.ps > ps.file
pwd=$(pwd)
for i in {1..2}
do
# var=$(awk 'NR==$i' 'pdfs.file')
echo $pwd
echo $pwd > testfile
done
Why aren't the commands in the for loop working?
The $pwd isnt echoed neither is the testfile created.
I tried these commands without the for loop in a terminal and they work fine.
My bash file is made executable by chmod +x bashfile.sh
What I am trying to do is this:
Find pdfs or eps files and populate pdfs.file and eps.file with their file names.
Step through row by row and grab these file names and append to $pwd.
Then append $pwd$var to the include graphics command in latex.
I'm not sure what you're doing wrong, but this works fine for me:
for i in {1..2}; do
echo $PWD
echo $PWD > /tmp/testfile
done
echo "File contents: $(cat /tmp/testfile)"
This successfully returns the following:
/tmp
/tmp
File contents: /tmp
Did you write the bash file using a Windows editor? Maybe you have a problem with line terminators. Try dos2unix bashfile.sh.

Resources