copy content file to multi files in Linux - bash

I want to copy the content of file in Multiple files have the same extension how to do that using linux command
I try Run the commond :
cat t1.txt > /etc/apache2/site-available/*le-ssl.conf
and
echo "hello" > /etc/apache2/site-available/*le-ssl.conf
but Give me an error result " Ambiguous redirect"
Any ideas?

A redirect will not duplicate a data stream. If you want multiple copies, use tee. For example:
< t1.txt tee /etc/apa.../*le-ssl.conf

Related

How to clean all contents of a file from Terminal on Mac

How to clean all contents of a file from Terminal on Mac? I've been searching for this about a day (I don't want to delete the file)
Assuming it's a file with text content, there are many ways to do it.
You can do either > filename.ext or cat /dev/null > filename.ext
You just need to write an empty content to a file, eg:
> your.file

How to write NUL to all log files in a folder using windows command line

I have multiple log files that start as ABC_.log in a windows environment. I want to clean that file (like writing /dev/null to file in linux). I need to do it through command line.
What I tried:
cmd:$ break > ABC_*.log
and
cmd:$ type NUL > ABC_*.log
Error:
The filename, directory name, or volume label syntax is incorrect
this can't be done via wildcard (not possible to redirect to more than one file at a time). Use a for loop to process each file on it's own:
for %%a in (ABC_*.log) do (
break>"%%a"
)
or directly on command line:
for %a in (ABC_*.log) do break>"%a"
The easiest way to empty a file in UNIX/Linux:
rm <filename>
touch <filename>

how to keep curl from deleteing log.txt when adding new entry?

I am trying to do: create a bash script that downloads the Ubuntu iso and a file from my own ftp server. Then do a traceroute to my server and save route/date and avg speed of the two downloads to log.txt.
Where i am stuck:
This seems to do okay
curl -o test.avi http://hostve.com/neobuntu/pics/Ubu1.avi 2> test.log
Sadly it removes the previous content of test.log.
With >, you are removing the previous data. If you want to append data, use >>:
curl -o test.avi http://hostve.com/neobuntu/pics/Ubu1.avi 2>> test.log
From Bash Reference Manual #3.6 Redirections:
3.6.2 Redirecting Output
Redirection of output causes the file whose name results from the
expansion of word to be opened for writing on file descriptor n, or
the standard output (file descriptor 1) if n is not specified. If
the file does not exist it is created; if it does exist it is
truncated to zero size.
The general format for redirecting output is:
[n]>[|]word
3.6.3 Appending Redirected Output
Redirection of output in this fashion causes the file whose name
results from the expansion of word to be opened for appending on file
descriptor n, or the standard output (file descriptor 1) if n is not
specified. If the file does not exist it is created.
The general format for appending output is:
[n]>>word

open a folder from a name in txt file BASH

I have a text file fold.txt that contains one line fold_nam:
$ cat fold.txt
fold_nam
This name in the the text file is an output that was created during a program run of a folder's name that now contains other files that I need to work with.
I am writing a big script and now I need to enter this folder and I need to get the name from the text file. I tried several things but cannot really work it out.
There's no need to use cat:
cd $(<fold.txt)
If you want to read the line into a variable: read -r folder_name < fold.txt
You should be able to do this:
cd $(cat fold.txt)
or
cd `cat fold.txt`

Can I execute multiple commands store in a text file through bat file

I have a file input.txt which contain multiple commands. I need to use that file as input of my batch file. How can i do this.
cmd < input.txt

Resources