I have a long list of pdf files which I am trying to merge in a single file. If I give the command
gs -sDEVICE=pdfwrite -dCompatibilityLevel=1.4 -dPDFSETTINGS=/default -dNOPAUSE -dQUIET -dBATCH -dDetectDuplicateImages -dCompressFonts=true -r150 -sOutputFile=blender_manual.pdf $(ls -v)
gs exits with the error
Error: /undefinedfilename in (1.1)
Operand stack:
Execution stack:
%interp_exit .runexec2 --nostringval-- --nostringval-- --nostringval-- 2 %stopped_push --nostringval-- --nostringval-- --nostringval-- false 1 %stopped_push
Dictionary stack:
--dict:1173/1684(ro)(G)-- --dict:0/20(G)-- --dict:77/200(L)--
Current allocation mode is local
Last OS error: No such file or directory
GPL Ghostscript 9.06: Unrecoverable error, exit code 1
The command works if I put *.pdf instead of $(ls -v) but in this case it merges the files in the wrong order. How can I fix this?
Use the #file syntax (see the Ghostscript documentation), list the files you want processed, in the order you want them processed.
Related
I have a directory where I have several files such as:
OK_file1.txt
OK_file2.txt
OK_file3.txt
OK_file4.txt
and within this file I have some content such as:
OK_file1.txt
error: 89 DUE TO TIME LIMIT ***
OK_file2.txt
Job_done
OK_file3.txt
Job_done
OK_file4.txt
error: 34 DUE TO TIME LIMIT ***
and I would like to parse each of these files and only list those with the character : error in a new file called: Jobs_error.txt
For the exemple this file should be
OK_file1.txt
OK_file4.txt
Does anyone have an idea?
You can use the following option of the grep command:
$ man grep
[…]
-l, --files-with-matches print only names of FILEs containing matches
and a redirection (>):
$ grep -l "error" OK_file*.txt > Jobs_error.txt
While I submit jobs with --immediate-submit and --dependency=afterok:{dependencies}, the temp files are getting deleted even before the rules that depend on temp files are started. It's working perfectly when running in normal way. Does any other came across this issue?
Snakemake script
rule all:
input: 'file2', 'file3'
rule one:
input: 'file1'
output: temp('file2')
shell: "touch {output}"
rule two:
input: 'file2'
output: 'file3'
shell: "touch {output}"
Submission command
snakemake -s sample.snake --cluster '../mybatch.py {dependencies}' -j 4 --immediate-submit
Snakemake output message
Provided cluster nodes: 4
Job counts:
count jobs
1 all
1 one
1 two
3
rule one:
input: file1
output: file2
jobid: 1
rule two:
input: file2
output: file3
jobid: 2
localrule all:
input: file2, file3
jobid: 0
Removing temporary output file file2.
Finished job 0.
1 of 3 steps (33%) done
Removing temporary output file file2.
Finished job 1.
2 of 3 steps (67%) done
localrule all:
input: file2, file3
jobid: 0
Removing temporary output file file2.
Finished job 0.
3 of 3 steps (100%) done
Error in job two while creating output file file3.
ClusterJobException in line 9 of /faststorage/home/veera/pipelines/ipsych-GATK/test/sample.snake:
Error executing rule two on cluster (jobid: 2, jobscript: /faststorage/home/veera/pipelines/ipsych-GATK/test/.snakemake/tmp.cmvmr3lz/snakejob.two.2.sh). For
detailed error see the cluster log.
Will exit after finishing currently running jobs.
Error message
Missing files after 5 seconds:
file2
/faststorage/home/veera/pipelines/ipsych-GATK/test/.snakemake/tmp.jqh2qz0n
touch: cannot touch `/faststorage/home/veera/pipelines/ipsych-GATK/test/.snakemake/tmp.jqh2qz0n/1.jobfailed': No such file or directory
Missing files after 5 seconds:
file2
The jobs are being submitted with proper dependency. The ../mybatch.py is custom wrapper script for sbatch. Is this a bug or error in my code? Thanks for the help in advance.
I'm trying to concatenate multiple pdf files which basically are the pages of a photobook containing jpg images. For my output pdf file I wish to adjust the image resolution to 300 dpi and I want to keep the best quality. The commands I'm using are:
gswin64c.exe -dNOPAUSE -dBATCH ^-dDownsampleColorImages=true -dColorImageResolution=300 ^-dDownsampleGrayImages=true -dGrayImageResolution=300 ^-dDownsampleMonoImages=true -dMonoImageResolution=300 ^-sDEVICE=pdfwrite -dJPEGQ=100 -sOutputFile=out.pdf in1.pdf in2.pdf
However, it seems that -dJPEGQ=100 has no effect on the output. Changing this parameter leads to the same filesize and artifacts are visible in the images for all values. Running the command with the option -dPDFSETTINGS=/printer I get better results without artifacts, however this option should also result in 300 dpi. So what is the correct command to specify the quality of the jpg images in the output file?
The solution is adjusting the DCTEncode filter with follwing commands:
gswin64c.exe -sOutputFile=out.pdf -dNOPAUSE -dBATCH ^-sDEVICE=pdfwrite -dPDFSETTINGS=/prepress -c "<< /ColorACSImageDict << /VSamples [ 1 1 1 1 ] /HSamples [ 1 1 1 1 ] /QFactor 0.08 /Blend 1 >> /ColorImageDownsampleType /Bicubic /ColorConversionStrategy /LeaveColorUnchanged >> setdistillerparams" -f in1.pdf
which lead to a compressed file with satisfactory quality for me and can be adjusted to each individual needs.
Edit:
the .setpdfwrite argument is deprecated for recent ghostscript releases (> 9.50), so I removed it in the answer
I want to display line numbers associated to the files [not generate] in a recursive side by side diff between two directories. To display line numbers of files in a diff, the command i use is:
diff -y <(cat -n abc1.txt) <(cat -n abc2.txt)
But in case of directories, how to do it ?
diff -y folder1 folder2
The expected output is:
folder1/file1a.txt folder2/file2a.txt
> 1
1 This is original content | 2 This is changed content
folder1/file1b.txt folder2/file2b.txt
> 1
> 2
1 This is another original content | 3 This is another changed content 2
Is there any solution like using xargs or something as what I have observed is that the recursive diff actually does diff on each file in the directories, so is there anyway to append the line numbers at granular level using xargs before the diff executes the command for a particular file ? Any brilliant ideas guys ???
Hey I am trying to pipe the output of 'convert' function
The line is
convert -geometry 384x \
../../../Flickr/balance/balance_3/balance_399.png \
ppm:- | ./segment 0.5 2000 100 - balance_399_out.ppm
and throwing error.
terminate called after throwing an instance of 'pnm_error'
Aborted (core dumped)
Is there anyway to pipe output of convert to this executable without saving ppm file.