Currently trying to preprocess data and need to name all the image files in my folder sequentially, however when I try to do it with the code below, it ends up producing more files that appear to be previous transformations of the images (previously inverted/cropped copies)
ls | cat -n | while read n f; do mv "$f" "$n.png"; done
I wanted the folders in my file to just label the data sequentially and not create any additional files.
The modifications to the image were made using imagemagick on Linux
The output of ls is as below:
000001560000.png~ 000001900000.png~~~~ 000002260000.png~
000001560000.png~~ 000001900000.png~~~~~ 000002260000.png~~
000001560000.png~~~ 000001910000.png~ 000002260000.png~~~
000001560000.png~~~~ 000001910000.png~~ 000002260000.png~~~~
000001560000.png~~~~~ 000001910000.png~~~ 000002260000.png~~~~~
000001570000.png~ 000001910000.png~~~~ 000002270000.png~
000001570000.png~~ 000001910000.png~~~~~ 000002270000.png~~
000001570000.png~~~ 000001920000.png~ 000002270000.png~~~
000001570000.png~~~~ 000001920000.png~~ 000002270000.png~~~~
000001570000.png~~~~~ 000001920000.png~~~ 000002270000.png~~~~~
000001580000.png~ 000001920000.png~~~~ 000002280000.png~
000001580000.png~~ 000001920000.png~~~~~ 000002280000.png~~
000001580000.png~~~ 000001930000.png~ 000002280000.png~~~
000001580000.png~~~~ 000001930000.png~~ 000002280000.png~~~~
000001580000.png~~~~~ 000001930000.png~~~ 000002280000.png~~~~~
000001590000.png~ 000001930000.png~~~~ 000002290000.png~
000001590000.png~~ 000001930000.png~~~~~ 000002290000.png~~
000001590000.png~~~ 000001940000.png~ 000002290000.png~~~
000001590000.png~~~~ 000001940000.png~~ 000002290000.png~~~~
000001590000.png~~~~~ 000001940000.png~~~ 000002290000.png~~~~~
000001600000.png~ 000001940000.png~~~~ 000002300000.png~
000001600000.png~~ 000001940000.png~~~~~ 000002300000.png~~
000001600000.png~~~ 000001950000.png~ 000002300000.png~~~
000001600000.png~~~~ 000001950000.png~~ 000002300000.png~~~~
000001600000.png~~~~~ 000001950000.png~~~ 000002300000.png~~~~~
000001610000.png~ 000001950000.png~~~~ 000002310000.png~
000001610000.png~~ 000001950000.png~~~~~ 000002310000.png~~
000001610000.png~~~ 000001960000.png~ 000002310000.png~~~
000001610000.png~~~~ 000001960000.png~~ 000002310000.png~~~~
000001610000.png~~~~~ 000001960000.png~~~ 000002310000.png~~~~~
000001620000.png~ 000001960000.png~~~~ 000002320000.png~
000001620000.png~~ 000001960000.png~~~~~ 000002320000.png~~
000001620000.png~~~ 000001970000.png~ 000002320000.png~~~
000001620000.png~~~~ 000001970000.png~~ 000002320000.png~~~~
000001620000.png~~~~~ 000001970000.png~~~ 000002320000.png~~~~~
000001630000.png~ 000001970000.png~~~~ 000002330000.png~
000001630000.png~~ 000001970000.png~~~~~ 000002330000.png~~
000001630000.png~~~ 000001980000.png~ 000002330000.png~~~
000001630000.png~~~~ 000001980000.png~~ 000002330000.png~~~~
000001630000.png~~~~~ 000001980000.png~~~ 000002330000.png~~~~~
000001640000.png~ 000001980000.png~~~~ 000002340000.png~
000001640000.png~~ 000001980000.png~~~~~ 000002340000.png~~
000001640000.png~~~ 000001990000.png~ 000002340000.png~~~
000001640000.png~~~~ 000001990000.png~~ 000002340000.png~~~~
000001640000.png~~~~~ 000001990000.png~~~ 000002340000.png~~~~~
000001650000.png~ 000001990000.png~~~~ 000002350000.png~
000001650000.png~~ 000001990000.png~~~~~ 000002350000.png~~
000001650000.png~~~ 000002000000.png~ 000002350000.png~~~
000001650000.png~~~~ 000002000000.png~~ 000002350000.png~~~~
000001650000.png~~~~~ 000002000000.png~~~ 000002350000.png~~~~~
000001660000.png~ 000002000000.png~~~~ 000002360000.png~
000001660000.png~~ 000002000000.png~~~~~ 000002360000.png~~
000001660000.png~~~ 000002010000.png~ 000002360000.png~~~
000001660000.png~~~~ 000002010000.png~~ 000002360000.png~~~~
000001660000.png~~~~~ 000002010000.png~~~ 000002360000.png~~~~~
000001670000.png~ 000002010000.png~~~~ 000002370000.png~
000001670000.png~~ 000002010000.png~~~~~ 000002370000.png~~
000001670000.png~~~ 000002020000.png~ 000002370000.png~~~
000001670000.png~~~~ 000002020000.png~~ 000002370000.png~~~~
000001670000.png~~~~~ 000002020000.png~~~ 000002370000.png~~~~~
000001680000.png~ 000002020000.png~~~~ 000002380000.png~
000001680000.png~~ 000002020000.png~~~~~ 000002380000.png~~
000001680000.png~~~ 000002030000.png~ 000002380000.png~~~
000001680000.png~~~~ 000002030000.png~~ 000002380000.png~~~~
000001680000.png~~~~~ 000002030000.png~~~ 000002380000.png~~~~~
000001690000.png~ 000002030000.png~~~~ 000002390000.png~
000001690000.png~~ 000002030000.png~~~~~ 000002390000.png~~
000001690000.png~~~ 000002040000.png~ 000002390000.png~~~
000001690000.png~~~~ 000002040000.png~~ 000002390000.png~~~~
000001690000.png~~~~~ 000002040000.png~~~ 000002390000.png~~~~~
000001700000.png~ 000002040000.png~~~~ 000002400000.png~
000001700000.png~~ 000002040000.png~~~~~ 000002400000.png~~
000001700000.png~~~ 000002050000.png~ 000002400000.png~~~
000001700000.png~~~~ 000002050000.png~~ 000002400000.png~~~~
000001700000.png~~~~~ 000002050000.png~~~ 000002400000.png~~~~~
000001710000.png~ 000002050000.png~~~~ 000002410000.png~
000001710000.png~~ 000002050000.png~~~~~ 000002410000.png~~
000001710000.png~~~ 000002060000.png~ 000002410000.png~~~
000001710000.png~~~~ 000002060000.png~~ 000002410000.png~~~~
000001710000.png~~~~~ 000002060000.png~~~ 000002410000.png~~~~~
000001720000.png~ 000002060000.png~~~~ 000002420000.png~
000001720000.png~~ 000002060000.png~~~~~ 000002420000.png~~
000001720000.png~~~ 000002070000.png~ 000002420000.png~~~
000001720000.png~~~~ 000002070000.png~~ 000002420000.png~~~~
000001720000.png~~~~~ 000002070000.png~~~ 000002420000.png~~~~~
000001730000.png~ 000002070000.png~~~~ 10.png
000001730000.png~~ 000002070000.png~~~~~ 11.png
000001730000.png~~~ 000002080000.png~ 12.png
000001730000.png~~~~ 000002080000.png~~ 13.png
000001730000.png~~~~~ 000002080000.png~~~ 14.png
000001740000.png~ 000002080000.png~~~~ 15.png
000001740000.png~~ 000002080000.png~~~~~ 16.png
000001740000.png~~~ 000002090000.png~ 17.png
000001740000.png~~~~ 000002090000.png~~ 18.png
000001740000.png~~~~~ 000002090000.png~~~ 19.png
000001750000.png~ 000002090000.png~~~~ 1.png
000001750000.png~~ 000002090000.png~~~~~ 20.png
000001750000.png~~~ 000002100000.png~ 21.png
000001750000.png~~~~ 000002100000.png~~ 22.png
000001750000.png~~~~~ 000002100000.png~~~ 23.png
000001760000.png~ 000002100000.png~~~~ 24.png
000001760000.png~~ 000002100000.png~~~~~ 25.png
000001760000.png~~~ 000002110000.png~ 26.png
000001760000.png~~~~ 000002110000.png~~ 27.png
000001760000.png~~~~~ 000002110000.png~~~ 28.png
000001770000.png~ 000002110000.png~~~~ 29.png
000001770000.png~~ 000002110000.png~~~~~ 2.png
000001770000.png~~~ 000002120000.png~ 30.png
000001770000.png~~~~ 000002120000.png~~ 31.png
000001770000.png~~~~~ 000002120000.png~~~ 32.png
000001780000.png~ 000002120000.png~~~~ 33.png
000001780000.png~~ 000002120000.png~~~~~ 34.png
000001780000.png~~~ 000002130000.png~ 35.png
000001780000.png~~~~ 000002130000.png~~ 36.png
000001780000.png~~~~~ 000002130000.png~~~ 37.png
000001790000.png~ 000002130000.png~~~~ 38.png
000001790000.png~~ 000002130000.png~~~~~ 39.png
000001790000.png~~~ 000002140000.png~ 3.png
000001790000.png~~~~ 000002140000.png~~ 40.png
000001790000.png~~~~~ 000002140000.png~~~ 41.png
000001800000.png~ 000002140000.png~~~~ 42.png
000001800000.png~~ 000002140000.png~~~~~ 43.png
000001800000.png~~~ 000002150000.png~ 44.png
000001800000.png~~~~ 000002150000.png~~ 45.png
000001800000.png~~~~~ 000002150000.png~~~ 46.png
000001810000.png~ 000002150000.png~~~~ 47.png
000001810000.png~~ 000002150000.png~~~~~ 48.png
000001810000.png~~~ 000002160000.png~ 49.png
000001810000.png~~~~ 000002160000.png~~ 4.png
000001810000.png~~~~~ 000002160000.png~~~ 50.png
000001820000.png~ 000002160000.png~~~~ 51.png
000001820000.png~~ 000002160000.png~~~~~ 52.png
000001820000.png~~~ 000002170000.png~ 53.png
000001820000.png~~~~ 000002170000.png~~ 54.png
000001820000.png~~~~~ 000002170000.png~~~ 55.png
000001830000.png~ 000002170000.png~~~~ 56.png
000001830000.png~~ 000002170000.png~~~~~ 57.png
000001830000.png~~~ 000002180000.png~ 58.png
000001830000.png~~~~ 000002180000.png~~ 59.png
000001830000.png~~~~~ 000002180000.png~~~ 5.png
000001840000.png~ 000002180000.png~~~~ 60.png
000001840000.png~~ 000002180000.png~~~~~ 61.png
000001840000.png~~~ 000002190000.png~ 62.png
000001840000.png~~~~ 000002190000.png~~ 63.png
000001840000.png~~~~~ 000002190000.png~~~ 64.png
000001850000.png~ 000002190000.png~~~~ 65.png
000001850000.png~~ 000002190000.png~~~~~ 66.png
000001850000.png~~~ 000002200000.png~ 67.png
000001850000.png~~~~ 000002200000.png~~ 68.png
000001850000.png~~~~~ 000002200000.png~~~ 69.png
000001860000.png~ 000002200000.png~~~~ 6.png
000001860000.png~~ 000002200000.png~~~~~ 70.png
000001860000.png~~~ 000002210000.png~ 71.png
000001860000.png~~~~ 000002220000.png~ 72.png
000001860000.png~~~~~ 000002220000.png~~ 73.png
000001870000.png~ 000002220000.png~~~ 74.png
000001870000.png~~ 000002220000.png~~~~ 75.png
000001870000.png~~~ 000002220000.png~~~~~ 76.png
000001870000.png~~~~ 000002230000.png~ 77.png
000001870000.png~~~~~ 000002230000.png~~ 78.png
000001880000.png~ 000002230000.png~~~ 79.png
000001880000.png~~ 000002230000.png~~~~ 7.png
000001880000.png~~~ 000002230000.png~~~~~ 80.png
000001880000.png~~~~ 000002240000.png~ 81.png
000001880000.png~~~~~ 000002240000.png~~ 82.png
000001890000.png~ 000002240000.png~~~ 83.png
000001890000.png~~ 000002240000.png~~~~ 84.png
000001890000.png~~~ 000002240000.png~~~~~ 85.png
000001890000.png~~~~ 000002250000.png~ 86.png
000001890000.png~~~~~ 000002250000.png~~ 8.png
000001900000.png~ 000002250000.png~~~ 9.png
000001900000.png~~ 000002250000.png~~~~
000001900000.png~~~ 000002250000.png~~~~~
The output of the second ls is:
100.png 148.png 196.png 244.png 292.png 340.png 388.png 436.png 484.png
101.png 149.png 197.png 245.png 293.png 341.png 389.png 437.png 485.png
102.png 150.png 198.png 246.png 294.png 342.png 390.png 438.png 486.png
103.png 151.png 199.png 247.png 295.png 343.png 391.png 439.png 487.png
104.png 152.png 200.png 248.png 296.png 344.png 392.png 440.png 488.png
105.png 153.png 201.png 249.png 297.png 345.png 393.png 441.png 489.png
106.png 154.png 202.png 250.png 298.png 346.png 394.png 442.png 490.png
107.png 155.png 203.png 251.png 299.png 347.png 395.png 443.png 491.png
108.png 156.png 204.png 252.png 300.png 348.png 396.png 444.png 492.png
109.png 157.png 205.png 253.png 301.png 349.png 397.png 445.png 493.png
110.png 158.png 206.png 254.png 302.png 350.png 398.png 446.png 494.png
111.png 159.png 207.png 255.png 303.png 351.png 399.png 447.png 495.png
112.png 160.png 208.png 256.png 304.png 352.png 400.png 448.png 496.png
113.png 161.png 209.png 257.png 305.png 353.png 401.png 449.png 497.png
114.png 162.png 210.png 258.png 306.png 354.png 402.png 450.png 498.png
115.png 163.png 211.png 259.png 307.png 355.png 403.png 451.png 499.png
116.png 164.png 212.png 260.png 308.png 356.png 404.png 452.png 500.png
117.png 165.png 213.png 261.png 309.png 357.png 405.png 453.png 501.png
118.png 166.png 214.png 262.png 310.png 358.png 406.png 454.png 502.png
119.png 167.png 215.png 263.png 311.png 359.png 407.png 455.png 503.png
120.png 168.png 216.png 264.png 312.png 360.png 408.png 456.png 504.png
121.png 169.png 217.png 265.png 313.png 361.png 409.png 457.png 505.png
122.png 170.png 218.png 266.png 314.png 362.png 410.png 458.png 506.png
123.png 171.png 219.png 267.png 315.png 363.png 411.png 459.png 507.png
124.png 172.png 220.png 268.png 316.png 364.png 412.png 460.png 508.png
125.png 173.png 221.png 269.png 317.png 365.png 413.png 461.png 509.png
126.png 174.png 222.png 270.png 318.png 366.png 414.png 462.png 510.png
127.png 175.png 223.png 271.png 319.png 367.png 415.png 463.png 511.png
128.png 176.png 224.png 272.png 320.png 368.png 416.png 464.png 512.png
129.png 177.png 225.png 273.png 321.png 369.png 417.png 465.png 513.png
130.png 178.png 226.png 274.png 322.png 370.png 418.png 466.png 514.png
131.png 179.png 227.png 275.png 323.png 371.png 419.png 467.png 515.png
132.png 180.png 228.png 276.png 324.png 372.png 420.png 468.png 516.png
133.png 181.png 229.png 277.png 325.png 373.png 421.png 469.png 517.png
134.png 182.png 230.png 278.png 326.png 374.png 422.png 470.png 87.png
135.png 183.png 231.png 279.png 327.png 375.png 423.png 471.png 88.png
136.png 184.png 232.png 280.png 328.png 376.png 424.png 472.png 89.png
137.png 185.png 233.png 281.png 329.png 377.png 425.png 473.png 90.png
138.png 186.png 234.png 282.png 330.png 378.png 426.png 474.png 91.png
139.png 187.png 235.png 283.png 331.png 379.png 427.png 475.png 92.png
140.png 188.png 236.png 284.png 332.png 380.png 428.png 476.png 93.png
141.png 189.png 237.png 285.png 333.png 381.png 429.png 477.png 94.png
142.png 190.png 238.png 286.png 334.png 382.png 430.png 478.png 95.png
143.png 191.png 239.png 287.png 335.png 383.png 431.png 479.png 96.png
144.png 192.png 240.png 288.png 336.png 384.png 432.png 480.png 97.png
145.png 193.png 241.png 289.png 337.png 385.png 433.png 481.png 98.png
146.png 194.png 242.png 290.png 338.png 386.png 434.png 482.png 99.png
147.png 195.png 243.png 291.png 339.png 387.png 435.png 483.png
Related
I am trying to output contents of each folder into each column of CSV e.g.
ls -R is great but it Outputs everything into 1 column:
/Folder1:
1a.jpg
2a.jpg
3a.jpg
4a.jpg
5a.jpg
/Folder2:
1b.jpg
2b.jpg
3b.jpg
4b.jpg
5b.jpg
/Folder3:
1c.jpg
2c.jpg
3c.jpg
4c.jpg
5c.jpg
/Folder4:
1d.jpg
2d.jpg
3d.jpg
4d.jpg
5d.jpg
/Folder5:
1e.jpg
2e.jpg
3e.jpg
4e.jpg
5e.jpg
But I am trying to Output every folder into new column.
/Folder1: ,/Folder2: ,/Folder3: ,/Folder4: ,/Folder5:
1a.jpg ,1b.jpg ,1c.jpg ,1d.jpg ,1e.jpg
2a.jpg ,2b.jpg ,2c.jpg ,2d.jpg ,2e.jpg
3a.jpg ,3b.jpg ,3c.jpg ,3d.jpg ,3e.jpg
4a.jpg ,4b.jpg ,4c.jpg ,4d.jpg ,4e.jpg
5a.jpg ,5b.jpg ,5c.jpg ,5d.jpg ,5e.jpg
I would like to merge specific files (XXXXXXX_Abstract_TOC.txt, XXXXXXX_Chapter1.txt, XXXXXXX_Chapter2.txt, XXXXXXX_Chapter3.txt, XXXXXXX_Chapter4.txt, XXXXXXX_Conclusion.txt) into one file based on specific numbers that come from a text file(/util_files/list_NRPs.txt).
Note: X is [0-9] digit
The list_NRPs.txt contains as follows:
0030001
0030002
0030004
...
In /All_Files folder, I have files as follows:
0030001_Abstract_TOC.txt
0030001_Chapter1.txt
0030001_Chapter2.txt
0030001_Chapter3.txt
0030001_Chapter4.txt
0030001_Conclusion.txt
0030002_Abstract_TOC.txt
0030002_Chapter1.txt
0030002_Chapter2.txt
0030002_Chapter3.txt
0030002_Chapter4.txt
0030002_Conclusion.txt
0030004_Abstract_TOC.txt
0030004_Chapter1.txt
0030004_Chapter2.txt
0030004_Chapter3.txt
0030004_Chapter4.txt
0030004_Conclusion.txt
...
For each XXXXXXX from list_NRPs.txt I would like to merge XXXXXXX_Abstract_TOC.txt, XXXXXXX_Chapter1.txt, XXXXXXX_Chapter2.txt, XXXXXXX_Chapter3.txt, XXXXXXX_Chapter4.txt, XXXXXXX_Conclusion.txt into XXXXXXX_All.txt.
The final process in /All_Files folder would be:
0030001_Abstract_TOC.txt
0030001_Chapter1.txt
0030001_Chapter2.txt
0030001_Chapter3.txt
0030001_Chapter4.txt
0030001_Conclusion.txt
0030001_All.txt
0030002_Abstract_TOC.txt
0030002_Chapter1.txt
0030002_Chapter2.txt
0030002_Chapter3.txt
0030002_Chapter4.txt
0030002_Conclusion.txt
0030002_All.txt
0030004_Abstract_TOC.txt
0030004_Chapter1.txt
0030004_Chapter2.txt
0030004_Chapter3.txt
0030004_Chapter4.txt
0030004_Conclusion.txt
0030004_All.txt
...
I would like start with cat ../util_files/list_NRPs.txt | xargs but I do not know how to proceed.
How can I do that?
You can use globbing to concatenate multiple files matching each line in list_NRPs.txt file:
while read -r ch; do
cat "/All_Files/$ch"* > "/All_Files/${ch}_All.txt"
done < /util_files/list_NRPs.txt
I am writing a bash script that renames JPG files based on their EXIF tags. My original files are named like this:
IMG_2110.JPG
IMG_2112.JPG
IMG_2113.JPG
IMG_2114.JPG
I need to rename them like this:
2015-06-07_11-21-38_iPhone6Plus_USA-CA-Los_Angeles_IMG_2110.JPG
2015-06-07_11-22-41_iPhone6Plus_USA-CA-Los_Angeles_IMG_2112.JPG
2015-06-13_19-05-10_iPhone6Plus_Morocco-Fez_IMG_2113.JPG
2015-06-13_19-12-55_iPhone6Plus_Morocco-Fez_IMG_2114.JPG
My bash script uses exiftool to parse the EXIF header and rename the files. For those files that do not contain an EXIF create date, I am using the file modification time.
#!/bin/bash
IFS=$'\n'
for i in *.*; do
MOD=`stat -f %Sm -t %Y-%m-%d_%H-%m-%S $i`
model=$( exiftool -f -s3 -"Model" "${i}" )
datetime=$( exiftool -f -s3 -"DateTimeOriginal" "${i}" )
stamp=${datetime//:/-}"_"${model// /}
echo ${stamp// /_}$i
done
I am stuck on the location. I need to determine the country and city using the GPS information from the EXIF tag. exiftool provides a field called "GPS Position." Of all the fields, this seems the most useful to determine location.
GPS Position : 40 deg 44' 49.36" N, 73 deg 56' 28.18" W
Google provides a public API for geolocation, but it requires latitude/longitude coordinates in this format:
40.7470444°, -073.9411611°
The API returns quite a bit of information (click the link to see the results):
https://maps.googleapis.com/maps/api/geocode/json?latlng=40.7470444,-073.9411611
My question is:
How do I format the GPS Position to a latitude/longitude value that will provide acceptable input to a service such as Google geolocation?
How do I parse the JSON results to extract just the country and city, in a way that is consistent with many different kinds of locations? Curl, and then? Ideally, I’d like to handle USA locations one way, and non-USA locations, another. USA locations would be formatted USA-STATE-City, whereas non-USA locations would be formatted COUNTRY-City.
I need to do this all in a bash script. I've looked at pygeocoder and gpsbabel but they do not seem to do the trick. There are a few free web tools available but they don't provide an API (http://www.earthpoint.us/Convert.aspx).
Better later than never, right.
So, I just came across the same issue and I've managed to make the conversion using the EXIFTool itself. Try this:
exiftool -n -p '$GPSLatitude,$GPSLongitude' image_name.jpg
The converted coordinates are slightly longer than proposed by Google, but the API accepted it fine.
Cheers.
For #1, the awk should not be that complicated:
awk '/GPS Position/{
lat=$4; lat+=strtonum($6)/60; lat+=strtonum($7)/3600; if($8!="N,")lat=-lat;
lon=$9; lon+=strtonum($11)/60; lon+=strtonum($12)/3600; if($13!="E")lon=-lon;
printf "%.7f %.7f\n",lat,lon
}'
I ended up doing it in PHP, but thanks for the tip Marco I'll check it out!
function get_gps($gps_pos) {
$parts = explode(" ",str_replace(array("deg ",",","'","\""),"",$gps_pos));
$lat_deg = $parts[0];
$lat_min = $parts[1];
$lat_sec = $parts[2];
$lat_dir = $parts[3];
$lon_deg = $parts[4];
$lon_min = $parts[5];
$lon_sec = $parts[6];
$lon_dir = $parts[7];
if ($lat_dir == "N") {
$lat_sin = "+";
} else {
$lat_sin = "-";
}
if ($lon_dir == "E") {
$lon_sin = "+";
} else {
$lon_sin = "-";
}
$latitiude = $lat_sin.($lat_deg+($lat_min/60)+($lat_sec/3600));
$longitude = $lon_sin.($lon_deg+($lon_min/60)+($lon_sec/3600));
return $latitiude.",".$longitude;
}
From man exiftool (note the last line):
-c FMT (-coordFormat)
Set the print format for GPS coordinates. FMT uses the same syntax
as a "printf" format string. The specifiers correspond to degrees,
minutes and seconds in that order, but minutes and seconds are
optional. For example, the following table gives the output for
the same coordinate using various formats:
FMT Output
------------------- ------------------
"%d deg %d' %.2f"\" 54 deg 59' 22.80" (default for reading)
"%d %d %.8f" 54 59 22.80000000 (default for copying)
"%d deg %.4f min" 54 deg 59.3800 min
"%.6f degrees" 54.989667 degrees
And regarding "There are a few free web tools available but they don't provide an API"—geoapify.com offers a free web tool but also an API. Their API is free for up to three thousand requests per day. Their web service does five hundred at a time.
I have a Makefile which looks roughly like this:
FIGURES = A1_B1_C1.eps A2_B2_C2.eps A3_B3_C3.eps
NUMBERS = 1 2 3
all : $(FIGURES)
%.eps : $(foreach num, $(NUMBERS), $(subst B, $(num), %).out)
# my_program($+, $#);
%.out :
The point is that the file names of my figures contain certain information (A, B, C) and that each figure is created by my_program from several (in the example 3) files.
While the filename of each figure has the format Ax_Bx_Cx.eps, the names of the data files to create the figures from look like this:
Ax_1x_Cx.out
Ax_2x_Cx.out
Ax_3x_Cx.out
So for each figure, I need a dynamically created dependency list with several file names. In other words, my desired output for the example above would be:
# my_program(A1_11_C1.out A1_21_C1.out A1_31_C1.out, A1_B1_C1.eps);
# my_program(A2_12_C2.out A2_22_C2.out A2_32_C2.out, A2_B2_C2.eps);
# my_program(A3_13_C3.out A3_23_C3.out A3_33_C3.out, A3_B2_C3.eps);
Unfortunately, the subst command seems to be ignored, for the output looks like this:
# my_program(A1_B1_C1.out A1_B1_C1.out A1_B1_C1.out, A1_B1_C1.eps);
# my_program(A2_B2_C2.out A2_B2_C2.out A2_B2_C2.out, A2_B2_C2.eps);
# my_program(A3_B3_C3.out A3_B3_C3.out A3_B3_C3.out, A3_B3_C3.eps);
I had a look at this possible duplicate but figured that the answer cannot help me, since I am using % and not $#, which should be ok in the prerequisites.
Clearly I am getting something wrong here. Any help is greatly appreciated.
To do fancy prerequisite manipulations you need at least make-3.82 which supports Secondary Expansion feature:
FIGURES = A1_B1_C1.eps A2_B2_C2.eps A3_B3_C3.eps
NUMBERS = 1 2 3
all : $(FIGURES)
.SECONDEXPANSION:
$(FIGURES) : %.eps : $$(foreach num,$$(NUMBERS),$$(subst B,$$(num),$$*).out)
#echo "my_program($+, $#)"
%.out :
touch $#
Output:
$ make
touch A1_11_C1.out
touch A1_21_C1.out
touch A1_31_C1.out
my_program(A1_11_C1.out A1_21_C1.out A1_31_C1.out, A1_B1_C1.eps)
touch A2_12_C2.out
touch A2_22_C2.out
touch A2_32_C2.out
my_program(A2_12_C2.out A2_22_C2.out A2_32_C2.out, A2_B2_C2.eps)
touch A3_13_C3.out
touch A3_23_C3.out
touch A3_33_C3.out
my_program(A3_13_C3.out A3_23_C3.out A3_33_C3.out, A3_B3_C3.eps)
I have many references in Referencer. I'm trying to include filenames in my bibtex file when exporting from Referencer. Since the software doesn't do this by default I'm trying to use a sed command to include the filename as a bibtex information in the XML file before I export and thus include the filename.
Input
<doc>
<filename>file:///home/dwickrama/Desktop/stevenJonesLab/papers/Transcription%20Factor%20Binding/A%20Common%20Nuclear%20Signal%20Transduction%20Pathway%20Activated%20by%20Growth%20Factor%20and%20Cytokine.pdf</filename>
<relative_filename>A%20Common%20Nuclear%20Signal%20Transduction%20Pathway%20Activated%20by%20Growth%20Factor%20and%20Cytokine.pdf</relative_filename>
<key>Sadowski93</key>
<notes></notes>
<bib_type>article</bib_type>
<bib_doi></bib_doi>
<bib_title>A common nuclear signal transduction pathway activated by growth factor and cytokine receptors.</bib_title>
<bib_authors>Sadowski, H B and Shuai, K and Darnell, J E and Gilman, M Z</bib_authors>
<bib_journal>Science</bib_journal>
<bib_volume>261</bib_volume>
<bib_number>5129</bib_number>
<bib_pages>1739-44</bib_pages>
<bib_year>1993</bib_year>
<bib_extra key="pmid">8397445</bib_extra>
Ouput
<doc>
<filename>file:///home/dwickrama/Desktop/stevenJonesLab/papers/Transcription%20Factor%20Binding/A%20Common%20Nuclear%20Signal%20Transduction%20Pathway%20Activated%20by%20Growth%20Factor%20and%20Cytokine.pdf</filename>
<bib_extra key="File">article:../Transcription\ Factor\ Binding/A\ Common\ Nuclear\ Signal\ Transduction\ Pathway\ Activated\ by\ Growth\ Factor\ and\ Cytokine.pdf:pdf</bib_extra>
<relative_filename>A%20Common%20Nuclear%20Signal%20Transduction%20Pathway%20Activated%20by%20Growth%20Factor%20and%20Cytokine.pdf</relative_filename>
<key>Sadowski93</key>
<notes></notes>
<bib_type>article</bib_type>
<bib_doi></bib_doi>
<bib_title>A common nuclear signal transduction pathway activated by growth factor and cytokine receptors.</bib_title>
<bib_authors>Sadowski, H B and Shuai, K and Darnell, J E and Gilman, M Z</bib_authors>
<bib_journal>Science</bib_journal>
<bib_volume>261</bib_volume>
<bib_number>5129</bib_number>
<bib_pages>1739-44</bib_pages>
<bib_year>1993</bib_year>
<bib_extra key="pmid">8397445</bib_extra>
I can use the following sed command to partially do what I want, but the URL encoding "%20" remains. How do I get rid of that in only the bibtex tag ?
sed -e 's/\(\ \ \ \ <filename>file:\/\/\/home\/dwickrama\/Desktop\/stevenJonesLab\/papers\)\([^.]*\)\(\.\?\)\(.*\)\(<\/filename>\)/\1\2\3\4\5\n\ \ \ \ <bib_extra\ key=\"File\">article:\.\.\2\3\4:\4<\/bib_extra>/g' NewPapers.reflib > NewPapers.new.reflib
Regex and sed are not very good tools for processing XML, or URL-decoding.
A quick script in more complete scripting language would be able to do it more clearly and reliably. For example in Python:
import urllib, urlparse
from xml.dom import minidom
doc= minidom.parse('NewPapers.reflib')
el= doc.getElementsByTagName('filename')[0]
path= urlparse.urlparse(el.firstChild.data)[2]
foldername, filename= map(urllib.unquote, path.split('/')[-2:])
extra= doc.createElement('bib_extra')
extra.setAttribute('key', 'File')
extra.appendChild(document.createTextNode('article:../%s/%s:pdf' % (foldername, filename)))
el.parentNode.insertBefore(extra, el.nextSibling)
doc.writexml(open('NewPapers.new.reflib'))
(I haven't included a function to reproduce the backslash-escaping in the given example output as it's not clearly exactly what format that is. The simplest approach would be filename= filename.replace(' ', '\\ '), but I'm not sure that would be correct.)
all you need is to add a line after right?? So just print it out after is searched.
#!/bin/bash
s='<bib_extra key="File">article:../Transcription\\ Factor\\ Binding/A\\ Common\\ Nuclear\\ Signal\\ Transduction\\ Pathway\\ Activated\\ by\\ Growth\\ Factor\\ and\\ Cytokine.pdf:pdf</bib_extra>'
awk -vstr="$s" '
/<filename>/{
print
print str;next
}
{print}' file