I‘ve got a Pandoc (v1.19.2.1) HTML5 template that I’m loading from the default --data-dir. Within the template I need to load external resources, such as stylesheets and JavaScript. I’d like to load those resources relative to the path of the template, not the working directory or the source file. For example, on macOS, in ~/.pandoc/templates/hierarchical/hierarchical.html:
…
<link rel="stylesheet" href="hierarchical.css">
…
where hierarchical.css is located at ~/.pandoc/templates/hierarchical/hierarchical.css, in the same directory as the template itself.
Then invoked from the command line:
pandoc \
--from=markdown_strict+header_attributes+yaml_metadata_block+pipe_tables\
--to=html5 \
--self-contained \
--template="hierarchical/template.html" \
--section-divs \
--output="$1.html" \
--toc \
--toc-depth=6 \
"$1.md"
I get the error:
pandoc: Could not fetch hierarchical.css
hierarchical.css: openBinaryFile: does not exist (No such file or directory)
I’ve tried various other relative paths to the CSS file. The only thing that works is the absolute path /Users/jmakeig/.pandoc/templates/hierarchical/hierarchical.css, which, of course, will only work on my laptop.
Is there any way to resolve external resources in Pandoc templates relative to the template itself, so that the templates are portable? I don’t see an obvious external variable that I could use in my template or a command line option.
I'm pasting the work-around I gave to the issue that I created in github a while ago.
Coming back to the topic more than two years after I created this issue, I've found a not-so-bad workaround.
Apparently, latex uses the environment variable TEXINPUTS as a sort ot PATH for resources. So, you can just configure an environment variable once in your system (linux, windows, wherever) and just refer to resources relative to that path.
This link provides some explanation about how to use it:
https://tex.stackexchange.com/questions/93712/definition-of-the-texinputs-variable
For example I have the following files:
SOME_PATH_TO/templates/my_latex_template.tex
SOME_PATH_TO/templates/img/my_img.png
In my system I set the environment variable (example with Windows, although I actually just save it under the system config):
set TEXINPUTS=SOME_PATH_TO/templates/
In the template my_latex_template.tex I have something like:
%...
\includegraphics{img/my_img.png}
%...
And I call the template like so:
pandoc file.txt -t pdf --template=SOME_PATH_TO/templates/my_latex_template.tex --output=output.pdf
Related
I am using the moderncv class to create a CV in Rmarkdown. In order to make the cv reproducible out of the box I have included the .cls and .sty files in the root directory. However, in an effort to keep the root directory uncluttered I would prefer to keep all the moderncv related files in a subdirectory (assets/tex/). I am able to access the .cls file using a relative path in the yaml front matter, but I am not able to access the .sty files unless they are in the root directory.
Searching previous questions on stackoverflow I learned the following: (1) keeping .cls and .sty files in nested directories is not recommended. I understand this and would like to do it anyway so that other people can fork my project and be able to knit the cv without having to deal with finding their texmk folder. (2) the solution to my problem seems to involve setting the TEXINPUTS using a Makefile (see this thread and another thread)
I am not very good with Makefiles, but I have managed to get one working that will knit my .Rmd file to pdf without problems, so long as the .sty files are still in root. This is what it looks like currently:
PDF_FILE=my_cv.pdf
all : $(PDF_FILE)
echo All files are now up to date
clean :
rm -f $(PDF_FILE)
%.pdf : %.Rmd
Rscript -e 'rmarkdown::render("$<")'
My understanding is that I can set the TEXINPUTS using:
export TEXINPUTS=".:./assets/tex:"
Where "assets/tex" represents the subdirectory where the .sty files are located. I do not know how to incorporate the above code into my makefile so that the .sty files are recognized in the subdirectories and my .Rmd is knit to PDF. In its current state, I get the following error if I remove the .sty files from root and put then in the aforementioned subdirectory:
! LaTeX Error: Command \fax already defined.
Or name \end... illegal, see p.192 of the manual.
which I assume is occurring because the moderncv class needs---and cannot locate---the relevant .sty files.
You could try to define the environment variable in the make rule:
%.pdf : %.Rmd
export TEXINPUTS=".:./assets/tex:"
Rscript -e 'rmarkdown::render("$<")'
Or you could set the environment variable in a set-up chunk in your Rmd file:
```{r setup, include = FALSE}
Sys.setenv(TEXINPUTS=".:./assets/tex:")
```
Note: Not tested due to lack of minimal example.
My purpose is to have an empty hugo application, so, using scripts, I can store list of directories with md files or only md files in an external directory, one level above.
Yes - you can use the contentDir option in your config file, or pass the -c or --contentDir flags to Hugo on the command line.
I'm having trouble understanding Laravel 5's elixir pathing. In my project, I have multiple css files (bootstrap, plugins, theme etc) and javascript files stored under:
resources/assets/css/<my css files>
resources/assets/js/<my javascript files>
I simply want to combine and version all styles and scripts and place them in my public directory. I believe the default output directly is:
public/build/css/app-xxxxxxxx.css
public/build/js/app-xxxxxxxx.js
(Where xxxxxxxx is the checksum using the version method)
What should my gulpfile.js look like to achieve this? Thanks!
You can use full path on the name or set the third parameter as default path. Examples (works with scripts or css):
mix.stylesIn('resources/assets/css', 'public/css/all.css');
Since there is a bug where you can't use the output of a somethingAll to concatenate with something else, I use this instead (note the wildcard):
mix.scripts(['maskedinput.js',
'blockui.js',
'user/*.js'],
'public/js/user.js', 'resources/assets/js/');
First parameter is the input files, second is the output file, third is the input's default path.
To version just call it on the file path.
mix.version("public/js/user.js");
I recently discovered Jade and want to give it a try for a new static website. I like the terse syntax and the templating capabilities, so much better than raw HTML. I'm editing in Webstorm 6, which has support for file watchers, and can run e.g. Sass out of the box. I've been able to run Jade via the command line to watch my Jade files:
jade --watch --out public jade
I'm now trying to configure my project in Webstorm to handle this automatically, and I'm running into problems.
To keep the source files separate from the generated ones, I'm aiming for a layout like this:
root
jade
index.jade
subdir
subdir.jade
public
index.html
subdir
subdir.html
With the Arguments field set as:
--out $ProjectFileDir$\public\$FileNameWithoutExtension$.html $FileDir$\$FileName$
To start with, I have the following within my jade folder:
index.jade
subdir
index.jade
The result in my public folder is:
index.html (folder)
index.html (file)
subdir.html (folder)
subdir.html (file)
This is the first time I've tried to use the file watcher feature, and the available macros are confusing me. Has anyone with experience in a similar situation any suggestions?
jade --out option specifies the directory, not the file:
-O, --out <dir> output the compiled html to <dir>
To retain the directories structure you will have to use $FileDirPathFromParent$ macro that takes a parameter.
For example, for the C:\project\public\jade\subdir\subdir.jade file we need it to return the path right to the jade directory, that would be the parameter for the macro: $FileDirPathFromParent(jade)$, and the result would be subdir.
Now if you set the Working directory to $FileDir$, the Arguments would be:
$FileName$ --out $ProjectFileDir$\public\$FileDirPathFromParent(jade)$
And the complete Jade File Watcher for this specific project layout would look like this:
How do I export and import images from and into a MediaWiki?
Terminal solutions
MediaWiki administrator, at server's terminal, can perform maintenance tasks using the Maintenance scripts framework. New Mediawiki versions run all standard scripts in the tasks described below, but old versions have some bugs or not have all moderns scripts: check the version number by grep wgVersion includes/DefaultSettings.php.
Note: all cited (below) scripts have also --help option, for instance php maintenance/importImages.php --help
Original image folder
Users upload files through the Special:Upload page; administrators can configure the allowed file types through an extension whitelist. Once uploaded, files are stored in a folder on the file system, and thumbnails in a dedicated thumb directory.
The Mediawiki's images folder can be zipped with zip -r ~/Mediafiles.zip images command, but this zip is not so good:
there are a lot of expurious files: "deleted files" and "old files" (not the current) with filenames as 20160627184943!MyFig.png, and thumbnails as MyFig.png/120px-MyFig.jpg.
for data-interchange or long-term preservation porpurses, it is invalid... The ugly images/?/??/* folder format is not suitable, as usual "all image files in only one folder".
Images export/import
For "Exporting and Importing" all current images in one folder at MediaWiki server's terminal, there are a step-by-step single procedure.
Step-1: generate the image dumps using dumpUploads (with --local or --shared options when preservation need), that creates a txt list of all image filenames in use.
mkdir /tmp/workingBackupMediaFiles
php maintenance/dumpUploads.php \
| sed 's~mwstore://local-backend/local-public~./images~' \
| xargs cp -t /tmp/workingBackupMediaFiles
zip -r ~/Mediafiles.zip /tmp/workingBackupMediaFiles
rm -r /tmp/workingBackupMediaFiles
The command results in a standard zip file of your image backup folder, Mediafiles.zip at yor user root directory (~/).
NOTE: if you are not worried about the ugly folder strutcture, a more direct way is
php maintenance/dumpUploads.php \
| sed 's~mwstore://local-backend/local-public~./images~' \
| zip ~/Mediafiles.zip -#
according Mediawiki version the --base=./ option will work fine and you can remove the sed command of the pipe.
Step-2: need a backup? installing a copy of the images? ... you need only Mediafiles.zip, and the Mediawiki installed, with no contents... If the Wiki have contents, check problems with filename conflicks (!). Another problem is configuration of file formats and permissions, that must be the same or broader in the new Wiki, see Manual:Configuring file uploads.
Step-3: restore the dumps (to the new Wiki), with the maintenance tools. Supposing that you used step-1 to export and preserve in a zip file,
unzip ~/Mediafiles.zip -d /tmp/workingBackupMediaFiles
php maintenance/importImages.php /tmp/workingBackupMediaFiles
rm -r /tmp/workingBackupMediaFiles
php maintenance/update.php
php maintenance/rebuildall.php
That is all. Check, navegating in your new Wiki's Special:NewFiles.
The full export or preservation
For exporting "ALL images and ALL articles" of your old MediaWiki, for full backup or content preservation. Add some procedures at each step:
Step-1: ... see above step-1... and, to generate the text-content dumps from the old Wiki
php maintenance/dumpBackup.php --full | gzip > ~/dumpContent.xml.gz
Note: instead of --full you can use the --current option.
Step-2: ... you need dumpContent.xml.zip and Mediafiles.zip... from the old Wiki. Suppose both zip files at your ~ folder.
Step-3: run in your new Wiki
unzip ~/Mediafiles.zip -d /tmp/workingBackupMediaFiles
gunzip -c ~/dumpContent.xml.gz
| php maintenance/importDump.php --no-updates \
--image-base-path=/tmp/workingBackupMediaFiles
rm -r /tmp/workingBackupMediaFiles
php maintenance/update.php
php maintenance/rebuildall.php
That is all. Check also Special:AllPages of the new Wiki.
There is no automatic way to export images like you export pages, you have to right click on them, and choose "save image". To get the history of the Image page, use the Special:Export page.
To import images use the Special:Upload page on your wiki. If you have lots of them, you can use the Import Images script. Note: you generally have to be in the sysop group to upload images.
- Export ALL:
You can get all pages and all images from a MediaWiki web using [API], even you are not the owner of the web (of course when the owner hasn't disable this function):
Step 1: Using API to get all pages title and all images url. You can write some code to do it automatically.
Step 2: Next you use [Special:Export] to export all pages with the titles you got, and use wget to get all images you had links (like this wget -i img-list.txt).
- Import ALL:
Step 1: Import pages using [Special:Import]
Step 2: Import images using [Manual:ImportImages.php].
There are a few mass upload tools available.
Commonist - www.djini.de/software/commonist/
Both run on the desktop and can be configured to upload to your local wiki (they are configured for Wikipedia and Wikimedia commons by default). If you are afraid to edit the content of a .jar file, I suggest you start with Commonplace.
Another useful extension exists for Mediawiki itself.
MultiUpload - http://www.mediawiki.org/wiki/Extension:MultiUpload
This extension allows you to drop images in a folder and load them all at once. It supports annotations for each file if necessary and cleans up the folder once it is done. On the downside, it requires opening a shared folder on the server side.
Commonplace - commons.wikimedia.org/wiki/Commons:Tools/Commonplace
used to be available, but it was deprecated as of Jan. 13, 2010.
Hope this helps a bit: http://www.mediawiki.org/wiki/Manual:ImportImages.php
As a committer of MediaWiki-Japi I'd like to point out:
For the usecase to push pages including images from one wiki to another MediaWiki-Japi now has a command line mode see
Issue 49 - Enable commandline interface with page transfer option
Otherwise you can use MediaWiki-Api with the language of your choice and use the functions as you find in PushPages.java
e.g.
download
upload