How to avoid "testem ci" hanging in the CI server - jasmine

I am testing my vanilla JavaScript application locally with an HTML test page that loads the standalone Jasmine library, and in the CI I am using testem to run the same HTML test page in a headless browser.
However, when I run testem ci, it hangs and I have to manually cancel the build in the CI. If I run it locally it hangs too.
I've gone through the testem repo issues and through some answers in SO, tried everything. I must have something wrong in my configuration but I can't spot it.
Relevant versions:
node -v
v16.1.0
npm -v
8.5.4
node node_modules/testem/testem.js -V
3.6.0
Jasmine version is 4.0.1.
Contents of testem.json:
{
"test_page": "./app/js/index.html",
"launch_in_ci": [
"Headless Chrome",
"Headless Firefox"
],
"launchers": {
"Headless Chrome": {
"exe": "/usr/bin/google-chrome",
"args": [
"--headless",
"--no-sandbox",
"--disable-gpu",
"--remote-debugging-port=9222",
"--remote-debugging-address=0.0.0.0",
"--no-sandbox"
]
},
"Headless Firefox": {
"exe": "/usr/bin/firefox",
"args": [
"--headless",
"--no-sandbox",
"--disable-gpu",
"--remote-debugging-port=9222",
"--remote-debugging-address=0.0.0.0",
"--no-sandbox"
]
}
},
"browser_args": {
"Headless Chrome": [
"--headless",
"--disable-gpu",
"--remote-debugging-port=9222",
"--remote-debugging-address=0.0.0.0",
"--no-sandbox"
],
"Headless Firefox": [
"--headless",
"--disable-gpu",
"--remote-debugging-port=9222",
"--remote-debugging-address=0.0.0.0",
"--no-sandbox"
]
},
"browser_disconnect_timeout": 60
}
Contents of .gitlab-ci.yml:
image: node:latest
cache:
paths:
- node_modules/
before_script:
# Download Chrome
- sh -c 'echo "deb [arch=amd64] http://dl.google.com/linux/chrome/deb/ stable main" > /etc/apt/sources.list.d/google.list'
- wget -q -O - https://dl-ssl.google.com/linux/linux_signing_key.pub | apt-key add -
# Download firefox
- apt update -y && apt install software-properties-common -y
- apt-key adv --keyserver keyserver.ubuntu.com --recv-keys A6DCF7707EBC211F
- apt-add-repository "deb http://ppa.launchpad.net/ubuntu-mozilla-security/ppa/ubuntu bionic main"
# Install both
- apt update -y && apt install google-chrome-stable firefox unzip -y
# download Jasmine lib folder
- mkdir tmp && cd tmp
- curl -L 'https://github.com/jasmine/jasmine/releases/download/v4.0.1/jasmine-standalone-4.0.1.zip' --output jasmine.zip
- unzip jasmine.zip && cp -r lib ../app/js && cd - && rm -rf tmp
test_async:
script:
- npm install
- node node_modules/testem/testem.js ci
Relevant parts of my app structure:
tree
.
├── app
│   └── js
│      ├── lib
│      │   └── jasmine-4.0.1
│      │   ├── boot0.js
│      │   ├── boot1.js
│      │   ├── jasmine.css
│      │   ├── jasmine_favicon.png
│      │   ├── jasmine-html.js
│      │   └── jasmine.js
│      ├── spec
│      │   └── example-spec.js
│      ├── src
│      │   ├── app.js
│      │   └── example.js
│      └── index.html
├── node_modules
├── .gitlab-ci.yml
├── package.json
├── package-lock.json
└── testem.json
Contents of app/js/index.html:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<title>Jasmine Spec Runner v4.0.1</title>
<link rel="shortcut icon" type="image/png" href="lib/jasmine-4.0.1/jasmine_favicon.png">
<link rel="stylesheet" href="lib/jasmine-4.0.1/jasmine.css">
<script src="lib/jasmine-4.0.1/jasmine.js"></script>
<script src="lib/jasmine-4.0.1/jasmine-html.js"></script>
<script src="lib/jasmine-4.0.1/boot0.js"></script>
<script src="lib/jasmine-4.0.1/boot1.js"></script>
<script src="src/example.js"></script>
<script src="spec/example-spec.js"></script>
</head>
<body>
</body>
</html>
What can I do to stop testem ci from hanging in the CI server?

Related

Ansible - Importing group_vars all.yml from a playbook

I got a dir structure like this:
.
├── inventories
│   └── production
│   ├── ansible.cfg
│   ├── group_vars
│   │   ├── all.yml
│ │ └── gitlab.yml
│   ├── hosts
│   └── host_vars
│  
├── playbooks
│   └── gitlab.yml
└── roles
└── gitlab
   ├── defaults
   │   └── main.yml
   ├── handlers
   │   ├── gitlab.yml
   │   └── main.yml
   ├── meta
   │   └── main.yml
   ├── README.md
   ├── tasks
   │   ├── configure.yml
   │   ├── conf_integrity.yml
   │   ├── hook.yml
   │   ├── install.yml
   │   └── main.yml
   ├── templates
   │   ├── certificates
   │   ├── gitlab.rb.j2
   │   ├── post-receive-gdys.sh.j2
   │   ├── post-receive-mys.sh.j2
   │   ├── post-receive-no-backup-gdys.sh.j2
   │   ├── post-receive-no-backup-mys.sh.j2
   │   ├── post-receive.sh.j2
   │   ├── ssl-crt.j2
   │   └── ssl-key.j2
   └── vars
   ├── conf_list.yml
   ├── hook.yml
   ├── main.yml
   └── package.yml
And I want to import all.yml vars to all of my hosts(in any role) for production inventory. And want to import group/role spesific vars (like gitlab.yml) to only my relevant roles. How can i do this? How should be content of my gitlab.yml playbook? In my setup ansible cant import the group_vars/all.yml to my playbook's tasks.
inventories/production/hosts:
[ansible]
[gitlab]
gitlab.zek.local
inventories/production/group_vars/all.yml:
gitlab_settings:
Fqdn: "git.zek.local"
Rails_shell_ssh_port: "22"
Use_self_signed_certs: "yes"
Backup:
enabled: "no"
Server: git02.zek.local
Port: 22
ssh_settings:
Port: "22"
PasswordAuthentication: "no"
roles/gitlab/tasks/configure.yml:
- name: Generating ssl cert for GitLab
command: >
openssl req -x509 -nodes -subj '/CN={{ gitlab_settings[Fqdn] }}' -days 365
-newkey rsa:4096 -sha256 -keyout /etc/gitlab/{{ gitlab_settings[Fqdn] }}.key -out /etc/gitlab/ssl/{{ gitlab_settings[Fqdn] }}.crt
creates=/etc/gitlab/ssl/{{ gitlab_settings[Fqdn] }}.crt
when: "{{ gitlab_settings['Use_self_signed_certs'] }}" == "yes"
notify:
- GitLab servisi yeniden baslatiliyor
sudo: yes
tags: ssl
playbooks/gitlab.yml:
---
- hosts: gitlab
remote_user: zek
sudo: yes
vars_files:
- ../roles/gitlab/vars/package.yml
- ../roles/gitlab/vars/hook.yml
- ../roles/gitlab/vars/conf_list.yml
roles:
- { role: gitlab }
my command for run playbook:
ansible-playbook -i inventories/production playbooks/gitlab.yml --flush-cache
use a separate inventory:
inventory/production
inventory/global
In global put your global vars: inventory/global/group_vars/all.yml
Execute your playbook with the two inventories:
ansible-playbook -i inventories/production -i invenrory/global playbooks/gitlab.yml

Use drush-patchfile in DDEV environment

In Drupal 7 I use
drush-patchfile
to automatically implements patches when installing/updating module via drush. But in DDEV I don't know how to extend existing drush with drush-patchfile
As you can see on https://bitbucket.org/davereid/drush-patchfile section Installation, I need to clone the repository into
~/.drush
directory and that will append it to existing drush.
On another project without DDEV, I've already done that with creating new docker image file
FROM wodby/drupal-php:7.1
USER root
RUN mkdir -p /home/www-data/.drush && chown -R www-data:www-data /home/www-data/;
RUN cd /home/www-data/.drush && git clone https://bitbucket.org/davereid/drush-patchfile.git \
&& echo "<?php \$options['patch-file'] = '/home/www-data/patches/patches.make';" \
> /home/www-data/.drush/drushrc.php;
USER wodby
But I'm not sure how to do that in DDEV container.
Do I need to create a new service based on drud/ddev-webserver or something else?
I've read documentation but not sure in what direction to go.
Based on #rfay comment, here solution that works for me (and with little modification can works for other projects).
I've cloned repo outside of docker container; for example, I've cloned into
$PROJECT_ROOT/docker/drush-patchfile
Create custom drushrc.php in the $PROJECT_ROOT/.esenca/patches folder (you can choose different folder)
<?php
# Location to the patch.make file. This should be location within docker container
$options['patch-file'] = '/var/www/html/.esenca/patches/patches.make';
Add following hooks into $PROJECT_ROOT/.ddev/config.yaml
hooks:
post-start:
# Copy drush-patchfile directory into /home/.drush
- exec: "ln -s -t /home/.drush/ /var/www/html/docker/drush-patchfile"
# Copy custom drushrc file.
- exec: "ln -s -t /home/.drush/ /var/www/html/.esenca/patches/drushrc.php"
Final project structure should looks like
.
├── .ddev
│   ├── config.yaml
│   ├── docker-compose.yaml
│   ├── .gitignore
│   └── import-db
├── docker
│   ├── drush-patchfile
│   │   ├── composer.json
│   │   ├── patchfile.drush.inc
│   │   ├── README.md
│   │   └── src
├── .esenca
│   └── patches
│   ├── drushrc.php
│   └── patches.make
├── public_html
│   ├── authorize.php
│   ├── CHANGELOG.txt
│   ├── COPYRIGHT.txt
│   ├── cron.php
│   ├── includes
│   ├── index.html
│   ├── index.php
│   ├── INSTALL.mysql.txt
│   ├── INSTALL.pgsql.txt
│   ├── install.php
│   ├── INSTALL.sqlite.txt
│   ├── INSTALL.txt
│   ├── LICENSE.txt
│   ├── MAINTAINERS.txt
│   ├── misc
│   ├── modules
│   ├── profiles
│   ├── README.txt
│   ├── robots.txt
│   ├── scripts
│   ├── sites
│   │   ├── all
│   │   ├── default
│   │   ├── example.sites.php
│   │   └── README.txt
│   ├── themes
│   ├── Under-Construction.gif
│   ├── update.php
│   ├── UPGRADE.txt
│   ├── web.config
│   └── xmlrpc.php
└── README.md
At the end start ddev envronment
ddev start
and now you can use drush-patchfile commands within web docker container.
You can ddev ssh and then sudo chown -R $(id -u) ~/.drush/ and then do whwatever you want in that directory (~/.drush is /home/.drush).
When you get it going and you want to do it repetitively for every start, you can encode the instructions you need using post-start hooks: https://ddev.readthedocs.io/en/latest/users/extending-commands/
Please follow up with the exact recipe you use, as it may help others. Thanks!

How to rebuild plugins automatically in NativeScript project?

I try to develop a plugin for a sample NativeScript app (created with:
tns create MyTnsApp --tsc
I created a plugin for it with nativescript-plugin-seed (directory tree seen below).
I put a console.log('Hello World!') to the plugin and I called if from the main app. It works when I run it.
While the application is running and I change something in the plugin the cli detects it and it starts an incremental compilation. However, when I terminate running and run the app again the changes are not visible.
I have to remove the plugin with tns, and add it again.
Is there a way to improve this process without removing/readding the plugin?
Regards,
roncsak
├── app
│   ├── App_Resources
│   ├── LICENSE
│   ├── README.md
│   ├── app.css
│   ├── app.js
│   ├── app.ts
│   ├── bundle-config.js
│   ├── bundle-config.ts
│   ├── main-page.js
│   ├── main-page.ts
│   ├── main-page.xml
│   ├── main-view-model.js
│   ├── main-view-model.ts
│   └── package.json
├── hooks
│   └── ...
├── node_modules
│   └── ...
├── package.json
├── platforms
│   └── ios
├── plugins
│   └── nativescript-myplugin
│ └── src
│   ├── index.js
│ ├── index.ts
│ ├── myplugin.js
│ ├── myplugin.ts
│ ├── package.json
│ ├── platforms
│ │   └── ios
│ └── tsconfig.json
│
└── tsconfig.json
./package.json
{
"description": "NativeScript Application",
"license": "SEE LICENSE IN <your-license-filename>",
"readme": "NativeScript Application",
"repository": "<fill-your-repository-here>",
"nativescript": {
"id": "org.nativescript.MyTnsApp",
"tns-ios": {
"version": "3.4.1"
}
},
"dependencies": {
"nativescript-myplugin": "file:plugins/nativescript-myplugin/src",
"nativescript-theme-core": "~1.0.4",
"tns-core-modules": "~3.4.0"
},
"devDependencies": {
"nativescript-dev-typescript": "~0.6.0",
"typescript": "~2.4.2"
}
}
./plugins/nativescript-myplugin/src/package.json
{
"name": "nativescript-myplugin",
"version": "0.0.1",
"description": "Your awesome NativeScript plugin.",
"main": "index.js",
"nativescript": {
"platforms": {
"ios": "3.4.1"
}
},
"devDependencies": {
"tns-core-modules": "~3.4.0",
"tns-platform-declarations": "^3.4.1"
}
}

No errors in log, but app-error in browser

I'm learning Heroku. I tried to deploy a Node.js app on Heroku. Build seems to be successful, but the browser shows App-error.
https://warm-coast-28316.herokuapp.com/
Why?
Here is the log. Is there something suspicios?
-----> Node.js app detected
-----> Creating runtime environmen
NPM_CONFIG_LOGLEVEL=error
NPM_CONFIG_PRODUCTION=true
NODE_ENV=production
NODE_MODULES_CACHE=true
-----> Installing binaries
engines.node (package.json): >=0.12.7 <0.13
engines.npm (package.json): ~2.11.3
Resolving node version >=0.12.7 <0.13 via semver.io...
Downloading and installing node 0.12.18...
Resolving npm version ~2.11.3 via semver.io...
Downloading and installing npm 2.11.3 (replacing version 2.15.11)...
-----> Restoring cache
Skipping cache restore (new runtime signature)
-----> Building dependencies
Installing node modules (package.json)
> ws#0.4.32 install /tmp/build_beb218743747080f27aa5afd85781f8f/vheinitz-reatha-7a0950a/node_modules/node-hl7/node_modules/socket.io/node_modules/socket.io-client/node_modules/ws
> (node-gyp rebuild 2> builderror.log) || (exit 0)
make: Entering directory `/tmp/build_beb218743747080f27aa5afd85781f8f/vheinitz-reatha-7a0950a/node_modules/node-hl7/node_modules/socket.io/node_modules/socket.io-client/node_modules/ws/build'
CXX(target) Release/obj.target/bufferutil/src/bufferutil.o
SOLINK_MODULE(target) Release/obj.target/bufferutil.node
COPY Release/bufferutil.node
CXX(target) Release/obj.target/validation/src/validation.o
SOLINK_MODULE(target) Release/obj.target/validation.node
COPY Release/validation.node
make: Leaving directory `/tmp/build_beb218743747080f27aa5afd85781f8f/vheinitz-reatha-7a0950a/node_modules/node-hl7/node_modules/socket.io/node_modules/socket.io-client/node_modules/ws/build'
module#0.0.1 node_modules/module
periodic-task#0.1.2 node_modules/periodic-task
cookie-parser#1.3.5 node_modules/cookie-parser
├── cookie#0.1.3
└── cookie-signature#1.0.6
debug#2.2.0 node_modules/debug
└── ms#0.7.1
jsonfile#2.4.0 node_modules/jsonfile
└── graceful-fs#4.1.11
morgan#1.5.3 node_modules/morgan
├── basic-auth#1.0.4
├── depd#1.0.1
└── on-finished#2.2.1 (ee-first#1.1.0)
string#3.3.3 node_modules/string
node-mv#0.1.3 node_modules/node-mv
├── progress#1.1.8
├── async#0.9.2
└── commander#2.9.0 (graceful-readlink#1.0.1)
serve-favicon#2.2.1 node_modules/serve-favicon
├── fresh#0.2.4
├── parseurl#1.3.1
├── ms#0.7.1
└── etag#1.6.0 (crc#3.2.1)
gm#1.23.0 node_modules/gm
├── array-series#0.1.5
├── array-parallel#0.1.3
└── cross-spawn#4.0.2 (lru-cache#4.0.2, which#1.2.12)
express#4.12.4 node_modules/express
├── cookie-signature#1.0.6
├── fresh#0.2.4
├── merge-descriptors#1.0.0
├── parseurl#1.3.1
├── content-type#1.0.2
├── utils-merge#1.0.0
├── cookie#0.1.2
├── methods#1.1.2
├── escape-html#1.0.1
├── range-parser#1.0.3
├── vary#1.0.1
├── finalhandler#0.3.6
├── serve-static#1.9.3
├── content-disposition#0.5.0
├── path-to-regexp#0.1.3
├── depd#1.0.1
├── on-finished#2.2.1 (ee-first#1.1.0)
├── qs#2.4.2
├── etag#1.6.0 (crc#3.2.1)
├── send#0.12.3 (destroy#1.0.3, ms#0.7.1, mime#1.3.4)
├── proxy-addr#1.0.10 (forwarded#0.1.0, ipaddr.js#1.0.5)
├── type-is#1.6.14 (media-typer#0.3.0, mime-types#2.1.14)
└── accepts#1.2.13 (negotiator#0.5.3, mime-types#2.1.14)
body-parser#1.16.0 node_modules/body-parser
├── bytes#2.4.0
├── content-type#1.0.2
├── depd#1.1.0
├── on-finished#2.3.0 (ee-first#1.1.1)
├── raw-body#2.2.0 (unpipe#1.0.0)
├── http-errors#1.5.1 (setprototypeof#1.0.2, inherits#2.0.3, statuses#1.3.1)
├── qs#6.2.1
├── debug#2.6.0 (ms#0.7.2)
├── type-is#1.6.14 (media-typer#0.3.0, mime-types#2.1.14)
└── iconv-lite#0.4.15
request#2.79.0 node_modules/request
├── is-typedarray#1.0.0
├── aws-sign2#0.6.0
├── oauth-sign#0.8.2
├── forever-agent#0.6.1
├── tunnel-agent#0.4.3
├── caseless#0.11.0
├── stringstream#0.0.5
├── isstream#0.1.2
├── json-stringify-safe#5.0.1
├── extend#3.0.0
├── aws4#1.5.0
├── uuid#3.0.1
├── combined-stream#1.0.5 (delayed-stream#1.0.0)
├── qs#6.3.0
├── form-data#2.1.2 (asynckit#0.4.0)
├── mime-types#2.1.14 (mime-db#1.26.0)
├── tough-cookie#2.3.2 (punycode#1.4.1)
├── har-validator#2.0.6 (pinkie-promise#2.0.1, commander#2.9.0, chalk#1.1.3, is-my-json-valid#2.15.0)
├── hawk#3.1.3 (cryptiles#2.0.5, sntp#1.0.9, boom#2.10.1, hoek#2.16.3)
└── http-signature#1.1.1 (assert-plus#0.2.0, jsprim#1.3.1, sshpk#1.10.2)
multer#0.1.6 node_modules/multer
├── qs#1.2.2
├── mkdirp#0.3.5
└── busboy#0.2.14 (readable-stream#1.1.14, dicer#0.2.5)
connect-busboy#0.0.2 node_modules/connect-busboy
└── busboy#0.2.14 (readable-stream#1.1.14, dicer#0.2.5)
nodemon#1.11.0 node_modules/nodemon
├── ignore-by-default#1.0.1
├── undefsafe#0.0.3
├── es6-promise#3.3.1
├── minimatch#3.0.3 (brace-expansion#1.1.6)
├── touch#1.0.0 (nopt#1.0.10)
├── lodash.defaults#3.1.2 (lodash.restparam#3.6.1, lodash.assign#3.2.0)
├── ps-tree#1.1.0 (event-stream#3.3.4)
├── update-notifier#0.5.0 (is-npm#1.0.0, semver-diff#2.1.0, string-length#1.0.1, chalk#1.1.3, repeating#1.1.3, configstore#1.4.0, latest-version#1.0.1)
└── chokidar#1.6.1 (inherits#2.0.3, path-is-absolute#1.0.1, async-each#1.0.1, glob-parent#2.0.0, is-binary-path#1.0.1, is-glob#2.0.1, readdirp#2.1.0, anymatch#1.3.0)
jade#1.11.0 node_modules/jade
├── character-parser#1.2.1
├── void-elements#2.0.1
├── commander#2.6.0
├── mkdirp#0.5.1 (minimist#0.0.8)
├── jstransformer#0.0.2 (is-promise#2.1.0, promise#6.1.0)
├── constantinople#3.0.2 (acorn#2.7.0)
├── with#4.0.3 (acorn-globals#1.0.9, acorn#1.2.2)
├── clean-css#3.4.24 (commander#2.8.1, source-map#0.4.4)
├── transformers#2.1.0 (promise#2.0.0, css#1.0.8, uglify-js#2.2.5)
└── uglify-js#2.7.5 (uglify-to-browserify#1.0.2, async#0.2.10, yargs#3.10.0, source-map#0.5.6)
moment#2.17.1 node_modules/moment
execsql#0.0.3 node_modules/execsql
├── underscore#1.5.2
├── optimist#0.6.0 (wordwrap#0.0.3, minimist#0.0.10)
└── mysql#2.0.0-rc2 (require-all#0.0.3, bignumber.js#1.0.1)
node-hl7#0.1.3 node_modules/node-hl7
├── xmlbuilder#0.4.3
├── chai#1.4.2
├── xml2js#0.2.8 (sax#0.5.8)
└── socket.io#0.9.17 (base64id#0.1.0, policyfile#0.0.4, redis#0.7.3, socket.io-client#0.9.16)
-----> Caching build
Clearing previous node cache
Saving 2 cacheDirectories (default):
- node_modules
- bower_components (nothing to cache)
-----> Build succeeded!
├── body-parser#1.16.0
├── connect-busboy#0.0.2
├── cookie-parser#1.3.5
├── debug#2.2.0
├── execsql#0.0.3
├── express#4.12.4
├── gm#1.23.0
├── jade#1.11.0
├── jsonfile#2.4.0
├── module#0.0.1
├── moment#2.17.1
├── morgan#1.5.3
├── multer#0.1.6
├── node-hl7#0.1.3
├── node-mv#0.1.3
├── nodemon#1.11.0
├── periodic-task#0.1.2
├── request#2.79.0
├── serve-favicon#2.2.1
└── string#3.3.3
-----> Discovering process types
Procfile declares types -> web
-----> Compressing...
Done: 20.2M
-----> Launching...
Released v3
https://warm-coast-28316.herokuapp.com/ deployed to Heroku
Your log contents only show your most recent deploy of your application to Heroku. You need to retrieve the later log messages to see what happened.
I know this because every single Heroku web request generates a log entry, and your logs show none.
What you can do is this:
Visit your Heroku app. Get the error.
Run the heroku logs command to view the most recently logs (and errors).
If you want to keep your logs running while you are testing your app so you don't have to constantly repeat the heroku logs command, you can say:
$ heroku logs --tail
This will open a 'stream' of logs that you can continue viewing in real-time.

Node.js module won't reinstall using puppet / vagrant

Previously I had a similar configuration to this working but as soon as I added hiera to my puppet build I started having problems. The error I currently have after running vagrant provision is as follows:
==> default: [vagrant-hostsupdater] Checking for host entries
==> default: [vagrant-hostsupdater] found entry for: 192.168.33.10 local.mysite
==> default: Configuring cache buckets...
==> default: Running provisioner: puppet...
==> default: Running Puppet with app.pp...
==> default: stdin: is not a tty
==> default: Error: Could not find class nodejs for local.mysite on node local.mysite
==> default: Error: Could not find class nodejs for local.mysite on node local.mysite
The SSH command responded with a non-zero exit status. Vagrant
assumes that this means the command failed. The output for this command
should be in the log above. Please read the output to determine what
went wrong.
My vagrant config is:
# -*- mode: ruby -*-
# vi: set ft=ruby :
require "yaml"
# Load yaml configuration
config_file = "#{File.dirname(__FILE__)}/config/vm_config.yml"
default_config_file = "#{File.dirname(__FILE__)}/config/.vm_config_default.yml"
vm_external_config = YAML.load_file(config_file)
# Configure Vagrant
Vagrant.configure("2") do |config|
config.vm.box = "ubuntu/trusty64"
config.vm.box_url = "http://cloud-images.ubuntu.com/vagrant/trusty/current/trusty-server-cloudimg-amd64-vagrant-disk1.box"
config.vm.network :private_network, ip: vm_external_config["ip"]
config.vm.hostname = vm_external_config["hostname"]
config.vm.network "forwarded_port", guest: vm_external_config["port"], host: 2368
config.vm.synced_folder vm_external_config["ghost_path"], "/var/www/mysite.com", :nfs => true
config.vm.provider :virtualbox do |vb|
vb.customize ["modifyvm", :id, "--memory", vm_external_config["memory"]]
end
config.cache.scope = :box
config.librarian_puppet.placeholder_filename = ".gitkeep"
config.vm.provision :puppet do |puppet|
puppet.hiera_config_path = "puppet/hiera/hiera.yaml"
puppet.manifests_path = "puppet/manifests"
puppet.manifest_file = "app.pp"
puppet.module_path = "puppet/modules"
puppet.facter = {
"environment" => ENV['ENV'] ? ENV['ENV'] : 'local'
}
end
end
My source tree looks like so (much of it isn't relevant aside from the folders structure for the custom blog module and hiera config):
├── Vagrantfile
├── config
│   └── vm_config.yml
└── puppet
├── Puppetfile
├── hiera
│   ├── common.yaml
│   ├── hiera.yaml
│   ├── local
│   │   └── site.yaml
│   └── production
│   └── site.yaml
├── manifests
│   └── app.pp
└── modules
├── blog
│   └── manifests
│   └── app.pp
├── ghost
│   └── manifests
│   └── app.pp
├── init.d
│   └── files
│   ├── WebhookServer
│   └── ghost
├── mailgunner
├── nginx
│   ├── files
│   │   ├── local
│   │   │   ├── mysite.com
│   │   │   └── mail.mysite.com
│   │   └── production
│   │   ├── mysite.com
│   │   └── mail.mysite.com
│   └── manifests
│   └── server.pp
├── tools
│   ├── files
│   │   ├── local
│   │   │   ├── backup.sh
│   │   │   ├── ghostsitemap.sh
│   │   │   └── init-mysite.sh
│   │   └── production
│   │   ├── backup.sh
│   │   ├── ghostsitemap.sh
│   │   └── init-mysite.sh
│   └── manifests
│   └── install.pp
└── webhooks
├── files
│   ├── local
│   │   └── init-webhook.sh
│   ├── production
│   │   └── init-webhook.sh
│   ├── webhook.sh
│   └── webhooks.rb
└── manifests
└── install.pp
hiera.yaml:
---
:backends:
- yaml
:yaml:
:datadir: /vagrant/hieradata
:hierarchy:
- "%{::environment}/site
- common
common.yaml
--
classes:
- site
local/site.yaml
--
:site:
environment: local
name: local.mysite
mailserver: local.mail.mysite
blog/manifests/app.pp
class blog::app {
class { 'nodejs':
version => 'v0.10.25',
} ->
package { 'pm2':
ensure => present,
provider => 'npm',
require => Class['nodejs'],
}
}
Puppetfile
forge 'https://forgeapi.puppetlabs.com'
mod 'willdurand/nodejs', '1.9.4'
Basically, my problem is that my puppet install is not reinstalling nodejs (I'd removed it previously using an rm -rf puppet/modules/nodejs)
Does anyone have any ideas how or why puppet is now refusing to install the nodejs puppet module in the puppet/modules directory?
FYI - I've installed the willdurand/nodejs module using puppet module install willdurand/nodejs
Any help is much appreciated - I've been banging my head against a brick wall on this for a few days now!
The Puppetfile is used by the vagrant-librarian-puppet to install your puppet module so it should install.
Make sure the plugin is installed
$ vagrant plugin list
vagrant-librarian-puppet (0.9.2)
....
If you dont see the plugin, make sure to install it
$ vagrant plugin install vagrant-librarian-puppet

Resources