Martini templates and tests - go

I'm trying to split my application files from my testing files. It looks something like this:
main.go
views/
layouts/
layout.html
spec/
main_test.go
main.go creates a Martini app and tells Martini.render where to look for the views:
func CreateApplication() {
m := martini.Classic()
m.Use(render.Renderer(render.Options{
Directory: "views",
Layout: "layouts/layout",
Extensions: []string{".html"},
}))
}
That all works really well when I'm using go run from the root folder. However, when I try to use the CreateApplication() function from the spec/main_test.go file, it's now looking for the views in spec/views because that's the run folder.
I went down the route of trying to use runtime.Caller() to get the absolute path, but that totally messes up when compilling the a binary.
I guess my question is how I can make this work? I want that CreateApplication() to work the same no matter where it was called from.

I often run into this problem. What I do in such cases, is to create a symlink from the child directory to the folder in the root directory which holds the templates. Up until now I haven't had any problems using this appraoch, but when an app goes to production I delete those symlinks. I actually have a script that creates the symlinks before I start testing, and then deletes them after I'm done.
In your case, it would look like this (I'm on Ubuntu or Cygwin):
main.go
views/
layouts/
layout.html
spec/
main_test.go
$ cd spec
$ ln -s ../views views
main.go
views/
layouts/
layout.html
spec/
main_test.go
views <- this is the symlink
Now, when running your tests from spec/ the views directoy is found. I hope it helps you, and if my approach is somehow flawed, I'm eager to know!

Related

How to share code between Netlify lambda functions

I have 3 separate functions each in their own folders. All of them make use of a Twilio client and Apollo Client for dealing with SMS and GraphQL server respectively.
Rather than having all the code to instantiate each client (get keys from env etc.) in each file, can it be put somewhere and required in?
I've tried putting the code into a .js file in the top level functions/ folder and requiring it in the function code as below and this works fine locally on netlify dev but errors with Module not found '../twilioClient' when the function is called in live environment.
/functions
apolloClient.js
twilioClient.js
package.json - specifying deps used by above files
/auth
auth.js - require('../apolloClient')
...
/trails
trails.js - require('../twilioClient') etc.
...
I had success with this approach.
Short answer:
Create a utils file in your functions folder and require it in your function files.
Long answer:
My netlify.toml file looks like this:
[build]
functions = "./functions"
And functions folder:
/functions
function-1.js
function-2.js
utils.js
And utils.js:
exports.helloWorld = () => {
console.log('hello world')
}
And function-1.js:
const {helloWorld} = require('./utils')
exports.handler = async (event) => {
helloWorld()
}
To test it:
Run netlify dev or deploy it.
Your Functions logs or terminal should say 'hello world'.
My netlify site deploys from GitHub.
I did get some success (locally & live) in putting shared modules in a local npm package:
/functions
/utils
package.json
index.js
/src
/auth
auth.js
/trails
trails.js
package.json
Export all common modules in functions/utils/index.js and set property "main": "index.js" in functions/utils/package.json.
In functions/package.json install the module:
{
"dependencies": {
...
"utils": "file:utils"
}
}
And import it in your functions (in functions/src/auth/auth.js): import { apolloClient, twilioClient } from "utils"
Please take a look at this repository for reference.
Posting my interim solution in case it helps before I get a chance to try out #nomadoda's answer.
My functions folder looks like this
/functions
/utils
apolloClient.js
twilioClient.js
/receive-sms
/auth
/stripe
/scripts
Where auth, scripts, receive-sms and stripe are my lambda functions.
I have "prebuild": "sh scripts/prebuild.sh" in my root package.json which looks like this:
cp -rf functions/utils functions/receive-sms
cp -rf functions/utils functions/auth
cp -rf functions/utils functions/scripts
cp -rf functions/utils functions/stripe
This is also where I cd into each lambda function folder and run yarn to install their dependencies.
Then in my function folders I can make use of the utils code by simply importing from the local /utils folder i.e. const apolloClient = require('./utils/apolloClient');
I also gitignore the copied /utils folders so only the master version of utils is tracked by git.
As I said, it's less than ideal but does work though I hope above answer works for me instead.

Does FileServer handler only serve what is inside the directory you specify?

For example, can a user put your url with linux commands to go up a folder/directory?
Let's say my server consist of:
bin/
serverfile.go
...
public/
index.html
style.css
"www.example.com/../bin/etc"
with serverfile.go consisting of:
pacakage main
import "net/http"
func main() {
htttp.ListenAndServe(":8000", http.FileServer(http.Dir("public")))
}
The http.FileServer inhibits a breakout of the root directory you specify.
In contrast you could build your own file server with http.ServeFile which would potentially be a dangerous undertaking.
See also Golang. What to use? http.ServeFile(..) or http.FileServer(..)?

Using Heroku Scheduler add-on with Golang app

I can't figure out how to use the Heroku Scheduler add-on with Go.
I would like run a separate Go file periodically, but I can find the command to achieve that.
From Heroku's doc (https://devcenter.heroku.com/articles/scheduler), if the app is not a Rails app, I should use a ruby script. But I don't know how to run a Go file from there.
I ended up forking the same main function used by my web dyno:
func main () {
if len(os.Args) >= 2 && os.Args[1] == "my_job_param" {
// Execute job.
} else {
// Set up my web server with port, router, etc.
}
}
Then, in the Scheduler add-on, I just call: my-app-name "my_job_param". It's pretty hacky, but I wanted to find a solution using the Scheduler add-on.
The typical pattern is to do something like:
.
└── cmd
├── processX
│ └── main.go
└── web
└── main.go
And you set heroku.install to ["./cmd/..."] and Heroku compiles and installs both commands into bin, so you get a bin/web and a bin/processX (processX is just a placeholder, whatever the name of the directory that contains a main package is the name of the resulting executable). In the above case your Procfile would say web: web, the first web being the process type, the second being the name of the executable. And the job you would tell scheduler to run would be processX.
That cleanly separates responsibilities
The Heroku scheduler will run any terminal command basically. So, if you compile your script into an executable called myscript, for example, you could simply put:
myscript
In the scheduler and it will execute that command, thereby running your script =) No ruby is required.

Go: embed JS files with bindata

This question is a follow up to an earlier question of mine. I've closed the question so I hope its okay that I ask a fresh but related question here. Go: embed static files in binary
How do I serve JS files with go-bindata? Do I pass it into html like this
hi.html
<script>{{.Bindata}}></script>
Doesn't seem to work even though I have no compile or JS errors.
Using https://github.com/elazarl/go-bindata-assetfs
Assuming you have the following structure:
myprojectdirectory
├───api
├───cmd
├───datastores
└───ui
├───css
└───js
Where ui is the directory structure you'd like to wrap up and pack into your app...
Generate a source file
The go-bindata-assetfs tool is pretty simple. It will look at the directories you pass to it and generate a source file with variables that can contain the binary data in those files. So make sure your static files are there, and then run the following command from myprojectdirectory:
go-bindata-assetfs ./ui/...
Now, by default, this will create a source file in the package main. Sometimes, this is ok. In my case, it isn't. You can generate a file with a different package name if you'd like:
go-bindata-assetfs.exe -pkg cmd ./ui/...
Put the source file in the correct location
In this case, the generated file bindata_assetfs.go is created in the myprojectdirectory directory (which is incorrect). In my case, I just manually move the file to the cmd directory.
Update your application code
In my app, I already had some code that served files from a directory:
import (
"net/http"
"github.com/gorilla/mux"
)
// Create a router and setup routes
var Router = mux.NewRouter()
Router.PathPrefix("/ui").Handler(http.StripPrefix("/ui", http.FileServer(http.Dir("./ui"))))
// Start listening
http.ListenAndServe("127.0.0.1:3000", Router)
Make sure something like this works properly, first. Then it's trivial to change the FileServer line to:
Router.PathPrefix("/ui").Handler(http.StripPrefix("/ui", http.FileServer(assetFS())))
Compile the app
Now you have a generated source file with your static assets in them. You can now safely remove the 'ui' subdirectory structure. Compile with
go install ./...
And you should have a binary that serves your static assets properly.
Use https://github.com/elazarl/go-bindata-assetfs
From the readme:
go-bindata-assetfs data/...
In your code setup a route with a file server
http.Handle("/", http.FileServer(assetFS()))
Got my answer here: Unescape css input in HTML
var safeCss = template.CSS(`body {background-image: url("paper.gif");}`)

How do use node-qunit?

The info on this page seems less-than-forth-coming -- https://github.com/kof/node-qunit. I've got a setup where I installed nodejs and installed the node-quit module. I have test runner and executed the command node /path/to/runner.js. Below is an example of my setup. Any ideas or examples on how to do this or maybe I'm using it wrong. I previous ran qunit tests using Rhino and EnvJs without any issues but I figured I try nodejs since I using it for other things and the packaging system can be scripted in my build. Maybe I missing an option to node to include Qunit or some environment variable not set -- that would make sense.
File Structure
node/
public/
js/
main.js
tests/
js/
testrunner.js
tests.js
Installation
cd node
npm install qunit
This will now update the file structure.
node/
node_modules/
qunit/
tests/js/testrunner.js
var runner = require("../../node/node_modules/qunit");
runner.run({
code : "/full/path/to/public/js/main.js",
tests : "/full/path/to/tests/js/tests.js"
});
tests/js/tests.js
test("Hello World", function() {
ok(true);
});
Command
node tests/js/testrunner.js
It appears that you need to use full paths to the main.js and tests.js files and also include a relative path to the qunit module. I updated the code above as an example for others.

Resources