What is the correct way to use Chef from Ruby (Rails)? - ruby

I'm very new with Chef, maybe I search wrong but Google show a lot of quick starts and deployment options, but mostly on how to deploy an app from dev's console. What I need is to perform recipes from the Rails app.
I have a stack which includes Rails+Resque as a master and Chef as a slave. Chef is added as a gem chef, the chef/shef/ext used inside the app to run queries.
It should do several things, like create ssh users (which works) and deploy new app stacks (which don't).
As the chef gem doesn't have a lot of docs and ext doesn't feel like user (or dev) oriented too, I think there should be some other way to work with Chef server (knife?), or some kind of documentation on gem I definitely miss to work effitiantly with this.

We got stuck on something similar and ended up using the ridley gem:
As per this SO question.

Related

Automatically set up new Digital Ocean server for Laravel app

I know that https://forge.laravel.com/auth/register is available for $12/month*, but I'd like to understand how to accomplish the same thing myself.
What I assume is possible (and what I'm looking for): I create a server that has only Ubuntu 18.04.3 installed and nothing else, and I upload a script that installs all the appropriate software and sets up MySQL with the correct passwords, etc (without manual intervention).
I've tried Laradock and had tons of problems with Docker and don't want to do that anymore.
I see that https://cloud.digitalocean.com/droplets/new lets me create a LEMP droplet (Ubuntu, Nginx, MySQL, PHP-FPM) with one click. But it lacks Redis, and its versions are outdated (e.g. PHP 7.2).
I've heard people mention Chef (maybe this?), but that seems to be more complicated than what I'm imagining.
Unfortunately I'm not even sure how to search for what I'm trying to do (or how to tag this question); is this called "server provisioning"? I've been searching phrases like "automatic install script redis mysql server for laravel".
Thanks in advance for pointing me in the right direction.
* I also just found https://getcleaver.com/ and https://runcloud.io/server-management, which each look like Forge + Envoyer (and RunCloud offers a free plan).
It is called server provisioning and Chef would be a good fit for this, check out Ansible too - another thing you could do is setup the server yourself and create an image from that server and then base your new servers out of that image, that way you'll have all your services installed from the start.
This sounds like a job or something like Puppet (or Chef/Ansible), however Laravel Envoy may be another tool to look at if you haven't already for the second part of your problem.
I highly recommend Heroku (or similar service), as this is all done out of the box, and has a ton of other great features that make developing a pipeline a breeze.

Scheduling/scripting with Play Framework (or non-rails project) in Heroku

Well, my head is spinning a bit here. I started with what i thought would be a simple task, to take regular db dumps on heroku and push them to a personal S3 account for backup.
I am not sure the best a approach to do this. Accessing S3 within Java is crystal clear, getting the db dump from heroku is clear as mud right now...
Disclaimer: i don't know Ruby, and i don't really want to learn Ruby if i don't have to, i really want to use Java (that is why i chose play) and i want to have it hosted, that is why i chose Heroku :-)
So, I could use the heroku Scheduler, but i am not understanding what scripts are being executed here - is it all scripts in /bin? What kind of scripts are these, are they ruby scripts? How do i add them as 'tasks' when they aren't rake tasks?
Can I use the pgbackups via URL somehow? It looks like the rake examples do pg_dump instead, write to a tmp file and then move it around from there. I'm pretty unclear how to access the heroku databased stuff from a script, the examples i have seen so far are in rake, so any insight there would be helpful...
Or coming at it from inside my java app, what is the status of the Heroku java API? If there is a way to get to the heroku runtime from my java, or somehow use the heroku.jar?
It would great to get some overall guidance and best practices in this area - thanks!!!
From the google group i found this tidbit:
http://groups.google.com/group/heroku/browse_thread/thread/7fe984c3d2d01f21/9474f31138636332?lnk=gst&q=scheduler+#9474f31138636332
"Sorry for the delayed response. We updated the docs to mention running Procfile entries via heroku run:
http://devcenter.heroku.com/articles/oneoff-admin-ps
Anything that works via heroku run works via Heroku Scheduler. Just put the name of the process type as the 'task" in Scheduler. No special syntax required. And you can even pass it arguments. "
From this and James Ward's last example above i am considering this answered.

Create stand-alone system services in Ruby

I want to build application which servers as a stand-alone system service, always run on the backend and servers a front-end with a web interface.
Like we do in Linux /etc/init.d/apache2 start , Same as I want to server my application /etc/init.d/myapp start.
My major target is to deliver on Linux specially Ubuntu, whole app would be in plain Ruby and front-end would be in Sinatra.
I want to make it install with simple, gem install my_app and command line features get available to start the service. The application would be doing heavily processing and database insertion. And I want that its configurations must be set as in pure linux fashion, like /etc/apache2/apache2.conf
Can any one guide me in it? Also if possible, i want to secure the code, is there any possibilities for it?
I am using the Daemon-Kit gem for the same requirements. Works very well in production. The only thing it does not include is the configuration with a .conf file, but it's easy to do it yourself with Ruby code. You can deploy with Capistrano, no need to install.

How can I tag an EC2 instance using Ruby in Chef?

I'm playing around with Chef to launch EC2 instances. Everything is working pretty well, but Chef doesn't seem to have the ability to tag the instances. Am I missing something?
Otherwise, what's the preferred Ruby library for achieving this? Can I do it without requiring additional gems?
Thanks
Version 0.5.12 of the knife-ec2 Gem supports tagging EC2 instances on creation with the --tags option.
knife ec2 server create [... your options...] --tags Tag=Value
Know this is old, but was browsing about and spotted it. Another alternative is to use the AWS community cookbook - assuming you have key creds - if you want to do things programatically as part of the recipe.
aws = data_bag_item('mydatabag', 'creds')
aws_resource_tag node['ec2']['instance_id'] do
aws_access_key aws['access_key']
aws_secret_access_key aws['secret_key']
tags({
"foo" => "bar"
})
action :update
end
Usually chef is used to install things on the instance. I'm not exactly sure how you start a node with chef, but maybe you can share this and I'll extend my answer?
Otherwise, fog is a great library to do these things. I just skimmed over the source and it seems to support tagging as well.
To get fog: gem install fog.

Running gem server in passenger

I'm running a few rails/rake apps in Apache/passenger and I want to add the documentation app served by gem server to these apps, so I can easily give it a special (sub)domain, like docs.example.org, so it's easily available for all members of our team and nobody has to start the server himself or remember port numbers (like 8808, the default gem server port).
I would recommend looking into bdoc instead of gem server, it allows the user to access all their gem docs without a server running at all. It would also be trivial to modify bdoc to output to a specific directory then you could easily add a step to regenerate the docs.
The nice thing about having them in static files would be the apache config is dead simple.
If you do want to make bdoc output to a specific dir look at this line.
Edit:
I actually went ahead and branched this on github and made the change. Now you can supply the output directory on the command line and it will generate the static rdoc pages for you.
I'm running http://gems.local on my machine in case I want to do some Ruby cracking offline. (Plain journey, trains, etc).
This is really easy, you can actually run passenger with all the Ruby gems' documentation locally without having to access the net.
I was following Jason's tips and got everything working. See the following article and you should be ready to go:
http://jasonseifer.com/2009/02/22/offline-gem-server-rdocs
Attila
I wrote a blog post on how I have my gems, ruby, rails and jquery docs locally using the yard server and nginx for proxing in mac os x. Steps for linux are almost the same, only thing that changes is the way to configure the daemons.
https://makarius.posterous.com/offline-rails-ruby-jquery-and-gems-docs-with

Resources