Change Cookbook attribute file content on chef sever through Bash - bash

I have a scenario where I need to replace certain Strings in an attribute file within a cookbook with user input from within a Bash script.
In the current puppet setup this is done simply by using sed on the module files, since the modules are stored in the file structure as files and folders.
How can I replicate this in the Chef eco-system? Is there a known shortcut?
Or would I have to download the cookbook as a file using knife, modify the content and then re-upload again to make the changes?

Not sure this is the best approach. You can definitely use knife download, sed, and knife upload as you mentioned but a better way would be to make it data driven. Either store the values in a data bag or role, and manipulate those either using knife or another API client. Then in your recipe code you can read out the values and use them.

Related

How to include config files for the Google Ops-agent

I want to do some configurations for Google Cloud Ops-Agent in order to deploy it via Ansible.
For example /etc/google-cloud-ops-agent/kafka.yaml
How to include *.yaml configs?
If using /etc/google-cloud-ops-agent/config.yaml I'm worried then the configuration will be overwritten
There are two ways I can think of to do this.
The easiest (and least precise): use the copy module to recusively copy the the directory content to the target. Of course, if there are files other than ".yaml", you'll get those as well.
The more complex way...and I have not tested this. use the find module to execute locally on the control node, to get a list of the .yaml files, register their locations and then copy them up. There's probably a simpler way.

Nifi: How to sync two directories in nifi

I have to write my response flowfiles in one directory than get data from it change it and then put it inside other dierctory i want to make this two direcotry sync(i mean that whenever i delet, or change flowfile in one directory it should change in other directories too ) I have ore than 10000 flowfiles so chechlist wouldn't be good solution. Can you reccomend me:
any contreoller service which can help me make this?
any better way i can make this task without controller service
You can use a combination of ListFile, FetchFile, and PutFile processors to detect individual file write changes within a file system directory and copy their contents to another directory. This will not detect file deletions however, so I believe a better solution is to use rsync within an ExecuteProcess processor.
To the best of my knowledge, rsync does not work on HDFS file systems, so in that case I would recommend using a tool like Helix or DistCp (I have not evaluated these tools in particular). You can either invoke them from the "command line" via ExecuteProcess or wrapping a client library in an ExecuteScript or custom processor.

Automate whether the json file data in server machine is getting updated

I have some n number of files in a server directory. Is there a way to write a bash script that will automate whether the data is getting updated in json file when i access through the portal and if i make any changes manually then it needs to get updated in json file. Is there a way to do this?
Yes, you can manipulate json with jq and bash script around.

Can Puppet .pp files themselves be .erb templates?

I want the sites defined in nodes.pp to come from a .yml file. I'm thinking if the .pp file is itself processed first from an .erb file then this would be easy. But as far as I can tell the .pp files cannot be templates themselves, eg. nodes.pp.erb.
I want to keep the nodes definition in yml rather than in .pp because I want to use the same definition for things like vagrant test of deployment. I find it easier to consume a common .yml rather than parse nodes.pp to extract the info.
the obvious solution is to generate nodes.pp on-demand from a nodes.pp.erb, eg. in a rake task, but I wonder if Puppet itself has a solution to my conundrum.
I think puppet hiera would work well for you, check out:
https://github.com/puppetlabs/hiera#readme

Simple way of copying a file to a remote box via SCP using a Rake Task?

I'm used to Python Fabric in the past and I'm trying to do something similar with Ruby.
Basically I have created a Rake script which will run as a particular user which has SSH keys setup for passwordless access to the boxes in question.
I've managed to use https://github.com/seattlerb/rake-remote_task in order to run a command remotely, and expected the "put" method to "just work". However it seems to be an Rsync wrapper which does not take advantage of the keyless authentication.
It also seems to expect the file to be generated by a template which is not what I want, I want to SCP an actual .tgz binary file.
Am I missing something in the Ruby/Rake ecosystem. I expected this to be easy, but I feel like I'm going to need to go back to searching for gems?

Resources