Chef: Read variable from file and use it in one converge - ruby

I have the following code which downloads a file and then reads the contents of the file into a variable. Using that variable, it executes a command. This recipe will not converge because /root/foo does not exist during the compile phase.
I can workaround the issue with multiple converges and an
if File.exist
but I would like to do it with one converge. Any ideas on how to do it?
execute 'download_joiner' do
command "aws s3 cp s3://bucket/foo /root/foo"
not_if { ::File.exist?('/root/foo') }
end
password = ::File.read('/root/foo').chomp
execute 'join_domain' do
command "net ads join -U joiner%#{password}"
end

The correct solution is to use a lazy property:
execute 'download_joiner' do
command "aws s3 cp s3://bucket/foo /root/foo"
creates '/root/foo'
sensitive true
end
execute 'join_domain' do
command lazy {
password = IO.read('/root/foo').strip
"net ads join -U joiner%#{password}"
}
sensitive true
end
That delays the file read until after it is written. Also I included the sensitive property so the password is not displayed.

You can download the file at compile time by using run_action and wrap the second part in the conditional block which will be executed at run time.
execute 'download_joiner' do
command "aws s3 cp s3://bucket/foo /root/foo"
not_if { ::File.exist?('/root/foo') }
end.run_action(:run)
if File.exist?('/root/foo')
password = ::File.read('/root/foo').chomp
execute 'join_domain' do
command "net ads join -U joiner%#{password}"
end
end

You should read the file from the second command, so that reading happens during convergence too:
execute 'download_joiner' do
command "aws s3 cp s3://bucket/foo /root/foo"
not_if { ::File.exist?('/root/foo') }
end
execute 'join_domain' do
command "net ads join -U joiner%`cat /root/foo`"
end
Note that both your approach and mine will break if the password contains funny characters. If net lets you provide the password on stdin or an env var, I'd do that instead.

Related

Bash commands as variables failing when joining to form a single command

ssh="ssh user#host"
dumpstructure="mysqldump --compress --default-character-set=utf8 --no-data --quick -u user -p database"
mysql=$ssh "$dumpstructure"
$mysql | gzip -c9 | cat > db_structure.sql.gz
This is failing on the third line with:
mysqldump --compress --default-character-set=utf8 --no-data --quick -u user -p database: command not found
I've simplified my actualy script for the purpose of debugging this specific error. $ssh and $dumpstructure aren't always being joined together in the real script.
Variables are meant to hold data, not commands. Use a function.
mysql () {
ssh user#host mysqldump --compress --default-character-set=utf8 --nodata --quick -u user -p database
}
mysql | gzip -c9 > db_structure.sql.gz
Arguments to a command can be stored in an array.
# Although mysqldump is the name of a command, it is used here as an
# argument to ssh, indicating the command to run on a remote host
args=(mysqldump --compress --default-character-set=utf8 --nodata --quick -u user -p database)
ssh user#host "${args[#]}" | gzip -c9 > db_structure.sql.gz
Chepner's answer is correct about the best way to do things like this, but the reason you're getting that error is actually even more basic. The line:
mysql=$ssh "$dumpstructure"
doesn't do anything like what you want. Because of the space between $ssh and "$dumpstructure", it'll parse this as environmentvar=value command, which means it should execute the "mysqldump..." part with the environment variable mysql set to ssh user#host. But it's worse than that, since the double-quotes around "$dumpstructure" mean that it won't be split into words, and so the entire string gets treated as the command name (rather than mysqldump being the command name, and the rest being arguments to it).
If this had been the right way to go about building the command, the right way to stick the parts together would be:
mysql="$ssh $dumpstructure"
...so that the whole combined string gets treated as part of the value to assign to mysql. But as I said, you really should use Chepner's approach instead.
Actually, commands in variables should also work and can be in form of `$var` or just $($var). If it says command not found, it could because the command maybe not in you PATH. Or you should give full path of you command.
So let's put this vote down away and talk about this question.
The real problem is mysql=$ssh "$dumpstructure". This means you'll execute $dumpstructure with additional environment mysql=$ssh. So we got command not found exception. It's actually because mysqldump is located on remote server not this host, so it's reasonable this command is not found.
From this point, let's see how to fix this question.
OP want to dumpplicate mysql data from remote server, which means $dumpstructure shoud be executed remotely. Let's see third line mysql=$ssh "$dumpstructure". Now we figure out this would result in problem. So what should be the correct command? The simplest command should be like mysql="$ssh $dumpstructure", which means both $ssh and $dumpstructure will be join into single command line in variable $mysql.
At the end, let's talk about the last command line. I do not agree with variable are meant to hold data, not command. Cause command is also a kind of data. The real problem is how to use it correctly.
OP's command is also supported, at least it is supported on bash 4.2.46.
So the real problem is how to use a variable to hold commands not import a new method to do that, wraping them into a bash function, for example.
So who can tell me why this answer does not come into readers' notice but be voted down?

Using a script that requires user input with chef cookbook with vagrant

I'm writing a custom Asterisk chef cookbook where I need to run this script
bash 'create asterisk keys' do
user 'root'
cwd File.dirname(source_path)
code <<-EOH
cd asterisk-#{node.version}*
./contrib/scripts/ast_tls_cert -C #{node.host} -O "#{node.box_name}" -d #{node.keys_dir}
EOH
action :nothing
end
This ast_tls_cert script will ask for several password inputs, but when I run this through vagrant the keys never get generated since the passwords never get entered. Is there a way to tell chef that if the script requires user input to just use some ENV variable as the value? I don't really need it to stop and ask the user for the inupt. Actually, I'd rather it didn't do that. I just want to specify some value and tell it to use that value.
In general you need to assume Chef is running unattended. You can use tools like expect or pexpect(python version) to drive scripts that absolutely require interactive input, but check if you can provide the passwords via environment variables or similar.
There's a gem called ruby_expect which can be added into a cookbook to handle this.
At the top of your cookbook default.rb file you'll want to add in chef_gem 'ruby_expect'. Next I created a ruby_block to handle doing this.
ruby_block 'create asterisk keys' do
block do
require 'ruby_expect'
Dir.chdir(File.join(File.dirname(tarball_path), "asterisk-#{node.version}"))
exp = RubyExpect::Expect.spawn(%{./contrib/scripts/ast_tls_cert -C #{node.host} -O "#{node.box_name}" -d #{node.keys_dir}}, debug: true)
exp.procedure do
each do
expect %r{Enter pass phrase for /etc/asterisk/keys/ca.key:} do
send 'somepassword'
end
end
end
end
action :nothing
end
Where tarball_path is where you downloaded the asterisk tar.

Executing 'tail -f' with Capistrano 3 pipes nothing into output

I have a Capistrano task that looks like this:
desc "tail log file"
task :tail do
on roles(:app) do
execute "tail -f #{shared_path}/log/#{fetch(:log_file)}.log"
end
end
When I run the task, it proceeds with the blocking tail -f request, but it shows up nothing. I am one hundred percent sure that it simply does not pipe the data somehow (I've verified it - the log file gets updated on the remote) thus it shows nothing. Did I miss something?
The app role is included in the stage config.
Mmmmmmm..., check the permissions in the file system. The user that runs the task should have the permission to read the file.
You can try chmod o+r logfile.log
Here you're giving permissions to read to anybody on the file (Useful for debugging purposes).
I have found a workaround/solution to my problem. I don't remember where I did found the solution, but executing the command rawly by instantiating an ssh connection with a pseudo-tty forced allocation (that's the -t) got it working. That will get the blocking requests like tail -f working. As the man pages say on the -t option:
This can be used to execute arbitrary screen-based programs on a remote machine,
which can be very useful...
def execute_interactively(command)
user = fetch(:user)
port = fetch(:port)
cmd = "ssh -l #{user} #{host} -p #{port} -t 'cd #{deploy_to}/current && #{command}'"
exec cmd
end
You'll need to set the capistrano verbosity level to DEBUG to see any streaming output. I found the solution in the capistrano3-taillog gem; see taillog.cap.
desc "tail log file"
task :tail do
on roles(:app) do
with_verbosity Logger::DEBUG do
execute "tail -f #{shared_path}/log/#{fetch(:log_file)}.log"
end
end
end
def with_verbosity(output_verbosity)
old_verbosity = SSHKit.config.output_verbosity
begin
SSHKit.config.output_verbosity = output_verbosity
yield
ensure
SSHKit.config.output_verbosity = old_verbosity
end
end

Want to read variable value from remote file

In one of my bash script I want to read and use the variable value from other script which is on remote machine.
How should I go ahead to resolve this. Any related info would be helpful.
Thanks in advance!
How about this (which is code I cannot currently test myself):
text=$(ssh yourname#yourmachine 'grep uploadRate= /root/yourscript')
It assumes that the value of the variable is contained in one line. The variable text now contains you variable assignment, presumably something like
uploadRate=1MB/s
There are several ways to convert the text/code into a real variable assignment in your current script, like evaluating the string or using grep. I would recommend
uploadRate=${text#*=}
to just remove the part up and including the =.
Edit: One more caveat to mention is that this only works if the original assignment does not contain variable references itself like in
uploadRate=1000*${kB}/s
ssh user#machine 'command'
will print the standard output of the remote command.
I would tell two ways at least:
1) You can simply redirect output to a file from remote server to your system with scp command...It would work for you.Then your script on your machine should read that file as an argument...
script on your machine:
read -t 50 -p "Waiting for argumet: " $1
It waits for output from remote machine,
Then you can
sshpass -p<password> scp user#host:/Path/to/file /path/to/script/
What you need to do:
You should tell the script from your machine, that the output from scp command is the argument($1)
2)Run script from your machine:
#!/bin/bash
script='
#Your commands
'
sshpass -p<password> ssh user#host $script
And you have also another ways to run script to do sth with remote machine.

Chef shell script not being run

I am using Chef on Scalarium to download an agent and run various commands on it. What I'm attempting is to write a shell script in the recipe to perform this.
file "/etc/profile.d/blah.sh" do
content <<-EOH
sudo -sH
<Retrieve file and run some commands>
EOH
end
When I run the recipe in Scalarium, no errors occur, but the commands aren't run either. There's no errors in the commands themselves, as I've run them on my computer.
The recipe is definitely read, as the Chef logs contain Processing file[/etc/profile.d/blah.sh] on blah.localdomain.
I've never used Chef before, do I need to do something else to tell it to execute the shell script?
Perhaps you want something like:
file "/etc/profile.d/blah.sh" do
mode 0500
content <<-EOH
sudo -sH
<Retrieve file and run some commands>
EOH
end
execute "/etc/profile.d/blah.sh"
Or, you can put the file retrieval and running of commands directly into your chef recipe:
remote_file "/path/to/where/the/file/should/be/saved" do
source "https://example.com/path/to/where/the/file/comes/from"
end
execute "first command"
execute "second command"

Resources