I want to add an ssh-key generated in a shell script to my gcp project metadata. The problem is, that I don't really know how to format the generated key to be in the format which is need for the project metadata. The ssh-key I have looks like this:
ssh-rsa AAAAB3.... username
The format that is stated in the documentation is this:
username:ssh-rsa AAAAB3....
Is there a way to reformat the key within my shell script using echo and cat?
My best try is this: echo $USERNAME:$(cat ~/.ssh/id_rsa.pub), but this still leaves the trailing username at the end.
Assuming you are using bash, this should do the trick:
# Use the following line to read the key from a file
# KEY_WITH_USERNAME=$(cat ~/.ssh/id_rsa.pub)
KEY_WITH_USERNAME="ssh-rsa AAAAB3.... username"
USERNAME=${KEY_WITH_USERNAME##* }
KEY_WITHOUT_USERNAME=${KEY_WITH_USERNAME%"$USERNAME"}
echo $USERNAME:$KEY_WITHOUT_USERNAME
Outputs:
username:ssh-rsa AAAAB3....
See related questions about how to remove pre- or suffix from a string in Bash and how to split a string and get the final part.
Related
I'd like to read a yaml file and find a specific variable, then store that variable's value.
So it's something like:
- variable1:
variable2:
- "value"
variable3:
- ...
and I want to grab variable2's value, which also happens to be a file path. I'd like to be taken to that file path as well, so using the value to locate the value at that file path. Is that possible in shell script without a 3rd party library/plugin?
If not, what's the best approach to act on a yaml config with this objective?
Any help would be highly appreciated. Cheers!
Only way to parse yaml with shell could be that you need to implement your own script to parse yaml. Following is a helpful blog.
https://linuxhint.com/parse-yaml-file-bash/
Working on a bash script. I'm reading a line from a properties file using grep and cut and have fetched some value in a variable role_portions something like this role_portions=role_1:10,role_2:25,role_3:75,role_4:50,role_5:75,role_6:25,role_7:50
Now, I get a few roles as csv input parameter in my bash script and I would want to change those roles values to 0.
For example, when I run modify_script.sh role_2,role_4,role_7, after reading the above value from the file, the script should provide as output role_1:10,role_2:0,role_3:75,role_4:0,role_5:75,role_6:25,role_7:0. Can someone help with this?
When the role names are without special characters (like & and /) you can use sed.
for role in role_2 role_4 role_7; do
role_portions=$(sed -r "s/(^|,)(${role}):[^,]*/\1\2:0/" <<< "${role_portions}")
done
When you are already using grep and cut you might be able to combine commands (maybe use awk).
I'm trying to provision a Windows virtual machine in VMWare using Salt Cloud wrapped in a bash script so that I can parameterise it but I'm having a problem with the escaping of the map_data.
my command is:
#!/bin/bash
salt salt-cloud cloud.map_run map_data='{"PROFILE":[{"HOSTNAME":{"folder":"FOLDER","devices":{"network":{"Network adapter 1":{"ip":"MYIP"}}}}}]}'
This works fine however I would like HOSTNAME, FOLDER and MYIP to be variables ($hostname $folder and $ip) and I'm struggling a bit with the escaping so that the variables are expanded and passed correctly to salt.
I have tried putting the variable inline in the command:
salt salt-cloud cloud.map_run map_data='{"PROFILE":[{"$hostname":{"folder":"$folder,"devices":{"network":{"Network adapter 1":{"ip":"$ip"}}}}}]}'
This gets as far as copying the template in the profile before bombing out with a vmware error about the variblised elements being incorrect
I have also tried to encapsulate the whole map data in a variable, escaping the double quotes and passing that, e.g,
data="'{\"PROFILE\":[{\"$hostname\":{\"folder\":\"$folder\",\"devices\":{\"network\":{\"Network adapter 1\":{\"ip\":\"$ip\"}}}}}]}'"
This appears to expand correctly if I echo it out but when I add it to my command:
salt salt-cloud cloud.map_run map_data=$data
I get the following error:
Passed invalid arguments to cloud.map_run: map_run() takes at most 1 argument (10 given)
I know that this is probably not strictly Salt's problem but I wondered if anyone out there could give me some pointers on how to proceed?
Did you try the concatenation of strings like that :
salt salt-cloud cloud.map_run map_data='{"PROFILE":[{"'$hostname'":{"folder":"'$folder',"devices":{"network":{"Network adapter 1":{"ip":"'$ip'"}}}}}]}'
I don't use the cloud app myself, so I can't test it but looking at the first command you give:
salt salt-cloud cloud.map_run map_data='{"PROFILE":[{"$hostname":{"folder":"$folder,"devices":{"network":{"Network adapter 1":{"ip":"$ip"}}}}}]}'
Because the variables are in single quotes, they won't expand. So that won't work.
The second command you gave:
data="'{\"PROFILE\":[{\"$hostname\":{\"folder\":\"$folder\",\"devices\":{\"network\":{\"Network adapter 1\":{\"ip\":\"$ip\"}}}}}]}'"
Looks correct, it will expand the variables, but compared to the first command it will also add single quotes to the string (I think you forgot to remove those?).
Also in your first command a " seems to be missing after $folder.
Fixing those mistakes gives me the command:
salt salt-cloud cloud.map_run map_data="{\"PROFILE\":[{\"$hostname\":{\"folder\":\"$folder\",\"devices\":{\"network\":{\"Network adapter 1\":{\"ip\":\"$ip\"}}}}}]}"
which I think would work. If you put an echo in front of your command, and just copy your json, you can copy/paste it into a json formatter like https://jsonformatter.curiousconcept.com/ and it will tell you if the json you used is correct. This will help you find things like missing quotes.
I'm dealing with a pipeline of predominantly shell and Perl files, all of which pass parameters (paths) to the next. I decided it would be better to use a single file to store all the paths and just call that for every file. The issue is I am using awk to grab the files at the beginning of each file, and it's turning out to be a lot of repetition.
My question is: I do not know if there is a way to store key-value pairs in a file so shell can natively do something with the key and return the value? It needs to access an external file, because the pipeline uses many scripts and a map in a specific file would result in parameters being passed everywhere. Is there some little quirk I do not know of that performs a map function on an external file?
You can make a file of env var assignments and source that file as need, ie.
$ cat myEnvFile
path1=/x/y/z
path2=/w/xy
path3=/r/s/t
otherOpt1="-x"
Inside your script you can source with either . myEnvFile or the more versbose version of the same feature sourc myEnvFile (assuming bash shell) , i.e.
$cat myScript
#!/bin/bash
. /path/to/myEnvFile
# main logic below
....
# references to defined var
if [[ -d $path2 ]] ; then
cd $path2
else
echo "no pa4h2=$path2 found, can't continue" 1>&1
exit 1
fi
Based on how you've described your problem this should work well, and provide a-one-stop-shop for all of your variable settings.
IHTH
In bash, there's mapfile, but that reads the lines of a file into a numerically-indexed array. To read a whitespace-separated file into an associative array, I would
declare -A map
while read key value; do
map[$key]=$value
done < filename
However this sounds like an XY problem. Can you give us an example (in code) of what you're actually doing? When I see long piplines of grep|awk|sed, there's usually a way to simplify. For example, is passing data by parameters better than passing via stdout|stdin?
In other words, I'm questioning your statement "I decided it would be better..."
I am new to mongoDb and exciting about using it at my workplace. However, I have come across a situation where one of our client has sent the data in .bson file. I have got everything working on machine. I want to use mongoexport facility to export my data in csv format. When I am using the following query
./mongoexport --db <dbname> -collection <collectionname> --csv -fields _id,field1,field2
I am getting the result in following format
ObjectID(4f6b42eb5e724242f60002ce),"[ { ""$oid"" : ""4f6b31295e72422cc5000001"" } ]",369008
However, I just want the value of the fields as a comma separated output like below: 4f6b42eb5e724242f60002ce,4f6b31295e72422cc5000001,369008
My question is, is there anything that I can do something in mongoexport to ignore certain characters?
any pointer will be helpful.
No, mongoexport has no features like this. You'll need to use tools like sed and awk to post-process the file, or read the file and munge it in a scripting language like Python.
You should be able to add the following to your list of arguments:
--csv
You may also want to supply a path:
-o something.csv
...Though I don't think you could do this in 2012 when you first posted your question :-)