gcloud beta logging for trailing logs - shell

I just found out about google new "gcloud beta logging" service.
The classic sample they show is something like this:
gcloud beta logging write my-test-log "A simple entry"
But I would like to log every new entry in a specific log file. For example:
tail -F My_Log_File.txt | grep gcloud beta logging write my-test-log
What is the best practice for this operation?

You can do this:
tail -F My_Log_File.txt | xargs gcloud beta logging write my-test-log
Or you can use a logging agent to watch certain files and log them to the logging service:
https://cloud.google.com/logging/docs/agent/installation
http://docs.fluentd.org/articles/in_tail

Related

Bash Scripting with LastPass CLI

Edit: As of 01/31/2023 the scripts that I am using below ARE working. Any patterns of inconsistencies that I find I will report here. Would like to leave this open in case others have findings/advice they are interested in sharing in relation to bash scripting/LastPass CLI/WSL
I am looking to use the LastPass CLI to make some changes to Shared Sites within our LastPass enterprise. I was able to write the scripts (fortunately with some help from others on here), however I am unable to get the commands to work properly within a script.
One of the commands that I WAS having troubles with was lpass share create. This command worked directly from the command line, but I was unable to run this command within a script successfully. I have a very simple script, similar to the one below:
#!/bin/bash
folderpath=$1
lpassCreateStoreFolder(){
lpass share create "$folderpath"
}
lpassLogin(){
echo 'testPWD' | LPASS_DISABLE_PINENTRY=1 lpass login --trust --force tester#test.com
}
lpassLogin
lpassCreateStoreFolder
I've been invoking my script through the PowerShell command line like so:
wsl "path/to/script" "Shared-00 Test LastPass CLI"
Sometimes this command works within the script and other times it does not. When I tried running this script around mid December, I had no success at all. The script would run through all the way, the CLI would even give me a response
Folder Shared-00 Test LastPass CLI created.
and the LastPass Admin Console logs show me a report of "Create Shared Folder". The problem is when I go to my LastPass Vault, the Shared Folder was rarely/if ever created. Running the command without a script, directly from the command line worked almost 100% of the time. I initially chalked this up to inconsistencies on their end, but now I am experiencing these same problems with a different command.
Similarly I have been using the lpass edit command to make edits to sites within our LastPass vault. Once again, I have a relatively simple script to make the edit to the site:
#!/bin/bash
lpassId=$1
lpassSetNotes(){
printf "Notes:\n What are your notes?\nThese are my notes" | lpass edit --non-interactive --sync=now "$lpassId"
}
lpassLogin(){
echo 'testPWD' | LPASS_DISABLE_PINENTRY=1 lpass login --trust --force test#test.com
}
lpassLogin
lpassSetNotes
and have been invoking this script through Powershell like so:
wsl "path/to/script" "000LastPassID000"
like the lpass share create command, running the script does not produce the desired output. The script runs all the way through and my changes are reflected in the logs, but when I go to the vault the site itself is never changed. The command DOES however work when I run it from the command line directly within WSL.
I am relatively new to writing Bash scripts/the Linux operating system, so I'm not entirely sure if this something wrong on my end or just the vendor's tool that I am utilizing producing inconsistencies. Any help would be appreciated, I know this issue might be hard to replicate without a LastPass account
Example LastPass CLI calls that work directly from command line in WSL
lpass share create "Shared-00 Testing LastPass CLI"
printf "Notes:\n What are your notes?\nThese are my notes" | lpass edit --non-interactive --sync=now "$lpassId"
References
LastPass CLI
CLI Manual
CLI GitHub

Best way to run bash script on Google Cloud to bulk download to Bucket

I am very new to using Google cloud and cloud servers, and I am stuck on a very basic question.
I would like to bulk download some ~60,000 csv.gz files from an internet server (with permission). I compiled a bunch of curl scripts that pipe into a gsutil that uploads to my bucket into an .sh file that looks like the following.
curl http://internet.address/csvs/file1.csv.gz | gsutil cp - gs://my_bucket/file1.csv.gz
curl http://internet.address/csvs/file2.csv.gz | gsutil cp - gs://my_bucket/file2.csv.gz
...
curl http://internet.address/csvs/file60000.csv.gz | gsutil cp - gs://my_bucket/file60000.csv.gz
However this will take ~10 days if I run from my machine, so I'd like to run it from the cloud directly. I do not know the best way to do this. This is too long of a process to use the Cloud Shell directly, and I'm not sure what other app on the Cloud is the best way to run an .sh script that downloads to a Cloud Bucket, or if this type of .sh script is the most efficient method to go about bulk downloading files from the internet using the apps on Google Cloud.
I've seen some advice to use SDK, which I've installed on my local machine, but I don't even know where to start with that.
Any help with this is greatly appreciated!
Gcloud and Cloud Storage doesn't offer the possibility to grab objects from internet and copy these directly on a bucket without intermediary (computer,server or cloud application).
Regarding which Cloud service can help you for run a bash script, you can use a GCE always free F1-micro instance VM (1 instance free per billing account)
To improve the upload files to a bucket, you can use GNU parrallel to run multiple Curl Commands at the same time and improve the time to complete this task.
To install parallel on ubuntu/debian run this command:
sudo apt-get install parallel
For example you can create a file called downloads with the commands that you want to parallelize (you must write all curl commands in the file)
downloads file
curl http://internet.address/csvs/file1.csv.gz | gsutil cp - gs://my_bucket/file1.csv.gz
curl http://internet.address/csvs/file2.csv.gz | gsutil cp - gs://my_bucket/file2.csv.gz
curl http://internet.address/csvs/file3.csv.gz | gsutil cp - gs://my_bucket/file3.csv.gz
curl http://internet.address/csvs/file4.csv.gz | gsutil cp - gs://my_bucket/file4.csv.gz
curl http://internet.address/csvs/file5.csv.gz | gsutil cp - gs://my_bucket/file5.csv.gz
curl http://internet.address/csvs/file6.csv.gz | gsutil cp - gs://my_bucket/file6.csv.gz
After that, you simply need to run the following command
parallel --job 2 < downloads
This command will run up to 2 parallel curl commands until all the commands in the file have been executed.
Another improvement you can apply to your routine is to use gsutil mv instead gsutil cp, mv command will delete the file after success upload, this can help you to save space on your hard drive.
If you have the MD5 hashes of each CSV file, you could use the Storage Transfer Service, which supports copying a list of files (that must be publicly accessible via HTTP[S] URLs) to your desired GCS bucket. See the Transfer Service docs on URL lists.

heroku redis mass insertion

When working on my local machine i'm using this command for mass insertion:
cat fixtures.txt | redis-cli --pipe
but heroku gives a limited access to redis-cli, so I don't know how should I do it.
I tried:
heroku run "cat fixtures.txt | redis-cli --pipe"
resulting:
bash: redis-cli: command not found
I tried:
cat fixtures.txt | heroku redis:cli --pipe
resulting:
▸ No Redis instances found.
Does anybody knows how to make it right?
I really need to initialize my redis with a lot of data

TeamCity: Disable build trigger for all TeamCity projects

I would like to ask is there any way to disable build triggers for all TeamCity projects by running a script?
I have a scenario that I need to disable all the build triggers to prevent the builds from running. This is because sometimes, I might need to perform some upgrading process on build agent machines which will take more than one day.
I do not wish to manually click on Disable buttons for every build triggers on every different TeamCity projects. Is there a way to automate this process?
Thanks in advance.
Use Team City REST API.
Given your Team City is deployed at http://dummyhost.com and you enabled guest access with system admin role (otherwise just switch from guestAuth to httpAuth in URL and specify user with password in request, details are in documentation) you can do next:
Get all build configurations
GET http://dummyhost.com/guestAuth/app/rest/buildTypes/
For each build configuration get all triggers
GET http://dummyhost.com/guestAuth/app/rest/buildTypes/id:***YOUR_BUILD_CONFIGID***/triggers/
For each trigger disable it
PUT http://dummyhost.com/guestAuth/app/rest/buildTypes/id:***YOUR_BUILD_CONFIGID***/triggers/***YOUR_TRIGGER_ID***/disabled
See full documentation here
You can pause the build queue. See this video. This way you needn't touch the build configurations at all; you're just bringing the TC to a halt.
For agent-specific upgrades, it's best to disable only the agent you're working on. See here.
Neither of these is "by running a script" as you asked, but I take it you were only asking for a scripted solution to avoid a lot of GUI clicking.
Another solution might be to simply disable the agent, so no more builds will run.
Here is a bash script to bulk pause all (still not paused) build configurations by project name pattern via TeamCity REST API:
TEAMCITY_HOST=http://teamcity.company.com
CREDS="-u domain\user:password"
curl $CREDS --request GET "$TEAMCITY_HOST/app/rest/buildTypes/" \
| sed -r "s/(<buildType)/\n\\1/g" | grep "Project Name Regex" \
| grep -v 'paused="true"' | grep -Po '(?<=buildType id=")[^"]*' \
| xargs -I {} curl -v $CREDS --request PUT "$TEAMCITY_HOST/app/rest/buildTypes/id:{}/paused" --header "Content-Type: text/plain" --data "true"

Run XCode Integration from command line

I have set up os x server and bot which do build/test process. I need run this bot from command line.
Is it possible to run integration (CI on os x server) from command line?
It's quite late response, nevertheless hope it would help.
As you might know Xcode Server exposes WEB API.
So in order to launch integration just issue this command replacing YOUR_BOT_NAME part with actual bot name.
curl -sk -X POST -d '{ shouldClean: false }' https://localhost:20343/api/bots/`curl -sk https://localhost:20343/api/bots | jq '.results[] | select(.name == "YOUR_BOT_NAME") | ._id' | tr -d '"'`/integrations
Note this command utilises JQ command line JSON processor which is available via Homebrew:
brew install jq
Build can be done with xcodebuild. For tests, I don't know.

Resources