I wrote a code using google cloud functions and google dataflow that reads from csv and then inserts into BQ
I used config files to automate.
Now i want to create a shell script that automatically creates the dataflow template(using mvn) and deploys the cloud function.
However, when I am running the shell script, the template is successfully created but the cloud function is not deployed as it states:
google-cloud-sdk/bin/gcloud: line 129: exec: python: not found
I need a resolution for including the gcloud into my bash shell script.
Related
I'm using Run an AWS CLI Script with referenced package. But such step does not allow to add feature .NET Configuration Variables, however plain RUN A SCRIPT allows it. Could be I'm missing something, but is this possible to enable somehow that feature for Run an AWS CLI Script?
https://github.com/aws-samples/aws-step-functions-kendra-web-crawler-search-engine
I was referring above link and implementing web crawling on particular website.
I have deployed the stack using command deploy --profile <YOUR_AWS_PROFILE> --with-kendra
but when i am using
crawl --profile <YOUR_AWS_PROFILE> --name lambda-docs --base-url https://docs.aws.amazon.com/ --start-paths /lambda --keywords lambda/latest/dg
it is giving me error:
'/crawl' is not recognized as an internal or external command,
operable program or batch file.
in the link it has been shown like "When the infrastructure has been deployed, you can trigger a run of the crawler with the included utility script"
is there any something to install the crawl command.
That should be ./crawl according to the README of the project.
Your error message also sounds like it's coming from Windows but the crawl script is written in Bash so you may run in to issues unless you switch to Linux/MacOS/BSD (or WSL).
I am running a simple YAML script in Azure Devops to run a command line batch script. The script calls a powershell script that converts ctest outputs to junit formatting for publishing test results through Azure Devops. When I run the build pipeline, the task fails with the following error:
'build\collect_results.cmd' is not recognized as an internal or external command, operable program or batch file.
My first hunch is it has something to do with placing the script in the same folder as the .vsts.yml file, since the script before it in the pipeline works with the call to the windows_c.cmd in the folder jenkins:
call jenkins\windows_c.cmd
Has anyone else seen this error? What is the root cause of it? Is it simply a bug in Azure DevOps?
I am using AWS CodeBuild along with Terraform for automated deployment of a Lambda based service. I have a very simple buildscript.yml that accomplishes the following:
Get dependencies
Run Tests
Get AWS credentials and save to file (detailed below)
Source the creds file
Run Terraform
The step "source the creds file" is where I am having my difficulty. I have a simply bash one-liner that grabs the AWS container creds off of curl 169.254.170.2$AWS_CONTAINER_CREDENTIALS_RELATIVE_URI and then saves them to a file in the following format:
export AWS_ACCESS_KEY_ID=SOMEACCESSKEY
export AWS_SECRET_ACCESS_KEY=MYSECRETKEY
export AWS_SESSION_TOKEN=MYSESSIONTOKEN
Of course, the obvious step is to simply source this file so that these variables can be added to my environment for Terraform to use. However, when I do source /path/to/creds_file.txt, CodeBuild returns:
[Container] 2017/06/28 18:28:26 Running command source /path/to/creds_file.txt
/codebuild/output/tmp/script.sh: 4: /codebuild/output/tmp/script.sh: source: not found
I have tried to install source through apt but then I get an error saying that source cannot be found (yes, I've run apt update etc.). I am using a standard Ubuntu image with the Python 2.7 environment for CodeBuild. What can I do to either get Terraform working credentials for source this credentials file in Codebuild.
Thanks!
Try using . instead of source. source is not POSIX compliant. ss64.com/bash/source.html
CodeBuild now supports bash as your default shell. You just need to specify it in your buildspec.yml.
env:
shell: bash
Reference: https://docs.aws.amazon.com/codebuild/latest/userguide/build-spec-ref.html#build-spec-ref-syntax
The AWS CodeBuild images ship with a POSIX compliant shell. You can see what's inside the images here: https://github.com/aws/aws-codebuild-docker-images.
If you're using specific shell features (such as source), it is best to wrap your commands in a script file with a shebang specifying the shell you'd like the commands to execute with, and then execute this script from buildspec.yml.
build-script.sh
#!/bin/bash
<commands>
...
buildspec.yml (snippet)
build:
commands:
- path/to/script/build-script.sh
I had a similar issue. I solved it by calling the script directly via /bin/bash <script>.sh
I don't have enough reputation to comment so here it goes an extension of jeffrey's answer which is on spot.
Just in case if your filename starts with a dot(.), the following will fail
. .filename
You will need to qualify the filename with directory name like
. ./.filename
I am executing jmeter on AWS EC2, result of which is returned in the form csv file.
I need to upload this csv file to AWS S3 bucket.
Since I am creating number of EC2 instances dynamically and executing jmeter on those instances, it's better to automate this process .
So for this I want to write shell script (as a user data) to execute jmeter and upload result CSV file to S3 bucket from each EC2 instance.
How i can write script for this ?
Consider using command line s3 clients.
S3 command line tools
Also go through some of these sites :
Shell Script To Transfer Files From Amazon S3 Bucket.
aws command line tools
python script to upload file to s3
You can use this library for managing objects on AWS S3 using shell scripts.
Universal Docs Manager is Pure shell script based objects manager which currently supports Local Disk, MySQL and AWS S3