As my school project, I've created a AWS EC2 instance and trying to install different "Dev Tools" and software as per AWS doc [http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/compile-software.html.
][1]
I'm trying to get mahout tarball using get_reference_source -p mahout while connecting to AWS instance via putty.
But this command gives error "get_reference_source " command not found. I'm searching for it since morning and found nothing.
Related
I am working on a new webapp in azure cloud.
The challenge is that I am working on a new python module that I dont know that well, Pyspice. Pyspice interface to a program Ngpspice.
On my windows PC it works fine but not on the cloud. So I would like to be able to do debugging without pushing and then wait 25min for each build.
Right now I am using SSH to connect to the webapp. Then I can create a simple python script to see if I can get the connection to work between pyspice and ngspice. The challenge I have is that when I run python in SSH then it uses a different environment than the webapp, i.e. all the modules in the requirements.txt is not available. So how can I change environment to be able to debug?
I have created an Azure App service with Python version 3.8, when I check the version in Azure SSH it is showing me different version.
To install the latest version in Azure SSH, run the below command
apt update
apt install python3.9
python --version
Run the below command to change the python version of the Azure App service in Azure Cloud Shell (Bash).
az webapp config set --resource-group MyRGName --name WebAppName --linux-fx-version "PYTHON|3.9"
To check the updated version, run the below command
az webapp config show --resource-group MyRGName --name Python4Nov --query linuxFxVersion
debug azure webapp using ssh
To remote debug Azure Linux App Service, we need to open a TCP Tunnel from the development machine to Azure App Service.
Configure for SSH and Remote Debugging
In Azure CLI run the below command
az webapp create-remote-connection --resource-group MyRG -n WebAppName
References taken from MSDoc
I am running Teamcity on a windows VM and have installed the awscli.
I am trying to pull a zip from aws S3. But I get this error:
" aws : The term 'aws' is not recognized as the name of a cmdlet, function, script file"
When I run the command in both cmd and powershell it works just fine.
I have also checked that the awscli path is in both user and system paths.
Any ideas?
I figured it out.
The build agent was not running as a service and was running as a user account that didn't have the correct permissions. Installed a new agent, ran it as a windows service and as a service account.
I hope this helps someone in the future that faces this frustrating issue.
I am trying to install an ArangoDB cluster on google cloud. I run the install script. At every try I run up against and issue with an SSH onto the server. In the script there is an SSH user "core" and the script is looking for a password for this user to do SSH onto the server to set it up. I have no idea what that could be and have searched everywhere.
I am running the script from my local machine using gcloud SDK.
Any thoughts welcome!
I am creating an ec2 instance through knife . i gave the following command to create
knife ec2 server create -r "role[webserver]" -I ami-b84e04ea --flavor t1.micro --region ap-southeast-1 -G default -x ubuntu -N server01 -S ec2keypair
but getting error as Fog::Compute::AWS::Error: InvalidBlockDeviceMapping => iops must be specified with the volumeType of device '/dev/sda1' . I am unable to solve this issue , Any help will be appreciated .
Its possible that the ami you are trying to launch requires an EBS. With an EBS you can set the IOPS value which seems like it is not set and is giving you the issue.
Having a look at the documentation it seems you might need to add
--ebs-size 10
SIZE as an option.
I got that from the knife documentation
http://docs.opscode.com/plugin_knife_ec2.html
Also taking a look at the source code for the knife ec2 plugin it looks like you can add.
--ebs-optimized
Enabled optimized EBS I/O
I am an aws newbie, and I'm trying to run Hadoop on EC2 via Cloudera's AMI. I installed the AMI, downloaded the cloudera-haddop-for-ec2-tools, and now I'm trying to configure
haddop-ec2-env.sh
It is asking for the following:
AWS_ACCOUNT_ID
AWS_ACCESS_KEY_ID
AWS_SECRET_ACCESS_KEY
EC2_KEYDIR
PRIVATE_KEY_PATH
when running:
./hadoop-ec2 launch-cluster my-cluster 10
i'm getting
AWS was not able to validate the provided access credentials
Firstly, I have the first 3 attributes for my own account. This is a corporate account, and I received an email with the access key id and secret access key for my email. Is it possible that my account doesn't have the proper permissions to do what is needed here. Exactly why does this script need my credentials? What does it need to do?
Secondly, where is the EC2 key dir? I've uploaded my key.pem file that amazon created for me, and hard coded that into the PRIVATE_KEY_PATH and chmod 400 on the .pem file. Is that the correct key that this script needs?
Any help is appreciated?
Sam
The cloudera ec2 tools heavily rely on the amazon ec2 api tools. Therefore, you must do the following:
1) Download amazon ec2 api tools from http://aws.amazon.com/developertools/351
2) Download cloudera ec2 tools from http://cloudera-packages.s3.amazonaws.com/cloudera-for-hadoop-on-ec2-0.3.0.tar.gz
3) Set the following env variables I am only giving Unix based examples
export EC2_HOME=<path-to-tools-from-step-1>
export $PATH=$PATH:$EC2_HOME/bin
export $PATH=$PATH:<path-to-cloudera-ec2-tools>/bin
export EC2_PRIVATE_KEY=<path-to-private-key.pem>
export EC2_CERT=<path-to-cert.pem>
4) In cloudera-ec2-tools/bin set the following variables
AWS_ACCOUNT_ID=<amazon-acct-id>
AWS_ACCESS_KEY_ID=<amazon-access-key>
AWS_SECRET_ACCESS_KEY=<amazon-secret-key>
EC2_KEYDIR=<dir-where-the-ec2-private-key-and-ec2-cert-are>
KEY_NAME=<name-of-ec2-private-key>
And then run
$ hadoop-ec2 launch-cluster my-hadoop-cluster 10
Which will create a hadoop cluster called "my-hadoop" with 10 nodes on multiple ec2 machines