ssh-keyscan fails on github-actions - amazon-ec2

I`m trying to automate deploys to ec2 instance instance with github actions, but ssh-keyscan seems to fail for no reason. On my local machine it works totally fine.
here is my workflow file:
name: Deploy
on:
push:
branches:
- 'main'
env:
SERVER_HOST: x.x.x.xx
SERVER_USER: username
SERVER_PATH: ~/folder-name/
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
- name: Install SSH Key
uses: shimataro/ssh-key-action#v2.3.1
with:
key: "${{ secrets.SSH_PRIVATE_KEY }}"
known_hosts: "just-a-placeholder-so-we-dont-get-errors"
- name: Generate auth hosts
run: ssh-keyscan -H ${{ env.SERVER_HOST }} >> ~/.ssh/known_hosts
# Deploy
- run: rsync -rv --delete . ${{ env.SERVER_USER }}#${{ env.SERVER_HOST }}:${{ env.SERVER_PATH }}
Notes:
secrets.SSH_PRIVATE_KEY contains my private openssh key generated with ssh-keygen -t rsa -b 4096 -C "dummyemail#host.com" where dummyemail#host.com is the actual email of a github account where the workflow is triggered.
yes, I have added .pub key to ~/.ssh/authorized_keys on my server machine

The problem was that I mistakenly added inbound rules only for ip addresses listed here.
So the solution was to add inbound ssh rule for 0.0.0.0/0 and use private key created along with ec2 instance.

Related

CI/CD using Github Actions and AWS EC2 instance

I have a dockerised fastapi app whith depends on mysql and redis which are all configured in docker-compose.yml. Want to implement a CI/CD using github actions and AWS EC2 instance. My EC2 instance has docker and docker-compose installed. Here are my questions.
What do I do to run the tests that depends on the test db?
How do I implement CD from github actions and AWS EC2 instance?
I might not be clear so please ask some questions for clarification. Thank you.
name: backend-api
on:
push:
branches: ["main"]
pull_request:
branches: ["main"]
jobs:
build:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v3
- name: Set up Python 3.10
uses: actions/setup-python#v3
with:
python-version: "3.10"
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install flake8 pytest
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Lint with flake8
run: |
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- name: Create .env file for configuration settings.
uses: SpicyPizza/create-envfile#v1.3
with:
envkey_APP_ENV: ${{secrets.APP_ENV}}
envkey_APP_HOST: ${{secrets.APP_HOST}}
envkey_MYSQL_USER: ${{secrets.APP_ENV}}
envkey_PROD_BASE_URL: ${{secrets.MYSQL_USER}}
envkey_DEV_BASE_URL: ${{secrets.APP_ENV}}
envkey_MYSQL_ROOT_PASSWORD: ${{secrets.MYSQL_ROOT_PASSWORD}}
envkey_MYSQL_DATABASE: ${{secrets.MYSQL_DATABASE}}
envkey_PRODUCTION_DB_URI: ${{secrets.PRODUCTION_DB_URI}}
envkey_TEST_DB_URI: ${{secrets.TEST_DB_URI}}
envkey_BASE_URL: ${{secrets.BASE_URL}}
envkey_WALLET_PROVIDER_ACCESS_TOKEN: ${{secrets.WALLET_PROVIDER_ACCESS_TOKEN}}
envkey_S3_BUCKET_NAME: ${{secrets.S3_BUCKET_NAME}}
envkey_S3_ACCESS_SECRET: ${{secrets.S3_ACCESS_SECRET}}
envkey_S3_ACCESS_KEY: ${{secrets.S3_ACCESS_KEY}}
envkey_S3_BUCKET_REGION: ${{secrets.S3_BUCKET_REGION}}
envkey_JWT_SECRET_KEY: ${{secrets.JWT_SECRET_KEY}}
envkey_ETHERSCAN_API_URL: ${{secrets.ETHERSCAN_API_URL}}
envkey_BLOCKCHAIN_API_URL: ${{secrets.BLOCKCHAIN_API_URL}}
envkey_WALLET_PROVIDER_BASE_URL: ${{secrets.WALLET_PROVIDER_BASE_URL}}
envkey_STRATEGY_PROVIDER_BASE_URL: ${{secrets.STRATEGY_PROVIDER_BASE_URL}}
envkey_INDEX_PROVIDER_BASE_URL: ${{secrets.INDEX_PROVIDER_BASE_URL}}
- name: Running Tests with pytest
run: |
pytest
Deploy:
needs: build
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Git pull
env:
AWS_EC2_PEM: ${{ secrets.AWS_EC2_PEM }}
AWS_EC2_PUBLIC_IP: ${{ secrets.AWS_EC2_PUBLIC_IP }}
AWS_EC2_USERNAME: ${{ secrets.AWS_EC2_USERNAME }}
run: |
pwd
echo "$AWS_EC2_PEM" > private_key && chmod 600 private_key
ssh -o StrictHostKeyChecking=no -i private_key ${AWS_EC2_USERNAME}#${AWS_EC2_PUBLIC_IP}
git checkout main &&
git fetch --all &&
git reset --hard origin/main &&
git pull origin main &&
touch .env
docker-compose up -d --build

Confused with github action and ec2

This is my github actions flow:
---
name: build and push image to aws ecr
on:
push:
branches: [ main ]
jobs:
build-and-push:
name: Build and push to ecr
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout#v2
...
deploy:
needs: build-and-push
name: deploy
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- name: Login ec2
env:
PRIVATE_KEY: ${{ secrets.EC2_SSH_KEY }}
HOSTNAME: ${{ secrets.HOST_DNS }}
USER_NAME: ${{ secrets.USERNAME }}
run: |
echo "$PRIVATE_KEY" > private_key && chmod 600 private_key
ssh -o StrictHostKeyChecking=no -i private_key ${USER_NAME}#${HOSTNAME}
ls
touch helloworld.txt
I wonder that I'm connected to the EC2 instance or not.
This is my result after actions, it's list all file of my project. I think I didn't ssh to EC2 successfull becuase I just create EC2 instance and it's empty.
Run echo "$PRIVATE_KEY" > private_key && chmod 600 private_key
Pseudo-terminal will not be allocated because stdin is not a terminal.
Warning: Permanently added '***,18.181.220.91' (ECDSA) to the list of known hosts.
Dockerfile
README.md
docker-compose.yml
nest-cli.json
package-lock.json
package.json
private_key
src
test
tsconfig.build.json
tsconfig.json
This is reference
This is my command to ssh ec2 from client: ssh -i "home-key.pem" ec2-user#ec2-18-181-220-91.ap-northeast-1.compute.amazonaws.com
If i didn't connnect to EC2 instance, how I do it with github ?
Thanks for your attention.
I wonder that I'm connected to the EC2 instance or not.
In your SSH commands, add a hostname -a one: it will display the name of the machine you are on.
That being said, you could use actions/ssh-execute-commands to execute those same commands:
- name: Execute SSH commmands on remote server
uses: JimCronqvist/action-ssh#master
env:
PRIVATE_KEY: ${{ secrets.EC2_SSH_KEY }}
HOSTNAME: ${{ secrets.HOST_DNS }}
USER_NAME: ${{ secrets.USERNAME }}
with:
hosts: '${USER_NAME}#${HOSTNAME}'
privateKey: ${{ secrets.PRIVATE_KEY }}
debug: false
command: |
ls -lah
hostname -a
whoami
touch helloworld.txt

Using github secrets in another non-workflow yaml file

Is it possible to access a github secret in a yaml file that's not a workflow or an action yaml file?
For example, I've saved in github the environment secret INFURA_RINKEBY_WSS and I attempt to access it in the following yaml config file for my program.
type: EndpointList
endpoints:
- type: RPCEndpoint
chain_id: 1
network: rinkeby
provider: Infura
url: ${{ secrets.INFURA_RINKEBY_WSS}}
explorer: https://etherscan.io
However, the INFURA_RINKEBY_WSS environment variable I've set in github isn't accessed yet by my yaml config file.
The following is my main.yaml github workflow:
name: Report to eth/usd on rinkeby w/ pytelliot
on: push
jobs:
build:
runs-on: ubuntu-latest
strategy:
fail-fast: false
matrix:
python-version: ["3.9"]
steps:
- uses: actions/checkout#v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python#v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
python -m pip install telliot-feed-examples
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Move pre-made pytelliot configs to home directory
run: |
cp -r ./config ~/
- name: report :)
run: telliot-examples --legacy-id 1 report --submit-once
env:
PK: ${{ secrets.PK }}
INFURA_RINKEBY_WSS: ${{ secrets.INFURA_RINKEBY_WSS }}
Thanks!

appleboy/ssh-action No such device or address error

I'm trying to create an auto-deploy file for laravel project.
Here is my code of the yml file.
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- name: Checkout
uses: actions/checkout#v2
- name: Deployment
uses: appleboy/ssh-action#master
with:
host: ${{ secrets.SSH_HOST }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
username: ${{ secrets.SSH_USERNAME }}
script: |
cd path-to-my-directory
git pull
php artisan migrate
git pull returns could not read Username for 'https://github.com': No such device or address error. But I sure that username, key, host are set properly.
Can't solve the problem or find a similar issue with solution
If you set an SSH key, while your URL is an HTTPS one, said key would not be used. At all.
Instead, Git, in the GitHub Action, would try and get the credentials (username/password) for the private repository through HTTPS.
Plus, an SSH URL would use the username git anyway, alongside the private SSH key which allows the remote server to authenticate the actual user.
You can see here an example of a GitHub Action actually using an SSH URL:
on: [push]
jobs:
try-ssh-commands:
runs-on: ubuntu-latest
name: SSH MY_TEST
steps:
- name: Checkout
uses: actions/checkout#v2
- name: test_ssh
uses: ./
with:
ssh_key: ${{secrets.SSH_PRIVATE_KEY}}
known_hosts: ${{secrets.SSH_KNOWN_HOSTS}}
If you are using HTTPS however, and as noted in the comments be the OP:
you need a PAT (Personal Access Token)
you need to store it: git config credential.helper store + git pull.

Github Action - AWS CodeDeploy Succeeded but in EC2 created the modified file

i'm setting up the Github Action, AWS EC2, CodeDeploy. All the configuration seems working well. But excepts one thing. I can not understand and how can i solve it. If someone have experiences about this please help me.
I'm using:
EC2 Rhel 8
Node project (VueJs framework)
This is my cicd.yml file
on:
push:
branches:
- paymentV2
name: Deploy VueJS to Amazon ECS
#on: [push]
jobs:
deploy:
name: Deploy
runs-on: ubuntu-latest
strategy:
matrix:
node-version: ['12.x']
appname: ['staging-aws-codedeploy']
deploy-group: ['staging']
repo: ['project/MyProject']
steps:
- uses: actions/checkout#v2
# Configure AWS credentials
- name: Configure AWS credentials
uses: aws-actions/configure-aws-credentials#v1
with:
aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
aws-region: ap-southeast-1
# Deploy to AWS
- name: Deploy to AWS
run: |
aws deploy create-deployment \
--application-name ${{ matrix.appname }} \
--deployment-config-name CodeDeployDefault.OneAtATime \
--deployment-group-name ${{ matrix.deploy-group }} \
--file-exists-behavior OVERWRITE \
--description "GitHub Deployment for the ${{ matrix.appname }}-${{ github.sha }}" \
--github-location repository=${{ matrix.repo }},commitId=${{ github.sha }}
This is my appspec.yml
version: 0.0
os: linux
files:
- source: /
destination: /var/www/MyProject
hooks:
ApplicationStart:
- location: scripts/application_start.sh
timeout: 300
runas: root
#scripts/application_start.sh
#cd /var/www/MyProject
#npm run build
This is the log from Github action & CodeDeploy AWS
I've tried editing the Vision.vue file and created the pull request on Github. Everything was working well. But one thing i'm confusing is why the modified file is existed. Please refer the image below
=> What am i expected is the modified file shouldn't have existed. I thought that Github should be automatically run git pull to get all new source code.
I've some more research and found out --file-exists-behavior with OVERWRITE but it seems not working as i want.
https://docs.aws.amazon.com/cli/latest/reference/deploy/create-deployment.html
==> Once again, i have no experience about CD by Github action & CodeDeploy. Everyone please help me and advice me the right thing. Thank you so much.
After a period of learning, I understood that appspect and buildspec.yaml were just the way to build and deploy, but for the pull code, I used webhook (aws codebuild, AWS Code Pipeline, Github webhook) or schedule (crontab). And i've decided to user crontab for my project, scheduling to pull new source code every hours. Hope this sharing can help anyone. Tks

Resources