I am using Relay (and Hasura) and hence are required to compile my code ahead of time using the relay-compiler. I can compile the code fine on my local machine, however it always fails when in Github Actions.
Here is the section of my yml file where is breaks:
runs-on: ubuntu-latest
# other steps
- name: Download GraphQL Schema
run: SECRET=$SECRET ENDPOINT=$ENDPOINT yarn run get-schema
env:
SECRET: ${{ secrets.hasura_admin_secret }}
ENDPOINT: ${{ secrets.graphql_endpoint }}
- name: Test Compile Relay
run: yarn run relay <<< this fails
- name: Test build
run: yarn run build
And here are those scripts in my package.json.
"start": "yarn run compile-css && react-scripts start",
"build": "yarn run build-compile-css && react-scripts build",
"get-schema": "yarn run get-graphql-schema -h \"x-hasura-admin-secret=$SECRET\" $ENDPOINT > schema.graphql",
"relay": "yarn run relay-compiler --schema schema.graphql --src src",
It fails with the error:
$ /home/runner/work/<company-name>/<app-name>/node_modules/.bin/relay-compiler --schema schema.graphql --src src
Writing js
ERROR:
Syntax Error: Unexpected "$".
I have verified that schema is downloaded correctly and the paths to the schema and src folder are correct.
Is there specific config or arguments I need to pass to get this working in a CI environment?
Update
After more testing, I have found that the downloaded file from get-graphql-schema is somehow not correct. The issue is not there if I commit the schema and use this instead of downloading it.
I have the understanding it is bad practice to upload graphql.schema files, is this the case? If so are there are special arguments or set up required to get schema files working correctly in Github Actions?
I have managed to find that when running the get-graphql-schema it Github Actions it add the following line as the first line of the file. I can remove this via an additional script.
schema.graphql
$ /home/runner/work/<company-name>/<app-name>/node_modules/.bin/relay-compiler --schema schema.graphql --src src
schema {
query: query_root
mutation: mutation_root
subscription: subscription_root
}
...
I am unsure though why running this command in github actions will copy the first line.
Related
I was creating my yml file to create a pipiline and compile my code in github.
This is my file and I cannot understand why is not parsing correctly. I already checked all spaces. If someone can help me I will really appreciate it.
.regression
image: node-16-alpine
default:
tags:
- mycompany-services-internal-running
script:
- git clone https://mycode.com
- npm install
- npm run myCodeExample
- node myReport.js
artifact:
-reports/cucumber_report.html
I have the .env file saved with my firebase web API key saved in my local working directory.
I use it in my project as
const firebaseConfig = {
apiKey : process.env.REACT_APP_API_KEY,
}
Now I set up firebase hosting and use GitHub actions to automatically deploy when changes are pushed to GitHub.
However the deployed app by github gives me an error in my console saying Your API key is invalid, please check you have copied it correctly .And my app won't work as it is missing api key.
It seems like github actions is unable to access process.env.REACT_APP_API_KEY
I feel the problem is
As .env file is not pushed to github repo that's why my the build server is unable to access the command process.env.REACT_APP_API_KEY
How can I tackle this issue ?
Github workflow was automatically setup while setting up firebase hosting.
Is this the error or there is something else to take care ?
Below is my firebase-hosting-merge.yml
# This file was auto-generated by the Firebase CLI
# https://github.com/firebase/firebase-tools
name: Deploy to Firebase Hosting on merge
'on':
push:
branches:
- master
jobs:
build_and_deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- run: npm ci && npm run build --prod
- uses: FirebaseExtended/action-hosting-deploy#v0
with:
repoToken: '${{ secrets.GITHUB_TOKEN }}'
firebaseServiceAccount: '${{ secrets.FIREBASE_SERVICE_ACCOUNT_EVENTS_EASY }}'
channelId: live
projectId: myprojectname
env:
FIREBASE_CLI_PREVIEWS: hostingchannels
How can I make my .env file variables accessible to github build server ?
Do I need to change my firebaseConfig ? Or there is any way so that I can make my .env file available to build server and later delete it once the build finishes ?
const firebaseConfig = {
apiKey : process.env.REACT_APP_API_KEY,
}
a quick solution here could be having a step in github actions to manually create the .env file before you need it.
- name: Create env file
run: |
touch .env
echo API_ENDPOINT="https://xxx.execute-api.us-west-2.amazonaws.com" >> .env
echo API_KEY=${{ secrets.API_KEY }} >> .env
cat .env
I set sample github action to my repository. snippet is here.
jobs:
build:
name: Build
runs-on: ubuntu-latest
steps:
- name: Set up Go 1.x
uses: actions/setup-go#v2
with:
go-version: ^1.13
id: go
- name: Check out code into the Go module directory
uses: actions/checkout#v2
- name: Get dependencies
run: |
go get -v -t -d ./...
if [ -f Gopkg.toml ]; then
curl https://raw.githubusercontent.com/golang/dep/master/install.sh | sh
dep ensure
fi
but job is fail where Get dependencies. error is here.
package github.com/<organization-account>/<repo-name>/api/domain/repo: cannot find package "github.com/<organization-account>/<repo-name>/api/domain/repo" in any of:
/opt/hostedtoolcache/go/1.14.4/x64/src/github.com/<organization-account>/<repo-name>/api/domain/repo (from $GOROOT)
/home/runner/go/src/github.com/<organization-account>/<repo-name>/api/domain/repo (from $GOPATH)
of course. My code is work at local when go run main.go. I have go.mod, go.sum.
This is not a correct answer for the OP, but may help someone reaching here from searching.
In the root of your go project:
go mod init
go mod tidy
Commit and push to github, and it should work now.
You need to setup a token for go get or go mod to download private repos, thats why you're getting a 404.
First, you need to add the private repos to the GO_PRIVATE environment variable.
env:
GO_PRIVATE: github.com/<organization-account>/*
Then you need to configure git to use that token.
- name: Setup Git
run: git config --global url."https://${{ secrets.TOKEN }}:#github.com/".insteadOf "https://github.com"
- name: Get dependencies
run: go mod download
Add the token to the actions secrets in github.com
I don't use dep, but you should use go mod download instead of go get or dep since you have a mod file. I don't know why you're using dep AND go modules AND using go get at the same time. Weird.
I have a lambda functions that has somewhat non standard packaging. I am using a Makefile to help me package what I need and use it as my build method with sam build command. However I don't see this makefile being executed. Can't figure out why not.
Here is what I have :
sam_template.yaml:
AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31
Description: >
subscriptions_functions
Sample SAM Template for subscriptions_functions
Globals:
Function:
Timeout: 3
Resources:
GetSubscriptionsFunction:
Type: AWS::Serverless::Function
Properties:
CodeUri: .
Handler: app.lambda_handler_individual_methods
Runtime: python3.7
Events:
GetSubscriptions:
Type: Api
Properties:
Path: /subscriptions
Method: get
Metadata:
BuildMethod: makefile
Environment:
Variables:
SERVICE_METHOD_NAME: 'xyz'
REQ_CLASS_NAME: 'xyz'
RES_CLASS_NAME: 'xyz'
Makefile: (the name is based on some AWS examples)
build-GetSubscriptionsFunction:
#echo "Buliding artifacts with sls. Destination dir " $(ARTIFACTS_DIR)
sls package --env aws
mkdir -p $(ARTIFACTS_DIR)
unzip .serverless/subscriptions.zip -d $(ARTIFACTS_DIR)
cp requirements.txt $(ARTIFACTS_DIR)
python -m pip install -r requirements.txt -t $(ARTIFACTS_DIR)
rm -rf $(ARTIFACTS_DIR)/bin
Build succeeded when I run sam build -t sam_template.yaml , but I can tell the Makefile didn't run (no messages printed out and it would create a .serverless directory, but it didn't)
Anyone has an idea what is wrong in this setup?
so I figured it out and it wasn't anything to do with the syntax.
I was running from IntelliJ terminal. Since I was hitting a wall with this one, I started pocking around and running few other SAM commands. Running sam validate also kept failing, but with an error pointing to unset default region.
My region was properly set in both .aws/config and I even tried to export an env variable AWS_DEFAULT_REGION , but nothing worked. It kept failing with unset region.
So I started looking at my plugins in IntelliJ and turns out I had both AWS Toolkit and Debugger for AWS Lambda (by Thundera) installed.
I uninstalled the later and I'm back in business. Not clear on why this plugin was interfering with my console and cli, but it did. Getting rid of it did the trick
I run e2e tests on CI environment, but I cannot see the artifacts in pipelines.
bitbucket-pipelines.yml:
image: cypress/base:10
options: max-time: 20
pipelines:
default:
-step:
script:
- npm install
-npm run test
artifacts:
-/opt/atlassian/pipelines/agent/build/cypress/screenshots/*
-screenshots/*.png
Maybe I typed in the wrong way path, but I am not sure.
Does anyone have any ideas what I am doing wrong?
I don't think it's documented anywhere but artifacts only accepts a relative directory from $BITBUCKET_CLONE_DIR. When I run my pipeline it says: Cloning into '/opt/atlassian/pipelines/agent/build'..., so I think artifacts is relative to that path. My guess is that if you change it to something like this, it will work:
image: cypress/base:10
options: max-time: 20
pipelines:
default:
-step:
script:
- npm install
-npm run test
artifacts:
- cypress/screenshots/*.png
Edit
From your comment I now understand what the real problem is: BitBucket pipelines are configured to stop at any non-zero exit code. That means that the pipeline execution is stopped when cypress fails the tests. Because artifacts are stored after the last step of the pipeline, you won't have any artifacts.
To work around this behavior you have to make sure the pipeline doesn't stop until the images are saved. One way to do this is to prefix the npm run test part with set +e (for more details about this solution, look at this answer here: https://community.atlassian.com/t5/Bitbucket-questions/Pipeline-script-continue-even-if-a-script-fails/qaq-p/79469). This will prevent the pipeline from stopping, but also makes sure that your pipeline always finishes! Which of course is not what you want. Therefore I recommend that you run cypress tests separately and create a second step in your pipeline to check the output of cypress. Something like this:
# package.json
...
"scripts": {
"test": "<your test command>",
"testcypress": "cypress run ..."
...
# bitbucket-pipelines.yml
image: cypress/base:10
options: max-time: 20
pipelines:
default:
- step:
name: Run tests
script:
- npm install
- npm run test
- set +e npm run testcypress
artifacts:
- cypress/screenshots/*.png
-step:
name: Evaluate Cypress
script:
- chmod +x check_cypress_output.sh
- ./check_cypress_output.sh
# check_cypress_output.sh
# Check if the directory exists
if [ -d "./usertest" ]; then
# If it does, check if it's empty
if [ -z "$(ls -A ./usertest)" ]; then
# Return the "all good" signal to BitBucket if the directory is empty
exit 0
else
# Return a fault code to BitBucket if there are any images in the directory
exit 1
fi
# Return the "all good" signal to BitBucket
else
exit 0
fi
This script will check if cypress created any artifacts, and will fail the pipeline if it did. I'm not sure this is exactly what you need but it's probably a step in the direction.
Recursive search (/**) worked for me with Cypress 3.1.0, due to additional folder inside videos and screenshots
# [...]
pipelines:
default:
- step:
name: Run tests
# [...]
artifacts:
- cypress/videos/** # Double star provides recursive search.
- cypress/screenshots/**
this is my working solution
'cypress:pipeline' is the custom pipeline in my bitbucket config file to run cypress.
please try cypress/screenshots/**/*.png in artifact section
"cypress:pipeline": "cypress run --env user=${E2E_USER_EMAIL},pass=${E2E_USER_PASSWORD} --browser chrome --spec cypress/integration/src/**/*.spec.ts"
pipelines:
custom:
healthCheck:
- step:
name: Integration and E2E Test
script:
- npm install
- npm run cypress:pipeline
artifacts:
# store any generated images as artifacts
- cypress/screenshots/**/*.png