Nightwatchjs folder structure - continuous-integration

What's the simplest folder structure I can use with Nightwatchjs? It will be used locally and for continuous integration. Currently I can't even get the demo to work. I have six errors:
module.js:469:15
module.js:417:25
bootstrap_node.js:604.10
bootstrap_node.js:394:7
bootstrap_node.js:149:9
bootstrap_node.js:509:3.
I realize this is a beginner question. I've been using Telerik and TestComplete for a few years and now we want to do CI properly so Selenium is the way to go. I'm comfortable with javascript but kind of bad at file path stuff.

What's the simplest folder structure I can use with Nightwatchjs?
The simplest NightwatchJS folder structure is:
To have 2 files (a configuration file and a file which contain you tests):
nightwatch.json
app.js (you can rename it as you want)
Example
1) nightwatch.json
{
"src_folders": [
"app.js"
],
"live_output": false,
"tests_output": "test/tests_output/",
"detailed_output": true,
"selenium": {
"start_process": false,
"host": "hub.browserstack.com",
"port": 80
},
"test_workers": {
"enabled": false,
"workers": "auto"
},
"test_settings": {
"chrome": {
"selenium_port": 80,
"selenium_host": "hub.browserstack.com",
"silent": true,
"desiredCapabilities": {
"os": "Windows",
"os_version": "10",
"browserName": "chrome",
"resolution": "1024x768",
"javascriptEnabled": true,
"acceptSslCerts": true,
"browserstack.video": "true",
"browserstack.debug": "true",
"browserstack.user": "<yourUsername>",
"browserstack.key": "<yourPassword>"
}
}
}
}
2) app.js
module.exports = {
'Does-stackoverflow-works': function (browser) {
browser
.url("http://stackoverflow.com/questions")
.waitForElementPresent('body', 2000, "Display latest Stackoverflow questions")
.end()
}
};
Run
$> nightwatch --env chrome
Output

Related

How to run a .json compiler whenever I save in Visual Studio

I'm new to ASP.NET and using WebCompiler to compile my SCSS, so it compiles whenever main.scss is saved using the JSON below that is generated by the WebCompiler. My problem is whenever I save something on the any other .SCSS file it does not compile and I have to go back to main.scss and save that specific file in order to compile my code.
Is there any way to run my JSON compiler automatically whenever I save my project?
[
{
"outputFile": "wwwroot/css/Styles/stylescompiled.scss",
"inputFile": "wwwroot/css/Styles/main.scss",
"minify": { "enabled": true },
"includeInProject": true
},
{
"outputFile": "wwwroot/css/style.css",
"inputFile": "wwwroot/css/Styles/stylescompiled.scss",
"options": { "sourceMap": true },
"includeInProject": true
},
{
"outputFile": "wwwroot/css/style.css",
"inputFile": "wwwroot/css/Styles/main.scss",
"options": { "sourceMap": true },
"minify": { "enabled": true },
"includeInProject": true
}
]

Cypress multi reporters: using mochawesome with autoset-status-cypress-testrail-reporter

I need to use two reporters with my Cypress tests: mochawesome to generate html reports, and autoset-status-cypress-testrail-reporter to publish test results to Testrail.
The main tool I could find that would enable me to use multiple reporters is cypress-multi-reporters.
However, If I try to use cypress-multi-reporters with autoset-status-cypress-testrail-reporter, alone or in conjunction with mochawesome as below (in cypress.json), it does not work. It will not print out any errors, but it just will not publish the results to Testrail, and it will not generate the mochawesome reports.
{
"reporterEnabled": "mochawesome, autoset-status-cypress-testrail-reporter",
"mochawesomeReporterOptions": {
"reportDir": "cypress/reports",
"overwrite": false,
"html": true,
"json": false
},
"autosetStatusCypressTestrailReporterReporterOptions": {
"host": "https://xxxxxx/",
"username": "xxxxx",
"password": "xxxx",
"projectId": 1,
"runId": 1234
}
}
Can anyone tell me why the above is not working, or suggest a similar tool that would work with both mochawesome and autoset-status-cypress-testrail-reporter?
Got this to work in the end. The solution was
OPTION 1 - to include only the below in cypress.json:
"reporter": "cypress-multi-reporters",
"reporterOptions": {
"configFile": "reporter-config.json"
}
Then to create a new file called reporter-config.json, and add the config for each reporter in there:
{
"reporterEnabled": "mochawesome, autoset-status-cypress-testrail-reporter",
"mochawesomeReporterOptions": {
"reportDir": "cypress/reports",
"overwrite": false,
"html": true,
"json": false
},
"autosetStatusCypressTestrailReporterReporterOptions": {
"host": "https://xxxxxx/",
"username": "xxxxx",
"password": "xxxx",
"projectId": 1,
"runId": 1234
}
}
OPTION 2 - to have everything inside cypress.json, like so:
"reporter": "cypress-multi-reporters",
"reporterOptions": {
"reporterEnabled": "mochawesome, autoset-status-cypress-testrail-reporter",
"mochawesomeReporterOptions": {
"reportDir": "cypress/reports",
"overwrite": false,
"html": true,
"json": false
},
"autosetStatusCypressTestrailReporterReporterOptions": {
"host": "https://xxxxxx/",
"username": "xxxxx",
"password": "xxxx",
"projectId": 1,
"runId": 1234
}
}

CodeClimate Not reading configuration

In my react project the code quality checker CodeClimate, using advanced configuration just stop some silly code quality factors/thresholds like 50 line of code, :
Function `AutocompleteCombobox` has 50 lines of code (exceeds 25 allowed). Consider refactoring.
what I did I create .codeclimate.yml besides my package.json and upload that to the repo (connected with CodeClimate on branch DEV), following documentation.
this is the example of the .yml file:
version: "2" # required to adjust maintainability checks
checks:
argument-count:
enabled: true
config:
threshold: 4
complex-logic:
enabled: true
config:
threshold: 4
The Question is: CodeClimate doesn’t changes the records and metrics based on my configuration file!! I CHANGED THE RECORDS via .yml file; but still not updated on CodeClimate website ?!! the metrics are same as default.
*** TIP: Nothing to do from CodeClimate website settings we stop every condition, nothing apply except default! and I don’t want to delete and re-add the repo's because I'ill lose my tracking records in enhancement.
The problem is simple, the server make file called .codeclimate.json because I edit the configurations via the website, but in my repo I made I file called .codeclimate.yml, when I convert the configuration from .yml to .json I override the one on the server that works perfectly.
Example for may configuration .codeclimate.json:
{
"version": "2",
"checks": {
"argument-count": {
"enabled": false,
"config": {
"threshold": 4
}
},
"complex-logic": {
"enabled": true,
"config": {
"threshold": 15
}
},
"file-lines": {
"enabled": false,
"config": {
"threshold": 250
}
},
"method-complexity": {
"enabled": true,
"config": {
"threshold": 15
}
},
"method-count": {
"enabled": false,
"config": {
"threshold": 20
}
},
"method-lines": {
"enabled": false,
"config": {
"threshold": 25
}
},
"nested-control-flow": {
"enabled": true,
"config": {
"threshold": 4
}
},
"return-statements": {
"enabled": true,
"config": {
"threshold": 4
}
},
"similar-code": {
"enabled": false,
"config": {
"threshold": null
}
},
"identical-code": {
"enabled": true,
"config": {
"threshold": null
}
}
},
"exclude_patterns": [
"config/",
"db/",
"dist/",
"features/",
"**/node_modules/",
"script/",
"**/spec/",
"**/test/",
"**/tests/",
"Tests/",
"**/vendor/",
"**/*_test.go",
"**/*.d.ts"
]
}
If you face the same issue probably the configuration is duplicated on CodeClimate, you need to use one file only.

web app works locally and on app engine, but not on cloud run

So I've run into this issue with a web app I've made:
it gets a file path as input
if the file exists on a bucket, it uses a python client api to create a compute engine instance
it passes the file path to the instance in the startup script
When I ran it locally, I created a python virtual environment and then ran the app. When I make the input on the web browser, the virtual machine is created by the api call. I assumed it used my personal account. I changed to the service account in the command line with this command 'gcloud config set account', it ran fine once more.
When I simply go to the source code directory deploy it as is, the application can create the virtual machine instances as well.
When I use Google cloud build and deploy to cloud run, it doesn't create the vm instance.
the web app itself is not throwing any errors, but when I check compute engine's logs, there is an error in the logs:
`{
"protoPayload": {
"#type": "type.googleapis.com/google.cloud.audit.AuditLog",
"status": {
"code": 3,
"message": "INVALID_PARAMETER"
},
"authenticationInfo": {
"principalEmail": "####"
},
"requestMetadata": {
"callerIp": "#####",
"callerSuppliedUserAgent": "(gzip),gzip(gfe)"
},
"serviceName": "compute.googleapis.com",
"methodName": "v1.compute.instances.insert",
"resourceName": "projects/someproject/zones/somezone/instances/nameofinstance",
"request": {
"#type": "type.googleapis.com/compute.instances.insert"
}
},
"insertId": "######",
"resource": {
"type": "gce_instance",
"labels": {
"instance_id": "#####",
"project_id": "someproject",
"zone": "somezone"
}
},
"timestamp": "2021-06-16T12:18:21.253551Z",
"severity": "ERROR",
"logName": "projects/someproject/logs/cloudaudit.googleapis.com%2Factivity",
"operation": {
"id": "operation-#####",
"producer": "compute.googleapis.com",
"last": true
},
"receiveTimestamp": "2021-06-16T12:18:21.253551Z"
}`
In theory, it is the same exact code that worked from my laptop and on app engine. I'm baffled why it only does this for cloud run.
App engines default service account was stripped of all its roles and given a custom role tailored to the web apps function.
The cloud run is using a different service account, but was given that exact same custom role.
Here is the method I use to call the api.
def create_instance(path):
compute = googleapiclient.discovery.build('compute', 'v1')
vmname = "piinnuclei" + date.today().strftime("%Y%m%d%H%M%S")
startup_script = "#! /bin/bash\napt update\npip3 install pg8000\nexport BUCKET_PATH=my-bucket/{}\ngsutil -m cp -r gs://$BUCKET_PATH /home/connor\ncd /home/connor\n./cloud_sql_proxy -dir=cloudsql -instances=sql-connection-name=unix:sql-connection-name &\npython3 run_analysis_upload.py\nexport ZONE=$(curl -X GET http://metadata.google.internal/computeMetadata/v1/instance/zone -H 'Metadata-Flavor: Google')\nexport NAME=$(curl -X GET http://metadata.google.internal/computeMetadata/v1/instance/name -H 'Metadata-Flavor: Google')\ngcloud --quiet compute instances delete $NAME --zone=$ZONE".format(path)
config = {
"kind": "compute#instance",
"name": vmname,
"zone": "projects/my-project/zones/northamerica-northeast1-a",
"machineType": "projects/my-project/zones/northamerica-northeast1-a/machineTypes/e2-standard-4",
"displayDevice": {
"enableDisplay": False
},
"metadata": {
"kind": "compute#metadata",
"items": [
{
"key": "startup-script",
"value": startup_script
}
]
},
"tags": {
"items": []
},
"disks": [
{
"kind": "compute#attachedDisk",
"type": "PERSISTENT",
"boot": True,
"mode": "READ_WRITE",
"autoDelete": True,
"deviceName": vmname,
"initializeParams": {
"sourceImage": "projects/my-project/global/images/my-image",
"diskType": "projects/my-project/zones/northamerica-northeast1-a/diskTypes/pd-balanced",
"diskSizeGb": "100"
},
"diskEncryptionKey": {}
}
],
"canIpForward": False,
"networkInterfaces": [
{
"kind": "compute#networkInterface",
"subnetwork": "projects/my-project/regions/northamerica-northeast1/subnetworks/default",
"accessConfigs": [
{
"kind": "compute#accessConfig",
"name": "External NAT",
"type": "ONE_TO_ONE_NAT",
"networkTier": "PREMIUM"
}
],
"aliasIpRanges": []
}
],
"description": "",
"labels": {},
"scheduling": {
"preemptible": False,
"onHostMaintenance": "MIGRATE",
"automaticRestart": True,
"nodeAffinities": []
},
"deletionProtection": False,
"reservationAffinity": {
"consumeReservationType": "ANY_RESERVATION"
},
"serviceAccounts": [
{
"email": "batch-service-accountg#my-project.iam.gserviceaccount.com",
"scopes": [
"https://www.googleapis.com/auth/cloud-platform"
]
}
],
"shieldedInstanceConfig": {
"enableSecureBoot": False,
"enableVtpm": True,
"enableIntegrityMonitoring": True
},
"confidentialInstanceConfig": {
"enableConfidentialCompute": False
}
}
return compute.instances().insert(
project="my-project",
zone="northamerica-northeast1",
body=config).execute()
The issue was with the zone. For some reason, when it was ran on cloud run, the code below was the culprit.
return compute.instances().insert(
project="my-project",
zone="northamerica-northeast1",
body=config).execute()
"northamerica-northeast1" should have been "northamerica-northeast1-a"
EDIT:
I made a new virtual machine image and quickly ran into the same problem, it would work locally and break down in the cloud run environment. After letting it sit for some time, it began to work again. This is leading me to the conclusion that there is also some sort of delay before it can be called by cloud run.

Nightwatch parallel for two browser run only default for browserstack

I have configured several test settings and I want to run parallel chrome and ie.
When I run
./node_modules/.bin/nightwatch --env chrome
test for chrome runs
./node_modules/.bin/nightwatch --env ie
test for ie runs
./node_modules/.bin/nightwatch --env ie,chrome
test for firefox runs (as I guess this run only default ones and nothing more)
Nightwatch 1.0.18
{
"src_folders": [
"test"
],
"output_folder": "reports",
"live_output" : true,
"custom_commands_path": "config/commands",
"test_runner": {
"type" : "mocha",
"options" : {
"grep": "#acc"
}
},
"test_settings": {
"default": {
"selenium_host": "hub-cloud.browserstack.com",
"selenium_port": 80,
"launch_url": "localhost",
"browserstack.key": "KEY",
"browserstack.user": "USER",
"browserstack.local": "true"
},
"chrome-local": {
"default_path_prefix": "",
"launch_url": "localhost",
"screenshots": {
"enabled": true,
"on_failure": true,
"on_success": true,
"on_error": true,
"path": "./screenshots"
},
"desiredCapabilities": {
"browserName": "chrome",
"javascriptEnabled": true,
"chromeOptions": {
"args": [
"--no-sandbox",
"headless",
"window-size=1920,1200"
]
},
"acceptSslCerts": true,
"acceptInsecureCerts": true,
"elementScrollBehavior": 1
}
},
"chrome": {
"desiredCapabilities": {
"browser": "chrome"
}
},
"ie": {
"desiredCapabilities": {
"browserName": "ie",
}
}
}
}
for running parallel tests using Nightwatch refer - https://github.com/browserstack/nightwatch-browserstack
As you need to run parallel tests on Chrome and IE browsers you need to make changes to the line https://github.com/browserstack/nightwatch-browserstack/blob/master/package.json#L10 as follows-
"parallel": "./node_modules/.bin/nightwatch -c conf/parallel.conf.js
-e ie,chrome"

Resources