I have an AWS Lambda function that is scheduled to run once an hour (as described here http://docs.aws.amazon.com/lambda/latest/dg/getting-started-scheduled-events.html).
The function ftps files from a data provider and copies them to S3.
I have a test environment, and a production environment. For each environment, the ftp address and credentials are different.
How can I configure the lambda function so it can be aware of which environment it's running in, and get the ftp config accordingly?
PS: I am aware of this question: runtime configuration for AWS Lambda function , but it did not help me because I am using a scheduled lamdba using the new scheduled lambda functions feature introduced on Oct 8th 2015, and I cannot see a way to get configuration into the event.
I found a way. For the test version of the function, I am calling it TEST-CopyFtpFilesToS3 and for the production version of the function I am naming the function PRODUCTION-CopyFtpFilesToS3. This allows me pull out the environment name using a regular expression.
Then I am storing config/test.json and config/production.json in the zip file that I upload as code for the function. This zip file will be extracted into the directory process.env.LAMBDA_TASK_ROOT when the function runs. So I can load that file and get the config I need.
Some people don't like storing the config in the code zip file, which is fine - you can just load a file from S3 or use whatever strategy you like.
Code for reading the file from the zip:
const readConfiguration = () => {
return new Promise((resolve, reject) => {
let environment = /^(.*?)-.*/.exec(process.env.AWS_LAMBDA_FUNCTION_NAME)[1].toLowerCase();
console.log(`environment is ${environment}`);
fs.readFile(`${process.env.LAMBDA_TASK_ROOT}/config/${environment}.json`, 'utf8', function (err,data) {
if (err) {
reject(err);
} else {
var config = JSON.parse(data);
console.log(`configuration is ${data}`);
resolve(config);
}
});
});
};
Related
I'm new to Cypress and Javascript
I'm trying to send system commands through Cypress. I've been through several examples but even the simplest does not work.
it always fails with the following message
Information about the failure:
Code: 127
Stderr:
/c/Program: Files\Git\usr\bin\bash.exe: No such file or directory`
I'm trying cy.exec('pwd') or 'ls' to see where it is launched from but it does not work.
Is there a particular include I am missing ? some particular configuration ?
EDIT :
indeed, I'm not clear about the context I'm trying to use the command in. However, I don't set any path explicitely.
I send requests on a linux server but I also would like to send system commands.
My cypress project is in /c/Cypress/test_integration/cypress
I work with a .feature file located in /c/Cypress/test_integration/cypress/features/System and my scenario calls a function in a file system.js located in /c/Cypress/test_integration/cypress/step_definitions/generic.
System_operations.features:
Scenario: [0004] - Restore HBox configuration
Given I am logging with "Administrator" account from API
And I store the actual configuration
...
Then I my .js file, I want to send a system command
system.js:
Given('I store the actual configuration', () => {
let nb_elem = 0
cy.exec('ls -l')
...
})
I did no particular path configuration in VS Code for the use of bash command (I just configured the terminal in bash instead of powershell)
Finally, with some help, I managed to call system functions by using tasks.
In my function I call :
cy.task('send_system_cmd', 'pwd').then((output) => {
console.log("output = ", output)
})
with a task created as follows:
on('task', {
send_system_cmd(cmd) {
console.log("task test command system")
const execSync = require('child_process').execSync;
const output = execSync(cmd, { encoding: 'utf-8' });
return output
}
})
this works at least for simple commands, I haven't tried much further for the moment.
UPDATE for LINUX system commands as the previous method works for WINDOWS
(sorry, I can't remember where I found this method, it's not my credit. though it fulfills my needs)
This case requires node-ssh
Still using tasks, the function call is done like this
cy.task('send_system_cmd', {cmd:"<my_command>", endpoint:<address>,user:<ssh_login>, pwd:<ssh_password>}).then((output) => {
<process output.stdout or output.stderr>
})
with the task being build like this:
// send system command - remote
on('task', {
send_system_cmd({cmd, endpoint, user, pwd}) {
return new Promise((resolve, reject) => {
const { NodeSSH } = require('node-ssh')
const ssh = new NodeSSH()
let ssh_output = {}
ssh.connect({
host: endpoint,
username: user,
password: pwd
})
.then(() => {
if(!ssh.isConnected())
reject("ssh connection not set")
//console.log("ssh connection OK, send command")
ssh.execCommand(cmd).then(function (result) {
ssh_output["stderr"] = result.stderr
ssh_output["stdout"] = result.stdout
resolve(ssh_output)
});
})
.catch((err)=>{
console.log(err)
reject(err)
})
})
}
})
Currently getting the following error with MongoDB:
no saslprep library specified. Passwords will not be sanitized
We are using Webpack so simply installing the module doesn't work (Webpack just ignores it). I found this thread which talks about how to exclude it from Webpack compilations, but then I have to manually load it into every Lambda function which led me to Lambda Layers.
Following the Serverless guide on using Lambda layers allowed me to get my layer published to AWS and included in all of my functions, but for some reason, it doesn't install the modules. If I download the layer using the AWS GUI, I get a folder with just the package.json and package-lock.json files.
My file structure is:
my-project
|_ layers
|_ saslprep
|_ package.json
and my serverless.yml is:
layers:
saslprep:
path: layers/saslprep
compatibleRuntimes:
- nodejs14.x
This is not my preferred solution as I'd like to use 256, but the way I got around this error/warning was by changing the authMechanism from SCRAM-SHA-256 to SCRAM-SHA-1 in the connection string. The serverless-bundle most likely needs to add this dependency into their package to enable support for Mongo 4.0 SHA256 (my best guess!).
You can specify this authentication mechanism by setting the authMechanism parameter to the value SCRAM-SHA-1 in the connection string as shown in the following sample code.
const { MongoClient } = require("mongodb");
// Replace the following with values for your environment.
const username = encodeURIComponent("<username>");
const password = encodeURIComponent("<password>");
const clusterUrl = "<MongoDB cluster url>";
const authMechanism = "SCRAM-SHA-1";
// Replace the following with your MongoDB deployment's connection string.
const uri =
`mongodb+srv://${username}:${password}#${clusterUrl}/?authMechanism=${authMechanism}`;
// Create a new MongoClient
const client = new MongoClient(uri);
// Function to connect to the server
async function run() {
try {
// Connect the client to the server
await client.connect();
// Establish and verify connection
await client.db("admin").command({ ping: 1 });
console.log("Connected successfully to server");
} finally {
// Ensures that the client will close when you finish/error
await client.close();
}
}
run().catch(console.dir);
I'm working on a small serverless offline assignment and I got error Unsupported Media Type when tried to invoke one lambda function in another.
I found a solution but when I tried to applied to my project was not working:
here in the link all the details. cloud anyone help me with that
https://github.com/dherault/serverless-offline/issues/1005#issue-632401297
there are three possible solutions.
Make sure that the lambda_A have the same port and host where the lambda_B is running.
Lambda_A:
const { Lambda } = require('aws-sdk');
const lambda = new Lambda({
region: 'us-east-1',
endpoint: 'http://localhost:3000',
});
module.exports.main = async (event, context) => {
// invoke
}
Lambda_B: Is running on http://localhost:3000
You have configured out serverless-offline in twice functions.
https://www.serverless.com/plugins/serverless-offline#usage-with-invoke
Lambda_A or Lambda_B have correctly stage?. Remember to use sls offline --stage local in both functions.
Has anyone used excel to store test data while runing cypress tests? I am trying to make this work, but since I cant access the file system with browserify I cant read excel test data file. Anyone got this working? DO you have any links/code on how to get it working?
Thanks.
I realised that normal excel processing packages wont work with cypress and typescript, since u r using browserfiy. browserify will prevent you from performing file read/write operations.
as a workaround, i only read the excel file prior to browserifying it (in the plugins/index.js), and convert it into a json file and save it in the fixtures folder.
then in the support/index.js in a beforeAll hook i read the json file as a fixture and save it to an aliased variable. Now I can parse data from the aliased variable and use it where required in cypress context throughout the test. this is what worked for me.
Here is an instruction how to use excel as source for cypress tests https://medium.com/#you54f/dynamically-generate-data-in-cypress-from-csv-xlsx-7805961eff55
First you need to convert your xlsx file to json with Xlsx
import { writeFileSync } from "fs";
import * as XLSX from "xlsx";
try {
const workBook = XLSX.readFile("./testData/testData.xlsx");
const jsonData = XLSX.utils.sheet_to_json(workBook.Sheets.testData);
writeFileSync(
"./cypress/fixtures/testData.json",
JSON.stringify(jsonData, null, 4),
"utf-8"
);
} catch (e) {
throw Error(e);
}
Then import json file and loop over each row and use the data in the way you want. In this example it tries to log in to a system.
import { login } from "../support/pageObjects/login.page";
const testData = require("../fixtures/testData.json");
describe("Dynamically Generated Tests", () => {
testData.forEach((testDataRow: any) => {
const data = {
username: testDataRow.username,
password: testDataRow.password
};
context(`Generating a test for ${data.username}`, () => {
it("should fail to login for the specified details", () => {
login.visit();
login.username.type(data.username);
login.password.type(`${data.password}{enter}`);
login.errorMsg.contains("Your username is invalid!");
login.logOutButton.should("not.exist");
});
});
});
});
My Parse app has a GiftCode collection which disallows the find operation at the class-level.
I am writing a beforeSave cloud function that prevents duplicate codes from being entered by our team from Parse's dashboard:
Parse.Cloud.beforeSave('GiftCode', function (req, res) {
Parse.Cloud.useMasterKey();
const code = req.object.get('code');
if (!code) {
res.success();
} else {
const finalCode = code.toUpperCase().trim();
req.object.set('code', finalCode);
(new Parse.Query('GiftCode'))
.equalTo('code', finalCode)
.first()
.then((gift) => {
if (!gift) {
res.success();
} else {
res.error(`GiftCode with code=${finalCode} already exists (objectId=${gift.id})`);
}
}, (err) => {
console.error(err);
res.error(err);
});
}
});
As you can see, I am calling Parse.Cloud.useMasterKey() (and this is running in the Parse cloud), but I am still getting the following error:
This user is not allowed to perform the find operation on GiftCode.
I use useMasterKey() in other normal cloud functions and am able to perform find operations as needed.
Is useMasterKey() not applicable to beforeSave functions?
I've never tried to use the master key in a beforeSave function but I wouldn't be surprised if there's some extra safeguards in place to prevent it. From a security standpoint, it seems like it could make all write-based CLPs and ACLs worthless for that class.
Try selectively using the master key by passing it as an option to the query like so
(new Parse.Query('GiftCode'))
.equalTo('code', finalCode)
.first({ useMasterKey: true })
.then((gift) => {
...
Parse.Cloud.useMasterKey(); has been deprecated in Parse Server version 2.3.0 (Dec 7, 2016). From that version on, it is a no-op (it does nothing). You should now insert the {useMasterKey:true} optional parameter to each of the methods that need to override the ACL or CLP in your code.