I saw heroku cli and it's cool to log in real time via -tail
But how do I send that real time logging to discord via webhooks (for example curl or node-fetch by cli or coding respectively.)?
So the only way I have been able to do this is to create a Heroku webhook through their notifications and have it go to Zappier then to a discord webhook.
you'll need to create a custom logdrain and connect that to the your heroku app's logplex. But you don't need to do that from scratch, this repository has a very good explanation and implementation in python
https://github.com/abhi887/heroku2discord-logdrain
Ps: the best way is here, and it works!
Module file:
const {fetch} = require("rovel.js");
function log(text){
fetch(process.env.CONSOLE_LOG, {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({
"username": "RDL console.log",
"content": text
})
})
globalThis.logg(text);
}
function error(text){
fetch(process.env.CONSOLE_LOG, {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({
"username": "RDL console.error",
"content": text
})
})
globalThis.logerr(text);
}
function warn(text){
fetch(process.env.CONSOLE_LOG, {
method: "POST",
headers: {
"Content-Type": "application/json"
},
body: JSON.stringify({
"username": "RDL console.warn",
"content": text
})
})
globalThis.warnn(text);
}
module.exports = {log, error, warn};
And import it:
const loggy = require("#utils/loggy.js");
globalThis.logg = console.log;
globalThis.console.log = loggy.log;
globalThis.logerr = console.error;
globalThis.console.error = loggy.error;
globalThis.warnn = console.warn;
globalThis.console.warn = loggy.warn;
globalThis.fetch = rovel.fetch;
(rovel.fetch is actually a copy of node fetch.)
process.env.CONSOLE_LOG is the discord webhook where you want to log it.
Related
I have a case in my application where I need to send data as form data to a server. The data includes a message and an optional list of files. The problem I'm facing is that when sending the request it's not being formed properly.
Request Payload
Expected (sample with the same request in the browser)
Actual (resulting request when running in NativeScript)
The actual result is that the payload is somehow being URL encoded.
Code example
sendData({ id, message, files }) {
const config = {
headers: {
'Content-Type': 'multipart/form-data'
}
};
const payload = new FormData();
payload.append('message', message);
if (files && files.length > 0) {
files.forEach((file) => {
payload.append(`files`, file, file.name);
});
}
return AXIOS_INSTANCE.post(
`/api/save/${id}`,
payload,
config
);
}
As you can see from the above, I'm using axios and also I'm trying to use FormData to format the data. From my research it seems that NativeScript used to not support binary data via XHR - however looking at this merge request on GitHub it looks like it's been fixed about a year ago.
So my suspicion is that I'm doing something wrong, maybe there's an alternative to using FormData, or else I shouldn't use axios for this particular request?
Version Numbers
nativescript 6.8.0
tns-android 6.5.3
tns-ios 6.5.3
Nativescript's background-http supports multipart form data.
See below for how its configured to do multipart upload
var bghttp = require("nativescript-background-http");
var session = bghttp.session("image-upload");
var request = {
url: url,
method: "POST",
headers: {
"Content-Type": "application/octet-stream"
},
description: "Uploading "
};
var params = [
{ name: "test", value: "value" },
{ name: "fileToUpload", filename: file, mimeType: "image/jpeg" }
];
var task = session.multipartUpload(params, request);
I have the following nodejs code per this and this:
const WebSocket = require('ws');
const ws = new WebSocket('wss://ws.tradier.com/v1/markets/events');
request({
method: 'post',
url: 'https://api.tradier.com/v1/markets/events/session',
form: {
},
headers: {
'Authorization': 'Bearer MY_API_KEY_NOT_SHOWN',
'Accept': 'application/json'
}
}, (error, response, body) => {
console.log(response.statusCode);
console.log(body);
let data = JSON.parse(body)
let sessionId = data.stream.sessionid
streamPrice(sessionId)
});
function streamPrice(sessionId){
console.log(sessionId)
ws.on('open', function open() {
console.log('Connected, sending subscription commands...');
ws.send(`{"symbols": ["TSLA"], "sessionid": "${sessionId}", "linebreak": true}`);
});
ws.on('message', function incoming(data) {
console.log(data);
});
ws.on('error', function error(data) {
console.log(data);
});
}
I get a 200 OK back from the API request to create the web sockets session, and I have a valid session ID:
200
{"stream":{"url":"https:\/\/stream.tradier.com\/v1\/markets\/events","sessionid":"6ba4158d-8ff8-46c3-b005-***********"}}
6ba4158d-8ff8-46c3-b005-***********
However, the ws.on() events never fire. I am not getting any errors. The session does close after a period of time, presumably due to inactivity. But it's not inactivity on my code's part...
Is there something wrong in my code / something I'm missing to make this work?
I was able to identify the issue myself. The problem is I was opening the websocket too early.
I moved the following line inside of streamingPrice scope instead of the global scope to resolve.
const ws = new WebSocket('wss://ws.tradier.com/v1/markets/events');
I am trying to setup a E2E cypress test
And for the same, trying(but FAILED) to get the events from the SSE connection(mocked) and for the same emitting the push event before setting the SSE connection
Q: Can you please help to sort this out, as I might be doing the whole thing wrong or missing something
Note: as per this PR, cypress supports SSE - https://github.com/cypress-io/cypress/pull/2054
Not able to find any reference on cypress for SSE support - https://docs.cypress.io/api/commands/route.html
const EEmitter = new EventEmitter();
cy.route({
method: 'GET',
url: `**/documentprocessing/startprocess`,
status:200,
response: {
"uniqueId": "abcd12345677",
},
})
.as(`startprocess`)
.route({
method: 'GET',
status:200,
url: '**/documentprocessing/getSSEStatus/**',
headers: {
'Content-Type': 'text/event-stream',
'Cache-Control': 'no-cache',
Connection: 'keep-alive',
},
onResponse: () => {
EEmitter.on('push', function(event, data) {
response.write(
'event: ' +
String(event) +
'data: ' +
String(data) +,
);
});
}
})
.as(`sseStatus`);
In the below code, after 1st API call(#startprocess), emitting push events
Then trying to get push events in response (in #sseStatus call above)
cy.route(`#startprocess`);
setTimeout(function() {
EEmitter.emit('push','message', { 'uniqueId':'abcd12345677' ,'uploadStatus':'Started'});
}, 1000);
setTimeout(function() {
EEmitter.emit('push','message', { 'uniqueId':'abcd12345677' ,'uploadStatus':'Complete'});
}, 3000);
cy.wait(3000)
cy.wait(`#sseStatus`);
I had a similar issue where I needed to mock a api call with text/event-stream as the content type.
I have mocked api calls using json fixtures in the past easy stuff you just do something like
cy.intercept('GET', '/endpoint?*', { fixture: 'folder/my-json.json' }).as('my-api-call');
but with text/event-stream I had to do it a little different (keep in mind this is what I did, it doesn't mean it is the best way to do it, since I couldn't find anything in the oficial documentation)
import json from 'path/to/myjson.json'
cy.intercept('GET', '/endpoint*', (req) =>
req.reply(`data: ${JSON.stringify(json)} \n\n`,
{
'content-type': 'text/event-stream'
}
)).as('my-api-call');
So I have been working on this serverless configuration that calls a Lambda function through ajax. The I enable CORs through the API Gateway, and I have made sure of the domain I specified. This domain works when calling other lambda functions within the same API.
Now for the weird stuff.
I send a post request (I am trying to upload a file through ajax, lambda, and S3), to my API. If I configure the Access-Control-Allow-Origin so that it points to the domain WITHOUT the http in front of it. Ex: example.com. When I try to call this i get:
Failed to load https://m562ogkc1l.execute-api.us-east-1.amazonaws.com/test/upload: Response to preflight request doesn't pass access control check: The 'Access-Control-Allow-Origin' header has a value 'example.com' that is not equal to the supplied origin. Origin 'http://example.com' is therefore not allowed access.
Ok fine, this is assumed, since that's not the proper domain. So when I add in the http (http://example.com) for the CORs of the API I get:
Failed to load https://m562ogkc1l.execute-api.us-east-1.amazonaws.com/test/upload: No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://example.com' is therefore not allowed access. The response had HTTP status code 400.
What this seems like to me is that there is an issue elsewhere, except I don't know where the issue lies.
I have made sure the data I pass for parameters of the ajax call are strignified (JSON.stringify()), and I am NOT running an AWS Lambda Proxy which means I shouldn't be configuring responses on the lambda side of things.
This all really confuses me and I wish AWS had better documentation and examples since they really want to push these serverless services.
Further code is here:
Ajax:
$('#submitButton').on('click', function(){
//console.log(document.getElementById('fileUpload').value.substring(12));//C:\fakepath\ in front of filename (size = 12)
$.ajax({
type: 'POST',
url: 'https://m562ogkc1l.execute-api.us-east-1.amazonaws.com/test/upload',
data: JSON.stringify({"id": id,"name": document.getElementById('fileUpload').value.substring(12),"body": document.getElementById('fileUpload').files[0]}),
contentType: "application/json",
success: function(data){
console.log(data);
//location.reload();
}
});
return false;
});
Lambda:
const AWS = require('aws-sdk');
var s3 = new AWS.S3();
exports.handler = async (event) => {
let encodedImage = JSON.parse(event.body);
let decodedImage = Buffer.from(encodedImage, 'base64');
var filePath = event.id + '/' + event.name
var params = {
"Body": decodedImage,
"Bucket": "repository.example.com",
"Key": filePath
};
return await new Promise((resolve, reject) => {
s3.upload(params, function(err, data){
if(err) {
let response = {
"statusCode": 200,
"headers": {
"Access-Control-Allow-Origin": "http://example.com"
},
"body": JSON.stringify(err),
"isBase64Encoded": false
};
resolve(response);
} else {
let response = {
"statusCode": 200,
"headers": {
"Access-Control-Allow-Origin": "http://example.com"
},
"body": JSON.stringify(data),
"isBase64Encoded": false
};
resolve(response);
}
});
});
};
(Yes I threw in some response configuration for the function, I just wanted to see if it would work)
I am trying to get 'access token' from spotify with the following code.
var encoded = btoa(client_id+':'+client_secret);
function myOnClick() {
console.log('clikced!');
$.ajax({
url: 'https://accounts.spotify.com/api/token',
type: 'POST',
data: {
grant_type : "client_credentials",
'Content-Type' : 'application/x-www-form-urlencoded'
},
headers: {
Authorization: 'Basic ' + encoded
},
dataType: 'json'
}).always((data)=> console.log(data));
}
however I keep getting errors:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading
the remote resource at https://accounts.spotify.com/api/token.
(Reason: CORS header ‘Access-Control-Allow-Origin’ missing).
and
readyState: 0, status: 0
Arielle from Spotify here.
Looks like you're using the Client Credentials Flow, which is one of 3 Authentication flows you can use with the Spotify API. (You can check out all 3 here)
Client Credentials is meant for server-side use only, and should not be used on the front-end, as it requires a client secret which you shouldn't be exposing!
You should use the Implicit Grant flow, which is made for use in the browser, instead. It's easy to get up and running, too!
// Get the hash of the url
const hash = window.location.hash
.substring(1)
.split('&')
.reduce(function (initial, item) {
if (item) {
var parts = item.split('=');
initial[parts[0]] = decodeURIComponent(parts[1]);
}
return initial;
}, {});
window.location.hash = '';
// Set token
let _token = hash.access_token;
const authEndpoint = 'https://accounts.spotify.com/authorize';
// Replace with your app's client ID, redirect URI and desired scopes
const clientId = 'xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx';
const redirectUri = 'http://localhost:8888';
const scopes = [
'user-read-birthdate',
'user-read-email',
'user-read-private'
];
// If there is no token, redirect to Spotify authorization
if (!_token) {
window.location = `${authEndpoint}?client_id=${clientId}&redirect_uri=${redirectUri}&scope=${scopes.join('%20')}&response_type=token`;
}
Gist: https://gist.github.com/arirawr/f08a1e17db3a1f65ada2c17592757049
And here's an example on Glitch, that you can "Remix" to make a copy and start making your app: https://glitch.com/edit/#!/spotify-implicit-grant
Hope that helps - happy hacking! 👩🏼💻
const result = await axios({
url: this.apiLoginUrl,
method: 'post',
data: "grant_type=client_credentials",
headers: {
'Authorization': `Basic ${Buffer.from(this.clientId + ":" + this.clientSecret).toString('base64')}`,
},
});