Azure Event Grid Creating subscription : "Does not have authorization to perform action" - azure-eventgrid

I am currently using node js to publish topics to Event Grid, and subscribe to topics through Event Grid. Using the event grid API on https://learn.microsoft.com/en-us/rest/api/eventgrid/ I get an error where I do not have authorization to perform action when creating a subscription. I have created a topic and have access permission to access my Azure account therefore I am confused why I get this rest error.
My code:
const { ClientSecretCredential } = require("#azure/identity");
const { SystemTopicEventSubscriptions, EventGridManagementClientContext, DomainTopics, EventSubscriptions } = require("#azure/arm-eventgrid");
const subscriptionId = "idea number";
const resourceGroupName = "eventgrid-dev";
const domainName = "test-domain";
let tenantId = "idea number";
let clientSecret = "idea number";
let clientId = "idea number";
const firstCredential = new ClientSecretCredential(tenantId, clientId, clientSecret);
//const client = new EventGridManagementClient(firstCredential, subscriptionId);
const clientContext = new EventGridManagementClientContext(firstCredential, subscriptionId);
// Topics
let domainTopics = new DomainTopics(clientContext);
domainTopics.beginCreateOrUpdate(resourceGroupName, domainName, "test-topic")
.then(result => {
console.log("result");
console.log(result);
})
.catch(error => {
console.log("Error");
console.log(error);
})
let subscription = new EventSubscriptions(clientContext);
subscription.beginCreateOrUpdate("/subscriptions/subscriptionId/resourceGroups/eventgrid-dev", "test-subscription",{topic: "test-topic"})
.then(result => {
console.log("result");
console.log(result);
})
.catch(error => {
console.log("Error");
console.log(error);
})
Output:
Error
RestError: The client 'subscriptionID' with object id 'subscriptionID' does not have authorization to perform action 'Microsoft.EventGrid/eventSubscriptions/Microsoft.EventGrid/test-subscription/write' over scope '/subscriptions/subscriptionID/resourceGroups/eventgrid-dev/providers/Microsoft.EventGrid/eventSubscriptions/providers/Microsoft.EventGrid/eventSubscriptions' or the scope is invalid. If access was recently granted, please refresh your credentials.
Thank you for the help!

In the case of existing an event grid domain, use the following code for creating an event grid subscription on the requested topic. Note, that the topic is created automatically during its first subscription:
let subscription = new EventSubscriptions(clientContext);
const scope = '/subscriptions/' + subscriptionId + '/resourceGroups/' + resourceGroupName + '/providers/Microsoft.EventGrid/domains/' + domainName + '/topics/test-topic';
const test_webhookEndpointUrl = ' ... ';
subscription.beginCreateOrUpdate(scope, "test-subscription",
{
destination: {
endpointType: "WebHook",
endpointUrl: test_webhookEndpointUrl
}
}
).then(result => {
console.log("result");
console.log(result);
})
.catch(error => {
console.log("Error");
console.log(error);
})

Related

Apollo Client GraphQL: When getting FORBIDDEN error, automatically get new JWT AccessToken and RefreshToken. How does the logic work?

In the following code, you can see that I am creating an errorLink. It makes use of an observable, a subscriber and then it uses this forward() function.
Can someone explain to me what's exactly happening here. I am bit familiar with observables, but I cannot understand what's going on here.
When creating the observable, where does the observer argument come from?
I would love to dive a bit deeper.
Also, why is bind used, when creating the subscriber?
const errorLink = onError(
({ graphQLErrors, networkError, operation, forward }) => {
if (graphQLErrors) {
for (let err of graphQLErrors) {
switch (err.extensions.code) {
case "FORBIDDEN":
console.log("errs!")
// ignore 401 error for a refresh request
if (operation.operationName === "RehydrateTokens") return
const observable = new Observable<FetchResult<Record<string, any>>>(
(observer) => {
console.log(observer)
// used an annonymous function for using an async function
;(async () => {
try {
console.log("yop bin hier")
const accessToken = await refreshToken()
console.log("AT!", accessToken)
if (!accessToken) {
throw new GraphQLError("Empty AccessToken")
}
// Retry the failed request
const subscriber = {
next: observer.next.bind(observer),
error: observer.error.bind(observer),
complete: observer.complete.bind(observer),
}
forward(operation).subscribe(subscriber)
} catch (err) {
observer.error(err)
}
})()
}
)
return observable
}
}
}
if (networkError) console.log(`[Network error]: ${networkError}`)
}
)
Just so that you are understanding the context.
Iam combining mutliple apollo links.
const httpLink = createHttpLink({
uri: "http://localhost:3000/graphql",
})
// Returns accesstoken if opoeration is not a refresh token request
function returnTokenDependingOnOperation(operation: GraphQLRequest) {
if (isRefreshRequest(operation)) {
return localStorage.getItem("refreshToken")
} else return localStorage.getItem("accessToken")
}
const authLink = setContext((operation, { headers }) => {
let token = returnTokenDependingOnOperation(operation)
console.log("tk!!!", token)
return {
headers: {
...headers,
authorization: token ? `Bearer ${token}` : "",
},
}
})
const client = new ApolloClient({
link: ApolloLink.from([errorLink, authLink, httpLink]),
cache: new InMemoryCache(),
})

`next.js` api is resolved before promise fullfill?

I want to achieve something like this:
call my website url https://mywebsite/api/something
then my next.js website api will call external api
get external api data
update external api data to mongodb database one by one
then return respose it's status.
Below code is working correctly correctly. data is updating on mongodb but when I request to my api url it respond me very quickly then it updates data in database.
But I want to first update data in database and then respond me
No matter how much time its take.
Below is my code
export default async function handler(req, res) {
async function updateServer(){
return new Promise(async function(resolve, reject){
const statusArray = [];
const apiUrl = `https://example.com/api`;
const response = await fetch(apiUrl, {headers: { "Content-Type": "application/json" }});
const newsResults = await response.json();
const articles = await newsResults["articles"];
for (let i = 0; i < articles.length; i++) {
const article = articles[i];
try {
insertionData["title"] = article["title"];
insertionData["description"] = article["description"];
MongoClient.connect(mongoUri, async function (error, db) {
if (error) throw error;
const articlesCollection = db.db("database").collection("collectionname");
const customQuery = { url: article["url"] };
const customUpdate = { $set: insertionData };
const customOptions = { upsert: true };
const status = await articlesCollection.updateOne(customQuery,customUpdate,customOptions);
statusArray.push(status);
db.close();
});
} catch (error) {console.log(error);}
}
if(statusArray){
console.log("success", statusArray.length);
resolve(statusArray);
} else {
console.log("error");
reject("reject because no statusArray");
}
});
}
updateServer().then(
function(statusArray){
return res.status(200).json({ "response": "success","statusArray":statusArray }).end();
}
).catch(
function(error){
return res.status(500).json({ "response": "error", }).end();
}
);
}
How to achieve that?
Any suggestions are always welcome!

AWS API gateway websocket receives messages inconsistently

I have a websocket in api gateway connected to a lambda that looks like this:
const AWS = require('aws-sdk');
const amqp = require('amqplib');
const api = new AWS.ApiGatewayManagementApi({
endpoint: 'MY_ENDPOINT',
});
async function sendMsgToApp(response, connectionId) {
console.log('=========== posting reply');
const params = {
ConnectionId: connectionId,
Data: Buffer.from(response),
};
return api.postToConnection(params).promise();
}
let rmqServerUrl =
'MY_RMQ_SERVER_URL';
let rmqServerConn = null;
exports.handler = async event => {
console.log('websocket event:', event);
const { routeKey: route, connectionId } = event.requestContext;
switch (route) {
case '$connect':
console.log('user connected');
const creds = event.queryStringParameters.x;
console.log('============ x.length:', creds.length);
const decodedCreds = Buffer.from(creds, 'base64').toString('utf-8');
try {
const conn = await amqp.connect(
`amqps://${decodedCreds}#${rmqServerUrl}`
);
const channel = await conn.createChannel();
console.log('============ created channel successfully:');
rmqServerConn = conn;
const [userId] = decodedCreds.split(':');
const { queue } = await channel.assertQueue(userId, {
durable: true,
autoDelete: false,
});
console.log('============ userId:', userId, 'queue:', queue);
channel.consume(queue, msg => {
console.log('========== msg:', msg);
const { content } = msg;
const msgString = content.toString('utf-8');
console.log('========== msgString:', msgString);
sendMsgToApp(msgString, connectionId)
.then(res => {
console.log(
'================= sent queued message to the app, will ack, outcome:',
res
);
try {
channel.ack(msg);
} catch (e) {
console.log(
'================= error acking message:',
e
);
}
})
.catch(e => {
console.log(
'================= error sending queued message to the app, will not ack, error:',
e
);
});
});
} catch (e) {
console.log(
'=========== error initializing amqp connection',
e
);
if (rmqServerConn) {
await rmqServerConn.close();
}
const response = {
statusCode: 401,
body: JSON.stringify('failed auth!'),
};
return response;
}
break;
case '$disconnect':
console.log('user disconnected');
if (rmqServerConn) {
await rmqServerConn.close();
}
break;
case 'message':
console.log('message route');
await sendMsgToApp('test', connectionId);
break;
default:
console.log('unknown route', route);
break;
}
const response = {
statusCode: 200,
body: JSON.stringify('Hello from websocket Lambda!'),
};
return response;
};
The amqp connection is for a rabbitmq server that's provisioned by amazonmq. The problem I have is that messages published to the queue either do not show up at all in the .consume callback, or they only show up after the websocket is disconnected and reconnected. Essentially they're missing until a point much later after which they show up unexpectedly. That's within the websocket. Even when they do show up, they don't get sent to the client (app in this case) that's connected to the websocket. What could be the problem here?
The problem here is that I had the wrong idea about how API Gateway's websockets work. API gateway maintains the websocket connection but not the lambda itself. I put my .consume subscription logic inside the lambda, which doesn't work because the lambda runs and terminates instead of being kept alive. A better method would be to make the queue an event source for the lambda. However this also didn't work for me because it requires you to know your queues when setting up the lambda. My queues are dynamically created so that violated the requirement. I ended up standing up a rmq server on a vps.

ECONNREFUSED 127.0.0.1:5432 RDS, Postgres

I've been stuck on this error for quite a long time, and am reaching out to see if anyone knows what to do!
I'm making an endpoint using Lambda that I want to, when hit, first insert a contract into a postgres db running on RDS and then insert the contents of a message into the same database (and send an email using SES but I figured that part out already). What I want to do, is pass a new postgres pool into the send message function down below, but every time that I do, and try to reach the database I get a Error: connect ECONNREFUSED 127.0.0.1:5432
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1141:16). This is when I run serverless offline. The initial SQL query runs but the second one doesn't.
I would really appreciate any guidance, I've pasted the relevant code snippets below. This is my first time doing anything with JavaScript, so this might be a very dumb question
contracts.js
const db = require('../db_connect');
const messages = require('./messages');
//POST endpoint to create a generic contract, it's worth keeping in mind that whenever we create a contract, it will always default to false for field isFinished
module.exports.createContract = (event, context, callback) => {
context.callbackWaitsForEmptyEventLoop = false;
const reqBody = JSON.parse(event.body);
const influencerID = reqBody.influencerID;
const buyerID = reqBody.buyerID;
const monetaryValue = reqBody.monetaryValue;
const charityID = reqBody.charityID;
db.query('INSERT INTO contracts(influencer_id, buyer_id, monetary_value, charity_id, is_finished) VALUES($1, $2, $3, $4, false)', [influencerID, buyerID, monetaryValue, charityID])
.then(res => {
callback(null, {
statusCode: 200,
body: 'successfully inserted contract'
})
messages.sendMessage(db, reqBody, buyerID, influencerID)
})
.catch(e => {
console.log(e);
callback(null, {
statusCode: e.statusCode || 500,
body: 'Error:' + e
});
});
};
messages.js
var dotenv = require('dotenv').config();
var aws = require('aws-sdk')
const db = require('../db_connect');
function sendMessage(reqBody, senderID, receiverID) {
const messageBody = reqBody.messageBody;
const timestamp = Date.now();
var receiverEmail;
db.query("SELECT e_mail FROM users WHERE user_id = $1", [receiverID])
.then(res => {
console.log(res)
receiverEmail = res.rows[0]
console.log("this is the new email --> ", receiverEmail)
})
.catch(e =>{
throw console.error(e);
})
}

Dialogflow v2 Actions on Google response timeout

Hi I have a timeout problem to get a json response; I am using google places API to look for the closest location.
Could anyone help me with this? Thanks.
const PlaceSearch = require("./node_modules/googleplaces/lib/NearBySearch.js");
const PlaceDetailsRequest = require("./node_modules/googleplaces/lib/PlaceDetailsRequest.js");
app.intent('Ask Location', conv => {conv.ask(new Permission({context: 'To start',permissions: 'DEVICE_PRECISE_LOCATION',}));});
app.intent('geolocation.intent', (conv,params,granted) =>{
if(granted){
var coordinates = conv.device.location.coordinates;
var location = [coordinates.latitude, coordinates.longitude];
var searchParameters = {
location: location,
name:'Store Name',
radius:10000
};
var config = {
apiKey:'#####',
outputFormat:'json'
};
var placeSearch = new PlaceSearch(config.apiKey, config.outputFormat);
var placeDetailsRequest = new PlaceDetailsRequest(config.apiKey, config.outputFormat);
placeSearch(searchParameters, function (error, search_response) {
if(search_response.status === 'OK'){
placeDetailsRequest({reference: search_response.results[0].reference}, function (error, details_response) {
conv.ask(`Your closest store is at ${details_response.result.formatted_address}.`);
});
}
});
}
});
I solved the issue using a request to Google API via URL; and using a promise.
const request = require("request");
app.input("geolocation.intent", conv => {
return new Promise((resolve, reject) => {
...
request(options, (error, response, body) => {
...
if (error) {
...
reject(...);
} else {
...
resolve(...);
}
}).then(result => {
const address = result.address;
conv.ask('Your closest store is...');
}).catch(error => {
conv.close('Error in Promise');
});
});
What I learned is that in Dialogflow API v2 you need to use promises when you make a request.

Resources