Doubleclick Search API Explorer: 'Login Required' for reports.GetFile - login-required

Quick question re: a 401 error I'm running into when executing a report.getfile in the Doubleclick Search API Explorer (full error message below) .
I am currently logged into my google account that is used for Doubleclick Search. I've tried logging in on a different browser, as well.
I've gone through the 'report.request' and 'report.get' executions without issue. I only get this error when I get to this point. "isReportReady" is also true before I execute 'report.getfile'. Full list of requests/responses below this initial message.
My understanding is that with API Explorer, I should just be able to turn on the OAUTH2.0 and execute, but correct me if I'm wrong!
"domain": "global",
"reason": "required",
"message": "Login Required",
"locationType": "header",
"location": "Authorization"
"code": 401,
"message": "Login Required"
If any additional details are needed definitely let me know. The only questions I've seen come up in here seem to be more-so based around manual coding, or other API Explorer services.
Thanks in advance for any insight!
reports.Request (Response)
POST https://www.googleapis.com/doubleclicksearch/v2/reports?fields=isReportReady%2Crequest%2Fcolumns%2FcolumnName&key={YOUR_API_KEY}
{
"downloadFormat": "csv",
"maxRowsPerFile": 6000000,
"reportType": "conversion",
"statisticsCurrency": "agency",
"reportScope": {
"agencyId": "xxxxxxxxxxxxxxxxxxxx",
"advertiserId": "xxxxxxxxxxxxxxxxxxx"
},
"timeRange": {
"startDate": "2014-01-1",
"endDate": "2016-11-3"
},
"columns": [
{
"columnName": "conversionVisitExternalClickId"
},
{
"columnName": "conversionTimestamp"
},
{
"columnName": "keywordid"
},
{
"columnName": "adGroup"
},
{
"columnName": "campaign"
},
{
"columnName": "account"
},
{
"columnName": "floodlightActivity"
},
{
"columnName": "conversionID"
}
]
}
reports.Get (Response)
"kind": "doubleclicksearch#report",
"id": "xxxxxxxxxxxxx",
"isReportReady": true,
"request": {
"reportScope": {
"agencyId": "xxxxxxxxxxxxxxx",
"advertiserId": "xxxxxxxxxxxxx"
},
"reportType": "conversion",
"columns": [
{
"columnName": "conversionVisitExternalClickId"
},
{
"columnName": "conversionTimestamp"
},
{
"columnName": "keywordId"
},
{
"columnName": "adGroup"
},
{
"columnName": "campaign"
},
{
"columnName": "account"
},
{
"columnName": "floodlightActivity"
},
{
"columnName": "conversionId"
}
],
"timeRange": {
"startDate": "2014-01-1",
"endDate": "2016-11-3"
},
"includeRemovedEntities": false,
"downloadFormat": "csv",
"statisticsCurrency": "agency",
"maxRowsPerFile": 6000000
},
"statisticsCurrencyCode": "USD",
"statisticsTimeZone": "America/New_York",
"rowCount": 5225201,
"files": [
{
"url": "https://www.googleapis.com/doubleclicksearch/v2/reports/AAAncMCcKrSJ9MS_/files/0",
"byteCount": "1271002762"
}
]
}
reports.getFile (Response)
GET https://www.googleapis.com/doubleclicksearch/v2/reports/AAAncMCcKrSJ9MS_/files/0?key={YOUR_API_KEY}
200
- Hide headers -
cache-control: private, max-age=0, must-revalidate, no-transform
content-length: 1694670352
Content-Type: text/csv
date: Tue, 21 Mar 2017 14:24:03 GMT
etag: W/"u6ybyji1w_vhYZMTUlOUhEWemRI/nWLUgbdAKD9EU6srfscDcZyGxz8"
expires: Tue, 21 Mar 2017 14:24:03 GMT
server: UploadServer
vary: Origin, X-Origin
x-guploader-uploadid: AEnB2UoZzz4AafhGUKg-1edKZWLCmiML_a6HnP7lr11aD7tXZAnEIgyEno9ZhHOCbJ02XTlAkgtYcD4AOGbhmar6dpSx-uzh-n_6SCQH-iShQma-k6GLsMk

Related

CloudFront doesn't fetch from custom origin for root url using Lambda#Edge

I have a Lambda#Edge function that decides which origin to use on the origin request, depending on a header value.
It doesn't work for the root url (mysite.com), but it works for subroutes (mysite.com/sth/abc). I am looking for help while trying to figure out why it doesn't work for the root url.
It looks like this:
exports.handler = (event, context, callback) => {
const request = event.Records[0].cf.request;
const isBot = headers['formaviva-agent'] && headers['formaviva-agent'][0].value === 'bot';
console.log("request before" + JSON.stringify(request));
console.log("isBot:" + isBot);
const shouldPrerender = isBot;
if (shouldPrerender) {
request.origin = {
custom: {
domainName: 'fast.formaviva.com',
port: 80,
protocol: 'http',
path: '',
sslProtocols: ['TLSv1', 'TLSv1.1'],
readTimeout: 5,
keepaliveTimeout: 5,
customHeaders: {}
}
};
request.headers['host'] = [{ key: 'host', value: 'fast.formaviva.com'}];
console.log("request after" + JSON.stringify(request));
}
callback(null, request);
};
The default origin is S3 static page hosting. formaviva-agent is set in a Viewer Request Lambda function that checks whether User-Agent belongs to a bot.
Formaviva-agent is whitelisted.
Judging by the logs, all goes well, the bot is recognized and the origin gets changed to a custom origin:
2019-05-07T09:56:45.627Z 3a2831f4-2a6b-46ea-8850-d89651a9b19e request after
{
"clientIp": "92.37.21.9",
"headers": {
"if-modified-since": [
{
"key": "If-Modified-Since",
"value": "Mon, 06 May 2019 10:50:39 GMT"
}
],
"if-none-match": [
{
"key": "If-None-Match",
"value": "W/\"41cc-16a8cc45892\""
}
],
"user-agent": [
{
"key": "User-Agent",
"value": "Amazon CloudFront"
}
],
"via": [
{
"key": "Via",
"value": "1.1 b38e161751a953866db739b688c09996.cloudfront.net (CloudFront)"
}
],
"formaviva-agent": [
{
"key": "formaviva-agent",
"value": "bot"
}
],
"x-forwarded-for": [
{
"key": "X-Forwarded-For",
"value": "92.37.21.9"
}
],
"host": [
{
"key": "host",
"value": "fast.formaviva.com"
}
]
},
"method": "GET",
"origin": {
"custom": {
"domainName": "fast.formaviva.com",
"port": 80,
"protocol": "http",
"path": "",
"sslProtocols": [
"TLSv1",
"TLSv1.1"
],
"readTimeout": 5,
"keepaliveTimeout": 5,
"customHeaders": {}
}
},
"querystring": "",
"uri": "/index.html"
}
In this case, Cloudfront still serves the content from S3 origin. The custom origin server does not get hit (confirmed). Even though the custom origin is specified. Why?

Multiple labels for some emails

When we trash a mail after sending it, the UI displays it with only the "Trash" label. However, the API shows both "Sent" and "Trash"
{
"id": "16169c0c3d212e74",
"threadId": "16169c0c3d212e74",
"labelIds": [
"TRASH",
"SENT"
],
"snippet": "#Testing ",
"historyId": "1893418",
"internalDate": "1517897696000",
"payload": {
"partId": "",
"mimeType": "multipart/alternative",
"filename": "",
"headers": [
{
"name": "MIME-Version",
"value": "1.0"
},
{
"name": "Received",
"value": "by xx.xx.xx.xx with HTTP; Mon, 5 Feb 2018 22:14:56 -0800 (PST)"
},
{
"name": "Date",
"value": "Tue, 6 Feb 2018 11:44:56 +0530"
},
{
"name": "Delivered-To",
"value": "xxx#xxx"
},
{
"name": "Message-ID",
"value": "xxx"
},
{
"name": "Subject",
"value": "TEST2"
},
{
"name": "From",
"value": "xxx"
},
{
"name": "To",
"value": "xxxx"
},
{
"name": "Content-Type",
"value": "multipart/alternative; boundary=\"f403045c3c98fab46e05648518a7\""
}
],
"body": {
"size": 0
},
"parts": [
{
"partId": "0",
"mimeType": "text/plain",
"filename": "",
"headers": [
{
"name": "Content-Type",
"value": "text/plain; charset=\"UTF-8\""
}
],
"body": {
"size": 423,
"data": "----"
}
}
]
},
"sizeEstimate": 1810
}
Some mails also have labels like [SENT, INBOX]. Is there any way to get the latest or most relevant label. I would like to categorize mails based on the labels and multiple labels create contradictions.
After you send an email it gets the SENT label. When you are trashing the email you are adding the TRASHED label. It does not remove any other labels that had been added.
I would suspect that the UI version of Gmail has a filter that does not display trashed mails in the sent mail box.
Solution: When you trash your email make sure to delete the SENT label or just filter out all other labels in your application if it has a trashed label.
The gmail api returns the data it has its up to you to either ensure that it has only the correct data by deleting other labels after you trash an email or filtering out the labels you are not interested in.

unable to parse my custom log in logstash

Here given below is my Service log I want to parse this log into logstash. please suggests some plugin or method to parse the log.
"msgs": [{
"ts": "2017-07-17T12:22:00.2657422Z",
"tid": 4,
"eid": 1,
"lvl": "Information",
"cat": "Microsoft.AspNetCore.Hosting.Internal.WebHost",
"msg": {
"cnt": "Request starting HTTP/1.1 POST http://localhost:20001/Processor text/xml; charset=utf-8 601",
"Protocol": "HTTP/1.1",
"Method": "POST",
"ContentType": "text/xml; charset=utf-8",
"ContentLength": 601,
"Scheme": "http",
"Host": "localhost:20001",
"PathBase": "",
"Path": "/Processor",
"QueryString": ""
}
},
{
"ts": "2017-07-17T12:22:00.4617773Z",
"tid": 4,
"lvl": "Information",
"cat": "NCR.CP.Service.ServiceHostMiddleware",
"msg": {
"cnt": "REQ"
},
"data": {
"Headers": {
"Connection": "Keep-Alive",
"Content-Length": "601",
"Content-Type": "text/xml; charset=utf-8",
"Accept-Encoding": "gzip, deflate",
"Expect": "100-continue",
"Host": "localhost:20001",
"SOAPAction": "\"http://servereps.mtxeps.com/TransactionService/SendTransaction\""
}
}
}]
please suggest me some the way to apply filter on this type of logs, so that i can take out fields from this logs and visualize in kibana.
i have heard of grok filter but what pattern i have to use here for the same.

gmail is blocking my elastic search watcher email

I am using ES 5.2. I implemented a watcher.But each time watcher is getting triggered it generate email but google blocks that email due to security concern. So what can be solution for that?
My YML file is as below :
cluster.name: elasticsearch-logging
node.name: "elasticsearch-logging-0"
path.data: /var/lib/elasticsearch/data
xpack.notification.email.account:
gmail_account:
profile: gmail
smtp:
auth: true
starttls.enable: true
host: smtp.gmail.com
port: 587
user: ******.**#gmail.com
password: ******
While doing curl on watcher getting below response :
DOING CURL --
curl -XGET localhost:9200/_xpack/watcher/watch/last_watch
Getting below response:
{
"found": true,
"id": "lastwatch",
"status": {
"version": 5,
"state": {
"active": true,
"timestamp": "2017-06-16T00:39:16.654Z"
},
"lastchecked": "2017-06-16T00:43:00.229Z",
"last_met_condition": "2017-06-16T00:43:00.229Z",
"actions": {
"email_admin": {
"ack": {
"timestamp": "2017-06-16T00:39:16.654Z",
"state": "awaits_successful_execution"
},
"last_execution": {
"timestamp": "2017-06-16T00:43:00.229Z",
"successful": false,
"reason": "MessagingException[failed to send email with subject [404 recently encountered] via account [gmail_account]]; nested: AuthenticationFailedException[534-5.7.14 https://accounts.google.com/signin/continue?sarp=1&scc=1&pltn534-5.7.14 q0WEdpll7GFx7wL5ZoIKlaHy0JIWKkJEAaiNf5hWY11ZPPsJb6u7h9z0Xe\n534-5.7.14 kWiT264a1EJgbKW5ESeccxI0uUZ_3X4klQS4jBjB7dDw6pRU490p-yKtXkL2-Ik\n534-5.7.14 vMoQFBgYsmH2WbbGFC3Z63GBpWVH0O9LmpVsB89ZsSreIXN_bb0AX3UWwoX4dTb4UiXtmi\nQI Please log in via your web browser and\n534-5.7.14 then try again.\n534-5.7.14 Learn more at\n534 5.7.14 https://support.google.com/mail/answer/78754 a22sm752699pfc.115 - gsmtp\n]; "
}
}
}
},
"watch": {
"trigger": {
"schedule": {
"cron": "0 0/1 * * * ?"
}
},
"input": {
"search": {
"request": {
"search_type": "query_then_fetch",
"indices": [
"logstash*"
],
"types": [],
"body": {
"query": {
"bool": {
"must": {
"match": {
"methodName": "getSSLConnectionSocketFactory"
}
}
}
}
}
}
}
},
"condition": {
"compare": {
"ctx.payload.hits.total": {
"gt": 0
}
}
},
"actions": {
"email_admin": {
"email": {
"profile": "standard",
"to": [
"****.*****#gmail.com"
],
"subject": "404 recently encountered"
}
}
}
}
}
Looks like a javax.mail issue and you need to turn on less secure apps.

LogicApp CRM connector create record gives a 500 error

I've build a fairly simple LogicApp with the most recent version that came out about a week ago. It runs every hour and tries to create a record in CRM online. In the same manner I've created a LogicApp that retrieves records and that works.
The failed input and output looks like this:
{
"host": {
"api": {
"runtimeUrl": "https://logic-apis-westeurope.azure-apim.net/apim/dynamicscrmonline"
},
"connection": {
"name": "subscriptions/6e779c81-1d0c-4df9-92b8-b287ba919b51/resourceGroups/spdev-eiramar/providers/Microsoft.Web/connections/EE2AF043-71F0-4780-A8D1-25438A7746C0"
}
},
"method": "post",
"path": "/datasets/xxxx.crm4/tables/accounts/items",
"body": {
"name": "Test 1234"
}
}
Output:
{
"statusCode": 500,
"headers": {
"pragma": "no-cache",
"x-ms-request-id": "d121f98c-3dd5-4e6d-899b-5150b17795a3",
"cache-Control": "no-cache",
"date": "Tue, 01 Mar 2016 12:27:09 GMT",
"set-Cookie": "ARRAffinity=xxxxx082f2bca2639f8a68e283db0eba612ddd71036cf5a5cf2597f99;Path=/;Domain=127.0.0.1",
"server": "Microsoft-HTTPAPI/2.0",
"x-AspNet-Version": "4.0.30319",
"x-Powered-By": "ASP.NET"
},
"body": {
"status": 500,
"message": "Unknown error.",
"source": "127.0.0.1"
}
}
Does anybody know how to solve this issue?
Do you have security roles setup to access the CRM Org, i.e. can you access the URL from browser? https://xxxx.crm4.dynamics.com
Update: Fix for this issue has been rolled out. Please retry your scenario.

Resources