Not able to get successful response from JMETER request - jmeter

I am new to JMETER.
Steps I am following are:
1. Login
2. Navigation
3. Posting values in Request
I am able to do first two steps i.e. Login and Navigation successfully by posting parameters from "Send Parameters With the Request".
When I am trying to posting parameters from "Send Parameters With the Request" in third step i am getting Response like values are null
Request from Result Tree:
POST https://www.test.net/Databases/Employee/Add
POST data:
MyNo=EMP004&Badge=EMP004&FirstName=EMP004&LastName=EMP004
Cookie Data:
.ASPXAUTH=7A01E1CDB1496B775ABB4D1A3E811E6DE6556141ED9F7EF3CF16FB6BE9EEC88B7446286A9249810B91C3D735BCFB6C8533FFA70E37E9161AC62D42C36CC3B2AF55B71BB261B5C77D536A3164BF024D36BED56196E84D575D333F529E56E4770C7E0BF5CFC06B4C5D04E40644B715D502; __RequestVerificationToken=2YhJX3pX3zYamRsdW_6htKtskarUoVbGZ_swGTT4AAtUMgIpYd-gfUtOQuRfUm3Il6oXE-6DhIz2AZDG5Z6R8BxI751RMw1Bj95WxGySRhs1; ASP.NET_SessionId=4nqsuzqt3jj3ey3ovvfy4waq
Request Headers:
Connection: keep-alive
Origin: https://www.test.net
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/56.0.2924.87 Safari/537.36
Content-Type: multipart/form-data; boundary=----WebKitFormBoundarynUCTUgtyzyFTH1Xx
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,/;q=0.8
Referer: https://www.test.net/Databases/Employee/Add
Accept-Encoding: gzip, deflate, br
Accept-Language: en-US,en;q=0.8
Content-Length: 382
Host: www.test.net
Response:
Employee: *
Employee can not be empty.
First Name: *
First Name can not be empty.
Last Name: *
Last Name can not be empty.
Badge: *
Badge can not be empty.

Related

{"type":"mapper_parsing_exception","reason":"failed to parse field [user_agent.version]

i m sending Nginx logs via filebeat -> elastic search -> Kibana .
But already have issue with some logs .
It s look like this type of log is parsing without any problem :
66.249.76.123 - - [24/Apr/2020:17:24:51 +0200] "GET / HTTP/1.1" 200 5249 "-"
"Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P)
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36
(compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
but on the other hand similar log :
62.197.243.55 - - [24/Apr/2020:17:29:22 +0200] "GET / HTTP/1.1" 200 5252 "-"
"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3)
AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36"
throwing error to syslog -
Apr 24 17:29:31 prodserver filebeat[12562]: 2020-04-24T17:29:31.497+0200#011WARN#011elasticsearch/client.go:517#011Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xbfa0df569acc421c, ext:321343689727, loc:(*time.Location)(0x5003080)}, Meta:{"pipeline":"filebeat-7.6.2-nginx-access-default"}, Fields:{"agent":{"ephemeral_id":"3d9ae7ae-c460-4e7b-b994-f10a681cc10b","hostname":"prodserver","id":"58d1eb1d-9c09-485d-ad7f-28b0066a0054","type":"filebeat","version":"7.6.2"},"ecs":{"version":"1.4.0"},"event":{"dataset":"nginx.access","module":"nginx","timezone":"+02:00"},"fileset":{"name":"access"},"host":{"name":"prodserver"},"input":{"type":"log"},"log":{"file":{"path":"/var/log/nginx/denevy.access.log"},"offset":483636},"message":"62.197.243.55 - - [24/Apr/2020:17:29:22 +0200] \"GET / HTTP/1.1\" 200 5252 \"-\" \"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.163 Safari/537.36\"","service":{"type":"nginx"}}, Private:file.State{Id:"", Finished:false, Fileinfo:(*os.fileStat)(0xc0008c4d00), Source:"/var/log/nginx/denevy.access.log", Offset:483837, Timestamp:time.Time{wall:0xbfa0df0e51ae7c7f, ext:32190743674, loc:(*time.Location)(0x5003080)}, TTL:-1, Type:"log", Meta:map[string]string(nil), FileStateOS:file.StateOS{Inode:0x2466a, Device:0xfc00}}, TimeSeries:false}, Flags:0x1, Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse field [user_agent.version] of type [date] in document with id 'ZJ_OrHEBZWkJKYxN4WlY'. Preview of field's value: '80.0.3987.163'","caused_by":{"type":"illegal_argument_exception","reason":"failed to parse date field [80.0.3987.163] with format [strict_date_optional_time||epoch_millis]","caused_by":{"type":"date_time_parse_exception","reason":"Failed to parse with all enclosed parsers"}}}
problem starts :
{"type":"mapper_parsing_exception","reason":"failed to parse field [user_agent.version] of type [date] in document with id 'ZJ_OrHEBZWkJKYxN4WlY'. Preview of field's value: '80.0.3987.163'","caused_by":{"type":"illegal_argument_exception","reason":"failed to parse date field [80.0.3987.163] with format [strict_date_optional_time||epoch_millis]","caused_by":{"type":"date_time_parse_exception","reason":"Failed to parse with all enclosed parsers"}}}
any idea why 99% of my logs doing well ,
but type with chrome : "(KHTML, like Gecko) Chrome/" - have issue with parse failed to parse field [user_agent.version]
Using Filebeat/Elasticsearch/Kibana - version 7.6.2
Thanks in advice
Are you using the standard Filebeat template or are you using your own Template?
I had identical problem because I was using my own template that did not have the "user_agent" fields defined. I added them by copying them from the Filebeat template and for the moment it did not give me any problems.

processingFailure error (400) while retrieving CommentThreads list

I a trying to retrieve all the comments of a video via Python iteration/paging. I am logged correctly with a developer key
import googleapiclient.discovery as gg
import googleapiclient.errors as gge
yt = gg.build(api_service_name= 'youtube', api_version= 'v3', developerKey = M_KEY)
comments= []
page= ''
while True:
request = yt.commentThreads().list(
part= "snippet,replies",
order= "relevance",
maxResults= 100,
pageToken= page,
textFormat= "plainText",
videoId= video['id']
# video is a static dictionary i've saved outside the script
)
try:
response = request.execute()
page= response['nextPageToken']
comments.extend(response['items'])
print('Comments extended')
except KeyError:
# there are no more pages
print('Iteration ended')
break
except gge.HttpError as error:
print('HTTP error:', error.__dict__['resp']['status'])
What i'm expecting it to do is iterate the pages of comments until the response['nextPageToken'] throws a KeyError, meaning that there are no more pages of comments. Instead, what happens is that the execution goes flawlessly for a dozen of iteration (at best) then it starts to throw said processingFailure error which content looks like this:
{
"error": {
"errors": [
{
"domain": "youtube.commentThread",
"reason": "processingFailure",
"message": "The API server failed to successfully process the request. While this can be a transient error, it usually indicates that the requests input is invalid. Check the structure of the <code>commentThread</code> resource in the request body to ensure that it is valid.",
"locationType": "other",
"location": "body"
}
],
"code": 400,
"message": "The API server failed to successfully process the request. While this can be a transient error, it usually indicates that the requests input is invalid. Check the structure of the <code>commentThread</code> resource in the request body to ensure that it is valid."
}
}
I have tried to log both the page and the videoId to ensure nothing went wrong with them but they're both valid. I've also tried to time.sleep() for up to 15 minutes when that error occurs but nothing changes.
This is the request in json format at the time of the error, catched using request.to_json(), thanks to #stvar for suggesting it:
{
"uri": "https://www.googleapis.com/youtube/v3/commentThreads",
"method": "POST",
"body": "part=snippet%2Creplies&order=relevance&maxResults=100&pageToken=QURTSl9pM2xlemsyWjAzYlBhWkNRMTFMMWIyMjFsVVNnS2U0WE8zTkwxRzdRdkpsdlVqWHlwWEV2SmxkeUxjdkR1UWk3eVU1OTI1cmJEeUtJZHRGQWVmY21PUGxVOXBER3YtckE2NlhSWlRwQzR0Y2VyY0JDbC1uNVRaSU56RklzejJERmRCc2lLUjV2Rm1LV0Njc3ByMjliRXRMZmNjRFJucFgwNFNBYVhkSFJzYXkzNVpKTXNNSzNfWmVGd3dSRWxYQmwyQmxnWGJwNFZidVpiYjlOWjBabFFsalZFZkdqZFV0SHlrVEJqclppVnRtMjZCTnYtQm9WWjFrQ0dELUlLTnYwWG50cU5BQXJ3Ukh3VE9PZnNaZ0tZaWN1ZTdBakJkWFp5Ymo5M2R5Y0g1aWVsWUUzUVg0TU83Q2JZQ1IxWnRTMXUyTFhpSDdmMU9GTmtiQUE0UjdyVUVBelNnSjhTTDVsLU1TaERwVHdvSVhkX1ktNVBTc2xkX09zcjBOT3E3Z2lVWWRPRFhkVF9NN1JaQTEyUEJmU1hNbUtvM2JzU1NzOFRid29wTEo3Q0hucmJnNHcwNUJzaGtqSE8wa2g2U0FUY3pQbDJ1bGNnaFRKNEJCRm90TVNyWXNSREgyQVFqMU9PNnY3elBGSEhrYXJSMUJYS09yQ0tOVE5Oa3l5V00tdGY3TTlwY3o0VXJsaWRua1BrNXVhWmVLMzV1T2NmOEhqaUNucEdheTRfZjNiM1JkYUJuaGZqQjFMV1c0NFRJNXlzR2trdFpLemV1SU12V0tCTW11b1RMU05PXzA0eVdHM3lRclpZaC1BN3k4RDdhTW1uTHZtbDVsRzBVTVFHdkdkMTN4VHQzNW1tZ3BoY0F6VDJVWTFhUWpxdW8td1M0bnkzQTRtVGc5bGxQNV81ejV1dm1JX1ZDRVZIXzI3eXVnbHJBcC1Lb0NULUhHOGp3ZGNGeVFKbFNXbVh0Y2NQei1UbjBFLVhuZWp0eXh5NzVjOEtjS1FqTUppQWdOSDRmWWtSOUZPRHQtSEpsTnJtNWZVX2t4VDlVTDB6WmxWTHN5dlZzZllNQkFBOEJNMWZkOEtoTk5jMnQ4Y0hydXVScTNILXZLXzJodGFUNmxhQnEtay1PVV9yYzJFNEhKaDZjcUszV3ZGM2VLaUxJZjlwRmViYXRfVGRSOFZ6OF9vU2h6WjVqNkhVU0tqZHduLTNlaFhuTHFXSG1WSk1HUVQ4dkdIdDZvUFdKNkxOeFlhTmJzd1J4dGQzLXBHUmsxaHYtdFc2cTI1VDZsMWJGdE5Pb1RmR2hlRGM3cjZPcDJ2eTljQk1GcTJXaTFtNFhndzlWbDBOby1kNzZhLU5WNTI1VUlzRmpQSkRvSlFFMUQzNllzbi0tU01OYTg1a2poS2ZrWHpQMjQyd1hDb3h4blE5ZlJmN2xIMEstRFR6cUFWcTNDRDFfbjNubXY4Z2ZseGdVY2NjTWk0NzQ3SDFZcWs2eWxZWlB0Vl9iSldlbktOMjVFWUp6UnVRb3dfOXFQdmhBZEN5clJpX1g4aVhmdERnbE5XX1FjVWNXODRtSm1LSUpDVnJHVGlEeUtGb3BPMVYyWU5TbnQzY29NLUY5c3Y2WmpNVTNlVjIxQ2RwSzlKTUZwY0RxY2FlMGFtd2tucFpjeUtDN2xwOERYcDJwSU1RY0dIdXdCTmJIcWdjbTh1Q04wVTh1dktzeVdob09wX25uU19BMFNlRlBrNG1wZFRKVVJFVzVfdGQxbGFYemFqZjJOQTd1R0NCZ0RrMWlTS3BMMy1hY0FMd25KTGFYelJPQjZvRnlYMnBFelhCREgyRDJ3TnJWNldWUllqOVVvdHV2cVRXRXlBbkJpaFJpd3RIc2RaamVaUERldXItT1pkTVVFczBzNi1hZmhDYTFzWVM3SElsYkxtMkoybC03YlZVRkt1NEVSWV8tWHRJTko4d2hqWllWVU04UXlkQV84ZjFzVm01bW83cTd4R3ZOSVNabGRSaXVlTU91MXR6RVFYeTNwNHd3bzNVY1RncHdzY1VKQWw2eWNvcmdER0N5RjZiQkRmNnh0S256MzhreldFTm9XMDhlY1VUeEhnNTM2bHNYVlpKdGJrdHd1Y3VCc0hYOHlEc1EyZXJLTUlMTlVQb0FmU0hpdy1WdS1iT19fTTlMQUVWa3BnWloxSXdUZHotMW5zWWVnTVBzelE3VmQtRlBOajNfcmJJNnlZYkpDdmxKWXoxcjBZYkV4Q2duNGx1MTlrYWVVOXktT0lVX2dfVnc2cW9nSEVHSHZCUzFHRFFPaW1ydUxlY012bVQtaVBnTXQ1VWxOZVI3YW9nTkhFdHlwUGlneFdOM2Jkcm1iWEcxN25pQVI0TUpvVW1hemlrYWl6M0dnSTQ0VWhVWVMxaHEzeS05cnJRSkJ6TEVEZTB4anYySzR6WWhyMEtkZ0ZVMkxDZXlkcHFCSTBfU2Z3ZEVTYkE1YlE1SXI0M3lhUnJGZ1l0QVZFU2ZqRDE1MUhSLU1lU2dxWFpUem04RHVqOVBTUkZhbkFLWUJ2aGZsX0w2SzBabTBoUUxiYVZxT25ydk56U01YdklJZmtPemxtT0Nrc1JGOWVRQnMydk5lZkRjRExEUzRaWXFfTlNRdVNOUTFacFdLcklqRXc1NDg2eGU2NXlid3Y0SnRVeWplVE5CVVF4Qm01ajJfOWY2U2NWcWlVajYwYXJ3eXZ6RGt4cldCMndES0wzZ0xERl91bmlQaDVtUmNXSERXekJCVDAtd2ZnVFBadERnQlJWUEl3cWxCb3FfOEh2NkJCUlZqUThCMUk0OXM3Sjk3Sng3WFBpdUlEUFRnLV9kMnhoa2Z5QVpLRFNLSWl0ck1WUnhKRWFaZ0J0VDZmTjY3MG5SMkZQYUx0YTQ3dmgzYzhpa0Nua0dIS0VzSGYzOERiYkN4ZXM1ZkpkYV9nMEJrMnA2aHgycWFfZ04ySmRGellBaUphdXA2X213WXNxVW1QbUpfa2xZZTUwbVA2azMyRjV0Q2dRcWJVajFuVTFjRHB5QUZUcTZ1X3RwNGVBSmVoNWt6ajFZNTkwS1J2TzZreHhfQ1EzTTdDUGpCb0V1SXdFWmh6YjF0Q1NHUnoxTy11MURZak03ZnNEX0NCSUxPbkVuZGZ5VDlCMkkwNl9lLUw2bk05MUVfU29NTmg4WFBSSGNibnZ4T3h4T25Yal81a3NDMG5veDBERVdLRVBEV0pqRi1CbEpnV01HajhjR29jbXFEdXpIcFdCblcyZ2dsX1ZUNHJuMkxHUFV6ekZiaklFMXpob0w4MXJNOXVhZ2dMX1oxdzNkRi1sV3JDaGpING10b21qYTI0QW00dWFOOVZHSWJlT3lZVy1qSzNnUGxPU0hTZnFTS2VVLW80cHFqWTAtYV9lbHI5WHdnZV9nM25oX2Iwd2lUeXgxQndDcENrczUxb3RlZlZkSnREU20xSjdCM0VBZkZ4X1pkQ2YwQUszcWJxRVdwM25wM0w2WmpNTG1WM0RyWlU3TkpGeElBeGRkUTZlWEttazJuLW1mdFZtMTNVamRKZEthMkFfbS1HSnRtMFRTbFlVYnBqQ2puaUVrY2xieTB0TDF4UDNUWGNMUFo3enFtajB5T0dZbGRYQkxDQzlxdGNXN0d4V093RXUxd2Z4ek9oVEZzVUpabHpxeHNFVlVUbF9ORGk5U1BhVHg3QlJEWjdqM2g1VlFhb184SU1ZMERFLWJnY0htaXk2aXFoTjBMaUg2eTdjVmhHekFCbmdjNXJ2WkpQVE9jY09RRXE1SVJYenhvdVJxVGRDbnd5WGdrZnlyWG96T2FXcVRVRXBiWHBGRGFnZTUybmFJam1HanpPelNSZHBJLW5yaDgybm5BdldNNTRtcGNuaXRGbzJmODhyc2IzYmNUdDNoekNaRG5Oa2k1OUNXU2tuSnd2OG1GVjBGM2xid0lWd3QwWDR3Ukc4TlZZcXdkY2FOdS0xaWlsajAwQTBLVnJTVzljLS1WZF9WWGl0Y21naDdpX2c1djJwRFpuU3pWY1VNRWNOTy1GaVdSTUxDTHZ6VDFNYl9HRmsxVEZlTG5fVFdrNzRYdlpoSVRwMHJjSjdJT21lU09ObEtqTlNiX0FCNGtXRldpSHByY0htSGxOYS1JbWZWbWRCZUlLaFhxUG5haGZCMm1PbklDMGFJS0pmZ2RjUUtVSWpLeVBrTk5DMXo5VUVkeGZKRFRtZDh4OHJkV3BEbi0wUnJXY2x1Rk9XU04xeXpkbnA5U1FnV2huOTFYVlBGRG5Rem9CODdfRUU2d3liNy1LQ0JHbjFsVVRQc3pMS3JudjhDVG1Bb2xjYU1MaUprajZGT2dkS0ktd3IzNXZMSWhFQm9oamdGZW5KcWNOQ0Q5NDZGWXFzTE5peEVyenJHWG9ZaWtNRjVoUXJNVFVxdjhSYUgyRHZKcXpmOWtMRHJmV1dMRklLUTFvLVJZbDY5dzRFcXJxUGJoenF4SFpLYUFoOWpFcnNNWVZmMmsycy1YVXllMkhwNzJrOGl5eUpTTXdQVFhWTjJ4MVA1OC1FbnVlbGZSRDQwOWxkcUwwSkRWTFVHMUdhUi1jVnlBZVJKZmZNY3Z2OTQ4MVdqWDV5WFBvSjJJY25VZDdOc2JyUkRKQ1IybXY2NG5uX3NJUmJubXFvTHQ3dHVoV0pMQmRlb2tQOUU2cU1xN3gzVnFsMzVqVDhjSFNuSmNYZWczSG1veEhGWDlkTDNBRmtXVFhBaVQwSDR2dTJBaU1nRUJVZndMS3gzUjNXNlRRcGJqNnJkbldzT3FjdU1NSkwtOVZ2WEIzMEYtQlFFNW84dUthQjFvZFg4OVota255cWt5Mlo1cXpoWTJKel9vTmpLbHRlQVN0WkVURDh3WG8wSTY3OE0yTTF4OF9KQ3ctc1Uzd3hxejdyRlB0NzVXZkxKU1RvOUw1TWtEMjNCaWVaLTkzaExwTHlwaG9DaC1qcjE4cHgxRHRtMmc3TnZXaTJleHNHVW1WTzJlMkVlWFd6QUxfOENSUEJRU0E5d08xa2hjSkxBSG5kY1BlTzJxUkR0U3ppTG0zR0xTOTRLUU9YSmxGSWFoR0JjRmp4b1pUTmNCTFoxaTdBd0RpYmhJLVJyX19qOFkyLW5UUW4wYjBsbTFpa3MzSThqd3pFa0Q0aGZrVWFBWXhMdS1LaUg0R0xRNlB4YUdxb0xVdEtaMm1ld0ZVbi1uMTlGLVN5bDhNakN3czlnQ2ZYZHhFS2hlQ1ByZzNhNHljdFhiOHpJdDg3ZVdyeGFBVW1jdEkxbnJMMUw2WGpQcUFDc0N6WWQtMzhCNTZ0VWtXWnBlRmRIWnl5VkpkeF9XMzZnZG5MWnpoVFNRcGJhTjAxOUNPWkpZeTh6Zk9QQW9paU5IOXVESHgybkRHeUREY1ZCMFBIaVZ2aFN5dFNuWWJZbWJKX1N5dG9LUXl6LVViMzFIWHZURkVVVU1iTl9NdUNiTEwwem9CQ0EtODNyMGJLaGh1bGpXRkFBRWxXR1dTYlNIa3NEenN1NlBid2gxZTUyUFNhem5yVWN4Y0tF&textFormat=plainText&videoId=CJ_GCPaKywg&key=m_developer_key&alt=json",
"headers": {
"accept": "application/json",
"accept-encoding": "gzip, deflate",
"user-agent": "google-api-python-client/1.7.9 (gzip)",
"content-length": "5730",
"x-http-method-override": "GET",
"content-type": "application/x-www-form-urlencoded"
},
"methodId": "youtube.commentThreads.list",
"resumable": null,
"response_callbacks": [],
"_in_error_state": false,
"body_size": 0,
"resumable_uri": null,
"resumable_progress": 0
}
NOTE: I need to have order= "relevance" in my request because I primarly need the most voted comments.
An answer is nowhere to be found, I hope you can help me
Issue is, we can't really retrieve all the comments of every video.
https://issuetracker.google.com/issues/134912604
We currently don't support paging through the whole stream. So there's no way to retrieve all the 1000+ commentThreads that you have for that video
This is not a solution to your problem. It just shows that querying the endpoint via a GET
request method succeeds obtaining from the API the needed page response.
# comments-wget [-d] VIDEO_ID [PAGE_TOKEN]
$ comments-wget() {
local x='eval'
[ "$1" == '-d' ] && {
x='echo'
shift
}
local v="$1"
quote2 -i v
local p="$2"
quote2 -i p
local O="/tmp/$v-comments%d.json"
local o
local k=0
while :; do
printf -v o "$O" "$k"
[ ! -f "$o" ] && break
(( k++ ))
done
quote o
k="$APP_KEY"
quote2 -i k
local a="$AGENT"
quote2 a
local c="\
wget \
--debug \
--verbose \
--no-check-certif \
--output-document=$o \
--user-agent=$a \
'https://www.googleapis.com/youtube/v3/commentThreads?key=$k&videoId=$v&part=replies,snippet&order=relevance&maxResults=100&textFormat=plainText&alt=json${p:+&pageToken=$p}'"
$x "$c"
}
$ PAGE_TOKEN=...
$ AGENT=... APP_KEY=... comments-wget CJ_GCPaKywg "$PAGE_TOKEN"
Setting --verbose (verbose) to 1
Setting --check-certificate (checkcertificate) to 0
Setting --output-document (outputdocument) to /tmp/CJ_GCPaKywg-comments0.json
Setting --user-agent (useragent) to ...
DEBUG output created by Wget 1.14 on linux-gnu.
--2019-06-10 17:41:11-- https://www.googleapis.com/youtube/v3/commentThreads?...
Resolving www.googleapis.com... 172.217.19.106, 216.58.214.202, 216.58.214.234, ...
Caching www.googleapis.com => 172.217.19.106 216.58.214.202 216.58.214.234 172.217.16.106 172.217.20.10 2a00:1450:400d:808::200a
Connecting to www.googleapis.com|172.217.19.106|:443... connected.
Created socket 5.
Releasing 0x0000000000ae57c0 (new refcount 1).
---request begin---
GET /youtube/v3/commentThreads?.../1.1
User-Agent: ...
Accept: */*
Host: www.googleapis.com
Connection: Keep-Alive
---request end---
HTTP request sent, awaiting response...
---response begin---
HTTP/1.1 200 OK
Expires: Mon, 10 Jun 2019 14:43:39 GMT
Date: Mon, 10 Jun 2019 14:43:39 GMT
Cache-Control: private, max-age=0, must-revalidate, no-transform
ETag: "XpPGQXPnxQJhLgs6enD_n8JR4Qk/OUAqOrEpA9YYqmVx0wqn9en_OrE"
Vary: Origin
Vary: X-Origin
Content-Type: application/json; charset=UTF-8
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-XSS-Protection: 1; mode=block
Content-Length: 205965
Server: GSE
Alt-Svc: quic=":443"; ma=2592000; v="46,44,43,39"
---response end---
200 OK
Registered socket 5 for persistent reuse.
Length: 205965 (201K) [application/json]
Saving to: ‘/tmp/CJ_GCPaKywg-comments0.json’
100%[==========================================>] 205,965 580KB/s in 0.3s
2019-06-10 17:41:18 (580 KB/s) - ‘/tmp/CJ_GCPaKywg-comments0.json’ saved [205965/205965]
Note that the shell functions quote and quote2 above are those from youtube-data.sh (they are not really needed). $PAGE_TOKEN is extracted from the body string of the JSON request object posted above.
The next question is: why your python code uses a POST request method?
Could it be that this is the cause of your problem?
According to Google's Python Client Library sample code and to Google's Youtube API sample code, you should have been coding your pagination loop as shown below:
request = yt.commentThreads().list(...)
while request:
response = request.execute()
# your processing code goes here ...
request = yt.commentThreads().list_next(request, response)

Print only specific headers using Curb gem [duplicate]

This question already has an answer here:
Get response headers from Curb
(1 answer)
Closed 8 years ago.
I have a question about Ruby gem - Curb. I'm playing around with this gem and have this piece of code:
require 'curb'
require 'colorize'
def err(msg)
puts
puts msg.red
puts 'HOWTO: '.white + './script.rb <domain>'.red
puts
end
target = ARGV[0] || err("You forgot something....")
Curl::Easy.perform(target) do |curl|
curl.headers["User-Agent"] = "Mozilla/5.0 (X11; U; SunOS sun4u; en-US; rv:1.7.7) Gecko/20050421"
curl.verbose = true
end
For example, when I try it on google.com, I get this headers (I don't put whole results from script):
Host: google.com
Accept: */*
User-Agent: Mozilla/5.0 (X11; U; SunOS sun4u; en-US; rv:1.7.7) Gecko/20050421
* STATE: DO => DO_DONE handle 0x1c8dd80; (connection #0)
* STATE: DO_DONE => WAITPERFORM handle 0x1c8dd80; (connection #0)
* STATE: WAITPERFORM => PERFORM handle 0x1c8dd80; (connection #0)
* additional stuff not fine transfer.c:1037: 0 0
* HTTP 1.1 or later with persistent connection, pipelining supported
< HTTP/1.1 302 Found
< Cache-Control: private
< Content-Type: text/html; charset=UTF-8
< Location: https://www.google.cz/?gfe_rd=cr&ei=2stTVO2eJumg8we6woGoCg
< Content-Length: 259
< Date: Fri, 31 Oct 2014 17:50:18 GMT
< Server: GFE/2.0
< Alternate-Protocol: 443:quic,p=0.01
My question, Is there any way, how to print only a specific headers via Curb? For example, I'd like only this headers on output, like this:
Content-Type: text/html; charset=UTF-8
Location: https://www.google.cz/?gfe_rd=cr&ei=2stTVO2eJumg8we6woGoCg
Server: GFE/2.0
And nothing anymore. Is there any how to to do it via this gem? Or if you have any ideas how to do it using some another gem, let me know.
It's not the most difficult thing to just parse it yourself.
That's exactly what "Get response headers from Curb" proposes.

redirect to magento admin module from payapl

i have create a custom module for pay vendor's commission in magento admin. I am using paypal form with return url hostname/magento/index.php/vender/adminhtml_commision/pay/id/4/key/6qaGSxQ1ICrEtZXkVCw .
After success payment page is redirecting to magento admin dashboard while it should be redirect to my module page.
page is redirecting to my custom module page with status 302 and open magento dashboard.
The response header of the return url :
Cache-Control no-store, no-cache, must-revalidate, post-check=0, pre-check=0
Connection Keep-Alive
Content-Length 0
Content-Type text/html; charset=UTF-8
Date Thu, 15 May 2014 13:28:22 GMT
Expires Thu, 19 Nov 1981 08:52:00 GMT
Keep-Alive timeout=5, max=100
Location http://localhost/topplefiable/index.php/admin/dashboard/
Pragma no-cache
Server Apache/2.4.4 (Win32) OpenSSL/0.9.8y PHP/5.4.19
Set-Cookie adminhtml=foitk9bk5p7qd25sh9ead2c9e7; expires=Thu, 15-May-2014 14:28:25 GMT; path=/topplefiable; domain=localhost; httponly
X-Powered-By PHP/5.4.19
Request Headersview source
Accept text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Encoding gzip, deflate
Accept-Language en-US,en;q=0.5
Connection keep-alive
Cookie store=s_1396871408storeview; adminhtml=foitk9bk5p7qd25sh9ead2c9e7; __ar_v4=%7C44OQLIKYJZGCJOP7YJWKBT%3A20140326%3A1%7CU4JSAS7TBVBWZGNKUMAMLP%3A20140326%3A1%7CWULQ7VIZHNEFTE7AHVGV7R%3A20140326%3A1; __zlcmid=OhdjUktCqwpH3c
Host localhost
User-Agent Mozilla/5.0 (Windows NT 6.1; rv:30.0) Gecko/20100101 Firefox/30.0
Request Headers From Upload Stream
Content-Length 1037
Content-Type application/x-www-form-urlencoded
Post response :
address_city WESLEY CHAPEL
address_country United States
address_country_code US
address_name RAYOMOND CHINOY
address_state FL
address_status confirmed
address_street 30310 HATZ WAY
address_zip 33543
auth AVkSV2ZSf5ipfxTbeKzKz86hJLe00dVdfxCNrqCuR2pWDjbyC6eIPYoZM3ibYe85vdBQXvoikxWM96Fsdwls8oQ
business stripa_1307688220_biz#domain.com
charset windows-1252
custom
first_name skumar
handling_amount 0.00
item_name
item_number
last_name kumar
mc_currency USD
mc_fee 0.20
mc_gross 0.20
notify_version 3.8
payer_email asah_1314106743_per#domain.com
payer_id Z95WWBMAGSL6Y
payer_status verified
payment_date 06:27:47 May 15, 2014 PDT
payment_fee 0.20
payment_gross 0.20
payment_status Completed
payment_type instant
protection_eligibility Eligible
quantity 1
receiver_email stripa_1307688220_biz#domain.com
receiver_id SNQXFDAY5XY4G
residence_country US
shipping 0.00
tax 0.00
test_ipn 1
transaction_subject
txn_id 3T8615219R228771W
txn_type web_accept
verify_sign A5aQCVrF8.8eOdu1dA6dqFof.9f4AfUbaoQLdRI9ETV8EbisVo3-1RdB
Please suggest me where i am wrong.
I found the solution after debugging errors, There was magento secrete key problem.
Magnto creates a key for each url, when we are passing "key" to paypal request it does not return "key" parameter.
Solution : Rename "key" to "magentokey" in paypal response, after success payment get "magentokey" and save paypal response in your database record. Then redirect to your custom form with "magentokey".

How to disable READ_XBUF caching?

I am using this simple code.
#include "gwan.h"
int main(int argc, char *argv[])
{
xbuf_t *reply = get_reply(argv), *read_buff;
read_buff = (xbuf_t*)get_env(argv, READ_XBUF);
xbuf_cat(reply, "START\n"); xbuf_ncat(reply, read_buff->ptr, read_buff->len); xbuf_cat(reply, "END\n");
// this line is important if I don't use read_buff everything seems OK
// but I need parse read_buff :(
printf("%s\n", read_buff->ptr); // this line is most important
return 200;
}
at first everything seems OK
shell:~$ for I in seq 0 1; do curl -A "" -H "TST: ${I}" 'http://test.com:8080/?read_buf.c&scp=3'; done
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 0
END
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 1
END
execute my loop again
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 0
END
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 1
END
execute my loop again
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 0
END
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 1
END
but there is my problem. Where TST is still 0 ?
execute my loop again
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 0
END
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 0
END
execute my loop again
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 0
END
START
GET /?read_buf.cscp=3HTTP/1.1
Host: test.com8080
Accept: */*
TST: 0
END
Why? Due to caching ? How to disable it?
PS: The servlet was executing on G-WAN 4.3.14
Since this question was posted after we replied to it by email, there is little doubt about its true objectives which are very far away from the alleged technical pretext.
To let people be judge, here is the reply we emailed that guy:
All users will see what your script is displaying.
And your script is not using any user-session. Under concurrency, this script displays the same information in the same manner for all clients.
G-WAN detects that, and this is triggering its cache because your script is slow (probably due to the print-to-console).
No such application would exist in the real world: you would use personalized URI parameters, or POST entities or even Cookies - something that your test carefully avoids - hence its irrelevance.
Besides, you might benefit from reading the G-WAN FAQs:
http://gwan.ch/faq#cache
Finally, G-WAN was not created to compete with any existing Web framework. The goal was merely to satisfy the needs of our own projects:
http://twd-industries.com/
And here G-WAN fits this task, very well, because we wrote it with the intent to use it properly.

Resources