How to read received SMS message from Twilio Programmable SMS API - sms

I like to ask How to read all received SMS message from Twilio Programmable SMS API (based on certain date).
I managed to figure out how to read all SMS message sent but can't find much resources around how to fetch all received SMS message, not sent.
Below is how you can read sent SMS message, not received message (sent After certain date)
Appreciate in advance.
TwilioClient.Init(accountSid, authToken);
var messages = MessageResource.Read(
dateSentAfter: new DateTime(2018, 12, 6, 0, 0, 0)
);
foreach (var record in messages)
{
Response.Write(record.DateCreated + ", From: " + record.From + ", To:" + record.To + "</br>" + " Body: " + record.Body + "</br></br>");
}

JavaScript use months 0 - 11 rather then 1 - 12.
So take this, using the date your currently have as the filter.
let a = new DateTime(2018, 12, 6, 0, 0, 0)
console.log(a)
Result: 2019-01-06T00:00:00.000Z
What you want is new DateTime(2018, 11, 6, 0, 0, 0)
Result: 2018-12-06T00:00:00.000Z
See if that fixes the issue.
The dateSent field is in both sent and received messages. You can set the To to your Twilio phone number to further reduce the dataset down to received SMS messages for that date.

Related

cURL or wget for measuring page load time?

What's the best/most accurate way to measure a pages load time that would include the time to load all of the pages resources? (Basically trying to get a load time that a real end user might have).
Is it better to use Wget or cURL for this type of task? (The operating system in use will be Windows due to other dependencies)
You can download all the resources requested by a page with wget, using the -p option:
wget -p https://www.example.com/
curl doesn't parse HTML so it can't be used for this. It will just print the initial page's HTML but if the HTML requests images or CSS or JS files, it won't know about that (because it doesn't parse HTML) so it won't download any of them.
wget won't be a very accurate measure of the page's load time if the page requests resources through JavaScript, because wget doesn't parse or execute JavaScript. A more accurate way to get user perceived load times is to open the page in Chrome and then look at how long it took. You can see an accurate breakdown by looking at the Network tab in the Dev Tools. If you're trying to automate it, you can use Chrome Headless through Puppeteer, something like this:
const puppeteer = require('puppeteer');
(async () => {
const browser = await puppeteer.launch();
// Get the first tab
const page = (await browser.pages())[0];
await page.goto('https://example.com/');
const loadTime = page.evaluate(() => window.performance.timing.loadEventEnd - window.performance.timing.navigationStart);
console.log(loadTime);
await browser.close();
})();
This on its own will also not be accurate due to caching of DNS results, the TLS handshake or resources by the browser.
UPDATE
If you only get the page load time, you cannot accomplish what you want to do. At a minimum, you need "Time to First Byte" to eliminate Server delays from performance issues.
WebPageTest has a LOT of data. But what I use is minimal. The below example, the JSON file has 162,260 lines of JSON (5.5 MB), I use 125 lines (4 KB) of it.
To show you how easy it is, I wrote this PHP app in a couple of hours.
I entered a URL at https://www.webpagetest.org
I copied the "Download JSON" link
Pasted it into my PHP code
The code,
Retrieved the JSON
Got a sub string of the JSON I needed.
Then formatted the data
And made a table
I just used a few parameters that would be needed to assess performance.
I included the Number of DOM elements and JS & CSS Blocking because those are very common issues with many pages, especially WordPress pages.
I find the layout shift to be important. Layout shift is when after the first paint, JS makes a change that requires the Browser to do another rendering.
<?php
header("Content-Type: text/html; UTF-8");
$data = file_get_contents('https://www.webpagetest.org/jsonResult.php?test=<result id>&pretty=1');
$start = strpos($data,'chromeUserTiming.CumulativeLayoutShift": ') + 41;
$LayoutShift = substr($data,$start,6);
$start = strpos($data,'"firstView": {') + 13;
$end = strpos($data,'}',$start) + 1;
$json = substr($data,$start,$end - $start);
unset($data);
$json = json_decode($json,1);
$TTFB = number_format($json['TTFB'] / 1000,3);
$loadTime = number_format($json['loadTime'] / 1000,3);
$fullyLoaded = number_format($json['fullyLoaded'] / 1000,3);
$loadEventStart = number_format($json['loadEventStart'] / 1000,3);
$firstPaint = number_format($json['firstPaint'] / 1000,3);
$firstContentfulPaint = number_format($json['firstContentfulPaint'] / 1000,3);
$renderBlockingCSS = number_format($json['renderBlockingCSS'] / 1000,3);
$renderBlockingJS = number_format($json['renderBlockingJS'] / 1000,3);
$TotalBlockingTime = number_format($json['TotalBlockingTime'] / 1000,3);
$FirstInteractive = number_format($json['FirstInteractive'] / 1000,3);
$fullyLoaded = number_format($json['fullyLoaded'] / 1000,3);
$TotalBlockingTime = number_format($json['TotalBlockingTime'] / 1000,3);
$renderBlockingCSS = number_format($json['renderBlockingCSS'] / 1000,3);
$renderBlockingJS = number_format($json['renderBlockingJS'] / 1000,3);
$requests = $json['requests'] ;
$domElements = $json['domElements'] ;
$domComplete = number_format($json['domComplete'] / 1000,3);
$LayoutShift = number_format($json['chromeUserTiming.TotalLayoutShift'] / 1000,3);
echo <<<EOT
<table>
<tr><td>First Byte</td><td>$TTFB</td></tr>
<tr><td>DOM Complete</td><td>$domComplete</td></tr>
<tr><td>Load Time</td><td>$loadTime</td></tr>
<tr><td>First Paint</td><td>$firstPaint </td></tr>
<tr><td>Fully Loaded</td><td>$fullyLoaded</td></tr>
<tr><td></td> <td></td> </tr>
<tr><td></td> <td></td> </tr>
<tr><td>Requests</td><td>$requests</td></tr>
<tr><td>DOM Elements</td><td>$domElements</td></tr>
<tr><td>CSS Render Blocking</td><td>$renderBlockingCSS</td></tr>
<tr><td>JS Render Blocking</td><td>$renderBlockingJS</td></tr>
<tr><td>Total Blocking Time</td><td>$TotalBlockingTime </td></tr>
<tr><td>Layout Shift</td><td>$LayoutShift </td></tr>
</table
EOT;
?>
The results
These results are for a mobile page with lots of pictures.
For mobile I like to have as few requests as possible.
I embed the images in the page as base64.
This is the sub string of the JSON I actually use:
{
"loadTime": 580.33333333333337,
"docTime": 580.33333333333337,
"fullyLoaded": 640.33333333333337,
"bytesOut": 3480,
"bytesOutDoc": 3114.6666666666665,
"bytesIn": 117909,
"bytesInDoc": 117862.66666666667,
"requests": 3,
"requestsFull": 3,
"requestsDoc": 2.6666666666666665,
"responses_200": 3,
"responses_404": 0,
"responses_other": 0,
"result": 0,
"testStartOffset": 0,
"cached": 0,
"optimization_checked": 1,
"loadEventStart": 575.33333333333337,
"loadEventEnd": 575.66666666666663,
"domContentLoadedEventStart": 510.33333333333331,
"domContentLoadedEventEnd": 510.33333333333331,
"connections": 1,
"final_base_page_request": 0,
"firstPaint": 304.63333333283333,
"firstContentfulPaint": 309.66666666666669,
"firstImagePaint": 309.66666666666669,
"firstMeaningfulPaint": 309.66666666666669,
"domInteractive": 510.33333333333331,
"renderBlockingCSS": 0,
"renderBlockingJS": 0,
"TTFB": 264.66666666666669,
"score_cache": 0,
"score_cdn": 0,
"score_gzip": 100,
"score_cookies": -1,
"score_keep-alive": 100,
"score_minify": -1,
"score_combine": -1,
"score_compress": 100,
"score_etags": -1,
"score_progressive_jpeg": -1,
"gzip_total": 118336,
"gzip_savings": 0,
"minify_total": -1,
"minify_savings": -1,
"image_total": 1667,
"image_savings": 0,
"cpu.PrePaint": 1,
"cpu.Paint": 2.6666666666666665,
"cpu.FireAnimationFrame": 0,
"cpu.FunctionCall": 5.333333333333333,
"cpu.EventDispatch": 0.66666666666666663,
"cpu.CommitLoad": 0,
"cpu.EvaluateScript": 1.3333333333333333,
"cpu.v8.compile": 0.33333333333333331,
"cpu.ParseHTML": 7,
"cpu.ResourceFetcher::requestResource": 8.3333333333333339,
"cpu.UpdateLayoutTree": 9.6666666666666661,
"cpu.Layout": 17.333333333333332,
"cpu.largestContentfulPaint::Candidate": 0,
"cpu.HitTest": 0,
"cpu.MarkDOMContent": 0,
"cpu.MarkLoad": 0,
"cpu.Idle": 586.33333333333337,
"start_epoch": 1666370740.1789024,
"date": 1666370741.9859715,
"fullyLoadedCPUms": 670,
"fullyLoadedCPUpct": 9.1036962354000011,
"domElements": 75,
"domComplete": 575.33333333333337,
"PerformancePaintTiming.first-paint": 304.63333333283333,
"PerformancePaintTiming.first-contentful-paint": 304.63333333283333,
"test_run_time_ms": 5673.333333333333,
"Colordepth": 24,
"generated-content-percent": -0.029999999999999999,
"generated-content-size": -0.040000000000000001,
"lastVisualChange": 400,
"render": 300,
"visualComplete85": 300,
"visualComplete90": 300,
"visualComplete95": 300,
"visualComplete99": 400,
"visualComplete": 400,
"SpeedIndex": 303,
"chromeUserTiming.navigationStart": 5.333333333333333,
"chromeUserTiming.fetchStart": 8.3333333333333339,
"chromeUserTiming.unloadEventStart": 275.66666666666669,
"chromeUserTiming.unloadEventEnd": 275.66666666666669,
"chromeUserTiming.commitNavigationEnd": 276,
"chromeUserTiming.domLoading": 276.33333333333331,
"chromeUserTiming.firstPaint": 309,
"chromeUserTiming.firstContentfulPaint": 309,
"chromeUserTiming.firstImagePaint": 309.33333333333331,
"chromeUserTiming.firstMeaningfulPaintCandidate": 309.33333333333331,
"chromeUserTiming.firstMeaningfulPaint": 309.33333333333331,
"chromeUserTiming.responseEnd": 510.33333333333331,
"chromeUserTiming.domInteractive": 515.66666666666663,
"chromeUserTiming.domContentLoadedEventStart": 515.66666666666663,
"chromeUserTiming.domContentLoadedEventEnd": 515.66666666666663,
"chromeUserTiming.domComplete": 580,
"chromeUserTiming.loadEventStart": 580.33333333333337,
"chromeUserTiming.loadEventEnd": 580.66666666666663,
"chromeUserTiming.LargestTextPaint": 309.66666666666669,
"chromeUserTiming.LargestImagePaint": 309.66666666666669,
"chromeUserTiming.LargestContentfulPaint": 309.66666666666669,
"chromeUserTiming.TotalLayoutShift": 0.0011286408333333333,
"chromeUserTiming.CumulativeLayoutShift": 0.0011286408333333333,
"TTIMeasurementEnd": 3657.3333333333335,
"LastInteractive": 300,
"run": 2,
"step": 1,
"effectiveBps": 313879.33333333331,
"domTime": 0,
"aft": 0,
"titleTime": 293.33333333333331,
"domLoading": 0,
"server_rtt": 0,
"maxFID": 0,
"TotalBlockingTime": 0,
"effectiveBpsDoc": 373405.33333333331,
"chromeUserTiming.LayoutShift": 109,
"avgRun": 3
}
I also use GT Metrix
And PageSpeed Insights
And W3C Validators, both HTML and CSS
End of Update
You can use curl to get the response time for the HTML. It will not give you much more.
curl will give you:
CURLINFO_SIZE_UPLOAD - Total number of bytes uploaded
CURLINFO_SIZE_DOWNLOAD - Total number of bytes downloaded
CURLINFO_SPEED_DOWNLOAD - Average download speed
CURLINFO_SPEED_UPLOAD - Average upload speed
The best way I know of to get EVERY detail of an HTTP request is to use https://www.webpagetest.org
You can use their API to get 300 performance test results per month for Free.
LINK to webpagetest.org API
Example
You curl the API and it returns the links to the details. Within those links there is a plethora of information and images.
Then curl the links you want.
LINK to HAR details of loading this page. 110,000 lines of info in 6,795,929 bytes
curl https://www.webpagetest.org/runtest.php?url=https://www.webpagetest.org&k={YOUR_API_KEY}&f=json
{
"statusCode": 200,
"statusText": "Ok",
"data": {
"testId": "210328_XiVQ_b694021b2a24ca1912dae50fb58b5861",
"jsonUrl": "https://www.webpagetest.org/jsonResult.php?test=210328_XiVQ_b694021b2a24ca1912dae50fb58b5861",
"xmlUrl": "https://www.webpagetest.org/xmlResult/210328_XiVQ_b694021b2a24ca1912dae50fb58b5861/",
"userUrl": "https://www.webpagetest.org/result/210328_XiVQ_b694021b2a24ca1912dae50fb58b5861/",
"summaryCSV": "https://www.webpagetest.org/result/210328_XiVQ_b694021b2a24ca1912dae50fb58b5861/page_data.csv",
"detailCSV": "https://www.webpagetest.org/result/210328_XiVQ_b694021b2a24ca1912dae50fb58b5861/requests.csv"
}
}
The waterfall below shows some of the details you can get.
You can get the nitty gritty details in various formats:
JSON
CSV
XML
HAR
Sample of CSV info
"type","id","request_id","ip_addr","full_url","is_secure","method","host","url","raw_id","frame_id","documentURL","responseCode","request_type","load_ms","ttfb_ms","load_start","load_start_float","bytesIn","objectSize","objectSizeUncompressed","expires","cacheControl","contentType","contentEncoding","socket","protocol","dns_start","dns_end","connect_start","connect_end","ssl_start","ssl_end","initiator","initiator_line","initiator_column","initiator_type","priority","initial_priority","server_rtt","bytesOut","score_cache","score_cdn","score_gzip","score_cookies","score_keep-alive","score_minify","score_combine","score_compress","score_etags","dns_ms","connect_ms","ssl_ms","gzip_total","gzip_save","minify_total","minify_save","image_total","image_save","cache_time","cdn_provider","server_count","created","http2_stream_id","http2_stream_dependency","http2_stream_weight","http2_stream_exclusive","tls_version","tls_resumed","tls_next_proto","tls_cipher_suite","netlog_id","server_port","final_base_page","is_base_page","load_end","ttfb_start","ttfb_end","download_start","download_end","download_ms","all_start","all_end","all_ms","index","number","cpu.EvaluateScript","cpu.v8.compile","cpu.FunctionCall","cpuTime","run","cached","renderBlocking","initiator_function",
"3","221020_AiDc05_4BQ","BE8FE33DABBBA0CF2B3AC5D78083C66E","151.101.193.69","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","1","GET","stackoverflow.com","/questions/74122109/curl-or-wget-for-measuring-page-load-time","BE8FE33DABBBA0CF2B3AC5D78083C66E","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Document","346","198","533","533.000041","40562","40562","145896","","private","text/html","gzip","50","HTTP/2","0","172","173","344","344","533","","","","script","Highest","Highest","","2317","-1","100","100","-1","100","-1","-1","-1","-1","-1","171","189","41099","0","","","","","","Fastly","","4","1","0","256","1","TLS 1.2","False","h2","49199","41","443","1","1","879","533","731","731","879","148","173","879","706","0","1","135","26","2","164","1","0","","",
"3","221020_AiDc05_4BQ","6487.6","151.101.193.69","https://cdn.sstatic.net/Js/stub.en.js?v=0e3ada576039","1","GET","cdn.sstatic.net","/Js/stub.en.js?v=0e3ada576039","6487.6","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Script","585","568","967","967.000067","18152","18152","53336","","max-age=604800","application/javascript","gzip","50","HTTP/2","787","967","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","25","","parser","High","High","","1831","50","100","100","-1","100","-1","-1","-1","-1","180","-1","-1","18152","0","","","","","34541","Fastly","","788","9","","","","","","","","67","","","","1552","967","1535","1535","1552","17","787","1552","765","1","2","34","8","443","485","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.5","151.101.193.69","https://cdn.sstatic.net/Js/third-party/npm/#stackoverflow/stacks/dist/js/stacks.min.js?v=facbc6b2f3b6","1","GET","cdn.sstatic.net","/Js/third-party/npm/#stackoverflow/stacks/dist/js/stacks.min.js?v=facbc6b2f3b6","6487.5","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Script","599","568","968","968.000061","23746","23746","100193","","max-age=604800","application/javascript","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","24","","parser","Low","Low","","1978","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","23746","0","","","","","567498","Fastly","","786","11","","","","","","","","61","","","","1567","968","1536","1536","1567","31","968","1567","599","2","3","81","2","38","121","1","0","potentially_blocking","",
"3","221020_AiDc05_4BQ","6487.7","151.101.193.69","https://cdn.sstatic.net/Shared/stacks.css?v=5ad0f45f4799","1","GET","cdn.sstatic.net","/Shared/stacks.css?v=5ad0f45f4799","6487.7","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Stylesheet","363","188","968","968.000072","68692","68692","654577","","max-age=604800","text/css","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","28","","parser","Highest","Highest","","1885","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","68692","0","","","","","542107","Fastly","","790","3","","","","","","","","72","","","","1331","968","1156","1156","1331","175","968","1331","363","3","4","","","","","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.8","151.101.193.69","https://cdn.sstatic.net/Sites/stackoverflow/primary.css?v=0cc30ee01b86","1","GET","cdn.sstatic.net","/Sites/stackoverflow/primary.css?v=0cc30ee01b86","6487.8","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Stylesheet","550","374","968","968.000077","60467","60467","339815","","max-age=604800","text/css","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","29","","parser","Highest","Highest","","1927","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","60467","0","","","","","134559","Fastly","","793","5","","","","","","","","77","","","","1518","968","1342","1342","1518","176","968","1518","550","4","5","","","","","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.9","151.101.193.69","https://cdn.sstatic.net/Shared/Channels/channels.css?v=d098999fc478","1","GET","cdn.sstatic.net","/Shared/Channels/channels.css?v=d098999fc478","6487.9","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Stylesheet","551","393","968","968.000082","4168","4168","18913","","max-age=604800","text/css","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","70","","parser","Highest","Highest","","1918","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","4168","0","","","","","477490","Fastly","","797","7","","","","","","","","82","","","","1519","968","1361","1361","1519","158","968","1519","551","5","6","","","","","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.4","142.251.163.95","https://ajax.googleapis.com/ajax/libs/jquery/1.12.4/jquery.min.js","1","GET","ajax.googleapis.com","/ajax/libs/jquery/1.12.4/jquery.min.js","6487.4","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Script","370","175","1309","1309.000055","33951","33951","97163","Thu, 19 Oct 2023 17:18:45 GMT","public, max-age=31536000, stale-while-revalidate=2592000","text/javascript","gzip","87","HTTP/2","784","955","955","1128","1128","1309","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","23","","parser","High","High","","1870","100","100","100","-1","100","-1","-1","-1","-1","171","173","181","33951","0","","","","","31493759","Google","","783","1","0","220","1","TLS 1.3","False","h2","4865","55","443","","","1679","1309","1484","1484","1679","195","784","1679","895","6","7","64","12","31","107","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.10","151.101.193.69","https://cdn.sstatic.net/Img/teams/teams-illo-free-sidebar-promo.svg?v=47faa659a05e","1","GET","cdn.sstatic.net","/Img/teams/teams-illo-free-sidebar-promo.svg?v=47faa659a05e","6487.10","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Image","176","175","1580","1580.000099","2368","2368","5950","","max-age=604800","image/svg+xml","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","485","","parser","Low","Low","","2101","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","2368","0","","","","","93767","Fastly","","1579","13","0","147","1","","","","","99","443","","","1756","1580","1755","1755","1756","1","1580","1756","176","7","8","","","","","1","0","","",
"3","221020_AiDc05_4BQ","6487.20","151.101.193.69","https://cdn.sstatic.net/Img/unified/sprites.svg?v=fcc0ea44ba27","1","GET","cdn.sstatic.net","/Img/unified/sprites.svg?v=fcc0ea44ba27","6487.20","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Image","176","174","2163","2163.000126","2852","2852","7542","","max-age=604800","image/svg+xml","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://cdn.sstatic.net/Sites/stackoverflow/primary.css?v=0cc30ee01b86","","","parser","High","Low","","2173","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","2852","0","","","","","383944","Fastly","","2162","15","0","147","1","","","","","126","443","","","2339","2163","2337","2337","2339","2","2163","2339","176","8","9","","","","","1","0","","",
The CSV contains the following details on every request made to render the page.
type
id
request_id
ip_addr
full_url
is_secure
method
host
url
raw_id
frame_id
documentURL
responseCode
request_type
load_ms
ttfb_ms
load_start
load_start_float
bytesIn
objectSize
objectSizeUncompressed
expires
cacheControl
contentType
contentEncoding
socket
protocol
dns_start
dns_end
connect_start
connect_end
ssl_start
ssl_end
initiator
initiator_line
initiator_column
initiator_type
priority
initial_priority
server_rtt
bytesOut
score_cache
score_cdn
score_gzip
score_cookies
score_keep-alive
score_minify
score_combine
score_compress
score_etags
dns_ms
connect_ms
ssl_ms
gzip_total
gzip_save
minify_total
minify_save
image_total
image_save
cache_time
cdn_provider
server_count
created
http2_stream_id
http2_stream_dependency
http2_stream_weight
http2_stream_exclusive
tls_version
tls_resumed
tls_next_proto
tls_cipher_suite
netlog_id
server_port
final_base_page
is_base_page
load_end
ttfb_start
ttfb_end
download_start
download_end
download_ms
all_start
all_end
all_ms
index
number
cpu.EvaluateScript
cpu.v8.compile
cpu.FunctionCall
cpuTime
run
cached
renderBlocking
initiator_function

How to calculate average for each tumbling window?

I’m new in kafka streams and I’m really going crazy. I have a stream of counter values represented by <counter-name, counter-value, timestamp>. I want to calculate the average value for each day, like this:
counterValues topic content:
“cpu”, 10, “2022-06-03 17:00”
“cpu”, 20, “2022-06-03 18:00”
“cpu”, 30, “2022-06-04 10:00”
“memory”, 40, “2022-06-04 10:00”
and I want to obtain this output:
“cpu”, “2022-06-03”, 15
“cpu”, “2022-06-04”, 30
“memory”, “2022-06-04”, 40
This is a snippet of my code that it doesn’t work (it seems to calculate count)…
Duration windowSize = Duration.ofDays(1);
TimeWindows tumblingWindow = TimeWindows.of(windowSize);
counterValueStream
.groupByKey().windowedBy(tumblingWindow)
.aggregate(StatisticValue::new, (k, counterValue, statisticValue) -> {
statisticValue.setSamplesNumber(statisticValue.getSamplesNumber() + 1);
statisticValue.setSum(statisticValue.getSum() + counterValue.getValue());
return statisticValue;
}, Materialized.with(Serdes.String(), statisticValueSerde))
.toStream().map((Windowed<String> key, StatisticValue sv) -> {
double avgNoFormat = sv.getSum() / (double) sv.getSamplesNumber();
double formattedAvg = Double.parseDouble(String.format("%.2f", avgNoFormat));
return new KeyValue<>(key.key(), formattedAvg) ;
}).to("average", Produced.with(Serdes.String(), Serdes.Double()));
But the aggregation result is:
“cpu”, 1, “2022-06-03 17:00”
“cpu”, 1, “2022-06-03 18:00”
“cpu”, 1, “2022-06-04 10:00”
“memory”, 1, “2022-06-04 10:00”
Note that I use a TimestampExtractor that use counter timestamp instead of kafka record. What am I doing wrong?

decoding the SENDER ID in sms header

I am doing a small SMS receive utility,i have a SMS messages which I can not understand how to decode its sender id, here is the output of reading the message in PDU mode:
+CMGL: 0,1,,86 0791021197003899440ED0657A7A1E6687E93408610192016390004205000365030106440642062F002006270633062A064706440643062A0020064306440020062706440648062D062F0627062A0020062706440645062C06270646064A
and in text mode:
+CMGL: 0,"REC READ","1011161051159710897116",,"16/10/29,10:36:09+00" 06440642062F002006270633062A064706440643062A0020064306440020062706440648062D062F0627062A0020062706440645062C06270646064A
and i read this message through mobile phone and i found that the sender alphanumeric code "1011161051159710897116" is equal to "etisalat" which is the name of service provider, i want to understand what encoding they use. and how to decode it ?
It's encoded as ASCII as decimal semi-octets:
1011161051159710897116 =
101 = &65 = e
116 = &74 = t
105 = &69 = i
115 = &73 = s
97 = &61 = a
108 = &6C = l
97 = &61 = a
116 = &74 = t
To read this from PDU data, you have to swap the semi-octets and if the length is odd you have to add an extra 'F' to make it even to get the proper octet string.
The specs for SMS PDU's can be found here: GSM 03.40

how javamail imap fetch mail order by receive date desc

how javamail imap fetch mail order by receive date desc? folder.getMessage() no a date arg.
I want to sort by date when fetch mail in imap.
Thanks Advance!
Normally, messages are stored in the INBOX in the order they're received, so message number order is received date order. But note that this can be wrong if messages are moved between folders.
In general, if you want messages in a particular order, you'll need to sort them. If your IMAP server supports the SORT extension, you can ask the server to do the sorting by using the com.sun.mail.imap.IMAPFolder.getSortedMessages method.
#DefaultValue("REVERSE,ARRIVAL") MailSortTerms sortTerms
/**/
if (imapStore.hasCapability("SORT*")) {
Message[] messages = ((IMAPFolder) inbox).getSortedMessages(
sortTerms.getTerms());
for (int i = skip;
i < Math.min(skip + size, inbox.getMessageCount());
i++) {
resultList.add(messages[i]);
}
} else {
Message[] messages = inbox.getMessages();
for (int i = inbox.getMessageCount() - skip - 1;
i >= Math.max(inbox.getMessageCount() - skip - size - 1, 0);
i--) {
resultList.add(messages[i]);
}
}

How to return ALL events in a Google Calendar without knowing whether it is a timed or all day event

Now, I'm working on making a program in Python that can pull events from all the calendars in my Google account; however, I'm trying to make the program potentially as commercial as possible. With that said, it's quite simple to customize the code for myself, when I know that all the US Holidays events attached to my calendar are all day events, so I can set up a simple if statement that checks if it's a Holiday calendar and specify the events request as such:
def get_main_events(pageToken=None):
events = gc_source.service.events().list(
calendarId=calendarId,
singleEvents=True,
maxResults=1000,
orderBy='startTime',
pageToken=pageToken,
).execute()
return events
So, that works for all day events. After which I'd append the results to a list and filter it to get only the events I want. Now getting events from my primary calendar is a bit easier to specify the events I want because they're generally not all day events, just my work schedule so I can use:
now = datetime.now()
now_plus_thirtydays = now + timedelta(days=30)
def get_main_events(pageToken=None):
events = gc_source.service.events().list(
calendarId=calendarId,
singleEvents=True,
maxResults=1000,
orderBy='startTime',
timeMin=now.strftime('%Y-%m-%dT%H:%M:%S-00:00'),
timeMax=now_plus_thirtydays.strftime('%Y-%m-%dT%H:%M:%S-00:00'),
pageToken=pageToken,
).execute()
return events
Now, the problem I run into with making the program available for commercial use, as well as myself, is the above will ONLY return NON-all day events from my primary calendar. I'd like to find out if there's a way - if so, how - to run the get events request and return ALL results whether they're all day or if they're just a timed event that takes place in a portion of the day. In addition part of this issue is that in another part of the code where I print the results I would need to use:
print event['start']['date']
for an all day event, and:
print event['start']['dateTime']
for a non all day event.
So, since 'dateTime' wont work on an all day event, I'd like to figure out a way to set it up so that I can evaluate whether an event is all day or not. i.e. "if said event is an all day event, use event['start']['date'], else use event['start']['dateTime']
So, through much testing, and finding a way to use a log feature to see what error was happening with:
print event['start']['date']
vs:
print event['start']['dateTime']
I found that I could use the error result to my advantage using 'try' and 'except'.
Here is the resulting fix:
First the initial part as earlier with the actual query to the calendar:
now = datetime.now()
now_plus_thirtydays = now + timedelta(days=30)
def get_calendar_events(pageToken=None):
events = gc_source.service.events().list(
calendarId=cal_id[cal_count],
singleEvents=True,
orderBy='startTime',
timeMin=now.strftime('%Y-%m-%dT%H:%M:%S-00:00'),
timeMax=now_plus_thirtydays.strftime('%Y-%m-%dT%H:%M:%S-00:00'),
pageToken=pageToken,
).execute()
return events
Then the event handling portion:
# Events Portion
print "Calendar: ", cal_summary[cal_count]
events = get_calendar_events()
while True:
for event in events['items']:
try:
if event['start']['dateTime']:
dstime = dateutil.parser.parse(event['start']['dateTime'])
detime = dateutil.parser.parse(event['end']['dateTime'])
if dstime.strftime('%d/%m/%Y') == detime.strftime('%d/%m/%Y'):
print event['summary'] + ": " + dstime.strftime('%d/%m/%Y') + " " + dstime.strftime('%H%M') + "-" + detime.strftime('%H%M')
# Making a list for the respective items so they can be iterated through easier for time comparison and TTS messages
if cal_count == 0:
us_holiday_list.append((dstime, event['summary']))
elif cal_count == 1:
birthday_list.append((dstime, event['summary']))
else:
life_list.append((dstime, event['summary']))
else:
print event['summary'] + ": " + dstime.strftime('%d/%m/%Y') + " # " + dstime.strftime('%H%M') + " to " + detime.strftime('%H%M') + " on " + detime.strftime('%d/%m/%Y')
# Making a list for the respective items so they can be iterated through easier for time comparison and TTS messages
if cal_count == 0:
us_holiday_list.append((dstime, event['summary']))
elif cal_count == 1:
birthday_list.append((dstime, event['summary']))
else:
life_list.append((dstime, event['summary']))
else:
return
except KeyError:
dstime = dateutil.parser.parse(event['start']['date'])
detime = dateutil.parser.parse(event['end']['date'])
print event['summary'] + ": " + dstime.strftime('%d/%m/%Y')
# Making a list for the respective items so they can be iterated through easier for time comparison and TTS messages
if cal_count == 0:
us_holiday_list.append((dstime, event['summary']))
elif cal_count == 1:
birthday_list.append((dstime, event['summary']))
else:
life_list.append((dstime, event['summary']))
page_token = events.get('nextPageToken')
if page_token:
events = get_calendar_events(page_token)
else:
if cal_count == (len(cal_id) - 1): # If there are no more calendars to process
break
else: #Continue to next calendar
print "-----"
cal_count += 1
print "Retrieving From Calendar: ", cal_summary[cal_count]
events = get_calendar_events()

Resources