cURL or wget for measuring page load time? - windows

What's the best/most accurate way to measure a pages load time that would include the time to load all of the pages resources? (Basically trying to get a load time that a real end user might have).
Is it better to use Wget or cURL for this type of task? (The operating system in use will be Windows due to other dependencies)

You can download all the resources requested by a page with wget, using the -p option:
wget -p https://www.example.com/
curl doesn't parse HTML so it can't be used for this. It will just print the initial page's HTML but if the HTML requests images or CSS or JS files, it won't know about that (because it doesn't parse HTML) so it won't download any of them.
wget won't be a very accurate measure of the page's load time if the page requests resources through JavaScript, because wget doesn't parse or execute JavaScript. A more accurate way to get user perceived load times is to open the page in Chrome and then look at how long it took. You can see an accurate breakdown by looking at the Network tab in the Dev Tools. If you're trying to automate it, you can use Chrome Headless through Puppeteer, something like this:
const puppeteer = require('puppeteer');
(async () => {
const browser = await puppeteer.launch();
// Get the first tab
const page = (await browser.pages())[0];
await page.goto('https://example.com/');
const loadTime = page.evaluate(() => window.performance.timing.loadEventEnd - window.performance.timing.navigationStart);
console.log(loadTime);
await browser.close();
})();
This on its own will also not be accurate due to caching of DNS results, the TLS handshake or resources by the browser.

UPDATE
If you only get the page load time, you cannot accomplish what you want to do. At a minimum, you need "Time to First Byte" to eliminate Server delays from performance issues.
WebPageTest has a LOT of data. But what I use is minimal. The below example, the JSON file has 162,260 lines of JSON (5.5 MB), I use 125 lines (4 KB) of it.
To show you how easy it is, I wrote this PHP app in a couple of hours.
I entered a URL at https://www.webpagetest.org
I copied the "Download JSON" link
Pasted it into my PHP code
The code,
Retrieved the JSON
Got a sub string of the JSON I needed.
Then formatted the data
And made a table
I just used a few parameters that would be needed to assess performance.
I included the Number of DOM elements and JS & CSS Blocking because those are very common issues with many pages, especially WordPress pages.
I find the layout shift to be important. Layout shift is when after the first paint, JS makes a change that requires the Browser to do another rendering.
<?php
header("Content-Type: text/html; UTF-8");
$data = file_get_contents('https://www.webpagetest.org/jsonResult.php?test=<result id>&pretty=1');
$start = strpos($data,'chromeUserTiming.CumulativeLayoutShift": ') + 41;
$LayoutShift = substr($data,$start,6);
$start = strpos($data,'"firstView": {') + 13;
$end = strpos($data,'}',$start) + 1;
$json = substr($data,$start,$end - $start);
unset($data);
$json = json_decode($json,1);
$TTFB = number_format($json['TTFB'] / 1000,3);
$loadTime = number_format($json['loadTime'] / 1000,3);
$fullyLoaded = number_format($json['fullyLoaded'] / 1000,3);
$loadEventStart = number_format($json['loadEventStart'] / 1000,3);
$firstPaint = number_format($json['firstPaint'] / 1000,3);
$firstContentfulPaint = number_format($json['firstContentfulPaint'] / 1000,3);
$renderBlockingCSS = number_format($json['renderBlockingCSS'] / 1000,3);
$renderBlockingJS = number_format($json['renderBlockingJS'] / 1000,3);
$TotalBlockingTime = number_format($json['TotalBlockingTime'] / 1000,3);
$FirstInteractive = number_format($json['FirstInteractive'] / 1000,3);
$fullyLoaded = number_format($json['fullyLoaded'] / 1000,3);
$TotalBlockingTime = number_format($json['TotalBlockingTime'] / 1000,3);
$renderBlockingCSS = number_format($json['renderBlockingCSS'] / 1000,3);
$renderBlockingJS = number_format($json['renderBlockingJS'] / 1000,3);
$requests = $json['requests'] ;
$domElements = $json['domElements'] ;
$domComplete = number_format($json['domComplete'] / 1000,3);
$LayoutShift = number_format($json['chromeUserTiming.TotalLayoutShift'] / 1000,3);
echo <<<EOT
<table>
<tr><td>First Byte</td><td>$TTFB</td></tr>
<tr><td>DOM Complete</td><td>$domComplete</td></tr>
<tr><td>Load Time</td><td>$loadTime</td></tr>
<tr><td>First Paint</td><td>$firstPaint </td></tr>
<tr><td>Fully Loaded</td><td>$fullyLoaded</td></tr>
<tr><td></td> <td></td> </tr>
<tr><td></td> <td></td> </tr>
<tr><td>Requests</td><td>$requests</td></tr>
<tr><td>DOM Elements</td><td>$domElements</td></tr>
<tr><td>CSS Render Blocking</td><td>$renderBlockingCSS</td></tr>
<tr><td>JS Render Blocking</td><td>$renderBlockingJS</td></tr>
<tr><td>Total Blocking Time</td><td>$TotalBlockingTime </td></tr>
<tr><td>Layout Shift</td><td>$LayoutShift </td></tr>
</table
EOT;
?>
The results
These results are for a mobile page with lots of pictures.
For mobile I like to have as few requests as possible.
I embed the images in the page as base64.
This is the sub string of the JSON I actually use:
{
"loadTime": 580.33333333333337,
"docTime": 580.33333333333337,
"fullyLoaded": 640.33333333333337,
"bytesOut": 3480,
"bytesOutDoc": 3114.6666666666665,
"bytesIn": 117909,
"bytesInDoc": 117862.66666666667,
"requests": 3,
"requestsFull": 3,
"requestsDoc": 2.6666666666666665,
"responses_200": 3,
"responses_404": 0,
"responses_other": 0,
"result": 0,
"testStartOffset": 0,
"cached": 0,
"optimization_checked": 1,
"loadEventStart": 575.33333333333337,
"loadEventEnd": 575.66666666666663,
"domContentLoadedEventStart": 510.33333333333331,
"domContentLoadedEventEnd": 510.33333333333331,
"connections": 1,
"final_base_page_request": 0,
"firstPaint": 304.63333333283333,
"firstContentfulPaint": 309.66666666666669,
"firstImagePaint": 309.66666666666669,
"firstMeaningfulPaint": 309.66666666666669,
"domInteractive": 510.33333333333331,
"renderBlockingCSS": 0,
"renderBlockingJS": 0,
"TTFB": 264.66666666666669,
"score_cache": 0,
"score_cdn": 0,
"score_gzip": 100,
"score_cookies": -1,
"score_keep-alive": 100,
"score_minify": -1,
"score_combine": -1,
"score_compress": 100,
"score_etags": -1,
"score_progressive_jpeg": -1,
"gzip_total": 118336,
"gzip_savings": 0,
"minify_total": -1,
"minify_savings": -1,
"image_total": 1667,
"image_savings": 0,
"cpu.PrePaint": 1,
"cpu.Paint": 2.6666666666666665,
"cpu.FireAnimationFrame": 0,
"cpu.FunctionCall": 5.333333333333333,
"cpu.EventDispatch": 0.66666666666666663,
"cpu.CommitLoad": 0,
"cpu.EvaluateScript": 1.3333333333333333,
"cpu.v8.compile": 0.33333333333333331,
"cpu.ParseHTML": 7,
"cpu.ResourceFetcher::requestResource": 8.3333333333333339,
"cpu.UpdateLayoutTree": 9.6666666666666661,
"cpu.Layout": 17.333333333333332,
"cpu.largestContentfulPaint::Candidate": 0,
"cpu.HitTest": 0,
"cpu.MarkDOMContent": 0,
"cpu.MarkLoad": 0,
"cpu.Idle": 586.33333333333337,
"start_epoch": 1666370740.1789024,
"date": 1666370741.9859715,
"fullyLoadedCPUms": 670,
"fullyLoadedCPUpct": 9.1036962354000011,
"domElements": 75,
"domComplete": 575.33333333333337,
"PerformancePaintTiming.first-paint": 304.63333333283333,
"PerformancePaintTiming.first-contentful-paint": 304.63333333283333,
"test_run_time_ms": 5673.333333333333,
"Colordepth": 24,
"generated-content-percent": -0.029999999999999999,
"generated-content-size": -0.040000000000000001,
"lastVisualChange": 400,
"render": 300,
"visualComplete85": 300,
"visualComplete90": 300,
"visualComplete95": 300,
"visualComplete99": 400,
"visualComplete": 400,
"SpeedIndex": 303,
"chromeUserTiming.navigationStart": 5.333333333333333,
"chromeUserTiming.fetchStart": 8.3333333333333339,
"chromeUserTiming.unloadEventStart": 275.66666666666669,
"chromeUserTiming.unloadEventEnd": 275.66666666666669,
"chromeUserTiming.commitNavigationEnd": 276,
"chromeUserTiming.domLoading": 276.33333333333331,
"chromeUserTiming.firstPaint": 309,
"chromeUserTiming.firstContentfulPaint": 309,
"chromeUserTiming.firstImagePaint": 309.33333333333331,
"chromeUserTiming.firstMeaningfulPaintCandidate": 309.33333333333331,
"chromeUserTiming.firstMeaningfulPaint": 309.33333333333331,
"chromeUserTiming.responseEnd": 510.33333333333331,
"chromeUserTiming.domInteractive": 515.66666666666663,
"chromeUserTiming.domContentLoadedEventStart": 515.66666666666663,
"chromeUserTiming.domContentLoadedEventEnd": 515.66666666666663,
"chromeUserTiming.domComplete": 580,
"chromeUserTiming.loadEventStart": 580.33333333333337,
"chromeUserTiming.loadEventEnd": 580.66666666666663,
"chromeUserTiming.LargestTextPaint": 309.66666666666669,
"chromeUserTiming.LargestImagePaint": 309.66666666666669,
"chromeUserTiming.LargestContentfulPaint": 309.66666666666669,
"chromeUserTiming.TotalLayoutShift": 0.0011286408333333333,
"chromeUserTiming.CumulativeLayoutShift": 0.0011286408333333333,
"TTIMeasurementEnd": 3657.3333333333335,
"LastInteractive": 300,
"run": 2,
"step": 1,
"effectiveBps": 313879.33333333331,
"domTime": 0,
"aft": 0,
"titleTime": 293.33333333333331,
"domLoading": 0,
"server_rtt": 0,
"maxFID": 0,
"TotalBlockingTime": 0,
"effectiveBpsDoc": 373405.33333333331,
"chromeUserTiming.LayoutShift": 109,
"avgRun": 3
}
I also use GT Metrix
And PageSpeed Insights
And W3C Validators, both HTML and CSS
End of Update
You can use curl to get the response time for the HTML. It will not give you much more.
curl will give you:
CURLINFO_SIZE_UPLOAD - Total number of bytes uploaded
CURLINFO_SIZE_DOWNLOAD - Total number of bytes downloaded
CURLINFO_SPEED_DOWNLOAD - Average download speed
CURLINFO_SPEED_UPLOAD - Average upload speed
The best way I know of to get EVERY detail of an HTTP request is to use https://www.webpagetest.org
You can use their API to get 300 performance test results per month for Free.
LINK to webpagetest.org API
Example
You curl the API and it returns the links to the details. Within those links there is a plethora of information and images.
Then curl the links you want.
LINK to HAR details of loading this page. 110,000 lines of info in 6,795,929 bytes
curl https://www.webpagetest.org/runtest.php?url=https://www.webpagetest.org&k={YOUR_API_KEY}&f=json
{
"statusCode": 200,
"statusText": "Ok",
"data": {
"testId": "210328_XiVQ_b694021b2a24ca1912dae50fb58b5861",
"jsonUrl": "https://www.webpagetest.org/jsonResult.php?test=210328_XiVQ_b694021b2a24ca1912dae50fb58b5861",
"xmlUrl": "https://www.webpagetest.org/xmlResult/210328_XiVQ_b694021b2a24ca1912dae50fb58b5861/",
"userUrl": "https://www.webpagetest.org/result/210328_XiVQ_b694021b2a24ca1912dae50fb58b5861/",
"summaryCSV": "https://www.webpagetest.org/result/210328_XiVQ_b694021b2a24ca1912dae50fb58b5861/page_data.csv",
"detailCSV": "https://www.webpagetest.org/result/210328_XiVQ_b694021b2a24ca1912dae50fb58b5861/requests.csv"
}
}
The waterfall below shows some of the details you can get.
You can get the nitty gritty details in various formats:
JSON
CSV
XML
HAR
Sample of CSV info
"type","id","request_id","ip_addr","full_url","is_secure","method","host","url","raw_id","frame_id","documentURL","responseCode","request_type","load_ms","ttfb_ms","load_start","load_start_float","bytesIn","objectSize","objectSizeUncompressed","expires","cacheControl","contentType","contentEncoding","socket","protocol","dns_start","dns_end","connect_start","connect_end","ssl_start","ssl_end","initiator","initiator_line","initiator_column","initiator_type","priority","initial_priority","server_rtt","bytesOut","score_cache","score_cdn","score_gzip","score_cookies","score_keep-alive","score_minify","score_combine","score_compress","score_etags","dns_ms","connect_ms","ssl_ms","gzip_total","gzip_save","minify_total","minify_save","image_total","image_save","cache_time","cdn_provider","server_count","created","http2_stream_id","http2_stream_dependency","http2_stream_weight","http2_stream_exclusive","tls_version","tls_resumed","tls_next_proto","tls_cipher_suite","netlog_id","server_port","final_base_page","is_base_page","load_end","ttfb_start","ttfb_end","download_start","download_end","download_ms","all_start","all_end","all_ms","index","number","cpu.EvaluateScript","cpu.v8.compile","cpu.FunctionCall","cpuTime","run","cached","renderBlocking","initiator_function",
"3","221020_AiDc05_4BQ","BE8FE33DABBBA0CF2B3AC5D78083C66E","151.101.193.69","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","1","GET","stackoverflow.com","/questions/74122109/curl-or-wget-for-measuring-page-load-time","BE8FE33DABBBA0CF2B3AC5D78083C66E","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Document","346","198","533","533.000041","40562","40562","145896","","private","text/html","gzip","50","HTTP/2","0","172","173","344","344","533","","","","script","Highest","Highest","","2317","-1","100","100","-1","100","-1","-1","-1","-1","-1","171","189","41099","0","","","","","","Fastly","","4","1","0","256","1","TLS 1.2","False","h2","49199","41","443","1","1","879","533","731","731","879","148","173","879","706","0","1","135","26","2","164","1","0","","",
"3","221020_AiDc05_4BQ","6487.6","151.101.193.69","https://cdn.sstatic.net/Js/stub.en.js?v=0e3ada576039","1","GET","cdn.sstatic.net","/Js/stub.en.js?v=0e3ada576039","6487.6","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Script","585","568","967","967.000067","18152","18152","53336","","max-age=604800","application/javascript","gzip","50","HTTP/2","787","967","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","25","","parser","High","High","","1831","50","100","100","-1","100","-1","-1","-1","-1","180","-1","-1","18152","0","","","","","34541","Fastly","","788","9","","","","","","","","67","","","","1552","967","1535","1535","1552","17","787","1552","765","1","2","34","8","443","485","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.5","151.101.193.69","https://cdn.sstatic.net/Js/third-party/npm/#stackoverflow/stacks/dist/js/stacks.min.js?v=facbc6b2f3b6","1","GET","cdn.sstatic.net","/Js/third-party/npm/#stackoverflow/stacks/dist/js/stacks.min.js?v=facbc6b2f3b6","6487.5","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Script","599","568","968","968.000061","23746","23746","100193","","max-age=604800","application/javascript","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","24","","parser","Low","Low","","1978","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","23746","0","","","","","567498","Fastly","","786","11","","","","","","","","61","","","","1567","968","1536","1536","1567","31","968","1567","599","2","3","81","2","38","121","1","0","potentially_blocking","",
"3","221020_AiDc05_4BQ","6487.7","151.101.193.69","https://cdn.sstatic.net/Shared/stacks.css?v=5ad0f45f4799","1","GET","cdn.sstatic.net","/Shared/stacks.css?v=5ad0f45f4799","6487.7","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Stylesheet","363","188","968","968.000072","68692","68692","654577","","max-age=604800","text/css","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","28","","parser","Highest","Highest","","1885","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","68692","0","","","","","542107","Fastly","","790","3","","","","","","","","72","","","","1331","968","1156","1156","1331","175","968","1331","363","3","4","","","","","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.8","151.101.193.69","https://cdn.sstatic.net/Sites/stackoverflow/primary.css?v=0cc30ee01b86","1","GET","cdn.sstatic.net","/Sites/stackoverflow/primary.css?v=0cc30ee01b86","6487.8","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Stylesheet","550","374","968","968.000077","60467","60467","339815","","max-age=604800","text/css","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","29","","parser","Highest","Highest","","1927","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","60467","0","","","","","134559","Fastly","","793","5","","","","","","","","77","","","","1518","968","1342","1342","1518","176","968","1518","550","4","5","","","","","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.9","151.101.193.69","https://cdn.sstatic.net/Shared/Channels/channels.css?v=d098999fc478","1","GET","cdn.sstatic.net","/Shared/Channels/channels.css?v=d098999fc478","6487.9","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Stylesheet","551","393","968","968.000082","4168","4168","18913","","max-age=604800","text/css","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","70","","parser","Highest","Highest","","1918","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","4168","0","","","","","477490","Fastly","","797","7","","","","","","","","82","","","","1519","968","1361","1361","1519","158","968","1519","551","5","6","","","","","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.4","142.251.163.95","https://ajax.googleapis.com/ajax/libs/jquery/1.12.4/jquery.min.js","1","GET","ajax.googleapis.com","/ajax/libs/jquery/1.12.4/jquery.min.js","6487.4","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Script","370","175","1309","1309.000055","33951","33951","97163","Thu, 19 Oct 2023 17:18:45 GMT","public, max-age=31536000, stale-while-revalidate=2592000","text/javascript","gzip","87","HTTP/2","784","955","955","1128","1128","1309","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","23","","parser","High","High","","1870","100","100","100","-1","100","-1","-1","-1","-1","171","173","181","33951","0","","","","","31493759","Google","","783","1","0","220","1","TLS 1.3","False","h2","4865","55","443","","","1679","1309","1484","1484","1679","195","784","1679","895","6","7","64","12","31","107","1","0","blocking","",
"3","221020_AiDc05_4BQ","6487.10","151.101.193.69","https://cdn.sstatic.net/Img/teams/teams-illo-free-sidebar-promo.svg?v=47faa659a05e","1","GET","cdn.sstatic.net","/Img/teams/teams-illo-free-sidebar-promo.svg?v=47faa659a05e","6487.10","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Image","176","175","1580","1580.000099","2368","2368","5950","","max-age=604800","image/svg+xml","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","485","","parser","Low","Low","","2101","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","2368","0","","","","","93767","Fastly","","1579","13","0","147","1","","","","","99","443","","","1756","1580","1755","1755","1756","1","1580","1756","176","7","8","","","","","1","0","","",
"3","221020_AiDc05_4BQ","6487.20","151.101.193.69","https://cdn.sstatic.net/Img/unified/sprites.svg?v=fcc0ea44ba27","1","GET","cdn.sstatic.net","/Img/unified/sprites.svg?v=fcc0ea44ba27","6487.20","4476133D5C5FF9920B0E2BB49084B133","https://stackoverflow.com/questions/74122109/curl-or-wget-for-measuring-page-load-time","200","Image","176","174","2163","2163.000126","2852","2852","7542","","max-age=604800","image/svg+xml","gzip","50","HTTP/2","-1","-1","-1","-1","-1","-1","https://cdn.sstatic.net/Sites/stackoverflow/primary.css?v=0cc30ee01b86","","","parser","High","Low","","2173","50","100","100","-1","100","-1","-1","-1","-1","-1","-1","-1","2852","0","","","","","383944","Fastly","","2162","15","0","147","1","","","","","126","443","","","2339","2163","2337","2337","2339","2","2163","2339","176","8","9","","","","","1","0","","",
The CSV contains the following details on every request made to render the page.
type
id
request_id
ip_addr
full_url
is_secure
method
host
url
raw_id
frame_id
documentURL
responseCode
request_type
load_ms
ttfb_ms
load_start
load_start_float
bytesIn
objectSize
objectSizeUncompressed
expires
cacheControl
contentType
contentEncoding
socket
protocol
dns_start
dns_end
connect_start
connect_end
ssl_start
ssl_end
initiator
initiator_line
initiator_column
initiator_type
priority
initial_priority
server_rtt
bytesOut
score_cache
score_cdn
score_gzip
score_cookies
score_keep-alive
score_minify
score_combine
score_compress
score_etags
dns_ms
connect_ms
ssl_ms
gzip_total
gzip_save
minify_total
minify_save
image_total
image_save
cache_time
cdn_provider
server_count
created
http2_stream_id
http2_stream_dependency
http2_stream_weight
http2_stream_exclusive
tls_version
tls_resumed
tls_next_proto
tls_cipher_suite
netlog_id
server_port
final_base_page
is_base_page
load_end
ttfb_start
ttfb_end
download_start
download_end
download_ms
all_start
all_end
all_ms
index
number
cpu.EvaluateScript
cpu.v8.compile
cpu.FunctionCall
cpuTime
run
cached
renderBlocking
initiator_function

Related

Can GA4-API fetch the data from requests made with a combination of minute and region and sessions?

Problem
With UA, I was able to get the number of sessions per region per minute (a combination of minute, region, and sessions), but is this not possible with GA4?
If not, is there any plan to support this in the future?
Detail
I ran GA4 Query Explorer with date, hour, minute, region in Dimensions and sessions in Metrics.
But I got an incompatibility error.
What I tried
I have checked with GA4 Dimensions & Metrics Explorer and confirmed that the combination of minute and region is not possible. (see image below).
(updated 2022/05/16 15:35)Checked by Code Execution
I ran it with ruby.
require "google/analytics/data/v1beta/analytics_data"
require 'pp'
require 'json'
ENV['GOOGLE_APPLICATION_CREDENTIALS'] = '' # service acount file path
client = ::Google::Analytics::Data::V1beta::AnalyticsData::Client.new
LIMIT_SIZE = 1000
offset = 0
loop do
request = Google::Analytics::Data::V1beta::RunReportRequest.new(
property: "properties/xxxxxxxxx",
date_ranges: [
{ start_date: '2022-04-01', end_date: '2022-04-30'}
],
dimensions: %w(date hour minute region).map { |d| { name: d } },
metrics: %w(sessions).map { |m| { name: m } },
keep_empty_rows: false,
offset: offset,
limit: LIMIT_SIZE
)
ret = client.run_report(request)
dimension_headers = ret.dimension_headers.map(&:name)
metric_headers = ret.metric_headers.map(&:name)
puts (dimension_headers + metric_headers).join(',')
ret.rows.each do |row|
puts (row.dimension_values.map(&:value) + row.metric_values.map(&:value)).join(',')
end
offset += LIMIT_SIZE
break if ret.row_count <= offset
end
The result was an error.
3:The dimensions and metrics are incompatible.. debug_error_string:{"created":"#1652681913.393028000","description":"Error received from peer ipv4:172.217.175.234:443","file":"src/core/lib/surface/call.cc","file_line":953,"grpc_message":"The dimensions and metrics are incompatible.","grpc_status":3}
Error in your code, Make sure you use the actual dimension name and not the UI name. The correct name of that dimension is dateHourMinute not Date hour and minute
dimensions: %w(dateHourMinute).map { |d| { name: d } },
The query explore returns this request just fine
results
Limited use for region dimension
The as for region. As the error message states the dimensions and metrics are incompatible. The issue being that dateHourMinute can not be used with region. Switch to date or datehour
at the time of writing this is a beta api. I have sent a message off to google to find out if this is working as intended or if it may be changed.

Technical Analyis (MACD) for crpto trading

Background:
I have writing a crypto trading bot for fun and profit.
So far, it connects to an exchange and gets streaming price data.
I am using this price to create a technical indicator (MACD).
Generally for MACD, it is recommended to use closing prices for 26, 12 and 9 days.
However, for my trading strategy, I plan to use data for 26, 12 and 9 minutes.
Question:
I am getting multiple (say 10) price ticks in a minute.
Do I simply average them and round the time to the next minute (so they all fall in the same minute bucket)? Or is there is better way to handle this.
Many Thanks!
This is how I handled it. Streaming data comes in < 1s period. Code checks for new low and high during streaming period and builds the candle. Probably ugly since I'm not a trained developer, but it works.
Adjust "...round('20s')" and "if dur > 15:" for whatever candle period you want.
def on_message(self, msg):
df = pd.json_normalize(msg, record_prefix=msg['type'])
df['date'] = df['time']
df['price'] = df['price'].astype(float)
df['low'] = df['low'].astype(float)
for i in range(0, len(self.df)):
if i == (len(self.df) - 1):
self.rounded_time = self.df['date'][i]
self.rounded_time = pd.to_datetime(self.rounded_time).round('20s')
self.lhigh = self.df['price'][i]
self.lhighcandle = self.candle['high'][i]
self.llow = self.df['price'][i]
self.lowcandle = self.candle['low'][i]
self.close = self.df['price'][i]
if self.lhigh > self.lhighcandle:
nhigh = self.lhigh
else:
nhigh = self.lhighcandle
if self.llow < self.lowcandle:
nlow = self.llow
else:
nlow = self.lowcandle
newdata = pd.DataFrame.from_dict({
'date': self.df['date'],
'tkr': tkr,
'open': self.df.price.iloc[0],
'high': nhigh,
'low': nlow,
'close': self.close,
'vol': self.df['last_size']})
self.candle = self.candle.append(newdata, ignore_index=True).fillna(0)
if ctime > self.rounded_time:
closeit = True
self.en = time.time()
if closeit:
dur = (self.en - self.st)
if dur > 15:
self.st = time.time()
out = self.candle[-1:]
out.to_sql(tkr, cnx, if_exists='append')
dat = ['tkr', 0, 0, 100000, 0, 0]
self.candle = pd.DataFrame([dat], columns=['tkr', 'open', 'high', 'low', 'close', 'vol'])
As far as I know, most or all technical indicator formulas rely on same-sized bars to produce accurate and meaningful results. You'll have to do some data transformation. Here's an example of an aggregation technique that uses quantization to get all your bars into uniform sizes. It will convert small bar sizes to larger bar sizes; e.g. second to minute bars.
// C#, see link above for more info
quoteHistory
.OrderBy(x => x.Date)
.GroupBy(x => x.Date.RoundDown(newPeriod))
.Select(x => new Quote
{
Date = x.Key,
Open = x.First().Open,
High = x.Max(t => t.High),
Low = x.Min(t => t.Low),
Close = x.Last().Close,
Volume = x.Sum(t => t.Volume)
});
See Stock.Indicators for .NET for indicators and related tools.

Factory Girl: Use same object several times

I'm building a factory method that will to save some information in JSON for later analysis in an app that cuts boards.
These boards will have plating around them, so the JSON will contain the ids for each one of its boarders.
For example, we can have Platings 1 and 2.
And I need to generate boards that may look like this:
width: 500
height: 500
platings: {"up": 1, left: 2, right: 1}
So I can have more than once an id on a single Factory, but I have no way to do it since every time I create a plating I will get another ID.
This is an example of what I have tried to do
Factory :medium_board do
width 500
height 500
platings {{
"up": create(:plating).id, #id: 1 OK
"left": create(:another_plating).uuid, #id: 2, OK
"right": create(:plating).uuid #id: 3, NOT OK, should have been 1.
}}
end
Is there something fundamental that I am skipping?
Which is the best way to do this?
Best
I would use an after hook here. Something like this:
factory :medium_board do
after :build do |board|
plating1 = create(:plating)
plating2 = create(:another_plating)
board.platings = {
up: plating1.id,
left: plating2.id,
right: plating1.id,
}
end
end

Ruby, spreadsheet by key

I am using ruby to call spreadsheet_by_key from a google document. The first page that I call works great, however when i try to duplicate it and use the second tab on the page it does not work. Let me better explain with some examples.
I am using:
data = session.spreadsheet_by_key("spreadsheetkeygoeshere").worksheets[0]
# Get Graph-Data
(2..data.num_rows).each do |column|
key = data[column, 10]
title = data[column, 2]
current = data[column, 3]
goal = data[column, 4]
send_event(key, title: title, min: 0, max: goal, value: current)
end
This works great and returns all of the expected values. Here is the problem I am having.. this is on the page 1 the first page that loads when you open google docs. Now lets say I wan't to make a new spreadsheet on the same doc just under a new tab with a different name and display that data as well
Here is how i change the code:
data1 = session.spreadsheet_by_key("spreadsheetkeygoeshere").worksheets[1]
# Get Graph-Data
(2..data1.num_rows).each do |column|
key = data[column, 10]
puts key
title = data[column, 1]
current = data[column, 5]
goal = data[column, 6]
send_event(key, title: title, min: 0, max: goal, value: current)
end
SO i changed the .worksheets[0] to .worksheets[1]
also i changed
(2..data.num_rows) to (2..data1.num_rows)
Also i changed the data = to data1 =
Any ideas on what i am doing wrong that causes the second spreadsheet to not get pulled ? Any help is greatly appreciated.
What worked was Cameron suggestion. I went in and changed everything to just data = instead of data1= and that fixed the problem.

Getting the dimensions of the image in ruby

To get the image dimensions in ruby, I tried to use identify to get image dimensions. I wanted to retrieve the output of this system call and get the output as a string
str = system('identify -format "%[fx:w]x%[fx:h]" image.png')
output = `ls`
print output
But, I'm getting the last lines of output and not the output to this particular system call.
Also, if there is a simpler way to get the image dimensions without external gems or libraries, please suggest as it would be great !
Since you already use an external library (ImageMagick), you could use its Ruby wrapper RMagick:
require 'RMagick'
img = Magick::Image::read('image.png').first
arr = [img.columns, img.rows]
Here's an example of a very simple PNG parser:
data = File.binread('image.png', 100) # read first 100 bytes
if data[0, 8] == [137, 80, 78, 71, 13, 10, 26, 10].pack("C*")
# file has a PNG file signature, let's get the image header chunk
length, chunk_type = data[8, 8].unpack("l>a4")
raise "unknown format, expecting image header" unless chunk_type == "IHDR"
chunk_data = data[16, length].unpack("l>l>CCCCC")
width = chunk_data[0]
height = chunk_data[1]
bit_depth = chunk_data[2]
color_type = chunk_data[3]
compression_method = chunk_data[4]
filter_method = chunk_data[5]
interlace_method = chunk_data[6]
puts "image size: #{width}x#{height}"
else
# handle other formats
end
Okay, I finally found a solution after some experiments.
str = `identify -format "%[fx:w]x%[fx:h]" image.png`
arr = str.split('x')
The array arr now contains dimensions in it [width,height] .
This worked for me ! Please suggest other approaches that might be more easier or simpler.

Resources