I have made a GitHub action in which I am having my dockerfile as :
FROM alpine:latest
RUN apk update
RUN apk --no-cache add curl git python3 py3-pip
COPY entrypoint.sh /entrypoint.sh
RUN ["chmod", "+x", "/entrypoint.sh"]
ENTRYPOINT ["/entrypoint.sh"]
In the entrypoint.sh I am using curl to request the GitHub API to change one repo's default branch. the command I am using is :
curl -i -u "$USER_NAME:$GITHUB_KEY" --request PATCH -d '{"default_branch": "master"}' https://api.github.com/repos/$USER_NAME/$REPO_NAME
Everything is Well Defined.
But in the response I am getting :
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0
100 74 0 38 100 36 255 241 --:--:-- --:--:-- --:--:-- 496
HTTP/2 403
server: GitHub.com
date: Wed, 23 Feb 2022 07:20:13 GMT
content-type: text/plain; charset=utf-8
vary: X-PJAX, X-PJAX-Container
permissions-policy: interest-cohort=()
cache-control: no-cache
set-cookie: _gh_sess=%2BEichIQiS6mPGOQezauQXt5LDjh8PQrowD3A5%2FzFXiDuXxvgFxJViurljc6uElUDKarOEnWSLlxeDz0RmL4MCusqQMitCn4v2UFd4IQ%3D%3D--QEwYz3UHkEiL1NGR--jjbCBNRKQEQOA%3D%3D; path=/; secure; HttpOnly; SameSite=Lax
strict-transport-security: max-age=31536000; includeSubdomains; preload
x-frame-options: deny
x-content-type-options: nosniff
x-xss-protection: 0
referrer-policy: origin-when-cross-origin, strict-origin-when-cross-origin
expect-ct: max-age=2592000, report-uri="https://api.github.com/_private/browser/errors"
content-security-policy: default-src 'none'; base-uri 'self'; block-all-mixed-content; child-src github.com/assets-cdn/worker/ gist.github.com/assets-cdn/worker/; connect-src 'self' uploads.github.com objects-origin.githubusercontent.com www.githubstatus.com collector.githubapp.com collector.github.com api.github.com github-cloud.s3.amazonaws.com github-production-repository-file-5c1aeb.s3.amazonaws.com github-production-upload-manifest-file-7fdce7.s3.amazonaws.com github-production-user-asset-6210df.s3.amazonaws.com cdn.optimizely.com logx.optimizely.com/v1/events translator.github.com wss://alive.github.com; font-src github.githubassets.com; form-action 'self' github.com gist.github.com objects-origin.githubusercontent.com; frame-ancestors 'none'; frame-src render.githubusercontent.com viewscreen.githubusercontent.com notebooks.githubusercontent.com; img-src 'self' data: github.githubassets.com identicons.github.com collector.githubapp.com collector.github.com github-cloud.s3.amazonaws.com secured-user-images.githubusercontent.com/ *.githubusercontent.com; manifest-src 'self'; media-src github.com user-images.githubusercontent.com/; script-src github.githubassets.com; style-src 'unsafe-inline' github.githubassets.com; worker-src github.com/assets-cdn/worker/ gist.github.com/assets-cdn/worker/
vary: Accept-Encoding, Accept, X-Requested-With
x-github-request-id: 07C7:12E1:C19D08:14B9C02:6215E02D
Cookies must be enabled to use GitHub.
I am not getting out to get out of it.
Please help me to get out of it. or please tell me any way using git to change default branch of a repo.
Related
I do a request to my Squid through CloudFront and receive a miss:
< content-type: text/html
< date: Thu, 17 Mar 2022 08:22:46 GMT
< server: Apache/2.4.48 (Ubuntu)
< cache-control: max-age=3153600000, public, only-if-cached, immutable
< last-modified: Thu, 01 Jan 1970 00:00:00 GMT
< x-cache-lookup: MISS from ip-172-31-68-17:3128
< via: 1.1 ip-172-31-68-17 (squid/4.13), 1.1 c929a0b0be95dbd556dd38270accc062.cloudfront.net (CloudFront)
< content-security-policy: default-src * 'unsafe-eval' 'unsafe-inline' 'unsafe-dynamic' data: blob: ws: wss:; img-src http: https: data: mediastream: 'unsafe-eval' 'unsafe-hashes' 'unsafe-inline' blob: ws: wss:; frame-ancestors 'self'; frame-ancestors https://allfilesnews.com https://*.allfilesnews.com
< x-frame-options: SAMEORIGIN
< link: <https://Reb1lBkV3d9kbe7qlaaC7nZDFGNEErQKHQBk51bdA6c.allfilesnews.com/Reb1lBkV3d9kbe7qlaaC7nZDFGNEErQKHQBk51bdA6c/VvaYNP0fJiVQfefYmC>; rel="canonical"
< expires: Thu, 15 Apr 2110 20:00:00 GMT
< vary: Origin
< x-cache: Miss from cloudfront
< x-amz-cf-pop: TLV50-C1
< x-amz-cf-id: t6P2Rr1HqwCmSDmzC4oVh2fzq25ooHzPY4OZPwWras-9tBbMqinr3Q==
You see that max-age is very long and Last-Modified is in deep past. Why then this (repeated, I already queried this URL no more than a few days ago) query to Squid gives a MISS?
Here is my Squid config:
read_timeout 60 minutes
http_port 3128 accel vhost allow-direct
cache_dir ufs /Files/main/cache 120000 16 256
cache_peer 127.0.0.1 parent 9003 0 no-query default
acl all src 0.0.0.0/0.0.0.0
http_access allow all
never_direct allow all
forward_max_tries 5
Note that 120GB of the cache space is nearly not used:
$ df -h /Files/main/
Filesystem Size Used Avail Use% Mounted on
Files/main 125G 187M 125G 1% /Files/main
I am new to Kong so bear with me:)
I am hosting my APIs on a windows server as http://supermarket.xxxx.com:5000
added service as follows on an Ubuntu box (http://supermarket.xxxx.com is added to hosts file)
curl -i -X POST
--url http://localhost:8001/services/
--data 'name=SupermarketService'
--data 'url=http://supermarket.xxx.com:5000'
HTTP/1.1 201 Created
Date: Thu, 17 Dec 2020 07:11:50 GMT
Content-Type: application/json; charset=utf-8
Connection: keep-alive
Access-Control-Allow-Origin: *
Server: kong/2.1.3
Content-Length: 379
X-Kong-Admin-Latency: 204
2 Added the routes
curl -i -X POST
--url http://localhost:8001/services/SupermarketService/routes
--data 'hosts[]=supermarket.xxx.com'
--data 'paths[]=/api/categories'
--data 'strip_path=false'
--data 'methods[]=GET'
HTTP/1.1 201 Created
Date: Thu, 17 Dec 2020 09:01:17 GMT
Content-Type: application/json; charset=utf-8
Connection: keep-alive
Access-Control-Allow-Origin: *
Server: kong/2.1.3
Content-Length: 463
X-Kong-Admin-Latency: 11
testing the setting on the Ubuntu box
curl -i -X GET
--url http://localhost:8000/api/categories
--header 'Host: supermarket.xxx.com'
HTTP/1.1 200 OK
Content-Type: application/json; charset=utf-8
Transfer-Encoding: chunked
Connection: keep-alive
Server: Microsoft-IIS/8.5
Strict-Transport-Security: max-age=31536000; includeSubDomains
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-UA-Compatible: IE=Edge,chrome=1
X-Xss-Protection: 1; mode=block
Content-Security-Policy: default-src https: data: 'unsafe-inline' 'unsafe-eval' 'self' ; connect-src 'self' data: *;
Date: Thu, 17 Dec 2020 23:10:21 GMT
X-Kong-Upstream-Latency: 1586
X-Kong-Proxy-Latency: 2
Via: kong/2.1.3
[{"id":100,"name":"Fruits and Vegetables"},{"id":101,"name":"Dairy"}]
when I try to access the same API in another box using a web browser
http://192.168.44.67:8000/api/categories //where 192.168.44.67 is the IP address of my Ubuntu box
iI get this
{"message":"no Route matched with those values"}
Please let me know what is wrong.
when accessing the API from the other box using a webbrowser you didn't set the Host header. Since when using a webbrowser the browser will automatically set that header based on the url you type in the browser url bar.
remove the following element from the curl request to create the route:
--data 'hosts[]=supermarket.xxx.com'
Then it will only match on the path segment you provided and it should work
I have installed Varnish and configured it as per as the guideline.
And when I try it curl -I https://d-o-m-a-i-n.com I get following which seems its working correctly.
HTTP/1.1 200 OK
Date: Sat, 28 Mar 2020 03:17:02 GMT
Server: Apache/2.4.18 (Ubuntu)
Expires: Sun, 29 Mar 2020 03:17:03 GMT
Cache-Control: max-age=86400, public, s-maxage=86400
Pragma: cache
X-Magento-Tags: cms_b_porto_homeslider_3,store,cms_b,cms_p_91,cms_b_porto_custom_notice_new,cat_p,cat_c_p_30,cat_p_22,cat_p_1,cat_p_34,cat_p_21,cat_p_41,cat_p_11,cat_p_39,cat_p_35,cat_p_33,cms_b_porto_footer_top_1_for_5,cms_b_porto_footer_middle_1_for_5,cms_b_porto_footer_middle_2_for_5
X-Content-Type-Options: nosniff
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN
Vary: Accept-Encoding
X-UA-Compatible: IE=edge
Content-Type: text/html; charset=UTF-8
X-Varnish: 33268 3
Age: 13957
Via: 1.1 varnish (Varnish/5.2)
X-Cache: cached
Accept-Ranges: bytes
But when I check in Google Chrome header response I get followings where it shows Varnish is not caching and age is 0?
Accept-Ranges: bytes
Age: 0
Cache-Control: max-age=86400, public, s-maxage=86400
Connection: Keep-Alive
Content-Encoding: gzip
Content-Length: 20027
Content-Type: text/html; charset=UTF-8
Date: Sat, 28 Mar 2020 07:13:35 GMT
Expires: Sun, 29 Mar 2020 07:13:35 GMT
Keep-Alive: timeout=5, max=100
Pragma: cache
Server: Apache/2.4.18 (Ubuntu)
Vary: Accept-Encoding
Via: 1.1 varnish (Varnish/5.2)
X-Cache: uncached
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-Magento-Tags: cms_b_porto_homeslider_3,store,cms_b,cms_p_91,cms_b_porto_custom_notice_new,cat_p,cat_c_p_30,cat_p_1,cat_p_22,cat_p_21,cat_p_11,cat_p_34,cat_p_41,cat_p_39,cat_p_35,cat_p_33,cms_b_porto_footer_top_1_for_5,cms_b_porto_footer_middle_1_for_5,cms_b_porto_footer_middle_2_for_5
X-UA-Compatible: IE=edge
X-Varnish: 1704417
X-XSS-Protection: 1; mode=block
Do you have any ideas why that is?
How are you generating the key for caching(vcl_hash)?
Check it and make sure there is no user agent involved.
I've been looking around for solutions before asking this questions but unfortunately none of them yielded good results.
I get a OpenURI::HTTPError: 405 Not Allowed when accessing this specific url:
require 'open-uri'
doc = Nokogiri::HTML(open("http://streeteasy.com"))
#=> OpenURI::HTTPError: 405 Not Allowed
from /Users/cyrusghazanfar/.rvm/rubies/ruby-2.2.0/lib/ruby/2.2.0/open-uri.rb:358:in `open_http'
also tried:
$ curl -I http://streeteasy.com
which returned:
HTTP/1.1 405 Not Allowed
Date: Fri, 22 Sep 2017 20:03:59 GMT
Content-Type: text/html
Connection: keep-alive
Server: nginx
X-DZ: 24.193.31.96
Vary: Accept-Encoding
X-DZ: 127.0.0.1
Expires: Thu, 01 Jan 1970 00:00:01 GMT
Cache-Control: private, no-cache, no-store, must-revalidate
Edge-Control: no-store, bypass-cache
Surrogate-Control: no-store, bypass-cache
the problem is that the server needs an User-Agent header to work, so in curl it would be like:
curl --header "User-Agent: Mozilla/5.0" http://streeteasy.com
I'am Developing bash script can detect web application firewall from header tags but i can find example like my idea ?
To find the request headers from bash you can simply use curl. If you're on windows you'll want the new windows bash shell or cygwin to run it.
There are dozens more tricks you can play with curl to get whatever you want in whatever format you want, lots of SO questions on it to answer any questions you come up with.
curl --head www.google.com
HTTP/1.1 200 OK
Date: Thu, 06 Apr 2017 02:07:00 GMT
Expires: -1
Cache-Control: private, max-age=0
Content-Type: text/html; charset=ISO-8859-1
P3P: CP="This is not a P3P policy! See https://www.google.com/support/accounts/answer/151657?hl=en for more info."
Server: gws
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN
Set-Cookie: NID=100=IoNzfnVsz_oaEwIQE182ysgVSHoZYRVKjTqSQ5GqKrz1ewxwav2ae5GPo_bx0apr39Pnn4yvM5RfsmQnJ_QFmllVwS34ts-bNrvkzDFIfaokkDTo1BXHDDI69duBn1f9kx4sXJ_rcCK28og6; expires=Fri, 06-Oct-2017 02:07:00 GMT; path=/; domain=.google.com; HttpOnly
Transfer-Encoding: chunked
Accept-Ranges: none
Vary: Accept-Encoding
Here's an example of getting response headers using curl:
curl -D - www.google.com
HTTP/1.1 200 OK
Date: Thu, 06 Apr 2017 02:11:26 GMT
Expires: -1
Cache-Control: private, max-age=0
Content-Type: text/html; charset=ISO-8859-1
P3P: CP="This is not a P3P policy! See https://www.google.com/support/accounts/answer/151657?hl=en for more info."
Server: gws
X-XSS-Protection: 1; mode=block
X-Frame-Options: SAMEORIGIN
Set-Cookie: NID=100=DrUalBDiHKiZkX0yETtowdWhEfjJy7ioPU0Fe7Wch9pbbYI8MeSbg8M42dHmwu-hKZmYUlnE7VIgLhJ_Zi6byG_PYpTu5s2KYUv9XjPeH-GfSOTSq22I2GnEqXZwhJv-Bdn0aYzCUugF9FHb3Q; expires=Fri, 06-Oct-2017 02:11:26 GMT; path=/; domain=.google.com; HttpOnly
Accept-Ranges: none
Vary: Accept-Encoding
Transfer-Encoding: chunked
<!doctype html><html itemscope="" itemtype="http://schema.org/WebPage" lang="en"><head><meta content="Search the world's information, including webpages, images, videos and more. Google has many special features to help you find exactly what you're looking for." name="description"><meta content="noodp" name="robots"><meta content="text/html; charset=UTF-8" http-equiv="Content-Type"><meta content="/images/branding/googleg/1x/googleg_standard_color_128dp.png" itemprop=<cut the rest of the HTTP request>