Intercepting responses to a process in golang - go

I have a process that activates a browser, which makes a request to a local server.
The server should respond but I do not know how to see, client side, the answer.
I need that is the browser that makes the request. I do not want to write it myself with http.NewRequest.
client.go:
func openChrome() {
var page = "https://localhost:1333/"
program := "C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe"
url := []string{"C:\\Program Files (x86)\\Google\\Chrome\\Application\\chrome.exe", page}
attr := &os.ProcAttr{
Files: []*os.File{os.Stdin, os.Stdout, os.Stderr},
}
proc,err:=os.StartProcess(program, url, attr)
proc.Wait();
// if err!=nil, log.Fatal(err)
}

To see the response and have easier control over requests could use a tool like chromedp, chromedriver (with webdriver) or selenium to load pages and find the response in the browser. All of these should be accessible to varying degrees from go, and can be used to drive the browser to go through the standard request cycle as if a human were doing it (to load content and query what is loaded).
You can see the stdout and stderr from a process you launch but that is unlikely to help you for this particular task so I'm ignoring that.
You don't give a reason for excluding a direct request, but for full control this would be another possibility, and unless you need to render html your code can do everything a browser could do (change user agent, parse html etc).

Related

How could I redirect HTTP requests to localhost within my own computer through a script or program?

Similar to how one could use a Fiddler script to redirect outgoing requests to a different URL (example below), I would like to redirect outgoing requests to certain URLs to localhost or another URL.
var list = [ "https://onetoredirect.com", "https://twotoredirect.com" ]
static function OnBeforeRequest(oS: Session) {
if(oS.uriContains("http://URLIWantToFullyBlock.com/")){
oS.oRequest.FailSession(404, "Blocked", "");
}
for(var i = 0; i < 2;i++) {
if(oS.uriContains(list[i])) {
oS.fullUrl = oS.fullUrl.Replace("http://", "https://");
oS.host = "localhost"; // This can also be replaced with another IP address.
break;
}
}
}
Problem is that I need to do this for a program that I do not have access to, so I cannot just edit the program to send to these new URLs. The two vague ideas I had were
A script/program that runs system-wide and redirects the requests
A script/program that watches my specific process (I have the ability to launch the process programmatically if need be) for these requests and redirects them
Either is viable, obviously I would prefer doing whichever is easier or more versatile lol.
I want to write this as part of a launcher for a game, where you can either use my launcher which would launch the game with the redirection on, or launch the game normally and have the redirection off (to play normally), essentially removing any need for user modification. It is also okay for the solution to be Windows only since the game is Windows only at the moment!
I ended up setting up a proxy with mitmproxy with a custom script, then setting the Windows proxy settings to go through localhost automatically!

Firefox web extension - read local file (last downloaded file)

Im creating a web extension and porting from XUL. I used to be able to easily read files with
var dJsm = Components.utils.import("resource://gre/modules/Downloads.jsm").Downloads;
var tJsm = Components.utils.import("resource://gre/modules/Task.jsm").Task;
var fuJsm = Components.utils.import("resource://gre/modules/FileUtils.jsm").FileUtils;
var nsiPromptService = Components.classes["#mozilla.org/embedcomp/prompt-service;1"].getService(Components.interfaces.nsIPromptService);
....
NetUtil.asyncFetch(file, function(inputStream, status) {
if (!Components.isSuccessCode(status)) {
return;
}
var data = NetUtil.readInputStreamToString(inputStream, inputStream.available());
var data = window.btoa(data);
var encoded_data_to_send_via_xmlhttp = encodeURIComponent(data);
...
});
This above will be deprecated.
I can use the downloads.download() to know what was the last download but I can NOT read the file and then get the equivalent for encoded_data_to_send_via_xmlhttp
Also in Firefox 57 onwards, means that I have to try to fake a user action by a button click or something, or upload a file.
Access to file:// URLs or reading files without any explicit user input
isnt there an easy way to read the last downloaded file?
The WebExtension API won't allow extensions to read local files anymore. You could let the extension get CORS privilege and read the content directly from the URL via fetch() or XMLHttpRequest() as blob and store directly to IndexedDB or memory, then encode and send to server. This comes with many restrictions and limitations such as to which origin you can read from and so forth.
Also, this would add potentially many unneeded steps. If the purpose is, as it seem to be in the question at the moment, to share the downloaded file with a server, I would instead suggest that you obtain the last DownloadItem object, extract the URL (.url) from that object and send the URL back to server.
This way the server can load directly from that URL (and encode it on server if needed). The network load will be about the same (a little less actually since there is no Base64 encoding involved which adds 33% to the size), and much less load on the client. The server would read the data as a binary/byte data stream; about the same as if the data was sent directly from the extension.
To obtain the last downloaded file you would do the following from a privileged script:
browser.downloads.search({
limit: 1,
orderBy: ["-startTime"]
})
.then(getLastDownload);
function getLastDownload(downloads) {
if (downloads.length) {
var url = downloads[0].url;
// ... send url to the server and let server fetch the data from it directly
}
}
According to this support mozilla question.
(2) Local file security
Firefox limits access from pages on web servers to pages on local disk or UNC paths. [...]).
Which solution ?
Use local-filesystem-links firefox addon (not tested)
and/or
run a small local webserver on client side, supposing server was run with sufficient privileges, you may finally access any local content via http:// (but still cannot with file:///)

Mocking request headers phantomjs and agouti

I'm using agouti in a project of mine to test my webpage. Everything works fine but Im having trouble finding a way to mock request headers. I'm currently using platformjs trough reuirejs and I would like to "fake" the os header sent by the browser so the system will think the requests are coming from a mobile platform. I want to do this to test that my home screen message only appears on mobile platforms.
Here is the current test.
/*
Test that add to homescreen notification is shown first time visiting the app.
*/
func (t *AppTest) TestHomescreenNotification() {
SetDefaultEventuallyTimeout(time.Second*7)
RegisterTestingT(t.unit)
page, err := agoutiDriver.NewPage()
Expect(page.DeleteCookie("visited")).To(Succeed())
Expect(err).NotTo(HaveOccurred())
Expect(page.Navigate(indexPath)).To(Succeed())
Eventually(page.Find("#message")).Should(BeFound())
Eventually(page.Find(".growInTop")).Should(BeFound())
/* Resetting default time out. */
SetDefaultEventuallyTimeout(time.Second)
}

How to handle every request in a Firefox extension?

I'm trying to capture and handle every single request a web page, or a plugin in it is about to make.
For example, if you open the console, and enable Net logging, when a HTTP request is about to be sent, console shows it there.
I want to capture every link and call my function even when a video is loaded by flash player (which is logged in console also, if it is http).
Can anyone guide me what I should do, or where I should get started?
Edit: I want to be able to cancel the request and handle it my way if needed.
You can use the Jetpack SDK to get most of what you need, I believe. If you register to system events and listen for http-on-modify-request, you can use the nsIHttpChannel methods to modify the response and request
let { Ci } = require('chrome');
let { on } = require('sdk/system/events');
let { newURI } = require('sdk/url/utils');
on('http-on-modify-request', function ({subject, type, data}) {
if (/google/.test(subject.URI.spec)) {
subject.QueryInterface(Ci.nsIHttpChannel);
subject.redirectTo(newURI('http://mozilla.org'));
}
});
Additional info, "Intercepting Page Loads"
non sdk version and with much much more control and detail:
this allows you too look at the flags so you can only watch LOAD_DOCUMENT_URI which is frames and main window. main window is always LOAD_INITIAL_DOCUMENT_URI
https://github.com/Noitidart/demo-on-http-examine
https://github.com/Noitidart/demo-nsITraceableChannel - in this one you can see the source before it is parsed by the browser
in these examples you see how to get the contentWindow and browserWindow from the subject as well, you can apply this to sdk example, just use the "subject"
also i prefer to use http-on-examine-response, even in sdk version. because otherwise you will see all the pages it redirects FROM, not the final redirect TO. say a url blah.com redirects you to blah.com/1 and then blah.com/2
only blah.com/2 has a document, so on modify you see blah.com and blah.com/1, they will have flags LOAD_REPLACE, typically they redirect right away so the document never shows, if it is a timed redirect you will see the document and will also see LOAD_INITIAL_DOCUMENT_URI flag, im guessing i havent experienced it myself

Detect url the user is viewing in chrome/firefox/safari

How can you detect the url that I am browsing in chrome/safari/firefox via cocoa (desktop app)?
As a side but related note, are there any security restrictions when developing a desktop app that the user will be alerted and asked if they want to allow? e.g. if the app accesses their contact information etc.
Looking for a cocoa based solution, not javascript.
I would do this as an extension, and because you would like to target Chrome, Safari, and Firefox, I'd use a cross-browser extension framework like Crossrider.
So go to crossrider.com, set up an account and create a new extension. Then open the background.js file and paste in code like this:
appAPI.ready(function($) {
appAPI.message.addListener({channel: "notifyPageUrl"}, function(msg) {
//Do something, like send an xhr post somewhere
// notifying you of the pageUrl that the user visited.
// The url is contained within msg.pageUrl
});
var opts = { listen: true};
// Note: When defining the callback function, the first parameter is an object that
// contains the page URL, and the second parameter contains the data passed
// to the context of the callback function.
appAPI.webRequest.onBeforeNavigate.addListener(function(details, opaqueData) {
// Where:
// * details.pageUrl is the URL of the tab requesting the page
// * opaqueData is the data passed to the context of the callback function
if(opaqueData.listen){
appAPI.message.toBackground({
msg: details.pageUrl
}, {channel: "notifyPageUrl"});
}
}, opts ); // opts is the opaque parameter that is passed to the callback function
});
Then install the extension! In the example above, nothing is being done with the detected pageUrl that the user is visiting, but you can do whatever you like here - you could send a message to the user, you could restrict access utilizing the cancel or redirectTo return parameters, you could log it locally utilizing the crossrider appAPI.db API or you could send the notification elsewhere, cross-domain, to wherever you like utilizing an XHR request from the background directly.
Hope that helps!
And to answer the question on security issues desktop-side, just note that desktop applications will have the permissions of the user under which they run. So if you are thinking of providing a desktop app that your users will run locally, say something that will detect urls they access by tapping into the network stream using something like winpcap on windows or libpcap on *nix varieties, then just be aware of that - and also that libpcap and friends would have to have access to a network card that can be placed in promiscuous mode in the first place, by the user in question.
the pcap / installed desktop app solutions are pretty invasive - most folks don't want you listening in on literally everything and may actually violate some security policies depending on where your users work - their network administrators may not appreciate you "sniffing", whether that is the actual purpose or not. Security guys can get real spooky so-to-speak on these kinds of topics.
The extension via Crossrider is probably the easiest and least intrusive way of accomplishing your goal if I understand the goal correctly.
One last note, you can get the current tab urls for all tabs using Crossrider's tabs API:
// retrieves the array of tabs
appAPI.tabs.getAllTabs(function(allTabInfo) {
// Display the array
for (var i=0; i<allTabInfo.length; i++) {
console.log(
'tabId: ' + allTabInfo[i].tabId +
' tabUrl: ' + allTabInfo[i].tabUrl
);
}
});
For the tab API, refer to:
http://docs.crossrider.com/#!/api/appAPI.tabs
For the background navigation API:
http://docs.crossrider.com/#!/api/appAPI.webRequest.onBeforeNavigate
And for the messaging:
http://docs.crossrider.com/#!/api/appAPI.message
And for the appAPI.db stuff:
http://docs.crossrider.com/#!/api/appAPI.db
Have you looked into the Scripting Bridge? You could have an app that launches, say, an Applescript which verifies if any of the well known browser is opened and ask them which documents (URL) they are viewing.
Note: It doesn't necessarily need to be an applescript; you can access the Scripting Bridge through cocoa.
It would, however, require the browser to support it. I know Safari supports it but ignore if the others do.
Just as a quick note:
There are ways to do it via AppleScript, and you can easily wrap this code into NSAppleScript calls.
Here's gist with AppleScript commands for Safari and Chrome. Firefox seems to not support AE.
Well obviously this is what I had come across on google.
chrome.tabs.
getSelected
(null,
function
(tab) {
alert
(tab.url);
}) ;
in pure javascript we can use
alert(document.URL);
alert(window.location.href)
function to get current url

Resources