I use a third-party application that requests a config file from their Internet site. That file is out of date, but I can create my own file with the updated information.
How can I redirect any requests coming from my computer for a specific URL to a different file? For example, if any application requests http://www.theirsite.com/path/to/file.html, cause it instead to receive http://www.example.com/blah.html or C:\My Documents\blah.html?
Why not change your hosts file to point lookups to their website to a web server that you control ? Then add your own config file. You may have to redirect other requests to their server though, if the app uses the website for other resources.
Related
I run a web server (RHEL7) that uses haproxy v1.5 as the front-end for remote client requests and host a farm of web-applications each running in specific directories off the web-doc root (i.e. /App001/, /App002/, etc..).
I recently renamed one of the application folders and reconfigured the app to use the new location and it works fine, but we have old documentation that points to the old app subfolder and I'd like haproxy to redirect the remote client requests seeking the old location (i.e. /OldAppFolder/) to the new location (/NewAppFolder/) to avoid a page-not-found error.
The redirect must be conditional such that it only redirects client requests seeking /OldAppFolder/*
Is this easy? Thank you!
This will rewrite the URL transparently to the user.
reqrep ^([^\ :]*)\ /OldAppFolder/(.*) \1\ /NewAppFolder/\2 if { path_beg /OldAppFolder/ }
I'm developing a site that's virtually entirely static. I use a generator to create all the HTML.
However, my site is a front-end to a store embedded in its pages. I have a little node.js server proxying requests on behalf of the browser to the back-end store. All it does is provide the number of items in the shopping cart so I can keep the number updated on all pages of my site. That's because the browser doesn't allow cross-domain scripting. My server has to act as a proxy between the client and the store.
(The embedded store is loaded from the store's web site and so itself does not require proxying.)
I was hoping to eventually deploy to Netlify or some similar JAMstack provider. But I don't see how I'd proxy on Netlify.
What is the standard solution to this problem? Or is proxying unavailable to JAMstack solutions? Are there JAMstack providers that solve this problem?
Netlify does allow for proxy rewrites using redirect paths with status code 200.
You can store your proxy redirects in _redirects at the root of your deployed site. In other words the file needs to exist at the root of the site directory to be deployed after a build.
_redirects
/api/* https://api.example.com/:splat 200
So a call to:
/api/v1/gifs/random?tag=cat&api_key=your_api_key
will be proxied to:
https://api.example.com/v1/gifs/random?tag=cat&api_key=your_api_key
If the API supports standard HTTP caching mechanisms like Etags or Last-Modified headers, the responses will even get cached by CDN nodes.
NOTE: you can also setup your redirects in your netlify.toml
I have a website on which i tried to exploit the XXE vulnerability but contents are not shown on to the website (some filters), so now I am trying OOB xxe attack. But for that I need a web server on which I can bring up the request with the file contents.
http://localhost/%myentity
through that payload in the DOCTYPE I can request the xmlparser to make a request to my localhost server with the file contents. So I can successfully exploit the OOB XXE attack.
It possible for me to capture live request on the XAMPP/localhost server. So that I can bring up the file contents? Hope you know what I mean?
First I'm not in the web side of our world, so be nice with the backend guy.
A quick background : For a personal need I've developped a google chrome extension. They are basically a webpage loaded in a chrome windows and... yeah that's it. Everything is on the client side (scripts, styles, images, etc...) Only the data are coming from a server through ajax calls. A cron job call a php script every hours to generate two files. One, data.json contains the "latest" datas in a json format. Another one hash.json contain the hash of the data. The client chrome application use local storage. If the remote hash differ from the local one, he simply retrieve the data file from the remote server.
As I have a BizSpark account with Azure my first idea was : Azure Web Site with php for the script, a simple homepage and the generated file and the Azure Scheduler for the jobs.
I've developed everything locally and everything is running fine... but once on the azure plateform I get this error
XMLHttpRequest cannot load http://tso-mc-ws.azurewebsites.net/Core/hash.json. No 'Access-Control-Allow-Origin' header is present on the requested resource. Origin 'http://localhost:23415' is therefore not allowed access.
But what I really can't understand is that I'm able (and you'll be too) to get the file with my browser... So I just don't get it... I've also tried based on some post I've found on SO and other site to manipulate the config, add extra headers, nothing seems to be working...
Any idea ?
But what I really can't understand is that I'm able (and you'll be
too) to get the file with my browser... So I just don't get it
So when you type in http://tso-mc-ws.azurewebsites.net/Core/hash.json in your browser's address bar, it is not a cross-domain request. However when you make an AJAX request from an application which is running in a different domain (http://localhost:23415 in your case), that's a cross-domain request and because CORS is not enabled on your website, you get the error.
As far as enabling CORS is concerned, please take a look at this thread: HTTP OPTIONS request on Azure Websites fails due to CORS. I've never worked with PHP/Azure Websites so I may be wrong with this link but hopefully it should point you in the right direction.
Ok, will perhap's be little troll answer but not my point (I'm .net consultant so... nothing against MS).
I pick a linux azure virtual machine, installed apache and php, configure apache, set some rights and define the header for the CROS and configure a cron in +/- 30minutes... As my goal is to get it running the problem is solved, it's running.
I have an application that makes web requests to a set of URLs with the same host name. For testing purposes, I need to have this application make the same requests to URLs with a different host name. I don't have access to the source code, so building a debug version with the modified URLs is not possible.
Is there a [lightweight] proxy application that can intercept web requests and transform their URL?
For example, if it detects a web request to https://some.production.server/path, have it transform and send the request to https://some.development.server/path
Sure, use Fiddler. Click Tools > Hosts.