I am building and extension with Firefox's Addon SDK (v1.9).
My application is supposed to remove cookies as they are added (or changed) based on a database of matching URIs.
I accomplish this task by adding an observer to 'cookie-changed' and implementing nsICookie to identify matching cookies and nsICookieManager to remove the cookie if a match is found.
Problem
I need know what website (URL) each cookie was added / changed from.
Unfortunately, by the time the cookie manager sends the cookie-changed notification that information is already lost - the cookie manager only knows which host the cookie is added for (and it might not be the host of the page setting the cookie if domain parameter was used). It might even be that there was no URL in the first place, e.g. if the cookie is set by an extension.
What you could do is register an observer for the http-on-examine-response notification. You can look at the Set-Cookie header of the channel as well as the channel URL so when cookie-changed notification is sent later you will know which website is responsible. Something like this:
var observer = require("observer-service");
observer.add("http-on-examine-response", function(subject, data)
{
subject.QueryInterface(Ci.nsIHttpChannel);
var cookieNames = [];
// There can be more than one Set-Cookie header, cannot use getResponseHeader
subject.visitResponseHeaders(function(header, value)
{
if (header.toLowerCase() == "set-cookie")
{
var match = /^([^\s=]+)=/.exec(value);
if (match)
cookieNames.push(match[1]);
}
});
if (cookieNames.length)
{
var url = channel.URI.spec;
// Remember that this url set the cookies or just clear the header
if (!isAllowedToSetCookies(url, cookieNames))
channel.setResponseHeader("Set-Cookie", "", false);
}
});
Note: This code hasn't been tested.
Documentation: observer notifications, nsIHttpChannel
Related
I am building a Remix app, and wanted to record some user analytics in my database based on what page the user was viewing. I also wanted to do so on a route by route basis, rather than just simply the raw URL.
For example: I wanted to know "user viewed URL /emails/123" as well as "user viewed route /emails/$emailId"
This problem could be generalized as "I want to run a piece of server code once per user navigation"
For my tracking I'm assuming users have javascript enabled in their browser.
Solutions I tried:
Record in the loader
This would be something like:
export const loader: LoaderFunction = async ({ request, params }): Promise<LoaderData> => {
myDb.recordPageVisit(request.url);
}
This doesn't work because the loader can be called multiple times per page visit (eg. after an action is run)
It's possible that there's some value hidden in the request parameter that tells us whether this is an initial page load or if it's a later visit, but if so I couldn't find it, including when I examined the raw HTTP requests.
It's also annoying to have to put this code inside of every loader.
Record the URL in the node code
I'm using #remix-run/node as my base remix server, so I have the escape hatch of setting up node middleware, and I thought that might be a good answer:
const app = express();
app.use((req, res, next) => {
if (req.url.indexOf("_data") == -1) {
myDb.recordPageVisit(req.url);
}
next();
});
I tried ignoring routes with _data in them, but that didn't work because remix is being efficient and when the user navigates, it is using an ajax-y call to only get the loaderData rather than getting the full rendered page from the server. I know this is the behavior of Remix, but I had not remembered it before I went down this path :facepalm:
As far as I can tell it's impossible to stateless-ly track unique pageviews (ie based purely on the current URL) - you need see the user's previous page as well.
I wondered if referer would allow this to work statelessly, but it appears that the referer is not behaving how I'd hoped: the referer header is already set to the current page in the first loader request for the data for the page. So initial load and load-after-mutation appear identical based on referer. I don't know if this is technically a bug, but it's certainly not the behavior I'd expect.
I ended up solving this by doing the pageview tracking in the client. To support recording this in the DB, I implemented a route that just received the POSTs when the location changed.
The documentation for react-router's useLocation actually includes this exact scenario as an example.
From https://reactrouter.com/docs/en/v6/api#uselocation:
function App() {
let location = useLocation();
React.useEffect(() => {
ga('send', 'pageview');
}, [location]);
return (
// ...
);
}
However, that doesn't quite work in remix - the location value is changed after actions (same text value, but presumably different ref value). So I started saving the last location string seen, and then only report a new pageview when the location string value has changed.
So after adding that stateful tracking of the current location, I landed on:
export default function App() {
// ...other setup omitted...
const [lastLocation, setLastLocation] = useState("");
let location = useLocation();
const matches = useMatches();
useEffect(() => {
if (lastLocation == location.pathname) {
return;
}
// there are multiple matches for parent route + root route, this
// will give us the leaf route
const routeMatch = matches.find((m) => m.pathname == location.pathname);
setLastLocation(location.pathname);
fetch("/api/pageview", {
body: JSON.stringify({
url: location.pathname,
// routeMatch.id looks like: "/routes/email/$emailId"
route: routeMatch?.id }),
method: "POST",
}).then((res) => {
if (res.status != 200) {
console.error("could not report pageview:", res);
}
});
}, [location]);
The matches code is not necessary for tracking just raw URLs, but I wanted to extract the route form (eg /emails/$emailId), and matches.id is a close match to that value - I strip "routes/" serverside. Matches docs: https://remix.run/docs/en/v1/api/remix#usematches
Client side pageview tracking is a bit annoying since clients are flaky, but for the current remix behavior I believe this is the only real option.
Did it a different way for routes, remix is funny cuz of the whole parent route thing * so I use a route logger
Beans.io
https://www.npmjs.com/package/beansio
This question already has answers here:
Accessing Session Using ASP.NET Web API
(13 answers)
Closed 3 years ago.
In my Web api when a user login successfully I set session with some values like
HttpContext.Session.SetObject("CurrentUserID", user.Id);
HttpContext.Session.SetObject("CurrentUserRoles",user.Roles);
and just return token and some values to save in cookie
return Ok(new
{
Id = user.Id,
Username = user.UserName,
FirstName = user.FirstName,
LastName = user.LastName,
Token = tokenString,
role = user.Roles
});
But when the client hit api action which has this line
List<string> userRolesList = HttpContext.Session.GetObject<List<string>>("CurrentUserRoles");
Then always get null value even I have added session inside Startup >Configure
like
app.UseSession();
app.UseMvc(routes =>
{
routes.MapRoute(
name: "default",
template: "{controller}/{action=Index}/{id?}");
});
and ConfigureService also
services.AddSession(options =>
{
// Set a short timeout for easy testing.
options.IdleTimeout = TimeSpan.FromSeconds( 60 * 60);
options.Cookie.HttpOnly = true;
});
services.AddMvc().SetCompatibilityVersion(CompatibilityVersion.Version_2_1);
but does work still... Please help.
HTTP is a stateless protocol. Sessions are fake state, enabled by both a server-side and client-side component. The client-side component is a cookie: specifically a Set-Cookie response header. In order for the session to be restored on the next request, the value of this Set-Cookie response header must be sent back via the Cookie request header with each request. A web browser (the client) will do all this automatically, including persisting the cookie locally. However, a thin client like HttpClient, Postman, etc. will not. You would need to independently persist the cookie from the response header and then attach it to each request via the Cookie header in order to maintain the session between requests.
That said, this is a major reason why APIs typically do not, and honestly should not make use of sessions. It's simply a pattern that doesn't make much sense in an API context, and only adds a potential point of failure, since clients must pay attention to the cookie headers, and take manual actions to handle the cookies.
I am using Sustainsys Saml2 with Identity Server 4. A customer has asked me if we support support SAML Single Logout.
They have asked for:
Single Logout Request URL
Single Logout Response URL
From what I can see this is probably supported by Sustainsys given the following properties exist.
var idp = new Sustainsys.Saml2.IdentityProvider(new EntityId("https://sso.acme.com"), opt.SPOptions)
{
MetadataLocation = "/metadata/sso-meta.xml",
LoadMetadata = true,
AllowUnsolicitedAuthnResponse = true,
SingleLogoutServiceResponseUrl = "INSERT",
SingleLogoutServiceBinding = Saml2BindingType.HttpRedirect
};
I have two questions:
I can only see one property which matches their request - the SingleLogoutServiceResponseUrl (I don't see a property for the SingleLogoutServiceRequestUrl). How do I configure the Single logout request Url?
How do I determine what the values are for these Url's?
Thanks
Outbound logout requests are sent to the SingleLogoutUrl configured on the Idp. The SingleLogoutResponseUrl is a special one - it's only used when responses should be sent to a different endpoint on the Idp than requests. Normally they are the same and if SingleLogoutResponseUrl is not set, the SingleLogoutUrl is used for both responses and requests.
Ask the Idp people for those.
And as an additional note: You're loading metadata. Then everything should already be in the metadata and you can shorten your code to
var idp = new Sustainsys.Saml2.IdentityProvider(new
EntityId("https://sso.acme.com"), opt.SPOptions)
{
MetadataLocation = "/metadata/sso-meta.xml",
AllowUnsolicitedAuthnResponse = true,
};
I have a static website, being served from a CDN, that communicates with an API via AJAX. How do I protect against CSRF?
Since I do not have control over how the static website is served, I cannot generate a CSRF token when someone loads my static website (and insert the token into forms or send it with my AJAX requests). I could create a GET endpoint to retrieve the token, but it seems like an attacker could simply access that endpoint and use the token it provides?
Is there an effective way to prevent against CSRF with this stack?
Additional details: authentication is completely separate here. Some of the API requests for which I want CSRF protection are authenticated endpoints, and some are public POST requests (but I want to confirm that they are coming from my site, not someone else's)
I could create a GET endpoint to retrieve the token, but it seems like an attacker could simply access that endpoint and use the token it provides?
Correct. But CSRF tokens are not meant to be secret. They only exist to confirm an action is performed in the order expected by one user (e.g. a form POST only follows a GET request for the form). Even on a dynamic website an attacker could submit their own GET request to a page and parse out the CSRF token embedded in a form.
From OWASP:
CSRF is an attack that tricks the victim into submitting a malicious request. It inherits the identity and privileges of the victim to perform an undesired function on the victim's behalf.
It's perfectly valid to make an initial GET request on page load to get a fresh token and then submit it with the request performing an action.
If you want to confirm the identity of the person making the request you'll need authentication, which is a separate concern from CSRF.
My solution is as follows
Client [static html]
<script>
// Call script to GET Token and add to the form
fetch('https:/mysite/csrf.php')
.then(resp => resp.json())
.then(resp => {
if (resp.token) {
const csrf = document.createElement('input');
csrf.name = "csrf";
csrf.type = "hidden";
csrf.value = resp.token;
document.forms[0].appendChild(csrf);
}
});
</script>
The above can be modified to target a pre-existing csrf field. I use this to add to may pages with forms. The script assumes the first form on the page is the target so this would also need to be changed if required.
On the server to generate the CSRF (Using PHP : assumes > 7)
[CSRFTOKEN is defined in a config file. Example]
define('CSRFTOKEN','__csrftoken');
Server:
$root_domain = $_SERVER['HTTP_HOST'] ?? false;
$referrer = $_SERVER['HTTP_REFERER'] ?? false;
// Check that script was called by page from same origin
// and generate token if valid. Save token in SESSION and
// return to client
$token = false;
if ($root_domain &&
$referrer &&
parse_url($referrer, PHP_URL_HOST) == $root_domain) {
$token = bin2hex(random_bytes(16));
$_SESSION[CSRFTOKEN] = $token;
}
header('Content-Type: application/json');
die(json_encode(['token' => $token]));
Finally in the code that processes the form
session_start();
// Included for clarity - this would typically be in a config
define('CSRFTOKEN', '__csrftoken');
$root_domain = $_SERVER['HTTP_HOST'] ?? false;
$referrer = parse_url($_SERVER['HTTP_REFERER'] ?? '', PHP_URL_HOST);
// Check submission was from same origin
if ($root_domain !== $referrer) {
// Invalid attempt
die();
}
// Extract and validate token
$token = $_POST[CSRFTOKEN] ?? false;
$sessionToken = $_SESSION[CSRFTOKEN] ?? false;
if (!empty($token) && $token === $sessionToken) {
// Request is valid so process it
}
// Invalidate the token
$_SESSION[CSRFTOKEN] = false;
unset($_SESSION[CSRFTOKEN]);
There is very good explanation for same, Please check
https://cloudunder.io/blog/csrf-token/
from my understanding it seems static site won't face any issue with CSRF due to CORS restriction, if we have added X-Requested-With flag.
There is one more issue i would like to highlight here, How to protect your api which is getting called from Mobile app as well as Static site?
As api is publicly exposed and you want to make sure only allowed user's should be calling it.
There is some check we can add at our API service layer for same
1) For AJAX request(From Static site) check for requesting domain, so only allowed sites can access it
2) For Mobile request use HMAC token, read more here
http://googleweblight.com/i?u=http://www.9bitstudios.com/2013/07/hmac-rest-api-security/&hl=en-IN
I'm working on an Ember.js app. I have an update function, part of an ObjectController.
The function should save my updated model, however when I call save(); it sends a POST request not a PUT request. (Tested in Chrome.)
Why would that happen? How can I make sure a PUT request is sent for updates?
Here is my code:
customer = this.get('model');
customer.set('name', 'New name');
customer.save();
For extra reference, when I log the "dirtyType" with console.log( customer.get('dirtyType') ); it says "updated".
Any help very much appreciated!
UPDATE
I've adjusted the sample code above to make it clearer, I am NOT creating a new model and wanting to use PUT. I have an existing model that I need to update.
I'm not sure if your workaround is correct in the land of PUT vs POST.
TL;DR PUT should define the resource (by Request-URI), but we don't do that during creation, so we shouldn't be using a POST. Override the create/save if you need this for your server, instead of hacking the isNew property, which may come back to bite you.
Put
9.6 PUT
The PUT method requests that the enclosed entity be stored under the
supplied Request-URI. If the Request-URI refers to an already
existing resource, the enclosed entity SHOULD be considered as a
modified version of the one residing on the origin server. If the
Request-URI does not point to an existing resource, and that URI is
capable of being defined as a new resource by the requesting user
agent, the origin server can create the resource with that URI. If a
new resource is created, the origin server MUST inform the user agent
via the 201 (Created) response. If an existing resource is modified,
either the 200 (OK) or 204 (No Content) response codes SHOULD be sent
to indicate successful completion of the request. If the resource
could not be created or modified with the Request-URI, an appropriate
error response SHOULD be given that reflects the nature of the
problem. The recipient of the entity MUST NOT ignore any Content-*
(e.g. Content-Range) headers that it does not understand or implement
and MUST return a 501 (Not Implemented) response in such cases.
If the request passes through a cache and the Request-URI identifies
one or more currently cached entities, those entries SHOULD be
treated as stale. Responses to this method are not cacheable.
The fundamental difference between the POST and PUT requests is
reflected in the different meaning of the Request-URI. The URI in a
POST request identifies the resource that will handle the enclosed
entity. That resource might be a data-accepting process, a gateway to
some other protocol, or a separate entity that accepts annotations.
In contrast, the URI in a PUT request identifies the entity enclosed
with the request -- the user agent knows what URI is intended and the
server MUST NOT attempt to apply the request to some other resource.
If the server desires that the request be applied to a different URI,
Custom Adapter
App.ApplicationAdapter = DS.RESTAdapter.extend({
createRecord: function(store, type, record) {
var data = {};
var serializer = store.serializerFor(type.typeKey);
serializer.serializeIntoHash(data, type, record, { includeId: true });
//return this.ajax(this.buildURL(type.typeKey), "POST", { data: data });
return this.ajax(this.buildURL(type.typeKey), "PUT", { data: data });
},
updateRecord: function(store, type, record) {
var data = {};
var serializer = store.serializerFor(type.typeKey);
serializer.serializeIntoHash(data, type, record);
var id = get(record, 'id');
// you could do the same here, but it's even more incorrect
return this.ajax(this.buildURL(type.typeKey, id), "PUT", { data: data });
},
});
http://www.ietf.org/rfc/rfc2616.txt
Thank you for all of your help guys, however I have found the issue and it is ridiculously silly.
The API I have been using had a new flag "is_new" and that had been added to the model and was overwriting the "isNew" property.
Causing Ember (and me) to get very confused.
I've tweaked the API and all is good in the world!
If the model was created with createRecord, and thus has isNew == true and you call save() the expected behavior is POST. Once the record has been persisted, and it is changed, and thus isDirty == true but isNew == false then the save() will be a PUT.
This is described in the Models Guide.