Working on a project in codeigniter.
I've got clients sending requests from an app (I have no control over) to a router app (I have no control over). The router gives me a unique identifier for each person talking to my app (in a post variable). How would you suggest tracking sessions for users without using a cookies between requests. I would use the CI sessions to track but it tracks off of IP address, and most of the requests from the router app to mine will have the same IP address.
You can try storing session to database or use apache/IIS log. Many free web apps are available to read these logs. I've used awstats for reading IIS log for getting traffic info.
You could go the old-fashioned route and put the tracking data into the URL. Append the unique ID from the routing app to every link on the page in the form of a GET parameter. Read that in on subsequent pages and regurgitate it on all subsequent URLs.
Certainly not sexy, but in the world of dealing with applications you can't control (aka the "real" world) it might not be the worth option.
Related
I´m developing an app that summarizes text, but i want that non-account user have limited entrances to the tool. Right now, i have accomplish this partially since these users only can use it 5 times a day, but i realized that deleting browser history restarts the session of the non-account user.
Can someone please guide me to the solution?
I´m basically using flask, javascript, ajax
I have done the limited entrances to the system for non-account users, but I´m trying to preserve this data (of how many entrances the user has with the tool) after the browser history and cache are deleted.
You can log anonymous user access using other more static factors like IP address and use that to block more than 5 requests.
Flask has a library to implement the stated functionality called Flask Limiter.
https://flask-limiter.readthedocs.io/en/stable
I have an application of laravel that is on main domain i want to share session with subdomain.com which is on magento ! there would be a button on main website on laravel dashboard to go to store of magento and when user lands on it session should be created automatically like single sign on, how to achieve this?
There is many approach for example in micro service architecture there is special engine (app) to manage authentication, but for now
To keep your sessions going across multiple domains, you need to use
session_set_cookie_params(). With that, you can specify your domain. For example...
session_set_cookie_params(10000, "/", ".main.com");
That will set the session timeout at 10,000 seconds for all documents under the site root, and for all subdomains of main.com.
You should call session_set_cookie_params() before you do session_start().
Note: to achieve this functionality there is bunch of method but keep your own track and dive deep in all approach s
I Guys, i have to create a mobile app that need to make a request to a laravel endpoint, app no require registration or login, which is the best way to protect my api? To make sure the only my application can call it?
Thanks!
There's no full proof method of securing your api, because with the right tools and following some tutorials on the web, anyone could view your whole api request, headers, tokens, etc.
Anything you do or store on the app is already compromised, so signatures,ssl, encryption,tokens, etc are not that helpful if malicious users have access to the app. It can make it more troublesome for malicious users, but a dedicated one could overcome it.
Using authentication atleast forces users to register before they can use your api and you can block the user when needed. Along with requiring email verification, users who wish to misuse your api would then need valid email addresses atleast. But since you mention securing without authentication, this goes out of scope.
You can secure your api somewhat by using rate limiting. laravel has an inbuilt rate limiting with the throttle middleware. You can use this to restrict the number of times an api can be called in a particular time interval by an ip address.
Next would be Ip blocking. If any malicious activity is found, you could block the ip address. But this can be overcome with a vpn, and a malicious user could also block someone elses ip in this manner.
Captcha can help against bots, but would also annoy regular users.
Another method would be restriction with cors, those who have faced cors issues know exactly how annoying it can be, but it wont work on native apps (or you could try pwa).
And in a worse case scenario you could go with some terms and conditions and some legal action
A simple solution You can create a table for devices with api key which will be generated for each device app, and always use it to send requests to the api end point, then used it to fetch data from the rest api. The same process like if you are loging in, but you will use the api key unstead and the key will be fixe not refreshed evrey time.
I have a simple app I want to create, which allows you to place any website within your Facebook page on a tab.
Previously, I could just do this without a secure canvas URL, but now it is telling me that I must have this to create the app.
Is there a way around this, as the app does not take any info from anybody, it just shows a site from my server on the page.
Short answer: No. You do not need to provide an encrypted connection if the app runs in sandbox mode but otherwise it is mandatory.
Well, actually people using secure browsing will just see an error message at the moment but judging from recent announcements apps without an encrypted connection will be blocked a bit further down the road.
This isn't easy to explain, but I'll try my best.
The issue has started happening in a site that was built some years ago using classic asp, the symptom is that the administrators log-in using a form and then an session variable is set, but suddendly when they request a new page they are prompted again to log-in.
This problem isn't specific to any browser, I've reproduced the problem with Firefox and IE8.
Using Fiddler I can see that suddenly the server sends a new Set-Cookie header, despite a previous session cookie being sent in the request.
From that moment, the server will switch between the two sessions randomly, none of the sessions seem expired, they preserve their own variables, but for the user it's useless because he might be asked to login and then the form data is processed in the already logged-in session.
What can I try to find out the problem?
The server is a shared hosting with IIS6, the hosting company isn't too friendly but the cost of moving everything to other place makes things stay as is.
Thanks.
Some further info:
Showing the machine name as suggested by Aaron D. always shows the same name, but I had stored the start time of the application in global.asa:
Sub Application_OnStart()
Application("Start") = now()
End Sub
And it turns out that when showing that info in a test page it does change as the detected session changes. So there are two servers (with the same name) or somehow it's running twice the application.
Is it possible?
I have a couple ideas but nothing definitive.
Are some requests over HTTPS and others over HTTP? Are the cookies set to only transfer over secure connections?
Are your requests alternating between a subdomain and the primary domain? Example, some requests go to www.foo.com and others to foo.com? The cookies may not be shared between the two unless you set the domain inside the cookie. This could also happen with multiple subdomains.
This one is a less common, but is the company hosting your site on multiple servers that are distributing the load? You could tell this by creating a page as specified here:
http://mentaljetsam.wordpress.com/2008/01/29/classic-asp-code-to-print-current-server-name/
If this turns out to be the case, the solution with be to change your session state model from "InProc" to use a shared resource such as a database.
Are you sure that it switches you between sessions and doesn't just expire your session away? It could be that your app is restarting (based on your edit) and this is killing your sessions, but the cached result makes it look like it's still valid. Can you try doing hard refreshes and/or check the results with an HTTP traffic watcher like Fiddler? That might give you a better clue about what requests are actually going across the wire.