I have a database full of links to validate the source of data, often the links get expired and are no longer accessible.
Can link caching be used to access links that are no longer functional?
Related
Here is the screenshot of this.
Few monts ago I deployed my this laravel app in cpanel. But Now when I'm accessing the site it shows me this message... I think it is hacked or malwared added in the app.
Can anyone tells
what are security steps? should I have to follow for deploying the project in cpanel securely...?
How can I protect my app from attackers in cpanel?
What I missed on basic think in my app was APP_DEBUG was false I have to set this to true.
Except this is App_Debug:
Is there any security should I follow...
Or should I have to move shared hosting to dedicating hosting..?
This has nothing to do with your hosting or your APP_DEBUG setting. In fact never set APP_DEBUG to true on a live website this can leak all your environment variables including database credentials to the world
Here's an explanation from the google site:
Social engineering is content that tricks visitors into doing something dangerous, such as revealing confidential information or downloading software. If Google detects that your website contains social engineering content, the Chrome browser may display a "Deceptive site ahead" warning when visitors view your site. You can check if any pages on your site are suspected of containing social engineering attacks by visiting the Security Issues report.
In your case it may be because either (as you said) the website was hacked and this content injected in it or (less likely) there is content you added to the site that Google is interpreting as misleading (either because it is or because it looks like it is even though it is not).
The remedy here is explained by the above linked site (wont include the entire text just a gist but do check the entire Google site):
Check in with Search Console
Remove deceptive content
Check the third-party resources included in your site
Request a review
If your page has been hacked then you should probably just delete everything, change your passwords (all of them) reset the app key and re-deploy it. Also contact your shared hosting provider to tell them what happened in case they need to be aware of any vulnerabilities or credential leaks.
Recently I got person ask me why our website doesn't work without cookies. My explanation is we need to save tokens and some reference in cookie, So that later on we can use it to make requests and there is limit options that we can use to save data in browser. But he doesn't satisfied with my answer and I also think there is a few options that we can make it work instead of using cookies/localStorage/sessionStorage.
My question is why most of the website cannot work without cookies? Can we make the website works without any storage in the browser?
Using cookies allows your website to remember the user (e.g. last login, avoiding having to login again) and offer corresponding benefits to them and you (e.g. tracking usage/interest, advertising). If you don't want these benefits then of course you can deliver a website which doesn't use cookies. If the website needs a login they will have to login on every different page viewed.
I'm currently working on a rather interesting... project. I have a client who wants to allow form uploads (from a page presented on their server) specifically to their own google drive account. The platform being used is essentially LAMP.
Single (pre-authenticated) google drive account. Multiple otherwise anonymous upload sources (users).
They do not want users to be required to have their own google accounts (rules out simply using Picker on the user's own drive files).
They want some degree of backwards browser compatibility, such as IE8 (rules out XHR to form the post using HTML5's file API to read the filedata). They don't want to use flash/etc due to potential compatibility issues with certain mobile browsers.
What is working:
Authenticating (getting a refresh token, storing, using it to get access tokens as needed)
Uploading a file to the account without metadata
Result of file upload being sent to hidden iframe
Catching the iframe load event via jquery to at least know something has happened
Problems:
The REST API upload endpoint does not support CORS: there is no way to access the result iframe directly. (see: Authorization of Google Drive using JavaScript)
The return from a successful upload is only raw JSON, not JSONP.
There is seemingly no way to host anything with proper headers to open via browser on the googleapis.com domain, so easyXDM and similar multi-iframe with cross origin workaround communication javascript approaches are ruled out.
There is no way to embed a callback URL in the POST from the submit, the API does not allow for it.
The Picker displays errors on trying to upload if you pass it an Oauth2 token that is not for a user who is also authenticated in their browser (assumedly via cookie). Strangely enough you can show files from the Oauth2 token's matching account, but other than in a browser instance where the target Oauth2 token's account matches the already logged in user any file uploads fail with an ambiguous "Server rejected" message. This happens with all files and file types, including files working in an authenticated browser instance. I assume it's an authentication flow/scope issue of some sort. I haven't tried diving the Picker source.
All of the javascript Google Drive API upload examples seem to rely on using HTML 5 to get the file data, so anything of that nature seems to be ruled out.
While files are uploaded, there's no way other than guesstimating which file came from which user, since we can't get the file object ID from the result in our inaccessible iframe. At best we could make a very rough time based guess, but this is a terrible idea in case of concurrency issues.
We can't set the file name or any other identifier for the file (not even a unique folder) because the REST API relies on that metadata being sent via JSON in the post request body, not via form fields. So we end up with file objects in the drive with no names/etc.
We can't create the file with metadata populated server side (or via jquery/XHR, or the google javascript API client) and then update it with a form based upload because the update API endpoint exclusively works with PUT (tested).
We can't upload the files to our local server and then send them to google (proxy them) as the php ini is locked down to prevent larger file uploads (and back to restrictions imposed on using HTML5 or flash for why we can't chunk files/etc).
All of this has been both researched and to varying degrees tried.
At the moment this is going on hold (at least it was a useful way to learn the API and gain a sense of its limitations) and I'm just going to implement something similar on dropbox, but if anyone has any useful input it would be lovely!
e.g. is there any way to get this working with Drive? Have I overlooked something?
I also realize that this is probably essentially a less than intended use-case, so I'm not expecting miracles. I realize that the ideal flow would be to simply allow users to upload if necessary to their own google drives and then have them grant file access to our web app (via Picker or API+our own UI), but this becomes a problem when not all of our own users are necessarily already google account users. I know that google would OBVIOUSLY prefer we get even more people to sign up with them in order to have this happen, but making people sign up for a google account to use our app was ruled out (not out of any prejudice on our part, it was just too many added steps and potential user hurdles). Even simply having them sign in to google if they did have accounts was deemed unwanted for the basic LCD feature functionality, although it's likely to be added as an additional option on top of whatever becomes the base solution.
The biggest problem with the approach you described is you're introducing a big security issue. Allowing an anonymous user to directly upload to Drive from the client requires leaking a shared access token to anyone who comes by. Even with the limited drive.file scope, a malicious or even slightly curious user would be able to list, access (read/update/delete!) any file that was uploaded by that app.
Of course a public drop box feature is still useful, but you really need to proxy those requests to avoid revealing the access token. If your PHP environment is too restrictive, why not run the proxy elsewhere? You can host a simple proxy to handle the uploading just about anywhere -- app engine, heroku, etc. and support whatever features you need to ensure the metadata is set correctly for your app.
(Correct me if I'm wrong.) A server host can detect the pages that a visitor goes to before and after they visited the host's site.
To what extent can a server host receive information on what sites their client visits before and after the visit to the present page?
There are probably two ways of doing this, which both serve different purposes:
If a user clicks a link on another page to go to your page, the page they came from (the referrer) will be sent in the Referer (sic) HTTP header. (See the HTTP specification.)
Most web frameworks and web-oriented languages provide an easy way to access this value, and most web analytics tools will process it out of the box. (Specifically how you go about getting at this value depends on what tools you use.) There are three caveats, though:
This header can be turned off in the settings on most browsers. (Most users don't do this, but a few tech-savvy and privacy-conscious users might.)
This only works if the user clicks a link. If the user types in the web address manually, the header won't be there.
You can only see one page immediately before the visit to your site.
If you want to see where a user travels across pages which you control, you can do this by setting a cookie with a unique value per visit, and storing each page load.
Like the above one, how you go about doing this depends on what tools you use, and there are a few caveats:
Like the Referer header, some tech-savvy and privacy-conscious users like to browse with cookies switched off.
Obviously, you can only see visits to pages that you control yourself (and that you can set cookies on).
Hi we've been getting an issue in which WCM references are being generated
towards protected content for anonymous/unauthenticated users. In WCM this
basically means that the links have a "/myconnect" (protected) context instead
of a "/connect" (unprotected) context in their URL.
Now WAS has a functionality in which if a user generates a request towards a
protected resource while being unauthenticated WAS places the URL to that
resource in a cookie called WASReqURL, and when the user authenticates it
redirects the user to that URL. In my case our users are getting redirected
to an image resource or a file resource, etc. This changes depending what is
the last requested protected item and what's in the cache, etc.
Now I've checked permissions on the libraries, sites, content items, components,
etc, but that still doesn't help. And the worst thing is that this issue is
intermittent almost as if it was tied to some sort of performance issue.
Thank you
Lotus WCM really does that when it thinks the anonymous user does not have enough access rights. If you are absolutely sure that your access rights are properly defined, congratulations. You have found something worth opening PMR. Attach the PUMA mustgather to it.