Ubuntu Nginx 403 error when posting form data including <iframe> - laravel

I have a Ubuntu Nginx server (using laravel forge to set it up)
I am now getting 403 errors when posting form data including which I was not getting previously.
The form is posted by a javascript button $('#my-form').submit(); if this is relevant.
Other forms are working fine as long as I remove the tags (used for youtube embedding)

Open up developer console and see more details about the POST request in network tab or console itself. 4XX is a group for client errors, not server or runtime, so expect the issue to be in your implementation. Maybe you use some package that is supposed to "automagically" authenticate or check user permissions when accepting this specific request, and so it fails because you are not passing some header or custom field? Hard to tell without more details.
Add relevant code (at very least your form html) if you want more specific tips.

In this case - I also had a wordpress blog installed with a wordfence plugin operating. The wordfence configuration was enforcing security settings which were preventing any website forms from posting and tags

Related

ajax not working with laravel deployment

I have a laravel app that can be found at https://github.com/maximus1127/drive ...the file in question is drive/resources/views/auditor_pages/application_review.blade.php. When i run this on my local wamp server, everything works fine. When i upload it to hostgator (paid hosting, not trying to go the free route), everything in the app works except for the ajax requests. The ajax requests even go to the same controller as other CRUD operations that are not ajax based and those other operations work fine. So i know the files are all connecting and talking to each other. can someone help me please? You can log into my app by going to driveportal.net user email "aa#aa.com" pw "password. Click "instructors" on the left, then instructor applications, then view details. This is all dummy data seeded from composer. Click the second row as the first one has altered database info which doesn't display all features.
The background check submitted/received and the "save notes" button is all ajax. But they all produce 404 errors. Can someone please help me figure out what i'm doing wrong? I recently added some middleware to my routes and maybe that's interfering but it still works well on my local server. I'm so confused.
Edit: i should also note that my headers are stored in the auditorDefault layout file.
For anyone interested, my ajax "url1/2/3" variables did not accommodate for my new group prefixes with the auditor middleware. when i changed the routes to accommodate for that, the ajax started working.

NI-FI - Issue authenticating to a page to download a file

I'm currently trying to get a .zip file from a webpage using nifi to do it. I am able to generate a direct download link of the file but the application needs to log in into the page before opening the direct link. I've tried using InvokeHTTP, ListWebDAV and FetchWebDAV processors and I was not able to do this properly.
I even tried to add the login and the password as attributes using the same ID used by the page(logon, temp_password).
Also tried going for a Python code but I was not able to get any good results with it.
Every time I tried any of these methods I received a small file on the InvokeHTTP with the download link saying that authorization is required and it downloads a file that is the source code of the login page.
Tried to look in almost everyplace on the internet without much success :/
I'm now trying to get a processor to actually log into the page and keep it that way so the invoke processor can download the zip file using the direct link.
If somebody have another idea on how I can resolve this I will be very grateful.
I can provide more info if needed, at the moment I am using the Ni-Fi 1.1.2
Thanks in advance;
Depending on the authentication mechanism in place by the page, you'll likely need to chain two InvokeHTTP processors together. Assuming the first page has a form field you fill out with the username and password, you'll make one InvokeHTTP which uses the POST method to submit the form with the provided credentials and receives a response that contains some kind of token (session ID, etc.). You will extract this value (either from a response header or the page content), and provide it to the second InvokeHTTP as a request header. Using your browser's Developer Tools feature as daggett suggested to observe the authentication process will allow you to determine exactly where these values are provided.

TYPO3 workspace preview not working with forms and HTTP POST data

The following question has been asked in the #typo3-cms Slack channel:
A customer of us wants to use the workspaces feature. Thats working fine. But he cannot test his forms because workspaces are not supporting POST requests. (POST requests are incompatible with keyword preview), does anyone have an idea how to make plugins which are working with POST method testable in workspaces or any other workaround?
Explanation of the scenario and behavior
The check to prevent HTTP POST requests from being executed points back to TYPO3 CMS 4.0 in 2006 when the workspaces feature was introduced into TYPO3 (see accordant Git revision from back then).
Since the workspaces preview link initializes a backend user in an untrusted application context, the check has been used to prevent administration actions from being executed - today one would do that differently and use XSRF protection tokens for that.
The handling of these workspace preview links also was part of a security issue in sprint 2016 with the aim to remove possible security side effects in that regard further (see TYPO3-CORE-SA-2016-012 for details).
There are several possibilities to preview workspace changes:
Preview link from workspace module
In the top-bar of the workspace module in the TYPO3 backend, the preview link can be send to other parties that don't have credentials to access the TYPO3 backend. This mechanism basically leads to the problems with HTTP POST as mentioned above.
This behavior is implemented in the class PreviewHook in either the system extension version (up to and including TYPO3 CMS 7) or workspaces (since and including TYPO3 CMS 8). There's currently no easy way to by-pass the HTTP POST check, except granting possible previewers real and limited access to the TYPO3 backend with a valid user account.
Preview contents directly from page module
Editors that have access to the TYPO3 backend should use the regular preview mechanism of TYPO3 in the page module - this is the same for live versions and workspace changes. The only difference if working in a workspace is, that the website frontend shows additional workspace related widgets to compare changes.
Using this mechanism, the HTTP POST problems mentioned in the beginning of this answer don't occur and e.g. forms can be used without any limitations.
If the previous method of creating the workspace preview link has been used with the same browser already, a cookie ADMCMD_prev has been created which still triggers the preview link behavior and still leads to problems with HTTP POST - even if the regular preview mechanism is used as described in this section. To circumvent that, this cookie has to be cleared manually in the browser.

FTP deployed Microsoft MVC3 website. POST protocol ISN'T working. GET protocol IS working

I am developing a website using Microsoft MVC3, and have built it upon the default MVC3 Application template. It accesses an external database and works on localhost.
I have deployed it to a shared server I rent from storminternet via the publish tool using ftp method (storminternet do not yet support web deploy), and it runs well. It accesses the database okay and get requests work fine.
However, any form that submits via POST protocol returns page not found error 404 (this is on actions where I have asserted [HttpPost]).
Storm internet assure me that POST and GET are allowed by default, and since the helpdesk are not developers, I'm unsure who to turn to. I don't have an excellent understanding of web.config, although I can read and understand xml and see what's going on by reading through and googling. I have tried adding the protocols to the root web.config, and I think I might be barking up the wrong tree.
Has anyone else had this problem, or might anyone know how to help me?
To replicate my error, my site is here... 213.229.125.117/$sitepreview/ase-limited.com/Dev (sorry it isn't blue. The dollar gets parsed to % something)
and the quickest route to a POST request is to click 'Add Building' at the top of the left-hand side and then click 'Save' at the top of the dialogue box.
Any help will be gratefully received. I've been stuck on this for days without luck.
Best Regards
Nick
STOP-PRESS-STOP-PRESS-STOP-PRESS-STOP-PRESS-STOP-PRESS-STOP-PRESS-STOP-PRESS-
It turned out to be a known issue with sitepreview. Switching to the proper domain sorted everything.
I have noticed that you have some 404 javascript errors when performing your AJAX requests. For example you have a request to:
http://213.229.125.117/$sitepreview/ase-limited.com/BuildingManager/Employees/2
instead of:
http://213.229.125.117/$sitepreview/ase-limited.com/Dev/BuildingManager/Employees/2
Notice how /Dev is missing. That's because in your javascripts you have hardcoded your urls instead of using url helpers to generate them. For example you wrote something like this:
$.ajax({
url: '/BuildingManager/Employees/2',
....
});
which works fine on localhost because you don't have a virtual directory name but doesn't work when you deploy on your server because now the correct path is:
$.ajax({
url: '/Dev/BuildingManager/Employees/2',
....
});
For this reason you should absolutely never hardcode urls like that.
And when I try to POST the form in tries to post to http://213.229.125.117/Dev/BuildingManager/SaveBuilding which seems a very weird url as it is missing the whole beginning. Once again: never hardcode urls. Always use url helpers.

get feedburner feed on httpS

We are grabbing our feed at feedburner by using the jquery jGFeed plugin.
this works great until the moment our users are on a httpS:// page.
When we try to load the feed on that page the user gets the message that there is mixed conteent, protected and unprotected on the page.
A solution would be to load the feed on https, but google doesn't allow that, the certificate isn't working.
$.jGFeed('httpS://feeds.feedburner.com/xxx')
Does anyone know a workaround for this. The way it functions now, we simply cannot server the feed in our pages when on httpS
At this time Feedburner does not offer feeds over SSL (https scheme). The message that you're getting regarding mixed content is by design; in fact, any and all content that is not being loaded from a secured connection will trigger that message, so making sure that all content is loaded over SSL is really your only alternative to avoid that popup.
As I mentioned, Feedburner doesn't offer feeds over SSL, so realistically you'll need to look into porting your feed to another service that DOES offer feeds over SSL. Keep in mind what I said above, however, with respect to your feed's content as well. If you have any embedded content that is not delivered via SSL then that content will also trigger the popup that you're trying to avoid.
This comes up from time to time with other services that don't have an SSL cert (Twitter's API is a bit of a mess that way too.) Brian's comment is correct about the nature of the message, so you've got a few options:
If this is on your server, and the core data is on your server too, then you've got end to end SSL capabilities; just point jGFeed to the local RSS feed that FeedBurner's already importing.
Code up a proxy on your server to marshall the call to Feedburner and return the response over SSL.
Find another feed service that supports SSL, and either pass it the original feed or the Feedburner one.
i have started using WordPress paid theme Schema for my several blogs. In general, it is a nice theme, fast and SEO friendly. However, since my blogs are all on HTTPS, then I noticed that if I had a widget of (Google Feedburner) in the sitebar. The chrome will show a security error for any secure page with an insecure form call on the page.
To fix this, it is really simple,
you would just need to change the file widget-subscribe.php located at /wp-content/themes/schema/functions/ and replace all “http://feedburner.google.com” to “https://feedburner.google.com”.
Save the file, and clear the cache, then your browser will show a green padlock.
and i fix this in my this blog www.androidloud.com

Resources