I load a local html file from my windows 7 filesytem :
file:///C:/Users/...etc.../myfile.html
Inside it, an existent file relative to the directory of myfile.html :
....load("../common/events.json");
Firefox refuses it, error at console :
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote
resource at file:///C:/Users/...etc.../common/events.json?timeshift=-60. (Reason: CORS request not http).
With link : https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS/Errors/CORSRequestNotHttp
So I set privacy.file_unique_origin to false in config and restarted Firefox : same issue
NB all is ok with ... IE 11 !
You could start your own local server:
python3 -m http.server
which tells you the port (e.g. Serving HTTP on 0.0.0.0 port 8000 (http://0.0.0.0:8000/)).
Then something enter in browser address bar something like
http://0.0.0.0:8000/C:/Users/...etc.../myfile.html.
The path is relative to the location where the server was started.
The security feature you disabled only blocks accessing files in the same or lower directory as the HTML document.
Accessing files is other directories (i.e. if your relative path starts with ../ or you use an absolute path) is always forbidden.
Related
I have worked on few Laravel projects. I use PhpStorm as my IDE which I setup to work with Xdebug.
This is the first time I have used Laravel symbolic link to create a link between public/storage to storage/app/public directory.
After this my Xdebug stops hitting the breakpoint even though my Xdebug seems to be configured as always.
I have tested Xdebug in another Laravel project in PhpStorm and it was working so it is not related to PhpStorm. I opened my current project (one with symbolic link) in IntelliJ IDEA but Xdebug was not working there too so there is something fishy in my project only.
After reading JetBrains Xdebug troubleshooting I came to know that I have to use path mapping in
Settings |Languages & Frameworks | PHP | Servers.
Now the problem is I really don't know which path to be mapped to which folder and found no related solution on internet (maybe I'm bad at googling)
Here is my bad attempt:
I just tried to map storage (in public) directory with public (in storage -> app) directory.
Please guide me regarding this.
xdebug.log file
[3172] Log opened at 2020-08-25 17:39:58
[3172] I: Checking remote connect back address.
[3172] I: Checking header 'HTTP_X_FORWARDED_FOR'.
[3172] I: Checking header 'REMOTE_ADDR'.
[3172] W: Remote address not found, connecting to configured address/port: http://127.0.0.1:8000/:9000. :-|
[3172] W: Creating socket for 'http://127.0.0.1:8000/:9000', getaddrinfo: 0.
[3172] E: Could not connect to client. :-(
[3172] Log closed at 2020-08-25 17:39:58
[7416] Log opened at 2020-08-25 17:43:53
[7416] I: Checking remote connect back address.
[7416] I: Checking header 'HTTP_X_FORWARDED_FOR'.
[7416] I: Checking header 'REMOTE_ADDR'.
[7416] I: Remote address found, connecting to ::1:9000.
[7416] E: Time-out connecting to client (Waited: 200 ms). :-(
[7416] Log closed at 2020-08-25 17:43:53
I had this same problem. Path mappings are indeed the answer.
Instead of putting a mapping on the location of the symlink (public/storage) you need to put the mapping on the target of the symlink (storage/app/public). This needs to be mapped to the absolute path on the server (which you identify as C:/xampp/htdocs/HRMS/storage/app/public).
This is made somewhat easier if you can put a break point on a line which calls a script within the symlinked directory. When you hit that break point then "step into" the called function PHP Storm will complain that
remote path <something> is not mapped to
any file path in project
Click to set up path mappings
Click on that, find the storage/app/public directory in your project files and type in the <something> from the debugger message. Hit OK, and all should be good.
I have a website template that I uploaded to my ftp server using FileZilla. However when I visit the domain I get the error:
Directory Listing Denied
This Virtual Directory does not allow contents to be listed.
in the console it displays a 403 forbidden error. After researching I recognize that the default web page inside the root directory is not set. All my attempts to set the default page have failed. Here is what I tried:
1) Logged into ftp server with FileZilla. Clicked File > Site Manager > Advanced and set the root directory. The root directory contains a file that says index.html
2) Created a .htaccess file in the root that contains the text "DirectoryIndex index.html"
Solutions involving IIS are welcomed as well.
Any advice on how I can get this fixed?
Unfortunately I did not have permissions to access the actual server files so I contacted the hosting company.
They resolved my issue I believe using the approach #alvits recommended, however they have not returned my request to confirm how they solved it.
Thanks for the support
I use Lets Encrypt and get error:
urn:acme:error:unauthorized :: The client lacks sufficient authorization :: Error parsing key authorization file: Invalid key authorization: malformed token
I try: sudo service nginx stop
but get error: nginx service not loaded
So I had a lot of trouble with this stuff. Basically, the error means that certbot was unable to find the file it was looking for when testing that you owned the site. This has a number of potential causes, so I'll try to summarize because I encountered most of them when I set this up. For more reference material, I found the github readme much more useful than the docs.
First thing to note is that the nginx service needs to be running for the acme authorization to work. It looks like you're saying it's not, so start by spinning that up.
sudo service nginx start
With that going, everything here is based on the file location of the website you're trying to create a certificate for. If you don't know where that is, it will be in the relevant configuration file under /etc/nginx which depends largely on your version of NGINX, but is usually under /etc/nginx/nginx.conf or /etc/nginx/sites-enabled/[site-name] or /etc/nginx/conf/[something].conf. Note that the configuration file should be listed (or at least it's directory) under /etc/nginx/nginx.conf so you might start there.
This is an important folder, because this is the folder that certbot needs to modify. It needs to create some files in a nested folder structure that the URL it tries to read from returns the data from those files. The folder it tries to create will be under the root directory you give it under the folder:
/.well-known/acme-challenge
It will then try to create a file with an obscure name (I think it's a GUID), and read that file from the URL. Something like:
http://example.com/.well-known/acme-challenge/abcdefgh12345678
This is important, because if your root directory is poorly configured, the url will not match the folder and the authorization will fail. And if certbot does not have write permissions to the folders when you run it, the file will not be created, so the authorization will fail. I encountered both of these issues.
Additionally, you may have noticed that the above URL is http not https. This is also important. I was using an existing encryption tool, so I had to configure NGINX to allow me to view the ./well-known folder tree under port 80 instead of 443 while still keeping most of my data under the secure https url. These two things make for a somewhat complicated NGINX file, so here is an example configuration to reference.
server {
listen 80;
server_name example.com;
location '/.well-known/acme-challenge' {
default_type "text/plain";
root /home/example;
}
location '/' {
return 301 https://$server_name$request_uri;
}
}
This allows port 80 for everything related to the certbot challenges, while retaining security for the rest of my website. You can modify the directory permissions to ensure that certbot has access to write the files, or simply run it as root:
sudo ./certbot-auto certonly
After you get the certificate, you'll have to set it up in your config as well, but that's outside the scope of this question, so here's a link.
I have installed elasticsearch and logstash 1.4 off the Debian repository. It is working and collecting logs from another device forwarding syslog.
I followed the kibana install guide but I am getting an error message: Connection Failed
With check that es is running or ensure that http.cors.enabled: true
In console I am getting this error:
Cross-Origin Request Blocked: The Same Origin Policy disallows reading the remote resource at http://'127.0.0.1':9200/_nodes. This can be fixed by moving the resource to the same domain or enabling CORS.
I have added this to my elasticsearch.yml:
http.cors.allow-origin: "http://192.168.1.1"
http.cors.enabled: true
That IP is the IP of itself since all 3 ELK apps run off the same host.
Any suggestions?
EDIT:::::
I got it working by adding Header set
Access-Control-Allow-Origin "*" right before the tag in site-enabled.
I also had to link to the module:
ln -s /etc/apache2/mods-available/headers.load /etc/apache2/mods-enabled/
For these configs, you'll need to sudo or be root.
First, make sure you have the following lines in elasticsearch.yml (usually at /etc/elasticsearch/elasticsearch.yml):
http.cors.allow-origin: "http://192.168.1.1"
http.cors.enabled: true
(don't worry if the rest of the file is all commented out--the defaults should be fine)
The rest of the configs are for Apache, so go to the apache directory. For example:
cd /etc/apache2
In your enabled sites folder, add a "Header set" option. On a simple system, this may be in the file pointed to at /etc/apache2/site-enabled/000-default.conf. Inside the directive (perhaps after the line that sets DocumentRoot) add:
Header set Access-Control-Allow-Origin "*"
For this to work, you also need to enable the headers module. Do:
cd /etc/apache2/mods-enabled
ln -s ../mods-available/headers.load
Finally, don't forget to reload or restart the Apache server (reload if you can't stand a 1 second downtime). For example, on a sysvinit-style system:
service apache2 reload
or
service apache2 restart
Then don't forget to refresh the page in your browser.
I'm having the following problem when i try to access my projects on localhost from the Sites dir
Not Found
The requested localhost/~barrymcmahon/php_5_advanced/ was not found on this server.
I know apache is working because I get a response from localhost [It Works!] and I can curl it with headers using [curl --head localhost] and I get a 200 but when I try to use the projects in my sites folder I get the above error.
I have amended the httpd.conf file and the user/username.conf file to point to the correct dir.
I noticed this is in the browser header when i get the 404 [ dont know if it will help ]
Remote Address:[::1]:80
Is is possible that there is there a different format for the localhost path on yosemite os ?