I have a .json file, and I'm using ajax to get the data, but I would like no one to be able to see the file, nor use it.
This doesn't work, because why not even i can access with ajax
<Files "*.json">
Order Deny, Allow
deny from all
allow from localhost
</Files>
I wish that no one could see the file or use it, only my website, and anyone who accesses my website can't see it too
Related
I am hosting multiple websites on the same server. Instead of uploading the (same) pictures for each website into individual folders, I would like to make ONE main folder on the server where all websites will get their image from, so I dont end up with duplicates.
I tried everything but cannot seem to get it working. Can anyone help me out?
Hosting on Ubuntu 16.04 with Apache2.
My host file:
Alias "/product-image" "/var/www/uploads"
<Directory /var/www/mysite.com/>
Options Indexes FollowSymLinks MultiViews
AllowOverride All
Order allow,deny
allow from all
</Directory>
So basically what I want is when my SRC goes to:
mysite.com/product-image/ferrari/f1.jpg
it should be served from
/var/www/uploads/ferrari/f1.jpg
Tried multiple tutorials but nothing worked so far.
P.S. when I go to the url mysite.com/product-image I would expect to see my upload folder but I see nothing. Instead I get an error:
Not Found
The requested URL /product-image was not found on this server.
Apache/2.4.18 (Ubuntu) Server at bedrijfskledinggroothandel.nl Port 443
If you have your multiple websites set up as subdomains in your hosting (as I do), each website at run time can only see the files in or below its specific subdirectory - the hosting will put this layer of security in place.
If this is your own server, not externally hosted, the same may well apply but you may perhaps be able to override this element of the configuration (if you want to - to me, the reason for this security layer is to prevent users realising the stuff is in the same place and trying to take advantage of the fact in some way).
You could however get to what I think is your objective by putting a http (not file level) redirect in place (via .htaccess) so that the subdomain interpreted https://my-website.com/Images (or whatever) as https://www.my-main-domain.com/central-image-directory, which would do the trick I think.
Currently I've put my laravel site online (just for testing). But when I go to for example www.mysite.nl/.env it shows my password etc. for my database. How can I prevent this?
It should be mentioned that use of .env files is intended to be for development only, not production.
Once you're ready to take the site live, the values that you put in the .env file should be moved to the server environment variables.
This should be more secure for two reasons:
The problem you've discovered, that the .env file is accessible, will no longer apply, since there will be no more .env file. Plus, this won't require any server configuration changes (.htaccess files or similar) to restrict access to the .env file.
Server environment variables will not be accessible to anyone without shell access to the server.
Also keep in mind that .env file should not be reachable by users. Only the public/ folder content must be reachable. Set your server configuration to do it ( not always possible though ).
Otherwise, for your production environment you can simply ommit the .env file and define all the settings directly in app/config/
Some hosters also provide their servers with Forge.
Remember to always put your .env file into the .gitignore file if you are using it.
Have a nice day
Removing .env alone doesn't guarantee that your code is safe. As the only reason why this happen is that you ignore the default recommended structure which is to only give access for the web server to public directory.
Now, let say you removed the .env file. Are you sure that:
Nobody can access storage/logs/laravel.log (or daily rotating log file).
Are you sure you're cache and session data is safe, if you're using file based driver.
Are you sure nobody can peek at your compiled blade view under storage/framework/views.
Don't ever skip and compromise security to solve your webhosting limitation.
To hide .env on apache server add this below code in top of your htaccess file. this will also hide directory list view and will hide gitignore webpack config and more. hope that helps
# Disable Directory listing
Options -Indexes
# block files which needs to be hidden, specify .example extension of the file
<Files ~ "\.(env|json|config.js|md|gitignore|gitattributes|lock)$">
Order allow,deny
Deny from all
</Files>
# in here specify full file name sperator '|'
<Files ~ "(artisan)$">
Order allow,deny
Deny from all
</Files>
Be careful, the accepted answer skips over an important detail. Even if the .env file is removed, the directory that it resides in should not be accessible in the first place.
Instead of attempting to deny access to a list of files or folders as some people suggest, the web server's Document Root should be set to the /public folder in a Laravel application. All requests will then be mapped safely starting from /public, which means that the source files of your application are not accessible from the web unless you explicitly give access to them.
someone hacked my wordpress site and I cannot access ftp, cpanel and admin.
I contacted the hosting company and they send me the new password in order to access via ftp but I cannot access via admin and via cpanel.
How can I solve this problem? And How can I prevent this in the future?
I saw that there are some plugins like "Better WP Security". Is it enough to prevent future attacks?
Thanks for your help
Using Better WP Security is an option, yes, but most of all try to use strong passwords, always keep up to date your plugins and Wordpress itself.
Do not store your password on the computer as (text) files (try to remember them, I know, it sounds hard to do, but this is the only way).
Also check all computers, from which you login into Administration area for viruses and/or Trojan horses/key-loggers.
This was for prevention.
Now, how to deal with current situation - it depends, the best way is to disable (and remove) all plugins, and start with clean Wordpress installation. The posts and pages are inside the database, so you should not lose any information, but you can make a backup of all of your files (and custom page templates, if any).
That is lots of depend how your hosting manage there security for Wordpress or Other CMS in PHP the common way to hack admin and cpanel is SYMLINK attack. 1st check all the permission on the host like for change and modify and second thing use strong .htaccess in the your main index dir. And check all the your dir on your account if there is any PHP shell exists there than delete it immediately.
There are certain key points that you can use to make your website more secure.
First Check your site on sucuri.net to get more info on malware, spam etc...
1. Use security plugin
I recommend to use Wordfence. Which has lots of features and is able to do
Scans over 44k+ malwares definitions
Detects phishing attempts
Removes Sh3lls
Backdoors
Trojans
Monitors
DNS security and many more...
Better WP Security (aka iThemes Security) is also good plugin to secure your WP. Which has also great features.
(both plugins works together - No doubt )
Comparison of Better WP Security and WordFence
2. Secure your .htaccess
secure wp-config.php
<Files wp-config.php>
order allow,deny
deny from all
</Files>
Disable directory browsing
# directory browsing
Options All -Indexes
Protect .htaccess itself
<files .htaccess="">
order allow,deny
deny from all
</files>
Disable hot linking
RewriteEngine on
RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http(s)?://(www\.)?YourDomain [NC]
RewriteRule \.(jpg|jpeg|png|gif)$ - [NC,F,L]
3. Protect your self
Use strong passwords, never share with anyone
Save your self from Social Engineering
Tips from Codex (Hardening Wordpress)
4. Get yourself updated.
Use updated version of WordPress, Plugins, Themes.
I want to prevent users access for my "~/Content/..." folder I wrote it as follow in "Global.asax.cs" and put this line of code at the top of every other routes
routes.IgnoreRoute("Content/{*pathInfo}");
but it does not work. in fact user can see every files in content folder by type the URL in browser.
am I missing something?
How did you figure out that it does not work? Give example.
You may have put it last in the Routing table. So try to move it up so that it gets added to the routing table first. The route collection is an ordered list of routes.
Also try this : Routes.IgnoreRoute("Content/");, but your version of ignore is also correct and it should work.
Lastly, I do not know what you mean when you say the user can see all the contents of the Content folder : Isn't that the point? User must be able to download files from the folder, and we usually just need MVC to ignore the requests from coming into the framework, and so that IIS can directly serve those files.
or did you mean Directory browsing is enabled, and you want to disable that : In that case go to IIS manager, and select your website and look for the Directory browsing option and disable it as shown here.
Your problem cannot be solved by routing constraints. There are 3 significant steps in processing request:
IIS got request.
IIS watch at filesystem and search for direct correspondence to file
If IIS didn't found any file - it gives request to ASP.NET MVC for processing.
So, you need to configure folder security to forbidden direct access to files, but allow access to application, as here.
But I don't recommend to secure folder, that should be shared. I don't believe that your site shouldn't have images to display :) If you have some secured content, you need to create another folder.
My current hosting company cannot allow me to place any content above the server root. So i have no way to protect those config.php files from those evil people. I know a way to stop them being accessed by browsers (fake 404 messages) but it's very easy to get pass that.
do you guys know any other way to protect files from users but allow php scripts to access them?
If your hoster allows the use of .htaccess files you could add a file called .htaccess into the desired directory with the content:
deny from all
So nobody can acces the files in this directory (but your php interpreter should still be able to).
You could give a custom extension those files and then protect them with an .htaccess file denying access to them. Something like the following:
<Files config.php>
Order allow,deny
deny from all
</Files>
If you're using a config.php, just set some variable in your main script like "$include_config" and then check for it in your config file itself. If the variable is not there, use die(); and nothing at all will be output by config.php