Joomla 1.5 site keeps getting blocked by host - joomla

I have a Joomla 1.5.26 site which I have had since Aug 2012. It has been in a stable condition since Aug, with no changes to components etc. It is firewalled with RS Firewall and all the other security precautions have been taken.
During the past few weeks the site has started to be blocked by the hosting company that holds the site, who claim that there are too many active connections. I have hunted through the sites, disabled various components etc and am still getting the same problems.
Has anyone experienced any similar issues? I am thinking of moving the site to a more reputable host for Joomla sites to see if it is more robust elsehwere. I just can't undertand why this keeps happening. The Hosts, are placing the IP address of any machines we use to administer the site if the connections get too many, and then we are locked out for about fifteen minutes. As I said previously, nothing has changed on the site, and I cannot find any evidence of the files or database being hacked.
Any ideas?
Much appreciated
James

If your host is telling you that there have been too many connection then it will most likely mean 1 of the following 3:
You site has reached the maximum monthly bandwidth allowance
Your site has too much traffic for your current host and might need to be moved to a VPS server or something more powerful
You host is plain crap
Check to if you have reached your maximum monthly bandwidth allowance (if you have one) else I would probably recommend transferring to a different host, if you site isn't an extremely popular site, generating thousands of users a day.

Related

Must strange site visitor user agent be avoided? If yes how?

I am using shared hosting.
My site was showing "ERR_CONNECTION_REFUSED".
So i went to see visitors to my (SSL) site.
I found that instead of regular names in the "User Agent" list,
cpanel visitors list is showing
user agent Expanse indexes the network perimeters of our customers. If you have any questions or concerns, please reach out to: scaninfo#example.com"
I want to know whether this is harmful and if yes,
How to avoid such unknown user agents?
Is there something i should do with ".htaccess" file?
Once again, i am using shared hosting (so, i have limited accessibility).
The ERR_CONNECTION_REFUSED you saw when accessing your website had nothing to do with the visitor you saw in cpanel. You might have had a different issue with your server configuration/shared hosting provider.
That "visitor" was an internet crawler, most likely from Palo Alto Networks, who owns Expanse. Long story short, it shouldn't cause any harm. They say that their crawlers are used to index/categorize URLs around the internet and/or to spot malicious content.
I advise you to ignore it, since there's not much you can do - I assume they have some ranges of IPs for their crawlers so you wouldn't be able to blacklist all of them anyway.

Best practice to store App Key in Laravel

I have been doing a lot of research on this and I can't seem to find a definitive answer. Obviously these days security is a big issue, hacks are going on all over the place of major companies that invest millions into security and they're still getting hacked.
I work on Laravel a lot and use shared hosting with Hostgator or some similar company of high report. Laravel comes with a built in function for encrypting database info and decrypting to the user when requested.
However, I have a question on how secure this ACTUALLY is. If someone gets into my cPanel, my app key which is used for encryption is right there in front of them. Granted, my cPanel password is the one that's auto-generated by Hostgator and it's complete jibberish with semicolons and alphanumeric strings all over, so it's not easy to guess.
But I'm trying to learn a little bit more about security. If my app key in my env file is locked securely behind my cPanel login, is Laravels built in "encrypt()" method "enough" to call an app "secure"? Is there other measures within Laravel or my host provider that could make it more secure than just tight passwords? Is there some sort of practice of referencing the app key through an external source that's not located in the cPanel area? So even if my cPanel got hacked, my app key wouldn't be in those files and get exposed?
I'm not a security expert, but there are a few points I can share from my experience in working at highly-secured companies.
First, Laravel itself is fine. You can generally trust open source software since it's transparent and security bugs get discovered and addressed early. So you don't need to improve Laravel, just use it as is, preferably an LTS version.
Then, CPanel is a liability. You should minimize weak points on your system, i.e. those that are externally accessible. Get a VPS or a private server and access it via an SSH, don't use tools like CPanel and PhpMyAdmin on it. The less software you have that talks to the outer world, the less vulnerable you are to bugs in that software.
In my current company the production server can only be accessed via SSH from a single IP address, the address of the dev server. So I log in to dev server first, and then log in from there to the prod. It denies all connections from all other IPs.
If you are limited to using CPanel or something similar, consider protecting the login page with HTTP Basic Auth, some hosting providers allow that.
You also want to keep your system and software up to date. Not too new either as that may have bugs that haven't been caught yet. Our devops prefer to have it a couple of minor versions behind, so that the community has time to test it out and get hacked for you.
That's all I know as a web-dev, sure enough there are special tools and ddos protection services but that's beyond a dev's concern imo. If you just follow these steps, you should be safe. Hope that helped a bit, cheers :)

Random people in different places seeing old broken version of site

A few months ago, I launched a new version of a site for a client on a new server, with a different IP address. It works fine for 90% of the people, but some people in different geographical locations have been reporting seeing the old (broken) site. No amount of clearing caches on the browser or server side makes any difference. Could it be an ISP (Charter?) is caching the site? The new site is on a WordPress Multi install and uses MaxCDN (I've cleared all these caches several times)

numsessions limit hits on parallel

I hope someone can help me figure out this issue.
I have a windows based VPS with 6GB of ram and enough disk space.
I have only 3 websites hosted and all three are not advertised publicly, so no one could access.
The issue is the server is slow in response whenever we try to load the sites in browser or in RDP or thru Parallel Plesk Control. Everything slow to response.
I have every 1 minute to 3 , from green zone to red zone a lot of numsessions limit hits.
I have browsed SO and read Parallel doc and even browse their forum and no one has mentioned a real solution. They say that numsessions is hit when many sessions of rdp or Parallel Plesk Control are left open.In my case no one has access to the server and no one is logging to the server either. I have rebooted the server many times and only one session was open and that was to control server via virtuoso (Parallel Power Control) and same the numsessions is hit again within 3 min of reboot.
I have talk to the idiots at 1and1 (where we bought the VPS xxl) and they have no clue saying it is not our problem but yours or MS Windows! I have not installed any third party or even proprietary software on server which could cause the issue. The server is brand new and only created new sites via Parallel Plesk control. Emails are not working either.
Windows Event viewer doesnt show much information either.
Last resort is to re-image the server which may solve issue but I doubt since the issue seems to be from the server when we bought it.
anyone could shed their wisdom light on this please?
Thanks
Just noticed my resource log full of these as well. I think the issue is that a session is counted as soon as a RDP connection is made - so bots trying common admin passwords count towards this.
The real issue is as there is no way I can find to filter these from the resource alerts you basically can't find the real problem you have as the logs are just full of numsesssions.

Enough bandwidth to support

I have a client that is paying $1500 per month for hosting of 1 website (1 domain name, email is hosted elsewhere). The website is pretty low traffic. Like, 100 unique visitors a week. The only catch (and why it is so expensive) is that their database is 15 GB, and is replicated from the hosting company to inside my small companies office.
Inside the office, there is a desktop application that hits the internal database quite a bit. From the website, some data is entered into THAT version of the database. Replication keeps both databases in synch on a schedule of every 5 minutes.
My client has a T1 that runs into their office. I want to knock out the hosting provider altogether, host their website from a server they already have (more than capable of handling this website), and dump the replication altogether. This would save them $1500 per month, and for a company of 5, it would really make a difference to them.
Assuming I already have a backup strategy in place (way to move a copy of the DB offsite every day), what are the problems with this?
Support? they can reboot their server as easily as the hosting provider can.
What if server goes down for good? There is a duplicate that I can bring up in a couple of hours, and that is all the level of service they really require.
What am I missing here? I want to save them money, but I don't want to screw them over...
EDIT: Some of the answers and comments make it clear that I myself wasn't clear. My client (company A, not a hosting provider) is paying company B to host their website. The website has a database (MS SQL Server 2000) that is 15 GB. That SQL Server DB is being replicated back to a server # company A.
Company B is charging Company A $1500 per month for this service.
Company A already has a T1 for connectivity to the internet. They are located inside of a run of the mill business park.
I am proposing doing away with any outside hosting, getting a DNS provider to point the website to Company A's static IP and hosting the website on a server inside Company A. Then there would be no need for any replication at all, and they wouldn't be paying company B $1500 per month.
I hope that explains it. I'm going to re-read and comment on all the current answers.
Really, any advice is very appreciated.
Sounds to me like your only risk in moving the server in-house is if your T1 goes down. If you have a backup strategy in place for that, go for it.
The other option is to co-loc your own server with your own SQL Server licence on it. Hosting companies charge a lot for hosting SQL Server databases because they have to pay per-CPU licencing for it. So they build up a powerful server to serve lots of client's databases, but then SQL Server offers no way to do useage accounting so they only way they can bill/screw you is on database size.
Sounds like the traffic is low enough on your site you can get a dual core server, a 1 CPU licence of SQL Server for a one-off cost of a few thousand dollars and then you're only paying the monthly co-loc price.
A hosting provider can monitor the server 24x7. What if the server crashes at 8 pm? I the people at the small company are not working around the clock?
Depends on the service this DB is providing. What are the requirements to its uptime?
Database replication isn;t that expensive for bandwidth - well, assuming you're not doing a hotcopy of the entire DB files across the link that is.
Check out log shipping, or any of the supported replication options that will replicate the DB using minimal bandwidth. (you never said what the DB was, so I can't comment further there)
I would move to the new server and keep replication. At the very least, if you're really worried about data loss, then get another server in the same facility and copy across to that one - even if you copy 15Gb every 5 minutes, it'll be using non-chargeable bandwidth without even going outside the switch they're connected to.

Resources