Joomla Hacked and redirecting to dndelectric website - joomla

I have searched in all my files for what could be re-directing or putting this malware into my sites to no avail. Anyone who has had this before? Its spread across multiple sites I have hosted. The common component with them is jevents but even the codes for that have not been changed.
Is there a way I can block these requests from .htaccess?

You really need to take your site offline - actually remove the files from the server and identify what type of hack this is.
First though I would suggest turning off javascript in your browser and visiting your page - do you still get redirected?
If not - then the problem is either:
a) a javascript file has been added to your site - or an existing javascript file has been edited. Examine all the .js files loading in the page.
b) an sql injection has added javascript directly into your articles (perhaps each and every article
Assuming you ARE redirected while is javascript turned off - then you are looking at either:
a) an edited .htaccess file redirecting you elsewhere
b) an edited (or 'included') php file setting headers and taking you elsewhere.
Are any of the add-ons in your site(s) listed here:
http://www.exploit-db.com/search/?action=search&filter_page=1&filter_description=joomla&filter_exploit_text=&filter_author=&filter_platform=0&filter_type=0&filter_lang_id=0&filter_port=&filter_osvdb=&filter_cve=
You need to know whether it is only your site compromised or whether it is other sites on the server, or even whether the server itself has been taken over, and that is a question for your host. Immediate re-infection after rebuilding from a backups could mean:
a) a cron job has been set up to re-infect you at a set period
b) another account on the server is infected and is reaching out to re-infect others
c) your site was compromised previously (dropped files within the site) but that these are sat dormant waiting for either an individual or a botnet to connect and take control.
d) or that the server is totally compromised and the hacker just reconnects to re-infect
There are some steps you could take - but frankly this is one area where if you need to ask it is probably a sign you aren't equipped to deal with the issue without expert assistance.
You could grep your files for likely file patterns c99, r57, web shell, eval(base64decode(, etc.
You could scan for files with recent creation dates or recent modified dates/times
Files changed in the last x days (1 days in this case)
find . -mtime -1
Files changed between two dates
find . -type f -newermt "2010-01-01" ! -newermt "2010-06-01"
You should scan log files for suspicious activity
You could download the files and let your anti-virus program scan them - this can give you a place to start (don't let it delete the files though as their contents can give further clues).
You should block access from known automated / scripted useragents (wget, libwww, etc)
All in all though you could spend days battling this with no guarantee of success. My advice would be to get some assistance from a Joomla security expert.

For your next installation, you might want to consider installing also host-based intrusion detection system OSSEC. It offers several security features, including File Integrity checking which would have detected the tampering of your .htaccess files. OSSEC is free and open-source.

You'll probably have to really compare directories and if you can, the original installations of the extensions you used. Check directory permissions - if you see something with 777 or unusually high permissions then that could be a good start to where the problem may be originating from. Check error logs too that may point to something missing or something that has changed and is now throwing an error.
You want to try and identify the problem/malware/virus because it will help you with the next part.
Find the last backup before you were hacked that is a good valid copy, and go from there. Best bet is to completely get rid of the current site and restore from backup (fully) - assuming the backup doesn't contain the virus/malware.
Sometimes there will just be one or two lines of code added to existing lines of code that cause the redirect - these are tough to trace down and identify, but you can do it if you compare sizes and such. Just time consuming. I hope this information helps - good luck.
Also, it sounds like your whole server is compromised if it's spreading - do not use the same password for multiple sites, or the default username (admin). Always change the passwords and the username. If you leave the username default as 'admin', hackers already have 50% of the login figured out. Make it tough on them by changing the user name. Alert your host you've been hacked - they will help in these cases, and can keep it from spreading any further than your account. Change your passwords with that host, change the passwords for each site (preferably after you've cleaned the malware/virus).

I experienced this same .htaccess hack on my shared hosting account. I had 5 sites running Joomla! v 1.5 <---> 2.x. After hours of experimenting with permissions and every other conceivable way to thwart the malicious .htaccess files from regenerating, I found that two of my live Joomla! installs had mysterious .php files in the 'tmp' directory [joomla_root/tmp]. One file was named something like 'jos_AjnJA.php' and the other file was 'j.php'. I changed the permissions on those two files to 000, then once again restored my original .htaccess files to their respective folders. Presto! The .htaccess files finally did not get maliciously rewritten within minutes, as before. 24 hours later and everything is still working as it should on all my Joomla! installs.
I cannot stress enough: I am sure there are variations of this exploit, but do yourself a huge favor and first check all tmp folders on your Joomla! installs for any suspicious .php files!

Related

Are there any ways to monitor all HTTP protocols and block certain ones using a single script on WIndows?

I want to write a program that can monitor all system HTTP/HTTPS protocols used to open the default browser, and block certain ones, automatically changing certain requested URL into another. The process of changing a URL is simple, but the monitoring and blocking part is quite puzzling.
e.g. When clicking on the URL 'https://example.com/asdf.htm', the request will be blocked by the program and the the Windows system will receive the command of 'http://www.example2.org/asdf.htm' instead and the latter instead of the former URL will be opened by the default browser.
I am an amateur developer and student who do not have much experience in solving such problems.
I searched the web and found someone asked a similar question years ago:
https://superuser.com/questions/554668/block-specific-http-request-from-windows
However, I didn't find any useful advice on coding in the page. Maybe I can use an antivirus program to block certain URLs or change the hosts file to block certain URLs but the URL replacement cannot be done. Certainly, changing the hosts to a certain server which redirects certain requests might work but that's too complex. I wish someone can help me solve the problem by giving a simple method on monitoring the Windows system itself. Thanks!
To summarize our conversation in the comments, in order to redirect or restrict traffic, either to sites, either to ports (protocols are actually "mapped" via ports) the main solutions usually are:
a software firewall - keep in mind that SWFW don't usually redirect, they just permit or allow traffic via ports
a hardware firewall (or advanced router, not the commercial ones, but enterprise grade) - they do what you want, but they are very expensive and not worth for a home experiment
a proxy server - this can do what you want
Other alternatives that might or might not work would include editing the hosts file, as you said, but as stated earlier I don't recommend it, it's a system file and if you forget about it, then it can be a hindrance (also keep in mind that normally you should not use a Windows user with admin rights even at home, but that is another story) and a browser extension (which Iwould guess only changes content on pages, not the way a browser works (such as changing URLs).
I think a proxy server is the best pick here. Try it and let me know.
Keep in mind I still recommend you read about networking in order to get a better idea of what you can and can't do in each setup.

Server is serving old versions of static files, but trimmed or padded to match length of new versions

The symptoms of my problem match this question pretty much exactly: Changed static files are cropped/padded to the new size and served the old - Fedora
Whenever I make changes to my static files (e.g. .js and .css), those changes don't show up in the served file. However, if my changes cause the file to change length, then the served file does match the new length:
If I delete characters from anywhere in the static file, then the served file is trimmed at the end by that many characters.
If I add characters to anywhere in the static file, then the served file is padded with that many � (that's U+FFFD) characters at the end.
The reason the linked answer doesn't solve my problem is that I'm not using Apache. I'm writing a Python web app for Heroku, so I'm using gunicorn and Flask (and therefore Werkzeug). The linked answer says that disabling sendfile in Apache solved the problem, so I tried setting the 'USE_X_SENDFILE' variable to False, as per this page, but it didn't help. I also set 'SEND_FILE_MAX_AGE_DEFAULT' to 1 in a further attempt to prevent some sort of caching from happening, and that didn't help either.
If it's not obvious, I really don't know much at all about configuring web servers, so having run out relevant Google hits, I'm at a loss as to what might be causing this. Help?
Virtualbox hates sendfile. If you turn it off it should work.
For example in nginx you would need to add sendfile: off;.
In Apache it's just EnableSendfile off.
In the end it turned out that this was probably being caused by VM issues. The code in question was located on a Windows drive, but was being served from within a Linux VM that was accessing the code via a shared directory. Remounting the share seemed to fix the problem (although not necessarily reliably).

My Joomla! site loads too slowly

My Joomla! site loads very slowly and sometimes return an error which is:
Fatal error: Maximum execution time of 30 seconds exceeded in D:\Hosting\6926666\html\libraries\joomla\environment\request.php on line 11
Note that the D:\Hosting\6926666\html\libraries\joomla\environment\request.php on line 11
alayws changes (not the same path).
Hint: In local my site working very good, the problem happened when the site and the database are on the server, or the database is only on the server.
My site is Joomla! 1.6 and my host server is godaddy.com
I changed to another template and the website sped up considerably. I then did some investigating and found the index.php of the template has been HACKED and there was EVAL() and BASE 64() code in there! As soon as I put in a "clean" version of the PHP, the website was back to normal!
I agree with Jav_Rock : godaddy is not "compatible" with Joomla.
So, to solve your problem, you should find another hosting provider!
Are you using any third-party extensions? If so, how many? The more you use, the more likely it is that one of them has introduced an 'issue'.
I'd like to suggest that you review the code and database structure associated with each extension to determine where there might be a problem. I'm currently doing this with a fairly simple site that is using a LOT of extensions (so many that it breaks the admin UI). Some of them are terribly written - e.g. making many database lookups when 1 is required, writing to the database with every single request, storing several values in a single column, etc.
However, you say you're a 'beginner', so I think your best bet is either:
Get someone who is not a beginner to do the above review for you
Experiment by enabling/disabling each extension in turn to determine which are problematic. That's not very scientific, but might get you somewhere.
If you can, try enabling debug (you can do that via the Global Configuration). This will show you - amongst other things - how many database queries are required to generate your page. I can't give you an absolute figure for what is 'good' or not, suffice it to say that fewer is better! So, for example, if you identify a low-priority extension whose removal saves you 25% of the total, you might well decide to disable that module - and, possibly, look for an alternative. As a guide, the site I'm working on originally made 500-600 queries for the home page. In my opinion that is far, FAR too many.
for:
Fatal error: Maximum execution time of 30 seconds exceeded
first create a php.ini file and place the code only 'max_execution_time=120' means execution time is 120 Sec. or 2 Minute. or modify according to you..
and put that file into you joomla administrator folder.
Or you can directly edit the php.ini if you have access on it.
also enable cache for the required modules.

Building a file upload site that scales

I'm attempting to build a file upload site as a side project, and I've never built anything that needed to handle a large amount of files like this. As far as I can tell, there are three major options for storing and retrieving the files (note that there can be multiple files per upload, so, for example, website.com/a23Fc may let you download a single or multiple files, depending on how many the user originally uploaded - similar to imgur.com):
Stick all the files in one huge files directory, and use a (relational) DB to figure out which files belong to which URLs, then return a list of filenames depending on that. Example: user loads website.com/abcde, so it queries the DB for all files related to the abcde uploads, returns their filenames, and the site outputs those.
Use CouchDB because it allows you to actually attach files to individual records in the DB, so each URL/upload could be a DB record with files attached to it. Example, user loads website.com/abcde, CouchDB grabs the document with the ID of abcde, grabs the files attached to that document, and gives them to the user.
Skip out on using a DB completely, and for each upload, create a new directory and stick the files in that. Example: user loads website.com/abcde, site looks for a /files/abcde/ directory, grabs all the files out of there, and gives them to the user, so a database isn't involved at all.
Which of these seems to most scalable? Like I said, I have very little experience in this area so if I'm completely off or if there is an obvious 4th option, I'm more than open to it. Having thousands or millions of files in a single directory (i.e., option 1) doesn't seem very smart, but having thousands or millions of directories in a directory (i.e., option 3) doesn't seem much better.
A company I used to work for faced this exact problem with about a petabyte of image files. Their solution was to use the Andrew File System (see http://en.wikipedia.org/wiki/Andrew_File_System for more) to store the files in a directory structure that matched the URL structure. This scaled very well in practice.
They also recorded the existence of the files in a database for other reasons that were internal to their application.
I recommend whichever solution you can personally complete in the shortest amount of time. If you already have working CouchDB prototypes, go for it! Same for a relational-oriented or filesystem-oriented solution.
Time-to-market is more important than architecture for two reasons:
This is a side project, you should try to get as far along as possible.
If the site becomes popular, since the primary purpose is file upload, you are likely to rebuild the core service at least once, perhaps more, during the life of the site.
If you are going to user ASP.NET here is article that describes how to use Distributed File System for web farm http://weblogs.asp.net/owscott/archive/2006/06/07/DFS-for-Webfarm-Usage---Content-Replication-and-Failover.aspx

Downloading large files to PC from OAS Server

We have an Oracle 10g forms application running on a Solaris OAS server, with the forms displaying in IE. Part of the application involves uploading and downloading files (Word docs and PDFs, mainly) from the PC to the OAS server, using Oracle's webutil utility.
The problem is with large files (anything over 25Megs or so), it takes a long time, sometimes many minutes. Uploading seems to work, even with large files. Downloading large files, though, will cause it to error out part way through the download.
I've been testing with a 189Meg file in our development system. Using WEBUTIL_FILE_TRANSFER.Client_To_DB (or Client_To_DB_with_Progress), the download would error out after about 24Megs. I switched to WEBUTIL_FILE_TRANSFER.URL_To_Client_With_Progress, and finally got the entire file to download, but it took 22 minutes. Doing without the progress bar got it down to 18 minutes, but that's still too long.
I can display files in the browser, and my test file displayed in about 5 seconds, but many files need to be downloaded for editing and then re-uploaded.
Any thoughts on how to accomplish this uploading and downloading faster? At this point, I'm open to almost any idea, whether it uses webutil or not. Solutions that are at least somewhat native to Oracle are preferred, but I'm opn to suggestions.
Thanks,
AndyDan
This may be totally out to lunch, but since you're looking for any thoughts that might help, here are mine.
First of all, I'm assuming that the actual editing of the files happens outside the browser, and that you're just looking for a better way to get the files back and forth.
In that case, one option I've used in the past is just to route around the web application using Apache, or any other vanilla web server you like. For downloading, create a unique file session token, remember it in the web application, and place a copy of the file, named with the token (e.g. <unique token>.doc), in a download directory visible to Apache. Then provide a link to the file that will be served via Apache.
For upload, you have a couple of options. One is to use the mechanism you've got, then when a file is uploaded, you just have to match on the token in the name to patch the file back into your archive. Alternately, you could create a very simple file upload form separate from your application that will upload the file to a temp directory via Apache, then route the user back into your application and provide the token in the URL HTTP GET-style or else in a cookie.
Before you go to all that trouble, you'll want to make sure that your vanilla web server will provide better upload and download speed and reliability than your current solution, but it should.
As an aside, I don't know whether the application server you're using provides HTTP compression, but if it does, you should make sure it's enabled and working. This is probably the best single thing you can do to increase transfer speed of large files, assuming they're fairly compressible. If your application server doesn't support it, then most any vanilla web server will.
I hope that helps.
I ended up using CLIENT_HOST to call an FTP command to download the files. My 189MB test file took 20-22 minutes to download using WEBUTIL_FILE_TRANSFER.URL_To_Client_With_Progress, and only about 20 seconds using FTP. It's not the best solution because it leaves the FTP password exposed on the PC temporarily, but only for as long as the download takes, and even then the user would have to know where to find it.
So, we're implementing this for now, and looking for a more secure but still performant long term solution.

Resources