All my javascript added some code at bottom - codeigniter

My website uses CodeIgniter. Today I found my website has some added code at the bottom of all the JavaScript files, including the jQuery file. The code is like this:
/*4fd970*/
You are blocked by day limit
/*/4fd970*/
My folders permission is set to 755.
I wonder why this code is being added to my file? Has someone hacked my site?
Is it caused by my server?

This is some kind of a virus. Happened to me too. Only the index.php and index.html files got modified, right? I think this is a password stealer - steals your ftp passwords from filezilla or some other ftp software and then automatically modifies the index pages.

Yes, someone hacked your site...though not very well. Same thing happened to me, but when I looked closer, there was also similar code added to EVERY HTML file as well (not at the end, but somewhere in the middle of the page).......
<!--0f2490--><script type="text/javascript" language="javascript" > You are blocked by day limit</script><!--/0f2490-->
Best I and my web host could determine, someone made a failed attempt to insert some kind of malware into the code. It was formatted in such a way (see code above) that it wasn't very apparent. If you select all (control a), it shows up a lot easier.
Some additional info from me web host;
http://sitecheck.sucuri.net/
web site: xxxxxx
status: Site infected with malware
web trust: Not Blacklisted
Malware entry: MW:JS:ENCODED:BADINJECTED1
Description: We identified a suspicious javascript block that results from a failed attempt to inject malicious content (blackhole injection). The site was compromised, but due to an error by the attackers, the malware was not properly added.

Yes you were hacked, and so was I.
You should first change your server access password. Someone probably managed to get his hands on it and uploaded the malware to your server.
Remove all the infected files and upload backup versions.
In my case my antivirus told me I had my .js and .php infected with Exploit:JS/Blacole.BW. You could also see those "4fd970" litter the code. In some cases hacker will change the code to have your user download some malware. If you only have a few scripts you can restore a backup version of all the scripts. If deleting everything is not an option, you can make a diff with the previous version and you should be able to find out what was changed.
Check for any file that shouldn't be there.
I also had an .htaccess file added to my server, to have the user download the Blacole malware. I replaced it with a proper one.

Related

Xhr request shows pending for a certain format in all the browsers on my machine

In my windows machine I run a simple fileserver that serves certain files from a folder. I access these files via chrome/firefox browsers
For a certain file format (In my case ".bin" file) the xhr request always stalls with a message saying "pending". But If I rename the file extension to ".cbin" and reload the page on browser again it works.
Why are the browsers preventing a certain file to be loaded ? All of this used to work a month back without issues (ie loading the bin files). I have disabled my antivirus too.
Any help would be invaluable. Thanks
After hours of searching on the web and experimenting I realized that the browsers have recently added rules by which they block downloads of certain files (exe,dmg,zip,gzip,bin) etc on a http connection for security reasons.
Hope this will help someone who faces the same issue.
You can read more about this issue here and here.

How to modify an old joomla website to remove a dangerous link flagged by google

A client told me his old website running on Joomla was flagged by google for having links to a malicious website. The website was blocked with the typical red security warning in google Chrome. I redirected the website to a temp page, but my client wants to bring back the old website while we work on something new.
However, my local machine and server are running Windows Server. I have the original files of the website and database. Is there a quick way I could remove the links (the google tool only mentions the website "mosaictriad.com") from the Joomla page from my machine? I've tried doing a crtl+f for mosaictriad.com in the sql file but didn't find anything.
Thanks for your opinion on what I should do next, the objective is simply to quickly clear the website from the security warning and send it back to the people managing his old server.
PS i don't have direct access to his server, only the files associated with his joomla website.
Additional details given my google:
Some pages on this website redirect visitors to dangerous websites that install malware on visitors' computers, including: mosaictriad.com.
Dangerous websites have been sending visitors to this website, including: navis.be and umblr.com.
Yes there is a way. You need to register in google webmaster tools. Register your site. Add the sitelinks. Ask google to rescan your website. They will remove it within 24 hours if scan result is negative for malwares.
Running the virus scanner on your local machine over the files may be able to detect some malicious files.
Alternatively, restore the website to a temporary folder on the web and use a commercial scanning service to help identify and clean the website. I use and recommend myjoomla.com but there are other services such as sucuri.net.
I think your strategy is wrong - you should quickly cleanup the website (try overwriting the core files with files from a fresh Joomla install) and you should then secure the website. Once you do that, you should contact Google through the Webmaster tools for a reconsideration request (this typically takes a few days to process if it's the first offense). Once Google approves your reconsideration request, then the red flag should be removed and the website should be accessible by everyone.

How to remove files from Varnish-cache

I'm developing a game in js/php. When I first uploaded my project, it contained a file named "index.html" with nonsensical content (only the word "bla" and a facebook-like-button). I later deleted that "index.html" so that requests to the domain would hit my "index.php" instead (wich contains the actual game).
This happend over a week ago, and i still see people (friends i asked to test the game) getting this dumb "index.html" shown when they open the site in their browsers. I also see this happening to roughly 1/3rd of the browsers when requesting screenshots via browserstack.com or browsershots.org.
I'm assiming the index.html is still cached by cloudcontroles Varnish-cache, but i can't find any possibility to clear this cache for my site. How can i do this or what can i do to get rid of this cached version?
For anyone who wants to test this live: http://dotgame2.cloudcontrolled.com/ (note that this dosn't happen always and for everyone)
Consider using cache breakers dependent on deployment version. You can also try our *.cloudcontrolapp.com routing tier which do not provide caching at all - http://dotgame2.cloudcontrolapp.com.

Files are not changing when I update them via FTP

I made some changes to a CSS file, uploaded it and saw no change. I cleared my browser's cache, repeated the process and still nothing. I also tried another browser and then experimented with other files - all the same result.I then deleted the CSS file altogether - the website still looks the same and I can still see the files in the browser's console.
I can only get results if I actually change the file names altogether (which is really inconvenient). I dont think there is an issue with FTP overwriting the file as there are no errors in FileZillas logs.
Is there another way a website can cache its self? Would anyone know why this is occurring?
EDIT:
I also tried this in cPanel's File Manager and viewed it on another PC - same result
Squid and other web accelerators often sit between a hosted server and your browser. Although they are supposed to invalidate their caches when the backing file changes, that information isn't always sent to specification or acted on properly.
Indeed, there can be multiple caches between you and the server each of which has a chance of hanging onto old data.
First, use Firebug or "Inspect Element" in chrome.
Verify that the css file that the browser is loading the file you think is should load.
Good luck.
Browsers can cache things, did you try SHIFT-F5 on your webpage to force a reload of everything?
Maybe the main server has cached configuration setup to other servers, check with your IT department. If this is the case, you need to tell them to invalidate the cache through all the cached servers.
I had the same issue with fileZilla to solve it you need to clear the file zilla cache or change the name of the files you are uploading.
Open FileZilla and click on the Edit menu.
Choose Clear Private Data.
In the new dialog box, check mark the categories you’d like to clear: Quickconnect history, Reconnect information, Site Manager entries, Transfer queue.
Finally, click OK to validate

How to avoid occasional corrupted downloads

My website hosts a msi file that users need to download. There is nothing special about the file. It lives in a directory on the webserver with a regular HREF pointing to it that users click on. Occasionally a user will complain that they can't open the msi file because Windows Installer claims the file is corrupt. Redownloading the file doesn't help. I end up emailing the file as an attachment which usually works.
My guess is that the file is either corrupted in the user's browser cache or perhaps an intermediary proxy's cache which the user goes through.
Why does this happen? Is there a technique / best practice that will minimize chances of corruption or, perhaps make sure users will get a fresh copy of the file if it does get corrupted during download?
Well if the cause is really just the cache, then I think you could just rename the file before having them download it again. This would work for any proxies too.
Edit: Also, I believe most browsers won't cache pages unless the Get and Post parameters remain the same. The same probably applies to any URL in general. Try adding a unique get (or post) parameter to the end of the URL of each download. You could use the current time, or a random number, etc. Rather than a hyperlink, you could have a button that, when clicked, submits a form with a unique parameter to the download URL.
My advice would be:
Recommend users avoiding IE (especially the older versions), because of truncated downloads, cache pollution...
Advice user to clear the cache before re-downloading the files.
Host the file on an FTP instead of HTTP
Provide MD5 checksum for user to verify the download.

Resources