Any offline tool finding broken links can pass login page? - performance

Our team have finished development phase of a web application
And i want to check the whole pages if there is a broken links or not
I try to use a lot of tools such as Xenu tool and LinkChecker tool
BUT
It can NOT navigate pages under the login page
And only home page will checked
Because of authentication is required
If there is a way to pass the authentication parameters ("userName" and "password") to the tool
to make it able to navigate pages under login page and check them

Edit your login module so that you can pass username and password in the url. Then start the tool from something like this: http://yourwebsite.com/login?username=...&password=.... You can then leave the work to the tool and your webapp, supposing your tool is managing cookies correctly (Xenu has an option for that.)
After reading the cookies section in the Xenu FAQ, I realized it can access IE's cookies, so...you can try login in IE then (while the browser is still running) run Xenu, enabling its cookies.

The only enterprise-grade solution I found so far is IBM Rational Policy tester. It does a whole lot more than simply checking broken links, but it does it well (spell check, grammar, reg ex, SEO, 508 accessibility, ...). The configuration of the tool is a pain and the UI is incredibly dated too... Having said that, authentication isn't an issue and, once configured, it does the job like a boss.
https://www.ibm.com/support/knowledgecenter/en/SSAUB3_8.5.0/com.ibm.pt.help.doc/helpindex_pt.html

Related

How to modify an old joomla website to remove a dangerous link flagged by google

A client told me his old website running on Joomla was flagged by google for having links to a malicious website. The website was blocked with the typical red security warning in google Chrome. I redirected the website to a temp page, but my client wants to bring back the old website while we work on something new.
However, my local machine and server are running Windows Server. I have the original files of the website and database. Is there a quick way I could remove the links (the google tool only mentions the website "mosaictriad.com") from the Joomla page from my machine? I've tried doing a crtl+f for mosaictriad.com in the sql file but didn't find anything.
Thanks for your opinion on what I should do next, the objective is simply to quickly clear the website from the security warning and send it back to the people managing his old server.
PS i don't have direct access to his server, only the files associated with his joomla website.
Additional details given my google:
Some pages on this website redirect visitors to dangerous websites that install malware on visitors' computers, including: mosaictriad.com.
Dangerous websites have been sending visitors to this website, including: navis.be and umblr.com.
Yes there is a way. You need to register in google webmaster tools. Register your site. Add the sitelinks. Ask google to rescan your website. They will remove it within 24 hours if scan result is negative for malwares.
Running the virus scanner on your local machine over the files may be able to detect some malicious files.
Alternatively, restore the website to a temporary folder on the web and use a commercial scanning service to help identify and clean the website. I use and recommend myjoomla.com but there are other services such as sucuri.net.
I think your strategy is wrong - you should quickly cleanup the website (try overwriting the core files with files from a fresh Joomla install) and you should then secure the website. Once you do that, you should contact Google through the Webmaster tools for a reconsideration request (this typically takes a few days to process if it's the first offense). Once Google approves your reconsideration request, then the red flag should be removed and the website should be accessible by everyone.

Which are the best extensions for use Facebook and stop the cookies and tracking?

I ´m trying with Ad Block Plus, Ghostery, Disconect and Self-Destruct Cookies in Firefox.
Can you recommend some tips to stop the tracking?
The "Do Not Track" feature in Firefox is useful for telling sites that you do not want to be tracked. A detailed guide on enabling this feature is available on Firefox's support page : http://mzl.la/WL6fUP .
Besides, if you want an extra level of security, I would suggest you to use the "NoScript" browser extension(https://addons.mozilla.org/en-US/firefox/addon/noscript/). NoScript blocks JavaScript and other executable content on website thus effectively protecting you from tracking codes on websites.
And if you want real privacy use a proxy or VPN. Another good idea is to use the tor browser ( torproject.org/projects/torbrowser.html.en).

Unable to clear cookies using browser.cookies.clear

I am unable to clear cookies using watir-webdriver and browser.cookies.clear
Are there any other alternatives ?
This is as far as I know a browser based limitation, due to concerns of security and privacy.
Webdriver is interacting with the browser by javascript, and javascript is not allowed to clear all cookies (think how nasty that could be on a malicious site). In the non testing env, most JS that is executing came from the server of the site you are accessing. So the most it is allowed to do is clear the cookies for the 'current domain' e.g. the cookies belonging to the current URL. So if a web page wants to clear all its own cookies, that is OK, but it is not allowed to clear cookies belonging to other domains.
So if you want all your tests to start with fresh cookies, you will need something in the 'Before' section in env.rb that goes to the root of the site in question, and then clears the cookies
BTW the same limitation applies to setting cookies, if you want to create cookies for a specific site, you need to navigate to that site before trying to create them, or nothing gets created.
This is not an answer, but FYI only.
Suppose chrome is the chosen browser, when cucumber is running, from the output of ps -ef | grep chrome we'll be able to find the customized user data directory which is something like below.
--user-data-dir=/tmp/.org.chromium.Chromium.G2DgPo
And inside that directory, we'll be able to find the Cookies file stored under the Default folder.
Not sure directly deleting that database file could fulfill the needs or not. Cause in a normal browser session, such runtime data are stored at
~/.config/google-chrome/Default/Cookies

Testing links in a web app

I need to test links inside a web application. I have looked at a couple tools (Xenu, various browser plugins, link-checker(ruby)). Nothing quite fits my needs which I will detail below.
I need to get past a login form
test needs to be rerun for different types of users (multiple sets of login credentials)
would like to automate this under a ci server (Jenkins)
the ability to spider the site
Does anyone have any ideas? Bonus if I can use Ruby to do this!
What you are asking for is beyond most of the test tools once you throw in the ability to spider the site.
That final requirement pushes you into the realm of hand-coding something. Using the Mechanize gem you could do all those things, but, you get to code a lot of the navigation of the site.
Mechanize uses Nokogiri internally, so it's easy to grab all links in a page, which you could store in a database to be checked by a different thread, or some subsequent code. That said, writing a spider is not hard if you're the owner of the pages you're hitting, because you can be pretty brutal about accessing the server and let the code run at full speed without worrying about being banned for excessive bandwidth use.

Copy cookie to other domain. Firefox? Chromium?

Is there any possibility of copying a set of cookies from one domain to another. I badly need this for Web development.
You cannot just copy set of cookies, but you can write your own php/python code, to set several cookies for another domain and use values from the old set.
No, the same origin policy prohibits sites from setting or reading cookies on behalf of other sites outside of a few special cases. Some browser extensions will allow you to copy and paste cookies to sync them manually, though.
From your comments it sounds like you want to use the same cookie for your development and your production system. How about using something like 'local.example.com' instead of 'localhost' as your development domain and setting wildcard cookies for all subdomains?
We use this pattern so we don't need to register multiple API keys for webservices since most of them have wildcard support for subdomains.
I'm not sure I would recommend something like that so you are able to use the same cookies in development and production because it has other implications as well. For example if you send static assets from another subdomain then the browser will send cookie header information unnecessarily and of course there might be some more details that might make debugging harder rather than easier this way.
If you could explain the problem at hand in a bit more detail there might be other solutions or best practices for staging and production environments that can help you.
You can do this manually, using grease monkey.
goto tools->page info.
select security tab.
view cookies button.
type the domain you wish to read from.
make a note of all the cookies and their content.
now go to the domain you wish to copy to.
install greasemonkey add on for firefox (or better yet, using Cookies Manager+ :: Add-ons for Firefox!).
run some javascript code to re-create the mentioned cookeis and their values.

Resources