I use HtmlUnit for automated tests for my site.
My site use gmaps api - and it takes a lot of time to send request for external site ( I have few hundreds of tests and few thousands of page loads).
I need some way to tell HtmlUnit to load only local pages (stored in IIS express), and forbit loading external resources to make my tests running more quickly.
You can prevent HTMLUnit from accessing certain URL's using as WebConnectionWrapper:
browser.setWebConnection(new WebConnectionWrapper(browser) {
#Override
public WebResponse getResponse(final WebRequest request) throws IOException {
if (<<CONDITION HERE(such as `request.getUrl().toString().contains("uq.edu.au")`)>>) {
return super.getResponse(request);
} else {
return new StringWebResponse("", request.getUrl());
}
}
});
Unless you need to test them you may want to consider disabling items like JavaScript and CSS, I find that also speeds it up.
Related
I have the following Spring Boot Web Application and used Tomcat as a server and Thymeleaf for my html templates. I have two questions regarding my first Spring Boot project.
This is how my code looks like so far:
#RequestMapping(value = "/myIpAdress")
private String displayMyIpAdress(HttpServletRequest request, Model model) throws Exception
{
model.addAttribute("myIpAdress", getMyIpAdress());
return "myIpAdress";
}
private String getMyIpAdress()
{
// here is a simple HTTP Request returning my IP Adress from a server
}
This is the myIpAdress.html
<div th:text="${myIpAdress}"></div>
All it does it displaying my IP Adress on my view, works great. Nothing fancy so far.
First question:
How do I set up a timer, to display my IP every second on my html File? I want an IMMEDIATE change in my View as soon as my IP changes. In Java I would simply do a scheduled timer or code a while(true) loop with a Thread.sleep(1000L) like this:
public static void main(String[] args) throws Exception
{
while(true){
System.out.println("My IP is: "+getMyIpAdress());
Thread.sleep(1000L);
}
}
This Java code works VERY fast. The console displays my IP adress almost every second. Which is exactly what I wanted initially. But now I wanted to make a Spring Boot Web App out of it and it is incredibly slow and the request takes up to 15 seconds to make the request and render the view.
So my second question is, why is it that slow? And how can I improve the speed? I don't have any CSS or JavaScript running on the HTML file, it really only displays my IP adress.
All I want is to fetch the IP from the server and display it on my view. Every second. How do I achive this?
In pure Thymeleaf you can't update the static view every time something changes in your backend without sending a new request from the client side. It works one way only - requests are from the client to the server, not the other way around.
The easiest way to achieve what you want to do is to add a simple javascript to the template which fetches the IP address every second (eg. from a special dedicated endpoint) and updates the view.
Or you could add some websocket support to the template, but I don't think it's needed in your case (seems to be an overengineering)...
I'm still exploring REST, node.js and generally web development. What I found out is that xmlhttprequest is mostly(if not always) used when using AJAX. As I learned AJAX is for asynchronous Javascript and XML. So my question is should I be using xmlhttprequest in my node.js project, just when I want to do asynchronous parts on my webpage? or does node.js HTTP also have opportunity to asynchronous javascript? How can I balance well the use of HTTP and xmlhttprequest(or AJAX) so that I don't get too messy in all my REST API stuff?
P.S. I kinda don't want to use AJAX, because of XML. I have heard that XML is much heavier in data than JSON and isn't worth using anymore. Is it true? What would you recommend me to do?
non async on node?
you're trying to build an endpoint api so all the other cases of not using async should be thrown out the window. As soon as you have a single non async code in your node.js project it will freeze the entire process until it is complete. Remember Node.js runs a single Thread (theoretically) which means all the other concurrent users are gonna get frozen.. that's one way to make people really upset.
say for instance you need to read a file from your Node.js server on a get request from a client (let's say a browser) well you want to make it a callback/promise never do non-async with an API server there is just no reason not to (in your case).
example below
import * as express from "express";
import * as fs from 'fs';
let app = express();
app.get('/getFileInfo', function(req, res) {
fs.readFile('filePath', 'UTF-8', function(err, data) {
if (err) {
console.log(err);
res.json({error: err});
} else {
res.json({data: data});
}
})
});
//users will freeze while the file is read until it is done reading
app.get('/nonasync', function(req, res) {
let data = fs.readFileSync('path', 'utf-8');
res.json({data:data});
});
the exact same idea applies to your web browser.. if you are going to not do something async in the browsers javascript the entire web application will be unresponsive because it also runs in the same manner, it has one main loop and unless they are in callbacks/promises/observable the website will freeze. Ajax is a much neater/nicer way to implement post/get/put/delete/get:id from a server then an XMLHttpRequest. now both of these have an option to send and receive JSON not only XML. Ajax is safer due to supporting different browser compatibility specs as XMLHttpRequest has some limitations in IE and Safari I believe.
NOTE: if you're not using a framework with node.js you should, it helps keep your endpoints neat and testable along with being able to pass the project on to others without them having to learn the way you implemented your req, res structure
some frameworks for node
Express 4 (my preference, api doc is really really good and strong
community)
Restify (used by Netflix - really light)
Hapi (never used but heard of)
some frameworks for web browsers you might like
angular 2 (my preference as I'm from a MEAN stack)
reactJS (created by big blue Facebook)
knockoutJS (simple and easy)
all the browser frameworks have their own implementation of the RESTful api's, but more are leaning towards Observable objects.
Is there a best practice for supporting self- and web-hosting (at the same time)?
There are many problems I had to solve. Under self-hosting autofac does not work properly, because HttpContext.Current is not set and The GlobalConfiguration is not accessible in self-hosting.
Are there other problems to be aware of?
Have a look at this answer: https://stackoverflow.com/a/13285693/463785 This shows you how you can structure your solution in a hosting layer agnostic way.
Basically, put your ASP.NET Web API logic into a separate project and, as #DarrelMiller suggested, don't use any hosting specific context in that project. Don't even reference unnecessary assemblies (e.g: System.Web) inside this project. However, you will have some hosting layer specific needs, such as getting the consumer's IP address (this cannot be done through what ASP.NET Web API gives you). In such cases, employ some sort of contract between your core API and hosting layers.
For example, below one is the message handler which will set the IP Address of the consumer for each request and I registered this message handler through my WebHost project:
public class UserHostAddressSetterHandler : DelegatingHandler {
protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken) {
request.Properties[ApiCommonRequestKeys.UserHostAddressKey] = request.GetUserHostAddress();
return base.SendAsync(request, cancellationToken);
}
}
internal static class HttpRequestMessageExtensions {
internal static HttpContextBase GetHttpContext(this HttpRequestMessage request) {
return (HttpContextBase)request.Properties[Constants.MS_HttpContextKey];
}
internal static string GetUserHostAddress(this HttpRequestMessage request) {
return request.GetHttpContext().Request.UserHostAddress;
}
}
Then, at the API core layer, I know that hosting layer has set the IP address and I can reach it through Properties[ApiCommonRequestKeys.UserHostAddressKey] of HttpRequestMessage instance anytime.
Have a look at this project: https://github.com/tugberkugurlu/PingYourPackage this is a nice ASP.NET Web API project which has been structured in a hosting later agnostic way. May give you a hint.
You should not use HttpContext.Current in any Web API project. Everything you need should be in HttpRequestMessage.Properties. I'm not aware of any DI issues with Self-host, I know you can use Unity in Self-host without any problems.
WebAPI is having a different pipeline, and they use HTTPHandler. which is in a lower level. Hence, using HTTPContext.Current is not a good idea.
After running a load test, in the results data, it shows that cached requests accumulate throughout the duration of the test, increasing over time.
In my web test, each URL has the Cache Control setting turned off, which is supposed to mean don't cache.
Furthermore, in my load test scenario settings, I have the "Percentage of New Users" setting set to 100, which means that each user should be treated as a new user, and not use caching.
With these settings, why are the test results still showing the increasing amount of cached requests throughout the load test?
I attached an image of the load test results graph of cached requests for clarification.
As you know, there is a property named “Cache Control” on each request.
When the Cache Control property on a request in the Web test is false, the request is always issued.
When the Cache Control property is true, the VSTS load test runtime code attempts to emulate the Browser caching behavior.
However, the Cache Control property is automatically set to true for all dependent requests (images, style sheets, javascripts, ...).
In a load test, the browser caching behavior is simulated separately for each user running in the load test. But event if “Percentage of New Users” is set to 100, the cache will be used during the virtual user session. If your web test contains many pages, the cache will be used.
Since VSTS 2008, you can now write a WebTestPlugin that disables caching of all dependent requests.
Note : When running a Web test by itself, the Cache Control property is automatically set to false for all dependent requests so they are always fetched : this allow you to view the html page in the browser.
Thanks to this blog I created the following class.
using System.ComponentModel;
using Microsoft.VisualStudio.TestTools.WebTesting;
namespace QuranX.Web.LoadTest.WebTestPlugins
{
[DisplayName("Enable browser caching")]
public class EnableBrowserCachingPlugin : WebTestPlugin
{
[DisplayName("Allow caching")]
[Description("If True then server responses will be cached")]
public bool AllowCaching { get; set; } = true;
public override void PostRequest(object sender, PostRequestEventArgs e)
{
foreach (WebTestRequest dependentRequest in e.Request.DependentRequests)
{
dependentRequest.Cache = AllowCaching;
}
}
}
}
The instructions then show how to install the plugin.
Build the app
Open the WebTest file
In the icons click "Add Web Test Plugin"
Set the "Enable browser caching" property to false
I really don't know where to begin with this question, but the site I'm working on at times has some really slow page loads. Especially after doing a build, but not always. I usually have to refresh the page 5-10 times before it actually comes up. I guess I am trying to see where exactly I should begin to look.
ASP.NET MVC 3
Ninject
AutoMapper
Entity Framework Code First 4.1
SQL Server 2008
Razor
UPDATES
Concerning some of the questions, it can do this long loading on every page, but after it loads its fairly quick on all the pages.
After posting this and getting your replies I started the application and it is still loading and probably won't ever load unless I click reload on the browser.
No caching, and the EF models aren't huge.
I am using Razor and Visual Studio 2010 with 6 GB of memory and an I7 processor.
I am using IIS Express and the default web server when debugging. It also does this on IIS7 on the main server.
I may look into the MVC Profiler and Glimpse to see what I can find.
Below I have some code this runs when it hits the homepage. I would say it never loads when I first start up the server. I put a break point at var model which never gets hit. If I reload the page then it does.
public ActionResult Index()
{
var model = new HomeViewModel();
model.RecentHeadlines = _headlineService.GetHeadlines(1, Config.RecentHeadlinesPageSize, string.Empty);
return View(model);
}
Below is my datacontext setup also.
public class DatabaseFactory : Disposable, IDatabaseFactory
{
private DataContext _dataContext;
public DataContext Get()
{
return _dataContext ?? (_dataContext = new DataContext());
}
protected override void DisposeCore()
{
if (_dataContext != null)
_dataContext.Dispose();
}
}
public class Disposable : IDisposable
{
private bool isDisposed;
~Disposable()
{
Dispose(false);
}
public void Dispose()
{
Dispose(true);
GC.SuppressFinalize(this);
}
private void Dispose(bool disposing)
{
if (!isDisposed && disposing)
{
DisposeCore();
}
isDisposed = true;
}
protected virtual void DisposeCore()
{
}
}
public class UnitOfWork : IUnitOfWork
{
private readonly IDatabaseFactory _databaseFactory;
private DataContext _dataContext;
public UnitOfWork(IDatabaseFactory databaseFactory)
{
_databaseFactory = databaseFactory;
}
protected DataContext DataContext
{
get { return _dataContext ?? (_dataContext = _databaseFactory.Get()); }
}
public void Commit()
{
DataContext.Commit();
}
}
I'd start by checking what the timeouts are set to in IIS for the process to be recycling itself.
I'm also a very big fan of the MVC Mini-Profiler which could show you exactly how long various parts of your page load are taking, definitely take a look at it.
Edit:
It is worth noting that the Glimpse project is also great for this task these days.
Sounds like it might be an issue with IIS AppPool recycling if you're experiencing it after builds or after periods of inactivity.
To help with AppPool timeouts you can utilize a batch file I created to help mitigate the issue.
That won't solve the problem for you after new builds because your ASP.NET MVC application needs to be JIT-compiled upon first run. If you're really eager to eliminate that issue, you can use ASP.NET precompliation.
Try Glimpse or use ASP.NET Tracing.
You could also precompile your views if you are using the Razor view engine via Razor Single File Generator for MVC.
It depends on what happened in your previous run, sometimes if you throw an error and don't clear that out then you will have issues running the application. It helps to restart the browser every time you build if there was an error.
However, this could be an issue of caching. It is possible that your database is caching due to poorly maintained context disposing. This would cause the lookups to run faster and faster as they were encountered in pages. Make sure you always call .dispose() when done with your database transactions.
funny - I've noticed something similar once with unity and mvc but the problem I believe resolved itself. You could also try ants profiler to see if the problem is outside of MVC.
If you let a single request sit there (without requesting 5+ times) what happens?
Let a single request run - is ANY of your code hit? (setup logging log4net, nlog, etc) to run application_start, etc to see if any code is getting called after the compile.