How do I define the Abp language to use during unit test execution? - aspnetboilerplate

I am creating unit tests for my service layer. I used the existing UserAppService_Tests test that comes with the downloaded template as a guide.
However I am seeing this exception thrown.
Abp.AbpException : No language defined!
My Test inherits from GpTestBase which in turn inherits from AbpIntegratedTestBase<GpTestModule>
GpTestModule has :
Configuration.Modules.Zero().LanguageManagement.EnableDbLocalization();
So I thought that it should be ok.
Any clues?
public override void PreInitialize()
{
Configuration.UnitOfWork.Timeout = TimeSpan.FromMinutes(30);
Configuration.UnitOfWork.IsTransactional = false;
// Disable static mapper usage since it breaks unit tests (see https://github.com/aspnetboilerplate/aspnetboilerplate/issues/2052)
Configuration.Modules.AbpAutoMapper().UseStaticMapper = false;
Configuration.BackgroundJobs.IsJobExecutionEnabled = false;
// Use database for language management
Configuration.Modules.Zero().LanguageManagement.EnableDbLocalization();
RegisterFakeService<AbpZeroDbMigrator<GpDbContext>>();
Configuration.ReplaceService<IEmailSender, NullEmailSender>(DependencyLifeStyle.Transient);
}

You should not be defining the language to use explicitly.
To have a localization context, you should login as a user.

This can happen if you don't have any languages defined in the AbpLanguages table. Example here.

In my case, it was caused by public const bool MultiTenancyEnabled = true;. Changing MultiTenancyEnabled to false, solved the problem.

Related

Customizing Apache Camels ExchangeFormatter using Spring-Boot

by default i assume that spring-boot/camel is using org.apache.camel.support.processor.DefaultExchangeFormatter.
I wonder how I can set the flag 'showHeaders' inside a spring-boot app.
Because I hope to see the headers in the "org.apache.camel.tracing" log as well.
Wish all a wonderful day
DefaultTracer is used in Camel to trace routes by default.
It is created with showHeaders(false) formatter option set.
Therefore you could implement another Tracer (consider extending from DefaultTracer) to enable putting headers into traced messages.
i need this mostly in my tests. so i have built this into my basic test class
#BeforeEach
public void before() {
if( camelContext.getTracer().getExchangeFormatter() instanceof DefaultExchangeFormatter ) {
DefaultExchangeFormatter def = (DefaultExchangeFormatter)camelContext.getTracer().getExchangeFormatter();
def.setShowHeaders(true);
}
}

Java debugger can't call some default method implementations

I'm coding in IntelliJ IDEA. When debugging my application, I can't use some default method implementations in Watches.
Here is a condensed example:
public class Friendship {
interface Friend {
default void sayHiTo(Friend friend) {
System.out.println("Hi, " + friend.hashCode());
}
default int amountOfHands() {
return 2;
}
}
public static class BasicFriend implements Friend {
int numberOfFaces() {
return 1;
}
}
public static void main(String[] args) {
System.out.println("Put a breakpoint here");
}
}
In main() method I put a breakpoint and set up three watches:
// Default interface method with dependency
new BasicFriend().sayHiTo(new BasicFriend())
// Default interface method without dependency
new BasicFriend().amountOfHands()
// Class method
new BasicFriend().numberOfFaces()
The first watch throws NoSuchMethodException complaining that method Friendship$BasicFriend.sayHiTo() doesn't exist.
The second watch runs successfully, but strangely it reports a boxed object
{java.lang.Integer#537} "2" instead of just a primitive 2.
The third watch reports a primitive 1, just as expected.
Why is the first watch not working? Is this a bug? Is this actually IDE related? Is it because of some conceptual flaw of default methods? Should it be working as I want it to in the first place? Is the strange result of the second watch somehow related to the issue in the first watch?
Prior to JDK 8u40, default and static interface methods were not supported by JDI (Java Debugger Interface), JDWP (Java Debugger Wire Protocol) and JDB (the standard Java debugger). This is bug JDK-8042123, which is recorded as fixed in 8u40 and a corresponding blurb appears in the 8u40 release notes.
Update to 8u40 or later to fix this issue, at least on the JDK side.
From the bug description, it looks like debugger-side changes are also required, to avoid casting com.sun.jdi.InterfaceType objects to com.sun.jdi.ClassType, but instead call InterfaceType.invokeMethod() directly.
In the specific case of IntelliJ, Suseika confirmed in a comment that 14.1.2 has mostly fixed the issue (except the unexpected boxing), though Mike Kobit still experiences this problem on that version with a ClassCastException suggestive of the incorrect cast above.

Fetch windows setting value

How do I fetch the Measurement System setting value in javascript?
I'm guessing that it would be throw some WinJS call.
The logical place would be Windows.Globalization, but not seeing if offered there. One pretty simple workaround - faster to write than to research the setting :) is to create a Windows Runtime Component in C# that calls in to System.Globalization:
namespace WindowsRuntimeComponent
{
public sealed class RegionalSettings
{
public bool isMetric()
{
return System.Globalization.RegionInfo.CurrentRegion.IsMetric;
}
}
}
Then add as a reference to your JavaScript app and invoke there:
var r = new WindowsRuntimeComponent.RegionalSettings;
var isMetric = r.isMetric();

What is the equivalent of TestPropertyAttribute available for class, not method

Is there the equivalent of the TestPropertyAttribute available for a class? I'd like to mark a bunch of tests with a property, without having to mark each test.
Thanks in advance.
Unfortunately there is none. We don't have a TestProperty equivalent at the class level.
Just found this use of the TestProperty class used in the initialize to set a class member. It sets a default and you mark the tests with different settings with the [TestProperty("thing", "non-default")]
[ClassInitialize()]
public void InitializeClass(TestContext testContext)
{
// Changed by the [TestProperty("thing", "non-default")]
if (TestContext.Properties.Contains("thing"))
_thing = TestContext.Properties["thing"] as string;
else
_thing = "default";
}
Further explanation is here.

Enterprise Library Validation Block - Should validation be placed on class or interface?

I am not sure where the best place to put validation (using the Enterprise Library Validation Block) is? Should it be on the class or on the interface?
Things that may effect it
Validation rules would not be changed in classes which inherit from the interface.
Validation rules would not be changed in classes which inherit from the class.
Inheritance will occur from the class in most cases - I suspect some fringe cases to inherit from the interface (but I would try and avoid it).
The interface main use is for DI which will be done with the Unity block.
The way you are trying to use the Validation Block with DI, I dont think its a problem if you set the attributes at interface level. Also, I dont think it should create problems in the inheritance chain. However, I have mostly seen this block used at class level, with an intent to keep interfaces not over specify things. IMO i dont see a big threat in doing this.
Be very careful here, your test is too simple.
This will not work as you expect for SelfValidation Validators or Class Validators, only for the simple property validators like you have there.
Also, if you are using the PropertyProxyValidator in an ASP.NET page, iI don;t believe it will work either, because it only looks a field validators, not inherited/implemented validators...
Yes big holes in the VAB if you ask me..
For the sake of completeness I decided to write a small test to make sure it would work as expected and it does, I'm just posting it here in case anyone else wants it in future.
using System;
using Microsoft.Practices.EnterpriseLibrary.Validation;
using Microsoft.Practices.EnterpriseLibrary.Validation.Validators;
namespace ConsoleApplication1
{
class Program
{
static void Main(string[] args)
{
ISpike spike = new Spike();
spike.Name = "A really long name that will fail.";
ValidationResults r = Validation.Validate<ISpike>(spike);
if (!r.IsValid)
{
throw new InvalidOperationException("Validation error found.");
}
}
}
public class Spike : ConsoleApplication1.ISpike
{
public string Name { get; set; }
}
interface ISpike
{
[StringLengthValidator(2, 5)]
string Name { get; set; }
}
}
What version of Enterprise Library are you using for your code example? I tried it using Enterprise Library 5.0, but it didn't work.
I tracked it down to the following section of code w/in the EL5.0 source code:
[namespace Microsoft.Practices.EnterpriseLibrary.Validation]
[public static class Validation]
public static ValidationResults Validate<T>(T target, ValidationSpecificationSource source)
{
Type targetType = target != null ? target.GetType() : typeof(T);
Validator validator = ValidationFactory.CreateValidator(targetType, source);
return validator.Validate(target);
}
If the target object is defined, then target.GetType() will return the most specific class definition, NOT the interface definition.
My workaround is to replace your line:
ValidationResults r = Validation.Validate<ISpike>(spike);
With:
ValidationResults r ValidationFactory.CreateValidator<ISpike>().Validate(spike);
This got it working for me.

Resources