Different Validator For Create and Update in Spring MVC - spring

I looking for different validation style for forms when I create and update entities.
For Instance, when I create an "UserClass" object it requires an ID to define, but when I update, I do not need ID again, because it is defined by user at the creation step. I have lots of entity and I need to find most proper way.
For instance is this logical?
public interface RecordGroupValidator {
public void validateNew(RecordGroup recordGroup, Errors errors);
public void validateUpdate(RecordGroup recordGroup, Errors errors);
}
Validator :
public class RecordGroupValidatorImpl implements RecordGroupValidator {
#Autowired
RecordGroupService recordGroupService;
#Override
public void validateNew(RecordGroup recordGroup, Errors errors) {
if (!ValidationHandler.validText(recordGroup.getIds())) {
errors.rejectValue(ColumnIdentifier.COLUMN.Ids.name(), TextParameters.SERVLET_RESPONSE.InvalidParameter.getText());
}
if (!ValidationHandler.validText(recordGroup.getName())) {
errors.rejectValue(ColumnIdentifier.COLUMN.Name.name(), TextParameters.SERVLET_RESPONSE.InvalidParameter.getText());
}
if (recordGroup.getRecordGroupType() == null) {
errors.rejectValue(ColumnIdentifier.COLUMN.RecordGroupType.name(), TextParameters.SERVLET_RESPONSE.InvalidParameter.getText());
}
if (recordGroupService.idsExist(recordGroup.getIds())) {
errors.rejectValue(ColumnIdentifier.COLUMN.Ids.name(), TextParameters.SERVLET_RESPONSE.DuplicateEntry.getText());
}
if (recordGroupService.nameExist(recordGroup.getName())) {
errors.rejectValue(ColumnIdentifier.COLUMN.Name.name(), TextParameters.SERVLET_RESPONSE.DuplicateEntry.getText());
}
}
#Override
public void validateUpdate(RecordGroup recordGroup, Errors errors) {
ValidationUtils.rejectIfEmptyOrWhitespace(errors, ColumnIdentifier.COLUMN.Name.name(), TextParameters.SERVLET_RESPONSE.InvalidParameter.getText());
if (recordGroup.getRecordGroupType() == null) {
errors.rejectValue(ColumnIdentifier.COLUMN.Type.name(), TextParameters.SERVLET_RESPONSE.InvalidParameter.getText());
}
}
}

I think you should create two validation. One for create and one for update. This will create clever architecture. Because for now you have only one difference but in the future you can have more. In my opinion you should split them now.

Related

Using FluentScheduler - ASP.NET Core MVC

I currently have a simple website setup with ASP.NET Core MVC (.NET 4.6.1), and I would like to periodically do some processes like automatically send emails at the end of every day to the registered members.
After doing some searching, I came across two common solutions - Quartz.NET and FluentScheduler.
Based on this SO thread, I found the approach of using FluentScheduler more easier to digest and use for my simple task. After quickly implementing the following lines of code into my Program.cs class, I had the emails going out successfully every minute (for testing purposes).
public class Program
{
public static void Main(string[] args)
{
var host = new WebHostBuilder()
.UseKestrel()
.UseContentRoot(Directory.GetCurrentDirectory())
.UseIISIntegration()
.UseStartup<Startup>()
.Build();
var registry = new Registry();
JobManager.Initialize(registry);
JobManager.AddJob(() => MyEmailService.SendEmail(), s => s
.ToRunEvery(1)
.Minutes());
host.Run();
}
}
However, now apart from sending emails I also need to do some back-end processing for e.g. updating the user records in the DB when mails are being sent out. For this, I normally inject my Entity Framework Context into the constructor of my controllers and use it to get/update SQL records.
My question is, since I cannot really inject these services into the main method, where would be the appropriate place to initialize the registry and add jobs for scheduling?
Thanks for the help, I am a little new to this so a little guidance would be much appreciated!
Instead of Program's Main function, I initialized the same in Startup.cs before app.UseMvc..
public void Configure(...., IDependencyObject dependencyObject)
{
....
JobManager.Initialize(new MyRegistry(dependencyObject));
app.UseMvc(routes =>
{
routes.MapRoute(
name: "default",
template: "api/{controller}/{action}/{id?}");
});
}
My registry class looks like this:
public class MyRegistry: Registry
{
public MyRegistry(IDependencyObject dependencyObject)
{
Schedule(() => new SyncUpJob(dependencyObject)).ToRunNow().AndEvery(10).Seconds();
}
}
My Job class looks like this:
public class SyncUpJob: IJob
{
public SyncUpJob(IDependencyObject dependencyObject)
{
DependencyObject= dependencyObject;
}
public IDependencyObject DependencyObject{ get; set; }
public void Execute()
{
// call the method to run weekly here
}
}
You can define all your jobs and their schedules, by subclassing from FluentScheduler Registry class. something like:
public class JobRegistry : Registry {
public JobRegistry() {
Schedule<EmailJob>().ToRunEvery(1).Days();
Schedule<SomeOtherJob>().ToRunEvery(1).Seconds();
}
}
public class EmailJob : IJob {
public DbContext Context { get; } // we need this dependency, right?!
public EmailJob(DbContext context) //constructor injection
{
Context = context;
}
public void Execute()
{
//Job implementation code: send emails to users and update database
}
}
For injecting dependencies into jobs, you need to implement FluentScheduler IJobFactory interface. GetJobIntance method is called by FluentScheduler for creating job instances. Here you can use any DI library you want; In this sample implementation, I'm going to assume that you use Ninject:
public class MyNinjectModule : NinjectModule {
public override void Load()
{
Bind<DbContext>().To<MyDbContextImplemenation>();
}
}
public class JobFactory : IJobFactory {
private IKernel Kernel { get; }
public JobFactory(IKernel kernel)
{
Kernel = kernel;
}
public IJob GetJobInstance<T>() where T : IJob
{
return Kernel.Get<T>();
}
}
Now you can start your jobs in main method by calling:
JobManager.JobFactory = new JobFactory(new StandardKernel(new MyNinjectModule()));
JobManager.Initialize(new JobRegistry());

Read Application Object from GemFire using Spring Data GemFire. Data stored using SpringXD's gemfire-json-server

I'm using the gemfire-json-server module in SpringXD to populate a GemFire grid with json representation of “Order” objects. I understand the gemfire-json-server module saves data in Pdx form in GemFire. I’d like to read the contents of the GemFire grid into an “Order” object in my application. I get a ClassCastException that reads:
java.lang.ClassCastException: com.gemstone.gemfire.pdx.internal.PdxInstanceImpl cannot be cast to org.apache.geode.demo.cc.model.Order
I’m using the Spring Data GemFire libraries to read contents of the cluster. The code snippet to read the contents of the Grid follows:
public interface OrderRepository extends GemfireRepository<Order, String>{
Order findByTransactionId(String transactionId);
}
How can I use Spring Data GemFire to convert data read from the GemFire cluster into an Order object?
Note: The data was initially stored in GemFire using SpringXD's gemfire-json-server-module
Still waiting to hear back from the GemFire PDX engineering team, specifically on Region.get(key), but, interestingly enough if you annotate your application domain object with...
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public class Order ... {
...
}
This works!
Under-the-hood I knew the GemFire JSONFormatter class (see here) used Jackson's API to un/marshal (de/serialize) JSON data to and from PDX.
However, the orderRepository.findOne(ID) and ordersRegion.get(key) still do not function as I would expect. See updated test class below for more details.
Will report back again when I have more information.
#RunWith(SpringJUnit4ClassRunner.class)
#ContextConfiguration(classes = GemFireConfiguration.class)
#SuppressWarnings("unused")
public class JsonToPdxToObjectDataAccessIntegrationTest {
protected static final AtomicLong ID_SEQUENCE = new AtomicLong(0l);
private Order amazon;
private Order bestBuy;
private Order target;
private Order walmart;
#Autowired
private OrderRepository orderRepository;
#Resource(name = "Orders")
private com.gemstone.gemfire.cache.Region<Long, Object> orders;
protected Order createOrder(String name) {
return createOrder(ID_SEQUENCE.incrementAndGet(), name);
}
protected Order createOrder(Long id, String name) {
return new Order(id, name);
}
protected <T> T fromPdx(Object pdxInstance, Class<T> toType) {
try {
if (pdxInstance == null) {
return null;
}
else if (toType.isInstance(pdxInstance)) {
return toType.cast(pdxInstance);
}
else if (pdxInstance instanceof PdxInstance) {
return new ObjectMapper().readValue(JSONFormatter.toJSON(((PdxInstance) pdxInstance)), toType);
}
else {
throw new IllegalArgumentException(String.format("Expected object of type PdxInstance; but was (%1$s)",
pdxInstance.getClass().getName()));
}
}
catch (IOException e) {
throw new RuntimeException(String.format("Failed to convert PDX to object of type (%1$s)", toType), e);
}
}
protected void log(Object value) {
System.out.printf("Object of Type (%1$s) has Value (%2$s)", ObjectUtils.nullSafeClassName(value), value);
}
protected Order put(Order order) {
Object existingOrder = orders.putIfAbsent(order.getTransactionId(), toPdx(order));
return (existingOrder != null ? fromPdx(existingOrder, Order.class) : order);
}
protected PdxInstance toPdx(Object obj) {
try {
return JSONFormatter.fromJSON(new ObjectMapper().writeValueAsString(obj));
}
catch (JsonProcessingException e) {
throw new RuntimeException(String.format("Failed to convert object (%1$s) to JSON", obj), e);
}
}
#Before
public void setup() {
amazon = put(createOrder("Amazon Order"));
bestBuy = put(createOrder("BestBuy Order"));
target = put(createOrder("Target Order"));
walmart = put(createOrder("Wal-Mart Order"));
}
#Test
public void regionGet() {
assertThat((Order) orders.get(amazon.getTransactionId()), is(equalTo(amazon)));
}
#Test
public void repositoryFindOneMethod() {
log(orderRepository.findOne(target.getTransactionId()));
assertThat(orderRepository.findOne(target.getTransactionId()), is(equalTo(target)));
}
#Test
public void repositoryQueryMethod() {
assertThat(orderRepository.findByTransactionId(amazon.getTransactionId()), is(equalTo(amazon)));
assertThat(orderRepository.findByTransactionId(bestBuy.getTransactionId()), is(equalTo(bestBuy)));
assertThat(orderRepository.findByTransactionId(target.getTransactionId()), is(equalTo(target)));
assertThat(orderRepository.findByTransactionId(walmart.getTransactionId()), is(equalTo(walmart)));
}
#Region("Orders")
#JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type")
public static class Order implements PdxSerializable {
protected static final OrderPdxSerializer pdxSerializer = new OrderPdxSerializer();
#Id
private Long transactionId;
private String name;
public Order() {
}
public Order(Long transactionId) {
this.transactionId = transactionId;
}
public Order(Long transactionId, String name) {
this.transactionId = transactionId;
this.name = name;
}
public String getName() {
return name;
}
public void setName(final String name) {
this.name = name;
}
public Long getTransactionId() {
return transactionId;
}
public void setTransactionId(final Long transactionId) {
this.transactionId = transactionId;
}
#Override
public void fromData(PdxReader reader) {
Order order = (Order) pdxSerializer.fromData(Order.class, reader);
if (order != null) {
this.transactionId = order.getTransactionId();
this.name = order.getName();
}
}
#Override
public void toData(PdxWriter writer) {
pdxSerializer.toData(this, writer);
}
#Override
public boolean equals(Object obj) {
if (obj == this) {
return true;
}
if (!(obj instanceof Order)) {
return false;
}
Order that = (Order) obj;
return ObjectUtils.nullSafeEquals(this.getTransactionId(), that.getTransactionId());
}
#Override
public int hashCode() {
int hashValue = 17;
hashValue = 37 * hashValue + ObjectUtils.nullSafeHashCode(getTransactionId());
return hashValue;
}
#Override
public String toString() {
return String.format("{ #type = %1$s, id = %2$d, name = %3$s }",
getClass().getName(), getTransactionId(), getName());
}
}
public static class OrderPdxSerializer implements PdxSerializer {
#Override
public Object fromData(Class<?> type, PdxReader in) {
if (Order.class.equals(type)) {
return new Order(in.readLong("transactionId"), in.readString("name"));
}
return null;
}
#Override
public boolean toData(Object obj, PdxWriter out) {
if (obj instanceof Order) {
Order order = (Order) obj;
out.writeLong("transactionId", order.getTransactionId());
out.writeString("name", order.getName());
return true;
}
return false;
}
}
public interface OrderRepository extends GemfireRepository<Order, Long> {
Order findByTransactionId(Long transactionId);
}
#Configuration
protected static class GemFireConfiguration {
#Bean
public Properties gemfireProperties() {
Properties gemfireProperties = new Properties();
gemfireProperties.setProperty("name", JsonToPdxToObjectDataAccessIntegrationTest.class.getSimpleName());
gemfireProperties.setProperty("mcast-port", "0");
gemfireProperties.setProperty("log-level", "warning");
return gemfireProperties;
}
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
//cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxSerializer(new OrderPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
#Bean(name = "Orders")
public PartitionedRegionFactoryBean ordersRegion(Cache gemfireCache) {
PartitionedRegionFactoryBean regionFactoryBean = new PartitionedRegionFactoryBean();
regionFactoryBean.setCache(gemfireCache);
regionFactoryBean.setName("Orders");
regionFactoryBean.setPersistent(false);
return regionFactoryBean;
}
#Bean
public GemfireRepositoryFactoryBean orderRepository() {
GemfireRepositoryFactoryBean<OrderRepository, Order, Long> repositoryFactoryBean =
new GemfireRepositoryFactoryBean<>();
repositoryFactoryBean.setRepositoryInterface(OrderRepository.class);
return repositoryFactoryBean;
}
}
}
So, as you are aware, GemFire (and by extension, Apache Geode) stores JSON in PDX format (as a PdxInstance). This is so GemFire can interoperate with many different language-based clients (native C++/C#, web-oriented (JavaScript, Pyhton, Ruby, etc) using the Developer REST API, in addition to Java) and also to be able to use OQL to query the JSON data.
After a bit of experimentation, I am surprised GemFire is not behaving as I would expect. I created an example, self-contained test class (i.e. no Spring XD, of course) that simulates your use case... essentially storing JSON data in GemFire as PDX and then attempting to read the data back out as the Order application domain object type using the Repository abstraction, logical enough.
Given the use of the Repository abstraction and implementation from Spring Data GemFire, the infrastructure will attempt to access the application domain object based on the Repository generic type parameter (in this case "Order" from the "OrderRepository" definition).
However, the data is stored in PDX, so now what?
No matter, Spring Data GemFire provides the MappingPdxSerializer class to convert PDX instances back to application domain objects using the same "mapping meta-data" that the Repository infrastructure uses. Cool, so I plug that in...
#Bean
public CacheFactoryBean gemfireCache(Properties gemfireProperties) {
CacheFactoryBean cacheFactoryBean = new CacheFactoryBean();
cacheFactoryBean.setProperties(gemfireProperties);
cacheFactoryBean.setPdxSerializer(new MappingPdxSerializer());
cacheFactoryBean.setPdxReadSerialized(false);
return cacheFactoryBean;
}
You will also notice, I set the PDX 'read-serialized' property (cacheFactoryBean.setPdxReadSerialized(false);) to false in order to ensure data access operations return the domain object and not the PDX instance.
However, this had no affect on the query method. In fact, it had no affect on the following operations either...
orderRepository.findOne(amazonOrder.getTransactionId());
ordersRegion.get(amazonOrder.getTransactionId());
Both calls returned a PdxInstance. Note, the implementation of OrderRepository.findOne(..) is based on SimpleGemfireRepository.findOne(key), which uses GemfireTemplate.get(key), which just performs Region.get(key), and so is effectively the same as (ordersRegion.get(amazonOrder.getTransactionId();). The outcome should not be, especially with Region.get() and read-serialized set to false.
With the OQL query (SELECT * FROM /Orders WHERE transactionId = $1) generated from the findByTransactionId(String id), the Repository infrastructure has a bit less control over what the GemFire query engine will return based on what the caller (OrderRepository) expects (based on the generic type parameter), so running OQL statements could potentially behave differently than direct Region access using get.
Next, I went onto try modifying the Order type to implement PdxSerializable, to handle the conversion during data access operations (direct Region access with get, OQL, or otherwise). This had no affect.
So, I tried to implement a custom PdxSerializer for Order objects. This had no affect either.
The only thing I can conclude at this point is something is getting lost in translation between Order -> JSON -> PDX and then from PDX -> Order. Seemingly, GemFire needs additional type meta-data required by PDX (something like #JsonTypeInfo(use = JsonTypeInfo.Id.CLASS, include = JsonTypeInfo.As.PROPERTY, property = "#type") in the JSON data that PDXFormatter recognizes, though I am not certain it does.
Note, in my test class, I used Jackson's ObjectMapper to serialize the Order to JSON and then GemFire's JSONFormatter to serialize the JSON to PDX, which I suspect Spring XD is doing similarly under-the-hood. In fact, Spring XD uses Spring Data GemFire and is most likely using the JSON Region Auto Proxy support. That is exactly what SDG's JSONRegionAdvice object does (see here).
Anyway, I have an inquiry out to the rest of the GemFire engineering team. There are also things that could be done in Spring Data GemFire to ensure the PDX data is converted, such as making use of the MappingPdxSerializer directly to convert the data automatically on behalf of the caller if the data is indeed of type PdxInstance. Similar to how JSON Region Auto Proxying works, you could write AOP interceptor for the Orders Region to automagicaly convert PDX to an Order.
Though, I don't think any of this should be necessary as GemFire should be doing the right thing in this case. Sorry I don't have a better answer right now. Let's see what I find out.
Cheers and stay tuned!
See subsequent post for test code.

Spring Security: AuthenticationFailureCredentialsExpiredEvent not fired

I am using spring.security.version = 3.1.0.RELEASE. The problem I am having is that for some reason AuthenticationFailureCredentialsExpiredEvent is not fired.
While debugging the code I found that AbstractUserDetailsAuthenticationProvider do display in the console that "User account credentials have expired". But I am still baffling as to why the event in concern is not triggered.
Here is my code:
class JpaUserDetails implements UserDetails {
...
...
#Override
public boolean isCredentialsNonExpired() {
if (some logic) {
return true;
}
else {
return false;
}
}
}
I do see AbstractUserDetailsAuthenticationProvider displaying in the console "User account credentials have expired" from the following lines of spring code:
public abstract class AbstractUserDetailsAuthenticationProvider implements AuthenticationProvider, InitilizeBean, MessageSourceAware {
...
...
private class DefaultPostAuthenticationChecks implements UserDetailsChecker {
public void check(UserDetails user) {
if(!user.isCredentialsNonExpired()) {
logger.debug("User account credentials have expired");
throw new CredentialsExpiredException(message.getMessage(
"AbstractUserDetailsAuthenticationProvider.credentialsExpired",
"User credentials have expired"), user);
}
}
}
}
The issue is that when the user credentials have expired, I am expecting the Spring to generate the event AuthenticationFailureCredentialsExpiredEvent which I am handling in the following way:
class SecurityEventDispatcher implements ApplicationListener<ApplicationEvent> {
final List<SecurityEventListener> listeners = new ArrayList<SecurityEventListener>();
public void registerListener(SecurityEventListener listener) {
this.listener.add(listener);
}
public void onApplicationEvent(ApplicationEvent event) {
for (SecurityEventListener listener : this.listeners) {
if(listener.canHandle(event)) {
listener.handle(event);
}
}
}
}
This is how I am handling the login failure event:
public class LoginFailedEvent extends SecurityEventListener {
#Override
public boolean canHandle(Object event) {
if(event instanceof AbstractAuthenticationFailureEvent) {
return true;
}
else {
return false;
}
}
#Override
public void handle(Object event) {
if (event instanceof AuthenticationFailureBadCredentialsEvent) {
// do something
}
if (event instanceof AuthenticationFailureCredentialsExpiredEvent) {
// do something
}
}
}
The issue as I mentioned before is that AuthenticationFailureCredentialsExpiredEvent is never fired. I have tested the AuthenticationFailureBadCredentialsEvent which works fine.
This is what I get in event for bad credentials: (which is working fine)
org.springframework.security.authentication.event.AuthenticationFailureBadCredentialsEvent
This is what I get in event for expired password:
ServletRequestHandledEvent: url=[/app/loginFailure] with failureCause = null
Does anyone have any idea what could be wrong? Any help will be highly appreciated.
Here is the answer to this question, since there isn't any much literature out there regarding the issue.
You probably need to set the ProviderManager's
('s) eventPublisher to be something other than
NullEventPublisher. There is not a simple way to do this via the
tag, so you will want to create the
AuthenticationProvider using standard beans configuration and inject
it into a standard Spring Bean for the ProviderManager.
Rob Winch - Spring Security Lead
If anyone is running into this issue, just upgrade the spring security to 3.1.2 or +, the issue is fixed.
(Applies to Spring Security 5)
Spring's default publisher is the NullEventPublisher. This will effectively publish nothing.To get events please configure a
import org.springframework.security.authentication.DefaultAuthenticationEventPublisher;
#Bean
public DefaultAuthenticationEventPublisher defaultAuthenticationEventPublisher() {
return new DefaultAuthenticationEventPublisher();
}
Now the events a published and one can just consume them as any other event:
#EventListener
public void logAuditEvents(AbstractAuthenticationEvent event) {
...
}
and
#EventListener
public void logAuditEvents(AbstractAuthorizationEvent event) {
...
}

How can I access previously set fields in a seam #Observer method?

My current setup is JBoss Seam 2.2 on JBoss 4.2.3.GA.
I have two Beans like so:
#Name("mailingManager")
#Scope(ScopeType.PAGE)
public class MailingMgr {
private Mailing selectedMailing;
#Observer("mailing.letter.success")
public void recordSuccess(final Object arg) {
if (null != selectedMailing) { // store arg }
}
public void send() {
selectedMailing = new Mailing();
if ('EMAIL' == determineType()) {
EmailSender mailer = (EmailSender) Component.getInstance(EmailSender.class);
mailer.send(getAddresses());
}
// ... more options
}
}
#Name("emailSender")
#Scope(ScopeType.PAGE)
public class EmailSender {
public void send(final Set<String> addresses) {
for (String addr : addresses) {
// ... create a mail
Events.instance().raiseEvent("mailing.letter.success", getGeneratedMail());
}
}
}
The problem is that when recordSuccess() is called selectedMailing is always null.
As a workaround I'm setting selectedMailing in the conversation context manually before calling any code that could potentially trigger my events, and then annotate my field with #In(required=false) to inject it again before recordSuccess is called. But is there a more elegant solution (keeping the decoupling intact)? And why isn't the calling bean reused to handle the event?

EclipseLink converts Enum to BigDecimal

I try to convert an Enum into a BigDecimal using the Converter of EclipseLink. The conversion works, but the resulting database column has a type of String. Is it possible to set a parameter, that EclipseLink builds a decimal column type within the database?
I use a class, which implements org.eclipse.persistence.mappings.converters.Converter.
The application server logs
The default table generator could not locate or convert a java type (null) into a database type for database field (xyz). The generator uses java.lang.String as default java type for the field.
This message is generated for every field, which uses a converter. How can I define a specific database type for these fields?
public enum IndirectCosts {
EXTENDED {
public BigDecimal getPercent() {
return new BigDecimal("25.0");
}
},
NORMAL {
public BigDecimal getPercent() {
return new BigDecimal("12.0");
}
},
NONE {
public BigDecimal getPercent() {
return new BigDecimal("0.0");
}
};
public abstract BigDecimal getPercent();
public static IndirectCosts getType(BigDecimal percent) {
for (IndirectCosts v : IndirectCosts.values()) {
if (v.getPercent().compareTo(percent) == 0) {
return v;
}
}
throw new IllegalArgumentException();
}
}
The database has to store the numeric values. I use such a converter:
public class IndirectCostsConverter implements Converter {
#Override
public Object convertObjectValueToDataValue(Object objectValue, Session session) {
if (objectValue == null) {
return objectValue;
} else if (objectValue instanceof IndirectCosts) {
return ((IndirectCosts) objectValue).getPercent();
}
throw new TypeMismatchException(objectValue, IndirectCosts.class);
}
#Override
public Object convertDataValueToObjectValue(Object dataValue, Session session) {
if (dataValue == null) {
return dataValue;
} else if (dataValue instanceof String) {
return IndirectCosts.getType(new BigDecimal((String) dataValue));
}
throw new TypeMismatchException(dataValue, BigDecimal.class);
}
#Override
public boolean isMutable() {
return false;
}
#Override
public void initialize(DatabaseMapping databaseMapping, Session session) {
}
}
Within convertDataValueToObjectValue(Object I have to use String because the SQL generator defines the database column as varchar(255). I would like to have decimal(15,2) or something.
Thanks a lot
Andre
The EclipseLink Converter interface defines a initialize(DatabaseMapping mapping, Session session); method that you can use to set the type to use for the field. Someone else posted an example showing how to get the field from the mapping here: Using UUID with EclipseLink and PostgreSQL
The DatabaseField's columnDefinition, if set, will be the only thing used to define the type for DDL generation, so set it carefully. The other settings (not null, nullable etc) will only be used if the columnDefinition is left unset.

Resources