spring boot 2 + spring data jpa rollback transactions - spring-boot

I have a spring boot 2 project with spring data jpa for persistence.
I have a manager with a method that inside calls 3 different save method. each save is located in a service annotated as transactional at class level. Also the manager's method is annotated as transactional.
The db is PastgeSQL
the manager's method is
#Override
#Transactional
public void saveUserComplete(CustomUserDetail user, UserAnag anag, List<UserRolePermission> rolePermissions) {
this.userDetailsService.save(user);
this.userAnagService.save(anag);
this.userRolePermissionService.saveAll(rolePermissions);
}
while the 3 service method called inside are
public void save(CustomUserDetail user) {
Calendar c = Calendar.getInstance();
Date now = c.getTime();
UserEntity entity = null;
if(null != user.getId()){
Optional<UserEntity> entityOpt = this.userRepository.findById(user.getId());
if(entityOpt.isPresent()){
entity = entityOpt.get();
entity.setUpdatedAt(now);
//entity.setUpdateUser();
}
}
else{
entity = new UserEntity();
entity.setCreatedAt(now);
//entity.setCreatUser();
}
entity.fillEntityFromPojo(user);
if(null != entity) { this.userRepository.save(entity); }
}
the first nethod called
public void save(UserAnag userAnag){
Calendar c = Calendar.getInstance();
Date now = c.getTime();
UserAnagEntity entity = null;
if (null != userAnag.getUsername()) {
Optional<UserAnagEntity> entityOpt = this.userAnagRepository.findByUsername(userAnag.getUsername());
if (entityOpt.isPresent()) {
entity = entityOpt.get();
entity.setUpdatedAt(now);
//entity.setUpdateUser();
} else {
entity = new UserAnagEntity();
entity.setCreatedAt(now);
//entity.setCreatUser();
}
entity.fillEntityFromPojo(userAnag);
}
if (null != entity) {
this.userAnagRepository.save(entity);
}
}
the second method called
public void saveAll(List<UserRolePermission> authorities) {
Calendar c = Calendar.getInstance();
Date now = c.getTime();
List<UserRolePermissionEntity> entities = Lists.newArrayListWithExpectedSize(authorities.size());
for (UserRolePermission urp : authorities) {
UserRolePermissionEntity entity = null;
if (null != urp.getUsername()) {
Optional<UserRolePermissionEntity> entityOpt = this.userRolePermissionRepository.findByUsernameAndRolenameAndPermission(urp.getUsername(), urp.getRolename(), urp.getPermission());
if (entityOpt.isPresent()) {
entity = entityOpt.get();
entity.fillEntityFromPojo(urp);
entity.setUpdatedAt(now);
// entity.setUpdateUser();
} else {
entity = new UserRolePermissionEntity();
entity.fillEntityFromPojo(urp);
entity.setCreatedAt(now);
// entity.setCreatUser();
}
entities.add(entity);
}
}
if (!entities.isEmpty()) {
String username = entities.get(0).getUsername();
this.userRolePermissionRepository.deleteByUsername(username);
this.userRolePermissionRepository.flush();
this.userRolePermissionRepository.saveAll(entities);
}
}
the last one.
What i expect is that if one of the save method fails the others are rollbacked, but now if one fails the others are performed the same so i have only partial data saved on the db.
What's wrong in my code?

Related

transactional unit testing with ObjectifyService - no rollback happening

We are trying to use google cloud datastore in our project and trying to use objectify as the ORM since google recommends it. I have carefully used and tried everything i could read about and think of but somehow the transactions don't seem to work. Following is my code and setup.
#RunWith(SpringRunner.class)
#EnableAspectJAutoProxy(proxyTargetClass = true)
#ContextConfiguration(classes = { CoreTestConfiguration.class })
public class TestObjectifyTransactionAspect {
private final LocalServiceTestHelper helper = new LocalServiceTestHelper(
// Our tests assume strong consistency
new LocalDatastoreServiceTestConfig().setApplyAllHighRepJobPolicy(),
new LocalMemcacheServiceTestConfig(), new LocalTaskQueueTestConfig());
private Closeable closeableSession;
#Autowired
private DummyService dummyService;
#BeforeClass
public static void setUpBeforeClass() {
// Reset the Factory so that all translators work properly.
ObjectifyService.setFactory(new ObjectifyFactory());
}
/**
* #throws java.lang.Exception
*/
#Before
public void setUp() throws Exception {
System.setProperty("DATASTORE_EMULATOR_HOST", "localhost:8081");
ObjectifyService.register(UserEntity.class);
this.closeableSession = ObjectifyService.begin();
this.helper.setUp();
}
/**
* #throws java.lang.Exception
*/
#After
public void tearDown() throws Exception {
AsyncCacheFilter.complete();
this.closeableSession.close();
this.helper.tearDown();
}
#Test
public void testTransactionMutationRollback() {
// save initial list of users
List<UserEntity> users = new ArrayList<UserEntity>();
for (int i = 1; i <= 10; i++) {
UserEntity user = new UserEntity();
user.setAge(i);
user.setUsername("username_" + i);
users.add(user);
}
ObjectifyService.ofy().save().entities(users).now();
try {
dummyService.mutateDataWithException("username_1", 6L);
} catch (Exception e) {
e.printStackTrace();
}
List<UserEntity> users2 = this.dummyService.findAllUsers();
Assert.assertEquals("Size mismatch on rollback", users2.size(), 10);
boolean foundUserIdSix = false;
for (UserEntity userEntity : users2) {
if (userEntity.getUserId() == 1) {
Assert.assertEquals("Username update failed in transactional context rollback.", "username_1",
userEntity.getUsername());
}
if (userEntity.getUserId() == 6) {
foundUserIdSix = true;
}
}
if (!foundUserIdSix) {
Assert.fail("Deleted user with userId 6 but it is not rolledback.");
}
}
}
Since I am using spring, idea is to use an aspect with a custom annotation to weave objectify.transact around the spring service beans methods that are calling my daos.
But somehow the update due to ObjectifyService.ofy().save().entities(users).now(); is not gettign rollbacked though the exception throws causes Objectify to run its rollback code. I tried printing the ObjectifyImpl instance hashcodes and they are all same but still its not rollbacking.
Can someone help me understand what am i doing wrong? Havent tried the actual web based setup yet...if it cant pass transnational test cases there is no point in actual transaction usage in a web request scenario.
Update: Adding aspect, services, dao as well to make a complete picture. The code uses spring boot.
DAO class. Note i am not using any transactions here because as per code of com.googlecode.objectify.impl.TransactorNo.transactOnce(ObjectifyImpl<O>, Work<R>) a transnational ObjectifyImpl is flushed and committed in this method which i don't want. I want commit to happen once and rest all to join in on that transaction. Basically this is the wrong code in com.googlecode.objectify.impl.TransactorNo ..... i will try to explain my understanding a later in the question.
#Component
public class DummyDaoImpl implements DummyDao {
#Override
public List<UserEntity> loadAll() {
Query<UserEntity> query = ObjectifyService.ofy().transactionless().load().type(UserEntity.class);
return query.list();
}
#Override
public List<UserEntity> findByUserId(Long userId) {
Query<UserEntity> query = ObjectifyService.ofy().transactionless().load().type(UserEntity.class);
//query = query.filterKey(Key.create(UserEntity.class, userId));
return query.list();
}
#Override
public List<UserEntity> findByUsername(String username) {
return ObjectifyService.ofy().transactionless().load().type(UserEntity.class).filter("username", username).list();
}
#Override
public void update(UserEntity userEntity) {
ObjectifyService.ofy().save().entity(userEntity);
}
#Override
public void update(Iterable<UserEntity> userEntities) {
ObjectifyService.ofy().save().entities(userEntities);
}
#Override
public void delete(Long userId) {
ObjectifyService.ofy().delete().key(Key.create(UserEntity.class, userId));
}
}
Below is the Service class
#Service
public class DummyServiceImpl implements DummyService {
private static final Logger LOGGER = LoggerFactory.getLogger(DummyServiceImpl.class);
#Autowired
private DummyDao dummyDao;
public void saveDummydata() {
List<UserEntity> users = new ArrayList<UserEntity>();
for (int i = 1; i <= 10; i++) {
UserEntity user = new UserEntity();
user.setAge(i);
user.setUsername("username_" + i);
users.add(user);
}
this.dummyDao.update(users);
}
/* (non-Javadoc)
* #see com.bbb.core.objectify.test.services.DummyService#mutateDataWithException(java.lang.String, java.lang.Long)
*/
#Override
#ObjectifyTransactional
public void mutateDataWithException(String usernameToMutate, Long userIdToDelete) throws Exception {
//update one
LOGGER.info("Attempting to update UserEntity with username={}", "username_1");
List<UserEntity> mutatedUsersList = new ArrayList<UserEntity>();
List<UserEntity> users = dummyDao.findByUsername(usernameToMutate);
for (UserEntity userEntity : users) {
userEntity.setUsername(userEntity.getUsername() + "_updated");
mutatedUsersList.add(userEntity);
}
dummyDao.update(mutatedUsersList);
//delete another
UserEntity user = dummyDao.findByUserId(userIdToDelete).get(0);
LOGGER.info("Attempting to delete UserEntity with userId={}", user.getUserId());
dummyDao.delete(user.getUserId());
throw new RuntimeException("Dummy Exception");
}
/* (non-Javadoc)
* #see com.bbb.core.objectify.test.services.DummyService#findAllUsers()
*/
#Override
public List<UserEntity> findAllUsers() {
return dummyDao.loadAll();
}
Aspect which wraps the method annoted with ObjectifyTransactional as a transact work.
#Aspect
#Component
public class ObjectifyTransactionAspect {
private static final Logger LOGGER = LoggerFactory.getLogger(ObjectifyTransactionAspect.class);
#Around(value = "execution(* *(..)) && #annotation(objectifyTransactional)")
public Object objectifyTransactAdvise(final ProceedingJoinPoint pjp, ObjectifyTransactional objectifyTransactional) throws Throwable {
try {
Object result = null;
Work<Object> work = new Work<Object>() {
#Override
public Object run() {
try {
return pjp.proceed();
} catch (Throwable throwable) {
throw new ObjectifyTransactionExceptionWrapper(throwable);
}
}
};
switch (objectifyTransactional.propagation()) {
case REQUIRES_NEW:
int limitTries = objectifyTransactional.limitTries();
if(limitTries <= 0) {
Exception illegalStateException = new IllegalStateException("limitTries must be more than 0.");
throw new ObjectifyTransactionExceptionWrapper(illegalStateException);
} else {
if(limitTries == Integer.MAX_VALUE) {
result = ObjectifyService.ofy().transactNew(work);
} else {
result = ObjectifyService.ofy().transactNew(limitTries, work);
}
}
break;
case NOT_SUPPORTED :
case NEVER :
case MANDATORY :
result = ObjectifyService.ofy().execute(objectifyTransactional.propagation(), work);
break;
case REQUIRED :
case SUPPORTS :
ObjectifyService.ofy().transact(work);
break;
default:
break;
}
return result;
} catch (ObjectifyTransactionExceptionWrapper e) {
String packageName = pjp.getSignature().getDeclaringTypeName();
String methodName = pjp.getSignature().getName();
LOGGER.error("An exception occured while executing [{}.{}] in a transactional context."
, packageName, methodName, e);
throw e.getCause();
} catch (Throwable ex) {
String packageName = pjp.getSignature().getDeclaringTypeName();
String methodName = pjp.getSignature().getName();
String fullyQualifiedmethodName = packageName + "." + methodName;
throw new RuntimeException("Unexpected exception while executing ["
+ fullyQualifiedmethodName + "] in a transactional context.", ex);
}
}
}
Now the problem code part that i see is as follows in com.googlecode.objectify.impl.TransactorNo:
#Override
public <R> R transact(ObjectifyImpl<O> parent, Work<R> work) {
return this.transactNew(parent, Integer.MAX_VALUE, work);
}
#Override
public <R> R transactNew(ObjectifyImpl<O> parent, int limitTries, Work<R> work) {
Preconditions.checkArgument(limitTries >= 1);
while (true) {
try {
return transactOnce(parent, work);
} catch (ConcurrentModificationException ex) {
if (--limitTries > 0) {
if (log.isLoggable(Level.WARNING))
log.warning("Optimistic concurrency failure for " + work + " (retrying): " + ex);
if (log.isLoggable(Level.FINEST))
log.log(Level.FINEST, "Details of optimistic concurrency failure", ex);
} else {
throw ex;
}
}
}
}
private <R> R transactOnce(ObjectifyImpl<O> parent, Work<R> work) {
ObjectifyImpl<O> txnOfy = startTransaction(parent);
ObjectifyService.push(txnOfy);
boolean committedSuccessfully = false;
try {
R result = work.run();
txnOfy.flush();
txnOfy.getTransaction().commit();
committedSuccessfully = true;
return result;
}
finally
{
if (txnOfy.getTransaction().isActive()) {
try {
txnOfy.getTransaction().rollback();
} catch (RuntimeException ex) {
log.log(Level.SEVERE, "Rollback failed, suppressing error", ex);
}
}
ObjectifyService.pop();
if (committedSuccessfully) {
txnOfy.getTransaction().runCommitListeners();
}
}
}
transactOnce is by code / design always using a single transaction to do things. It will either commit or rollback the transaction. there is no provision to chain transactions like a normal enterprise app would want.... service -> calls multiple dao methods in a single transaction and commits or rollbacks depending on how things look.
keeping this in mind, i removed all annotations and transact method calls in my dao methods so that they don't start an explicit transaction and the aspect in service wraps the service method in transact and ultimately in transactOnce...so basically the service method is running in a transaction and no new transaction is getting fired again. This is a very basic scenario, in actual production apps services can call other service methods and they might have the annotation on them and we could still end up in a chained transaction..but anyway...that is a different problem to solve....
I know NoSQLs dont support write consistency at table or inter table levels so am I asking too much from google cloud datastore?

What is happening internally for auto increment field in data base side and application side?

I have a bean for spring application
#Entity
#Table(name="tbl_apply_leave")
public class ApplyLeaveModel{
#Id
#GeneratedValue(strategy=GenerationType.AUTO)
private int dbid;
private String employee_name;
public int getDbid() {
return dbid;
}
public void setDbid(int dbid) {
this.dbid = dbid;
}
public String getEmployee_name() {
return employee_name;
}
public void setEmployee_name(String employee_name) {
this.employee_name = employee_name;
}
}
then In controller before save this object value of dbid is null.
but after save the object using hibernate dbid contains value.
I don't know what's happening.
controller code is:
System.out.println(applyLeaveModel.getDbid()); // null
leave_dao.saveApplyLeaveModel(applyLeaveModel);
System.out.println(applyLeaveModel.getDbid()); // 5
public void saveApplyLeaveModel(ApplyLeaveModel applyLeaveModel) {
Session session = null;
Transaction trans_obj = null;
try {
session = sessionFactory.openSession();
if (session.isOpen() && session != null) {
trans_obj = session.beginTransaction();
session.persist(applyLeaveModel);
}
} catch (Exception e) {
System.out.println("save ApplyLeaveModel session " + e);
} finally {
trans_obj.commit();
session.close();
}
}
Hibernate executes PreparedStatement.getGeneratedKeys() to obtain the generated id after insert.
Also related: hibernate.jdbc.use_get_generated_keys.
Sidenote: Primitive types in Java cannot have null value. Probably you mean 0, or type of the id is Integer.

Initial data on JPA repositories

I'm looking for a convenient way to provide initial data for my application. Currently I've implemented a Spring Data JPA based project which is my foundation of all database related operation.
Example:
I've got a entity Role which can be assigned to the entity User. On a clean application start I would like to provide directly some default roles (e.g. admin, manager, etc).
Best
I built a random data factory :
public class RandomDataFactory {
private static final String UNGENERATED_VALUE_MARKER = "UNGENERATED_VALUE_MARKER";
private static void randomlyPopulateFields(Object object) {
new RandomValueFieldPopulator().populate(object);
}
/**
* Instantiates a single object with random data
*/
public static <T> T getSingle(Class<T> clazz) throws IllegalAccessException, InstantiationException {
T object = clazz.newInstance();
randomlyPopulateFields(object);
return object;
}
/**
* Returns an unmodifiable list of specified type objects with random data
*
* #param clazz the myPojo.class to be instantiated with random data
* #param maxLength the length of list to be returned
*/
public static <T> List<T> getList(Class<T> clazz, int maxLength) throws IllegalAccessException, InstantiationException {
List<T> list = new ArrayList<T>(maxLength);
for (int i = 0; i < maxLength; i++) {
T object = clazz.newInstance();
randomlyPopulateFields(object);
list.add(i, object);
}
return Collections.unmodifiableList(list);
}
/**
* Returns a unmodifiable list of specified type T objects with random data
* <p>List length will be 3</p>
*
* #param clazz the myPojo.class to be instantiated with random data
*/
public static <T> List<T> getList(Class<T> clazz) throws InstantiationException, IllegalAccessException {
return getList(clazz, 3);
}
public static <T> T getPrimitive(Class<T> clazz) {
return (T) RandomValueFieldPopulator.generateRandomValue(clazz);
}
public static <T> List<T> getPrimitiveList(Class<T> clazz) {
return getPrimitiveList(clazz, 3);
}
public static <T> List<T> getPrimitiveList(Class<T> clazz, int length) {
List<T> randoms = new ArrayList<T>(length);
for (int i = 0; i < length; i++) {
randoms.add(getPrimitive(clazz));
}
return randoms;
}
private static class RandomValueFieldPopulator {
public static Object generateRandomValue(Class<?> fieldType) {
Random random = new Random();
if (fieldType.equals(String.class)) {
return UUID.randomUUID().toString();
} else if (Date.class.isAssignableFrom(fieldType)) {
return new Date(System.currentTimeMillis() - random.nextInt());
} else if (LocalDate.class.isAssignableFrom(fieldType)) {
Date date = new Date(System.currentTimeMillis() - random.nextInt());
return new LocalDate(date);
} else if (fieldType.equals(Character.class) || fieldType.equals(Character.TYPE)) {
return (char) (random.nextInt(26) + 'a');
} else if (fieldType.equals(Integer.TYPE) || fieldType.equals(Integer.class)) {
return random.nextInt();
} else if (fieldType.equals(Short.TYPE) || fieldType.equals(Short.class)) {
return (short) random.nextInt();
} else if (fieldType.equals(Long.TYPE) || fieldType.equals(Long.class)) {
return random.nextLong();
} else if (fieldType.equals(Float.TYPE) || fieldType.equals(Float.class)) {
return random.nextFloat();
} else if (fieldType.equals(Double.TYPE)) {
return random.nextInt(); //if double is used, jsonPath uses bigdecimal to convert back
} else if (fieldType.equals(Double.class)) {
return random.nextDouble(); //if double is used, jsonPath uses bigdecimal to convert back
} else if (fieldType.equals(Boolean.TYPE) || fieldType.equals(Boolean.class)) {
return random.nextBoolean();
} else if (fieldType.equals(BigDecimal.class)) {
return new BigDecimal(random.nextFloat());
} else if (Enum.class.isAssignableFrom(fieldType)) {
Object[] enumValues = fieldType.getEnumConstants();
return enumValues[random.nextInt(enumValues.length)];
} else if (Number.class.isAssignableFrom(fieldType)) {
return random.nextInt(Byte.MAX_VALUE) + 1;
} else {
return UNGENERATED_VALUE_MARKER;
}
public void populate(Object object) {
ReflectionUtils.doWithFields(object.getClass(), new RandomValueFieldSetterCallback(object));
}
private static class RandomValueFieldSetterCallback implements ReflectionUtils.FieldCallback {
private final Object targetObject;
public RandomValueFieldSetterCallback(Object targetObject) {
this.targetObject = targetObject;
}
#Override
public void doWith(Field field) throws IllegalAccessException {
Class<?> fieldType = field.getType();
if (!Modifier.isFinal(field.getModifiers())) {
Object value = generateRandomValue(fieldType);
if (!value.equals(UNGENERATED_VALUE_MARKER)) {
ReflectionUtils.makeAccessible(field);
field.set(targetObject, value);
}
}
}
}
}
}
Look into an in-memory H2 database.
http://www.h2database.com/html/main.html
Maven Dependency
<!-- H2 Database -->
<dependency>
<groupId>com.h2database</groupId>
<artifactId>h2</artifactId>
<version>1.4.178</version>
</dependency>
Spring Java Config Entry
#Bean
public DataSource dataSource() {
System.out.println("**** USING H2 DATABASE ****");
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
return builder.setType(EmbeddedDatabaseType.H2).addScript("/schema.sql").build();
}
You can create/load the H2 database w/ a SQL script in the above code using .addscript().
If you are using it for Unit test, and need a different state for different test, then
There is a http://dbunit.sourceforge.net/
Specifically for Spring there is http://springtestdbunit.github.io/spring-test-dbunit/
If you need to initialize it only once and using EmbeddedDatabaseBuilder for testing, then as Brandon said, you can use EmbeddedDatabaseBuilder.
#Bean
public DataSource dataSource() {
EmbeddedDatabaseBuilder builder = new EmbeddedDatabaseBuilder();
return builder.setType(EmbeddedDatabaseType.H2).addScript("/schema.sql").build();
}
If you want it to be initialised on application start, you can add #PostConstruct function to your Configuration bean, and it will be initialised after configuration bean was created.
#PostConstruct
public void initializeDB() {
}

What is the recommended way to doing nhibernate session management in asp.net-mvc that supports second level cache / transactions, etc?

I am struggling to get second level caching and transactions working in my asp.net-mvc site and I am thinking it has to do with how I have my session managements setup.
Basically I have the following classes:
NhibernateRepository
SessionManager
and I am using Unity IOC Container:
this.RegisterType<IRepository, NHibernateRepository>(new PerResolveLifetimeManager());
this.RegisterType<ISessionManager, SessionManager>(new PerResolveLifetimeManager());
The NhibernateRepository class looks like this with a Session property
public NHibernateRepository(UserModel userModel, ISessionManager sessionManager)
{
UserModel = userModel;
SessionManager = sessionManager;
}
public ISession Session
{
get
{
using (_lock.WaitToRead())
{
if (_session != null) return _session;
}
using (_lock.WaitToWrite())
{
if (_session != null) return _session;
_session = SessionManager.GetSession(UserModel == null ? "Task" : UserModel.FullName);
return _session;
}
}
}
The session Manager class looks like this:
public class SessionManager : ISessionManager
{
private static readonly ResourceLock _lock = new OneManyResourceLock();
public static ISessionFactory Factory { get; set; }
public ISession GetSession(string userName)
{
ISession session = GetSessionFactory().OpenSession(new AuditInterceptor(userName));
return session;
}
private static ISessionFactory GetSessionFactory()
{
using (_lock.WaitToRead())
{
if (Factory != null) return Factory;
}
using (_lock.WaitToWrite())
{
if (Factory != null) return Factory;
string connectionString = ConfigurationManager.ConnectionStrings["DomainConnection"].ConnectionString;
Factory = FluentlyConfigureFactory(connectionString, false);
return Factory;
}
}
private static ISessionFactory FluentlyConfigureFactory(string connectionString, bool showSql)
{
MsSqlConfiguration databaseConfiguration = MsSqlConfiguration.MsSql2005
.ConnectionString(c => c.Is(connectionString))
.Dialect<SparcMsSqlDialect>()
.UseOuterJoin()
.UseReflectionOptimizer();
if (showSql)
{
databaseConfiguration.ShowSql();
}
databaseConfiguration.Raw("generate_statistics", showSql.ToString());
FluentConfiguration configuration = Fluently.Configure().Database(databaseConfiguration);
return configuration
.Mappings(m => m.FluentMappings.AddFromAssemblyOf<ApplicationMap>().Conventions.Add(typeof(Conventions)))
.ExposeConfiguration(
c => {
c.SetProperty("cache.provider_class", "NHibernate.Caches.SysCache.SysCacheProvider, NHibernate.Caches.SysCache");
c.SetProperty("cache.use_second_level_cache", "true");
c.SetProperty("cache.use_query_cache", "true");
c.SetProperty("expiration", "86400");
})
.BuildSessionFactory();
}
Does anyone see anything fundamentally wrong with this? From googling I see all different opinions on how you should setup asp.net-mvc with nhibernate (adding transaction in beginRequest and committing on endRequest, etc) but I can't find the canonical way of getting it working with second level caching, etc that seems to be the best practice for having high scalability, etc
I tried adding transactions into this code given what I have read but i now seem to be getting this error:
Initializing[ (one of my domain objects) #1]-Could not initialize proxy - no Session.
so I reverted that code. Basically I am hoping there is a "go to" best practice at this point for using any second level cache, transactions in asp.net-mvc . .
I like to use an implementation of memory cache class which practically handles the object locks for you, I've implemented a custom module for a 2 level cache for nHibernate and you can plug by doing some small configuration, for that you need to implement is ICache interface.
public class NHibernateCache2 : ICache
{
private static readonly IInternalLogger Log = LoggerProvider.LoggerFor(typeof(NHibernateCache2));
private readonly string _region;
private string _regionPrefix;
private readonly MemoryCache _cache;
private TimeSpan _expiration;
private CacheItemPriority _priority;
// The name of the cache key used to clear the cache. All cached items depend on this key.
private readonly string _rootCacheKey;
private bool _rootCacheKeyStored;
private static readonly TimeSpan DefaultExpiration = TimeSpan.FromSeconds(300);
private static readonly string DefauktRegionPrefix = string.Empty;
private const string CacheKeyPrefix = "NHibernate-Cache:";
public NHibernateCache2():this("nhibernate", null)
{
}
public NHibernateCache2(string region):this(region, null)
{
}
/// There are two (2) configurable parameters:
/// expiration = number of seconds to wait before expiring each item
/// priority = a numeric cost of expiring each item, where 1 is a low cost, 5 is the highest, and 3 is normal. Only values 1 through 5 are valid.
/// All parameters are optional. The defaults are an expiration of 300 seconds and the default priority of 3.
public NHibernateCache2(string region, IDictionary<string, string> properties)
{
_region = region;
_cache = MemoryCache.Default;
Configure(properties);
_rootCacheKey = GenerateRootCacheKey();
StoreRootCacheKey();
}
/// Defines property in order to get the region for the NHibernate's Cache.
public string Region
{
get { return _region; }
}
/// Obtains a expiration value that indicates the time in seconds after which an object is automatically
/// evicted from the cache.
public TimeSpan Expiration
{
get { return _expiration; }
}
/// Obtains a priority value that indicates the likelihood that an object of that region evicts
/// another already cached object of a lower priority region.
public CacheItemPriority Priority
{
get { return _priority; }
}
private void Configure(IDictionary<string, string> props)
{
if (props == null)
{
if (Log.IsWarnEnabled)
{
Log.Warn("configuring cache with default values");
}
_expiration = DefaultExpiration;
_priority = CacheItemPriority.Default;
_regionPrefix = DefauktRegionPrefix;
}
else
{
_priority = GetPriority(props);
_expiration = GetExpiration(props);
_regionPrefix = GetRegionPrefix(props);
}
}
private static string GetRegionPrefix(IDictionary<string, string> props)
{
string result;
if (props.TryGetValue("regionPrefix", out result))
{
Log.DebugFormat("new regionPrefix :{0}", result);
}
else
{
result = DefauktRegionPrefix;
Log.Debug("no regionPrefix value given, using defaults");
}
return result;
}
private static TimeSpan GetExpiration(IDictionary<string, string> props)
{
TimeSpan result = DefaultExpiration;
string expirationString;
if (!props.TryGetValue("expiration", out expirationString))
{
props.TryGetValue(NHibernate.Cfg.Environment.CacheDefaultExpiration, out expirationString);
}
if (expirationString != null)
{
try
{
int seconds = Convert.ToInt32(expirationString);
result = TimeSpan.FromSeconds(seconds);
Log.Debug("new expiration value: " + seconds);
}
catch (Exception ex)
{
Log.Error("error parsing expiration value");
throw new ArgumentException("could not parse 'expiration' as a number of seconds", ex);
}
}
else
{
if (Log.IsDebugEnabled)
{
Log.Debug("no expiration value given, using defaults");
}
}
return result;
}
private static CacheItemPriority GetPriority(IDictionary<string, string> props)
{
CacheItemPriority result = CacheItemPriority.Default;
string priorityString;
if (props.TryGetValue("priority", out priorityString))
{
result = ConvertCacheItemPriorityFromXmlString(priorityString);
if (Log.IsDebugEnabled)
{
Log.Debug("new priority: " + result);
}
}
return result;
}
private static CacheItemPriority ConvertCacheItemPriorityFromXmlString(string priorityString)
{
if (string.IsNullOrEmpty(priorityString))
{
return CacheItemPriority.Default;
}
var ps = priorityString.Trim().ToLowerInvariant();
if (ps.Length == 1 && char.IsDigit(priorityString, 0))
{
// the priority is specified as a number
int priorityAsInt = int.Parse(ps);
if (priorityAsInt >= 1 && priorityAsInt <= 6)
{
return (CacheItemPriority)priorityAsInt;
}
}
else
{
/// change for your own priority settings
switch (ps)
{
case "abovenormal":
return CacheItemPriority.Default;
case "belownormal":
return CacheItemPriority.Default;
case "default":
return CacheItemPriority.Default;
case "high":
return CacheItemPriority.Default;
case "low":
return CacheItemPriority.Default;
case "normal":
return CacheItemPriority.Default;
case "notremovable":
return CacheItemPriority.NotRemovable;
}
}
Log.Error("priority value out of range: " + priorityString);
throw new IndexOutOfRangeException("Priority must be a valid System.Web.Caching.CacheItemPriority; was: " + priorityString);
}
private string GetCacheKey(object key)
{
return String.Concat(CacheKeyPrefix, _regionPrefix, _region, ":", key.ToString(), "#", key.GetHashCode());
}
/// Gets an object that exist in the second level cache of NHibernate by the specified key.
///A unique identifier for the cache entry to get.
///Returns an entry from the NHibernate's Cache.
public object Get(object key)
{
if (key == null)
{
return null;
}
string cacheKey = GetCacheKey(key);
if (Log.IsDebugEnabled)
{
Log.Debug(String.Format("Fetching object '{0}' from the cache.", cacheKey));
}
object obj = _cache.Get(cacheKey);
if (obj == null)
{
return null;
}
var de = (DictionaryEntry)obj;
if (key.Equals(de.Key))
{
return de.Value;
}
else
{
return null;
}
}
/// Adds a specific object inside the in the second level cache of NHibernate by using its key and its content.
/// A key value of an item from the second level cache of NHibernate.
/// Data for an entry of second level cache of NHibernate.
public void Put(object key, object value)
{
if (key == null)
{
throw new ArgumentNullException("key", "null key not allowed");
}
if (value == null)
{
throw new ArgumentNullException("value", "null value not allowed");
}
string cacheKey = GetCacheKey(key);
if (_cache[cacheKey] != null)
{
if (Log.IsDebugEnabled)
{
Log.Debug(String.Format("updating value of key '{0}' to '{1}'.", cacheKey, value));
}
// Remove the key to re-add it again below
_cache.Remove(cacheKey);
}
else
{
if (Log.IsDebugEnabled)
{
Log.Debug(String.Format("adding new data: key={0}&value={1}", cacheKey, value));
}
}
if (!_rootCacheKeyStored)
{
StoreRootCacheKey();
}
var cacheItemPolicy = new CacheItemPolicy()
{
AbsoluteExpiration = DateTime.Now.Add(_expiration),
SlidingExpiration = ObjectCache.NoSlidingExpiration,
Priority = _priority,
};
cacheItemPolicy.ChangeMonitors.Add(_cache.CreateCacheEntryChangeMonitor(new[] { _rootCacheKey }));
_cache.Add(
cacheKey,
new DictionaryEntry(key, value),
cacheItemPolicy);
}
/// Removes a cache entry from second level cache of NHibernate by a key.
/// A key value of an item from second level cache of NHibernate.
public void Remove(object key)
{
if (key == null)
{
throw new ArgumentNullException("key");
}
string cacheKey = GetCacheKey(key);
if (Log.IsDebugEnabled)
{
Log.Debug("removing item with key: " + cacheKey);
}
_cache.Remove(cacheKey);
}
/// Removes an object/item from second level cache of NHibernate.
public void Clear()
{
RemoveRootCacheKey();
StoreRootCacheKey();
}
/// Generate a unique root key for all cache items to be dependant upon
private string GenerateRootCacheKey()
{
return GetCacheKey(Guid.NewGuid());
}
private void RootCacheItemRemoved(CacheEntryRemovedArguments arguments)
{
_rootCacheKeyStored = false;
}
private void StoreRootCacheKey()
{
_rootCacheKeyStored = true;
var policy = new CacheItemPolicy
{
AbsoluteExpiration = ObjectCache.InfiniteAbsoluteExpiration,
SlidingExpiration = ObjectCache.NoSlidingExpiration,
Priority = CacheItemPriority.Default,
RemovedCallback = RootCacheItemRemoved
};
_cache.Add(
_rootCacheKey,
_rootCacheKey,
policy);
}
private void RemoveRootCacheKey()
{
_cache.Remove(_rootCacheKey);
}
/// Clears the second level cache of NHibernate.
public void Destroy()
{
Clear();
}
public void Lock(object key)
{
// Do nothing
}
public void Unlock(object key)
{
// Do nothing
}
/// Obtains the next timestamp value.
public long NextTimestamp()
{
return Timestamper.Next();
}
/// Defines property in order to get the sliding expiration time for the second level cache of NHibernate.
public int Timeout
{
get { return Timestamper.OneMs * 60000; } // 60 seconds
}
/// Retrieves the name of NHibernate second level cache region.
public string RegionName
{
get { return _region; }
}
}
then you need to define an ICacheProvider implementation:
public class NHibernateCacheProvider2 : ICacheProvider
{
private static readonly Dictionary<string, ICache> Caches;
private static readonly IInternalLogger Log;
static NHibernateCacheProvider2()
{
Log = LoggerProvider.LoggerFor(typeof(NHibernateCacheProvider2));
Caches = new Dictionary<string, ICache>();
}
/// Builds a new SysCache through the region and a collection of properties.
/// regionName: The name of the cache region.
/// properties: Configuration settings.
/// returns A new instance of NHibernateCache by using a region of the cache.
public ICache BuildCache(string regionName, IDictionary<string, string> properties)
{
if (regionName == null)
{
regionName = string.Empty;
}
ICache result;
if (Caches.TryGetValue(regionName, out result))
{
return result;
}
// create cache
if (properties == null)
{
properties = new Dictionary<string, string>(1);
}
if (Log.IsDebugEnabled)
{
var sb = new StringBuilder();
sb.Append("building cache with region: ").Append(regionName).Append(", properties: ");
foreach (KeyValuePair<string, string> de in properties)
{
sb.Append("name=");
sb.Append(de.Key);
sb.Append("&value=");
sb.Append(de.Value);
sb.Append(";");
}
Log.Debug(sb.ToString());
}
return new NHibernateCache2(regionName, properties);
}
public long NextTimestamp()
{
return Timestamper.Next();
}
public void Start(IDictionary<string, string> properties) { //your impl it's not necessary }
public void Stop() { }
}
if you are using fluent nhibernate you can register it with the following configuration:
Fluently.Configure().Database(MsSqlConfiguration.MsSql2008.
ConnectionString(builder => builder.FromConnectionStringWithKey(connectionStringKey)))
.ExposeConfiguration(c =>{c.SetProperty("show_sql", "true");}).
Cache(builder =>builder.ProviderClass<NHibernateCacheProvider2().
UseSecondLevelCache().UseQueryCache())
I hope that helps

ASMX Web Service, Stored Procedures and MVC Models

I am developing a web application using MVC 3. This application connects to an SQL Server database through ASMX Web Services. Each Web Method calls a Stored Procedure and returns a DataTable.
This is the code I'm using to call the Stored Procedure:
public static DataTable ExecSP(string StoredProcedureName, List<string> ParameterNames, List<Object> ParameterValues)
{
SqlConnection Connection = new SqlConnection(ConfigurationManager.ConnectionStrings["SQLServer"].ConnectionString);
SqlDataReader Reader = null;
DataTable SPResult = null;
try
{
Connection.Open();
SqlCommand Command = new SqlCommand("dbo." + StoredProcedureName, Connection);
Command.CommandType = CommandType.StoredProcedure;
if (ParameterNames != null)
{
for (int i = 0; i < ParameterNames.Count; i++)
{
SqlParameter Parameter = new SqlParameter(ParameterNames[i], ParameterValues[i]);
if (Parameter.SqlDbType.Equals(SqlDbType.NVarChar))
{
Parameter.SqlDbType = SqlDbType.VarChar;
}
if (Parameter.SqlValue == null)
{
Parameter.SqlValue = DBNull.Value;
}
Command.Parameters.Add(Parameter);
}
}
Reader = Command.ExecuteReader();
SPResult = new DataTable();
SPResult.Load(Reader);
}
catch (Exception ex)
{
throw;
}
finally
{
Connection.Close();
if (Reader != null)
{
Reader.Close();
}
}
return SPResult;
}
I would like to know if there is a straight-forward way to convert this DataTable into a Model that can then be passed to a View (like, for example, the model binding that happens in an AJAX post) and, if there isn't, what are the alternatives. I know that using LINQ would probably solve this problem, but I can't use it.
Thanks in advance.
Best regards.
Found a solution.
I built a generic method that translates any DataTable into a List of whatever class I specify:
public static List<T> Translate<T>(DataTable SPResult, Func<object[],T> del)
{
List<T> GenericList = new List<T>();
foreach (DataRow Row in SPResult.Rows)
{
GenericList.Add(del(Row.ItemArray));
}
return GenericList;
}
where del is a delegate. When calling this method, del should be the constructor of the specified class. Then, in all Model classes, I built a constructor that receives an object[] RowFromTable
public class MyClass
{
public int ID { get; set; }
public string Description { get; set; }
public FormaProcesso(object[] RowFromTable)
{
this.ID = (int)RowFromTable[0];
this.Description = RowFromTable[1].ToString();
}
}
Finally, to put it all together, this is what happens when I call the Web Method:
public List<MyClass> GetAll()
{
DataTable SPResult = MyWebService.GetAll().Table;
return Translate<MyClass>(SPResult, l => new MyClass(l));
}
Got the idea from here

Resources