Redis hasKey method return NULL - spring

When can redis hasKey method return null? I have seen that if we pass null also in method param it returns false .So is there any scenario when that method can return null?

If you use spring data redis,It mainly includes the following situations
Key does not exist;
After pipline;
After transaction execution
you can take a look at the underlying implementation org. Springframework. Data. Redis. Connection. Jedis. Jediskeycommands #exists (byte []...). Detail as following:
#Override
public Boolean hasKey(K key) {
byte[] rawKey = rawKey(key);
return execute(connection -> connection.exists(rawKey), true);
}
#Nullable
#Override
public Long exists(byte[]... keys) {
Assert.notNull(keys, "Keys must not be null!");
Assert.noNullElements(keys, "Keys must not contain null elements!");
try {
if (**isPipelined**()) {
pipeline(connection.newJedisResult(connection.getRequiredPipeline().exists(keys)));
return **null**;
}
if (**isQueueing**()) {
transaction(connection.newJedisResult(connection.getRequiredTransaction().exists(keys)));
return **null**;
}
return connection.getJedis().exists(keys);
} catch (Exception ex) {
throw connection.convertJedisAccessException(ex);
}
}

Related

Moshi with Graal has all reflection registered but cannot map fields

I'm trying to use Moshi with GraalVM's native-image, and trying to get the reflection to work.
I have my class:
public class SimpleJson {
private String message;
public SimpleJson(String message) { this.message = message; }
public String getMessage() { return message; }
public void setMessage(String message) { this.message = message; }
}
and code
var simpleJsonJsonAdapter = moshi.adapter(SimpleJson.class);
var simpleJsonString = "{\"message\": \"hello there\"}";
var simpleJsonObj = simpleJsonJsonAdapter.fromJson(simpleJsonString);
var simpleJsonStringBack = simpleJsonJsonAdapter.toJson(simpleJsonObj);
System.out.println("Converting: " + simpleJsonString);
System.out.println("Simple json has message: " + simpleJsonObj.getMessage());
System.out.println("Simple message full json coming back is: " + simpleJsonStringBack);
which prints:
Converting: {"message": "hello there"}
Simple json has message: null
Simple message full json coming back is: {}
and this only works (by avoiding an exception with SimpleJson is instantiated reflectively but was never registered) with the following chunk of code, to get everything registered ready for reflection:
#AutomaticFeature
public class RuntimeReflectionRegistrationFeature implements Feature {
#Override
public void beforeAnalysis(BeforeAnalysisAccess access) {
try {
// Enable the moshi adapters
var moshiPkgs = "com.squareup.moshi";
// Standard shared models
var pkgs = "my.models";
// Register moshi
new ClassGraph()
.enableClassInfo()
.acceptPackages(moshiPkgs)
.scan()
.getSubclasses(JsonAdapter.class.getName())
.forEach(
classInfo -> {
System.out.println("Building moshi adapter class info for " + classInfo);
registerMoshiAdapter(classInfo.loadClass());
});
// Register everything we've got
new ClassGraph()
.enableClassInfo() // Scan classes, methods, fields, annotations
.acceptPackages(pkgs) // Scan package(s) and subpackages
.scan()
.getAllClasses()
.forEach(
classInfo -> {
System.out.println("Building class info for " + classInfo);
registerGeneralClass(classInfo.loadClass());
});
} catch (Exception e) {
e.printStackTrace();
throw e;
}
}
private void registerMoshiAdapter(Class<?> classInfo) {
try {
RuntimeReflection.register(classInfo);
Arrays.stream(classInfo.getMethods()).forEach(RuntimeReflection::register);
ParameterizedType superclass = (ParameterizedType) classInfo.getGenericSuperclass();
// extends JsonAdapter<X>()
var valueType = Arrays.stream(superclass.getActualTypeArguments()).findFirst();
if (valueType.isPresent() && valueType.get() instanceof Class) {
Arrays.stream(((Class<?>) valueType.get()).getConstructors())
.forEach(RuntimeReflection::register);
}
RuntimeReflection.register(classInfo.getConstructor(Moshi.class));
} catch (RuntimeException | NoSuchMethodException name) {
// expected
}
}
private void registerGeneralClass(Class<?> classInfo) {
try {
RuntimeReflection.register(classInfo);
Arrays.stream(classInfo.getDeclaredMethods()).forEach(RuntimeReflection::register);
Arrays.stream(classInfo.getDeclaredConstructors()).forEach(RuntimeReflection::register);
} catch (RuntimeException name) {
// expected
}
}
}
(inspired by this issue, although I believe that's trying to address MoshiAdapters generated which is a Kotlin only thing).
So, Java doesn't complain about reflection (which it was previously trying to do, hence the error message mentioned), but Moshi isn't actually doing anything.
Does anyone have any suggestions on how to work around this?
Note, I did try the manual reflect-config.json approach with
[
{
"allDeclaredClasses": true,
"queryAllDeclaredConstructors": true,
"queryAllPublicConstructors": true,
"name": "my.models.SimpleJson",
"queryAllDeclaredMethods": true,
"queryAllPublicMethods": true,
"allPublicClasses": true
}
}
but this resulted in error around Runtime reflection is not supported for... - also not good!
The solution was simple in the end... the registration just needed
Arrays.stream(classInfo.getDeclaredFields()).forEach(RuntimeReflection::register);
adding.

Why my Spring redis cache doesn't work even with anntations

I follow the instructions on this tutorial (https://www.baeldung.com/spring-boot-redis-cache)
#Cacheable(value = "itemCache")
public UserInfo getUserInfo(String id) {
// without explicit manipulate cache, the value can't be retrieved from Redis
UserInfo res = cacheManager.getCache("itemCache").get(id, UserInfo.class);
if (res != null) {
return res;
}
try {
... retrieve from database ...
res = convertMapToUserInfo(id, userInfoMap);
// without explicit manipulate cache, the value can't be stored in Redis
cacheManager.getCache("itemCache").put(id, res);
return res;
}
} catch (Exception e) {
...
} finally {
...
}
return null;
}
The weird thing is, I have to put/get items from Cache manualy, even I use Cacheable annotation. Without explicit manipulate cache, the returned value of getUserInfo can't be cached.
The RedisConfiguration contains code
#Bean
public RedisCacheManagerBuilderCustomizer redisCacheManagerBuilderCustomizer() {
return (builder) -> builder
.withCacheConfiguration("itemCache",
this.cacheConfiguration());
}
#Bean
public RedisCacheConfiguration cacheConfiguration() {
return RedisCacheConfiguration.defaultCacheConfig()
.disableCachingNullValues()
.serializeValuesWith(RedisSerializationContext.SerializationPair.fromSerializer(new GenericJackson2JsonRedisSerializer()));
}
And I also add #EnableCaching to my Application class. Could anyone help me figure out why my cache doesn't take effect. Thanks!
I've recently found out that if we use just one entry with the builder it does'nt work. But if you have more than one cache in your project and include all these entries in this method or even if you add a 'dummy' entry, it works. Maybe a bug or a known issue in Spring implementation, I don't know.
#Bean
public RedisCacheManagerBuilderCustomizer redisCacheManagerBuilderCustomizer() {
return (builder) -> builder
.withCacheConfiguration("itemCache", this.cacheConfiguration())
.withCacheConfiguration("dummy", this.cacheConfiguration());
}
I hope helps.

Return null or something nullable from lambda in transformation method - Webflux (Intellij warning)

I get the following warning in my code, additionally on dev environment I get surprisingly NPE because I thought that my endpoints return empty Flux and not NULL!
public Mono<BackupResponse> getBackupsNumber(
UUID organizationId) {
return confServiceApi
.getObjectsXByGroupId(organizationId) //should not return NULL at all, in the worst case empty Flux, am I right???
.flatMap(
objectX ->
catalogApi.getBackupsNumber(
objectX.getAgentId(), objectX.getProtectionId().toString())) // NPE here!!!
.map(BackupsNumberResponse::getBackups) //here WARNING -> Return null or something nullable from lambda in transformation method
.reduce(0, Integer::sum)
.map(Mapper::createBackupResponse);
}
confServiceApi.getObjectsXByGroupId:
public Flux<ObjectXResponse> getObjectsXByGroupId(UUID groupId) {
return objectXByGroupIdRepository
.findByGroupId(groupId)
.map(Mapper::toObjectXResponse);
}
And catalogApi.getBackupsNumber returns Mono<BackupNumberResponse> with the Integer field in response object:
public Mono<ResponseEntity<BackupsNumberResponse>> getBackupsNumber(
UUID agentId, String protectionId) {
return backupService
.getBackupsNumber(agentId, protectionId)
.map(ResponseEntity::ok);
}
And backupService.getBackupsNumber:
return backupRepository
.countByAgentIdAndProtectionId(agentId, protectionId)
.map(BackupMapper::createObjectResponse);
And mapper:
public static BackupsResponse createBackupsNumberResponse(Long numberOfRecoveryPoints) {
return new BackupsResponse ().backups(numberOfBackups.intValue());
}

How to avoid caching when values are null?

I am using Guava to cache hot data. When the data does not exist in the cache, I have to get it from database:
public final static LoadingCache<ObjectId, User> UID2UCache = CacheBuilder.newBuilder()
//.maximumSize(2000)
.weakKeys()
.weakValues()
.expireAfterAccess(10, TimeUnit.MINUTES)
.build(
new CacheLoader<ObjectId, User>() {
#Override
public User load(ObjectId k) throws Exception {
User u = DataLoader.datastore.find(User.class).field("_id").equal(k).get();
return u;
}
});
My problem is when the data does not exists in database, I want it to return null and to not do any caching. But Guava saves null with the key in the cache and throws an exception when I get it:
com.google.common.cache.CacheLoader$InvalidCacheLoadException:
CacheLoader returned null for key shisoft.
How do we avoid caching null values?
Just throw some Exception if user is not found and catch it in client code while using get(key) method.
new CacheLoader<ObjectId, User>() {
#Override
public User load(ObjectId k) throws Exception {
User u = DataLoader.datastore.find(User.class).field("_id").equal(k).get();
if (u != null) {
return u;
} else {
throw new UserNotFoundException();
}
}
}
From CacheLoader.load(K) Javadoc:
Returns:
the value associated with key; must not be null
Throws:
Exception - if unable to load the result
Answering your doubts about caching null values:
Returns the value associated with key in this cache, first loading
that value if necessary. No observable state associated with this
cache is modified until loading completes.
(from LoadingCache.get(K) Javadoc)
If you throw an exception, load is not considered as complete, so no new value is cached.
EDIT:
Note that in Caffeine, which is sort of Guava cache 2.0 and "provides an in-memory cache using a Google Guava inspired API" you can return null from load method:
Returns:
the value associated with key or null if not found
If you may consider migrating, your data loader could freely return when user is not found.
Simple solution: use com.google.common.base.Optional<User> instead of User as value.
public final static LoadingCache<ObjectId, Optional<User>> UID2UCache = CacheBuilder.newBuilder()
...
.build(
new CacheLoader<ObjectId, Optional<User>>() {
#Override
public Optional<User> load(ObjectId k) throws Exception {
return Optional.fromNullable(DataLoader.datastore.find(User.class).field("_id").equal(k).get());
}
});
EDIT: I think #Xaerxess' answer is better.
Faced the same issue, cause missing values in the source was part of the normal workflow. Haven't found anything better than to write some code myself using getIfPresent, get and put methods. See the method below, where local is Cache<Object, Object>:
private <K, V> V getFromLocalCache(K key, Supplier<V> fallback) {
#SuppressWarnings("unchecked")
V s = (V) local.getIfPresent(key);
if (s != null) {
return s;
} else {
V value = fallback.get();
if (value != null) {
local.put(key, value);
}
return value;
}
}
When you want to cache some NULL values, you could use other staff which namely behave as NULL.
And before give the solution, I would suggest you not to expose LoadingCache to outside. Instead, you should use method to restrict the scope of Cache.
For example, you could use LoadingCache<ObjectId, List<User>> as return type. And then, you could return empty list when you could'n retrieve values from database. You could use -1 as Integer or Long NULL value, you could use "" as String NULL value, and so on. After this, you should provide a method to handler the NULL value.
when(value equals NULL(-1|"")){
return null;
}
I use the getIfPresent
#Test
public void cache() throws Exception {
System.out.println("3-------" + totalCache.get("k2"));
System.out.println("4-------" + totalCache.getIfPresent("k3"));
}
private LoadingCache<String, Date> totalCache = CacheBuilder
.newBuilder()
.maximumSize(500)
.refreshAfterWrite(6, TimeUnit.HOURS)
.build(new CacheLoader<String, Date>() {
#Override
#ParametersAreNonnullByDefault
public Date load(String key) {
Map<String, Date> map = ImmutableMap.of("k1", new Date(), "k2", new Date());
return map.get(key);
}
});

EclipseLink converts Enum to BigDecimal

I try to convert an Enum into a BigDecimal using the Converter of EclipseLink. The conversion works, but the resulting database column has a type of String. Is it possible to set a parameter, that EclipseLink builds a decimal column type within the database?
I use a class, which implements org.eclipse.persistence.mappings.converters.Converter.
The application server logs
The default table generator could not locate or convert a java type (null) into a database type for database field (xyz). The generator uses java.lang.String as default java type for the field.
This message is generated for every field, which uses a converter. How can I define a specific database type for these fields?
public enum IndirectCosts {
EXTENDED {
public BigDecimal getPercent() {
return new BigDecimal("25.0");
}
},
NORMAL {
public BigDecimal getPercent() {
return new BigDecimal("12.0");
}
},
NONE {
public BigDecimal getPercent() {
return new BigDecimal("0.0");
}
};
public abstract BigDecimal getPercent();
public static IndirectCosts getType(BigDecimal percent) {
for (IndirectCosts v : IndirectCosts.values()) {
if (v.getPercent().compareTo(percent) == 0) {
return v;
}
}
throw new IllegalArgumentException();
}
}
The database has to store the numeric values. I use such a converter:
public class IndirectCostsConverter implements Converter {
#Override
public Object convertObjectValueToDataValue(Object objectValue, Session session) {
if (objectValue == null) {
return objectValue;
} else if (objectValue instanceof IndirectCosts) {
return ((IndirectCosts) objectValue).getPercent();
}
throw new TypeMismatchException(objectValue, IndirectCosts.class);
}
#Override
public Object convertDataValueToObjectValue(Object dataValue, Session session) {
if (dataValue == null) {
return dataValue;
} else if (dataValue instanceof String) {
return IndirectCosts.getType(new BigDecimal((String) dataValue));
}
throw new TypeMismatchException(dataValue, BigDecimal.class);
}
#Override
public boolean isMutable() {
return false;
}
#Override
public void initialize(DatabaseMapping databaseMapping, Session session) {
}
}
Within convertDataValueToObjectValue(Object I have to use String because the SQL generator defines the database column as varchar(255). I would like to have decimal(15,2) or something.
Thanks a lot
Andre
The EclipseLink Converter interface defines a initialize(DatabaseMapping mapping, Session session); method that you can use to set the type to use for the field. Someone else posted an example showing how to get the field from the mapping here: Using UUID with EclipseLink and PostgreSQL
The DatabaseField's columnDefinition, if set, will be the only thing used to define the type for DDL generation, so set it carefully. The other settings (not null, nullable etc) will only be used if the columnDefinition is left unset.

Resources