How to use lambda/Stream api to filter distinct elements by object Attribute/Property - java-8

I have a List of Object. Every object has a map with a key named "xyz". I want elements in the list which has unique value to that particular key.
I know we can do this easily with set/map but I'm particularly looking for lambda solution.
I thought this would work.
list.stream()
.filter(distinctByXyz(f -> f.getMap.get("xyz")))
.collect(Collectors.toList()));
I've a function to distinct them
private <T> Predicate<T> distinctByKey(Function<? super T, Object> keyExtractor){
Map<Object, Boolean> map = new ConcurrentHashMap<>();
return t -> map.putIfAbsent(keyExtractor.apply(t), Boolean.TRUE) == null;
}
The problem is the function f.getMap() inside filter isnt working. Showing compilation error (Cannot resolve method)

You seem to have a few typos in your code, this should work:
list
.stream()
.filter(distinctByKey(f -> f.getMap().get("xyz")))
.collect(Collectors.toList());
You are using distinctByXyz when it should really be distinctByKey. Then f.getMap that should probably be f.getMap() and also you are slightly off with your parenthesis.

Related

List.stream().filter() vs removeIf

Please help me understand the difference.
I'm trying to remove elements from list. If I use stream().filter(), this works but when I use removeIf, it throws UnsupportedOperationException.
private void filterEmployees(EmployeeResponse employeeResponse) {
List<Employee> employees = employeeResponse.getEmployees();
List<Employee> employeesFiltered = employees
.stream()
.filter(employee -> employee.getRole().equals("01") || employee.getRole().equals("02") || employee.getRole().equals("03"))
.collect(Collectors.toList());
employeeResponse.setEmployees(employeesFiltered);
}
Below code throws UnsupportedOperationException.
private void filterEmployees(EmployeeResponse employeeResponse, List<String> rolesList) {
List<Employee> employees = employeeResponse.getEmployees();
employees.removeIf(employee -> !rolesList.contains(employee.getRole()));
employeeResponse.setEmployees(employees);
}
Exception:
java.lang.UnsupportedOperationException: remove
at java.base/java.util.Iterator.remove(Iterator.java:102)
at java.base/java.util.Collection.removeIf(Collection.java:545)
Streaming list with a filter simply filters the current list, skipping the unwanted values, and creates a new one. I would be equivalent to you iterating the list and when you see the value, skip it. Otherwise add it to the new list.
removeIf is a new method in Java 8. It is in the Collections interface which List extends. That method is not supported in any immutable collection or in List.of which creates an ImmutableCollection.
I can't tell if that is the problem so you may want to check if that instance is immutable by doing the following:
List<Employee> employees = employeeResponse.getEmployees();
System.out.println(employees.getClass().getName());
One way to possibly solve that problem is to do the following:
List<Employee> employee = new ArrayList<>(employeeResponse.getEmployees());
Which creates a mutable ArrayList and populates it with the employee list. Then removeIf should work as expected.

Iterate over Collected list in Java 8 GroupingBy

I have a List of Objects say List<Type1> that I have grouped using type.(using groupingBy)
Now I want to convert that Map> into Type2 that has both the list and the Id of that group.
class Type1{
int id;
int type;
String name;
}
class Type2{
int type;
List<Type1> type1List;
}
This is what I have written to achieve this:
myCustomList
.stream()
.collect(groupingBy(Type1::getType))
.entrySet()
.stream()
.map(type1Item -> new Type2() {
{
setType(type1Item.getKey());
setType1List(type1Item.getValue());
}
})
.collect(Collectors.toList());
This works perfectly. But I am trying to make the code even cleaner. Is there a way to avoid streaming this thing all over again and use some kind of flatmap to achieve this.
You can pass a finisher function to the collectingAndThen to get the work done after the formation of the initial map.
List<Type2> result = myCustomList.stream()
.collect(Collectors.collectingAndThen(Collectors.groupingBy(Type1::getType),
m -> m.entrySet().stream()
.map(e -> new Type2(e.getKey(), e.getValue()))
.collect(Collectors.toList())));
You should give Type2 a constructor of the form
Type2(int type, List<Type1> type1List) {
this.type = type;
this.type1List = type1List;
}
Then, you can write .map(type1Item -> new Type2(type1Item.getKey(), type1Item.getValue())) instead of
.map(type1Item -> new Type2() {
{
setType(type1Item.getKey());
setType1List(type1Item.getValue());
}
})
See also What is Double Brace initialization in Java?
In short, this creates a memory leak, as it creates a subclass of Type2 which captures the type1Item its entire lifetime.
But you can perform the conversion as part of the downstream collector of the groupingBy. This implies that you have to make the toList explicit, to combine it via collectingAndThen with the subsequent mapping:
Collection<Type2> collect = myCustomList
.stream()
.collect(groupingBy(Type1::getType,
collectingAndThen(toList(), l -> new Type2(l.get(0).getType(), l))))
.values();
If you really need a List, you can use
List<Type2> collect = myCustomList
.stream()
.collect(collectingAndThen(groupingBy(Type1::getType,
collectingAndThen(toList(), l -> new Type2(l.get(0).getType(), l))),
m -> new ArrayList<>(m.values())));
You can do as mentioned below:
type1.map( type1Item -> new Type2(
type1Item.getKey(), type1Item
)).collect(Collectors.toList());

map from string to multiple different strings and add to list with Java stream

I'm new to Java 8 and Streams .
I got a PolicyDefinition object, that got to two method : getAlias,getName which both returns a string .
Is there an elegant way to create a list with all aliases and names of policy definitions using Stream (created from collection of PolicyDefinition) in one statement ?
with two statements its not a problem :
List<String> policyNames =
policyDefinitions.stream()
.map(definition -> definition.getName())
.collect(Collectors.toList());
List<String> policyAlias =
policyDefinitions.stream()
.map(definition -> definition.getAlias())
.collect(Collectors.toList());
But Is it possible in one ?
Thanks a lot for the help
flatMap it!
List<String> policyNames = policyDefinitions.stream()
.flatMap(definition -> Stream.of(definition.getName(), definition.getAlias()))
.collect(Collectors.toList());
As mentioned in the comments - for tidyness, create a method in Definition
public Stream<String> allNames() {
return Stream.of(getName(), getAlias())
}
Then
List<String> policyNames = policyDefinitions.stream()
.flatMap(Definition::allNames)
.collect(Collectors.toList());
OP comments "I forgot to mention that getAlias might be null, what do you do than[sic]"
In that case, use Optional:
public Stream<String> allNames() {
return Stream.concat(Stream.of(getName()), Optional.ofNullable(getAlias()).stream())
}
Also you can create a Map with Alias as a Key and Name as a Value using groupingBuy operator

HashMap null check in Merge Operation

Why HashMap merge is doing null check on value. HashMap supports null key and null values.So can some one please tell why null check on merge is required?
#Override
public V merge(K key, V value,
BiFunction<? super V, ? super V, ? extends V> remappingFunction) {
if (value == null)
throw new NullPointerException();
if (remappingFunction == null)
throw new NullPointerException();
Due to this I am unable to use Collectors.toMap(Function.identity(), this::get) to collect values in a Map
The behavior is mandated by the Map.merge contract:
Throws:
…
NullPointerException - if the specified key is null and this map does not support null keys or the value or remappingFunction is null
Note that using Map.merge for Collectors.toMap without a merge function is an implementation detail; it not only disallows null values, it does not provide the desired behavior for reporting duplicate keys, the Java 8 implementation wrongly reports one of the two values as key when there are duplicate keys.
In Java 9, the implementation has been completely rewritten, it does not use Map.merge anymore. But the new implementation is behavioral compatible, now having code explicitly throwing when the value is null. So the behavior of Collectors.toMap not accepting null values has been fixed in the code and is not an artifact of using Map.merge anymore. (Still speaking of the toMap collector without a merge function only.)
Unfortunately, the documentation does not tell.
Because internally for Collectors.toMap, Map#merge is used - you can't really do anything about it. Using the static Collectors.toMap is not an option (which by the way is documented to throw a NullPointerException).
But spinning a custom collector to be able to do what you want (which you have not shown) is not that complicated, here is an example:
Map<Integer, Integer> result = Arrays.asList(null, 1, 2, 3)
.stream()
.collect(
HashMap::new,
(map, i) -> {
map.put(i, i);
},
HashMap::putAll);
As a workaround for mentioned problems with null values in toMap and merge
you can try to use a custom collector in the following manner:
public static <T, R> Map<T, R> mergeTwoMaps(final Map<T, R> map1,
final Map<T, R> map2,
final BinaryOperator<R> mergeFunction) {
return Stream.of(map1, map2).flatMap(map -> map.entrySet().stream())
.collect(HashMap::new,
(accumulator, entry) -> {
R value = accumulator.containsKey(entry.getKey())
? mergeFunction.apply(accumulator.get(entry.getKey()), entry.getValue())
: entry.getValue();
accumulator.put(entry.getKey(), value);
},
HashMap::putAll);
}

Functional programming in java: Cloning Vs Mutating. Good or bad?

Mutating:
"transformEmployeeNameToUpperCase" function to transform employee name to uppercase.
List<Employee> employeesStartsWithDInUppercase1 = employees.stream()
.filter(employee -> employee.getName().startsWith("D"))
.map(Main::transformEmployeeNameToUpperCase)
.collect(Collectors.toList());
public static Employee transformEmployeeNameToUpperCase(Employee employee){
employee.setName(employee.getName().toUpperCase());
return employee;
}
Cloning:
"createEmployeeWithUpperCaseName" function to new employee with name in uppercase.
List<Employee> employeesStartsWithDInUppercase2 = employees.stream()
.filter(employee -> employee.getName().startsWith("D"))
.map(Main::createEmployeeWithUpperCaseName)
.collect(Collectors.toList());
public static Employee createEmployeeWithUpperCaseName(Employee e){
return new Employee( e.getId(), e.getName().toUpperCase(), e.getDesignation(), e.getAge());
}
Does "createEmployeeWithUpperCaseName" follow rule 1(above) as they say
yes: the employee is not being modified
In case of "transformEmployeeNameToUpperCase", does it follow rule 2(above)?
yes, although the rule uses an incorrect terminology. It creates an object, not a variable. You can't create a variable.
Is it good practice to use transformEmployeeNameToUpperCase way?
No, at least not the way you're doing it. There's nothing bad per se in modifying mutable objects: they're mutable for a reason. But a map() operation shouldn't modify its input and return it. You're perverting its purpose. A future reader of your code wouldn't expect a map operation to mutate its input, and you're thus making your code do unexpected things, leading to bugs and/or misunderstandings. It would be better to do it this way:
employees.stream()
.filter(employee -> employee.getName().startsWith("D"))
.forEach(e -> e.setName(e.getName().toUpperCase()));
That way, it makes it clear that the point of the pipeline is to have a side effect on the elements of the list. And it doesn't create a (probably) useless copy of the list, too.
Agree with #JB Nizet, but still if you don't want to change the original object but want to change the name of employee to Uppercase. use object cloning.
pseudo code:
List<Employee> employeeWithUpperCaseName = employees.parallelStream()
.filter(e -> e.getName().startsWith("D"))
.map(x -> {
Employee s = null;
try {
s = (Employee) x.clone();
s.setName(x.getName().toUpperCase());
} catch (CloneNotSupportedException e) {
e.printStackTrace();
} finally {
return s;
}
})
.collect(Collectors.toList());
you can write it in better way.

Resources