Several ways to de-duplicate lists
Here I would like to share a few ways to re-list, sort it out, if there are any mistakes, please feel free to comment.
1. The Streamdistinct()
methods
Distinct () is a method provided by Stream in Java 8 that returns a Stream composed of different elements from that Stream. Distinct () uses the hashCode() and eqauls() methods to get different elements. Therefore, classes that need to be de-weighted must implement the hashCode() and equals() methods. In other words, we can override our custom hashCode() and equals() methods to achieve de-weighting for specific requirements.
The distinct() method is declared as follows:
Stream<T> distinct(a);
Copy the code
1.1 forString
Deduplicating the list
Because the String class has overridden the equals() and hashCode() methods, it can override success.
@Test
public void listDistinctByStreamDistinct(a) {
// 1. De-duplicates the String list
List<String> stringList = new ArrayList<String>() {{
add("A");
add("A");
add("B");
add("B");
add("C");
}};
out.print("Before reloading:");
for (String s : stringList) {
out.print(s);
}
out.println();
stringList = stringList.stream().distinct().collect(Collectors.toList());
out.print("After weight removal:");
for (String s : stringList) {
out.print(s);
}
out.println();
}
Copy the code
The results are as follows:
Before deduplication: AABBC after deduplication: ABCCopy the code
1.2 De-duplication of the entity class list
Note: we used it in the codeLombok
The plug-in@Data
Notes, which can be overwritten automaticallyequals()
As well ashashCode()
Methods.
/** * defines an entity class */
@Data
public class Student {
private String stuNo;
private String name;
}
Copy the code
@Test
public void listDistinctByStreamDistinct(a) throws JsonProcessingException {
ObjectMapper objectMapper = new ObjectMapper();
// 1. Delete the Student list
List<Student> studentList = getStudentList();
out.print("Before reloading:");
out.println(objectMapper.writeValueAsString(studentList));
studentList = studentList.stream().distinct().collect(Collectors.toList());
out.print("After weight removal:");
out.println(objectMapper.writeValueAsString(studentList));
}
Copy the code
The results are as follows:
Before deduplication: [{"stuNo":"001"."name":"Tom"}, {"stuNo":"002"."name":"Mike"}, {"stuNo":"001"."name":"Tom"}] : [{"stuNo":"001"."name":"Tom"}, {"stuNo":"002"."name":"Mike"}]
Copy the code
2. According toList<Object>
中 Object
An attribute is deduplicated
2.1 Create a list
@Test
public void distinctByProperty1(a) throws JsonProcessingException {
// The first method is to create a new list of only different elements to implement a certain attribute of the object
ObjectMapper objectMapper = new ObjectMapper();
List<Student> studentList = getStudentList();
out.print("Before reloading :");
out.println(objectMapper.writeValueAsString(studentList));
studentList = studentList.stream().distinct().collect(Collectors.toList());
out.print("Distinct removed :");
out.println(objectMapper.writeValueAsString(studentList));
// Here we introduce two static methods and TreeSet<> to get different elements
// 1. import static java.util.stream.Collectors.collectingAndThen;
// 2. import static java.util.stream.Collectors.toCollection;
studentList = studentList.stream().collect(
collectingAndThen(
toCollection(() -> new TreeSet<>(Comparator.comparing(Student::getName))), ArrayList::new)); out.print("After removal by name :");
out.println(objectMapper.writeValueAsString(studentList));
}
Copy the code
The results are as follows:
Before going to heavy: [{"stuNo":"001"."name":"Tom"}, {"stuNo":"001"."name":"Tom"}, {"stuNo":"003"."name":"Tom"}] distinct :[{"stuNo":"001"."name":"Tom"}, {"stuNo":"003"."name":"Tom"} [{}}"stuNo":"001"."name":"Tom"}]
Copy the code
2.2 throughfilter()
methods
First we create a method as a parameter to stream.filter () that returns type Predicate and determines whether an element can be added to the Set as follows:
private static <T> Predicate<T> distinctByKey(Function<? superT, ? > keyExtractor) {
Set<Object> seen = ConcurrentHashMap.newKeySet();
return t -> seen.add(keyExtractor.apply(t));
}
Copy the code
Use as follows:
@Test
public void distinctByProperty2(a) throws JsonProcessingException {
// In the second method, we use filtering to implement de-weighting based on an object attribute
ObjectMapper objectMapper = new ObjectMapper();
List<Student> studentList = getStudentList();
out.print("Before reloading :");
out.println(objectMapper.writeValueAsString(studentList));
studentList = studentList.stream().distinct().collect(Collectors.toList());
out.print("Distinct removed :");
out.println(objectMapper.writeValueAsString(studentList));
// Here we are taking the distinctByKey() method as an argument to filter() to filter out elements that cannot be added to set
studentList = studentList.stream().filter(distinctByKey(Student::getName)).collect(Collectors.toList());
out.print("After removal by name :");
out.println(objectMapper.writeValueAsString(studentList));
}
Copy the code
The results are as follows:
Before going to heavy: [{"stuNo":"001"."name":"Tom"}, {"stuNo":"001"."name":"Tom"}, {"stuNo":"003"."name":"Tom"}] distinct :[{"stuNo":"001"."name":"Tom"}, {"stuNo":"003"."name":"Tom"} [{}}"stuNo":"001"."name":"Tom"}]
Copy the code
3. Summary
These are some of the ways TO de-weight lists that I’m going to share with you. Of course, I didn’t go into a more detailed performance analysis here, but HOPEFULLY I’ll go down and re-analyze it later. If there is a mistake, but also hope to give advice.
Code address: Github