What is a Stream?
- A Stream, introduced in Java 8, is a completely different concept from InputStream and OutputStream in the java.io package. To put it simply, a stream is a collection containing a series of data items, but it is not a storage container, but more like an algorithm container related to data items.
- Scala and Groovy demonstrate that functions as first-class citizens can greatly expand a programmer’s toolkit. The idea of functional programming is also used in Stream, greatly increasing and liberating productivity. By mastering the power of Stream, you can write cleaner, more expressive code, and say goodbye to 996.
2. Stream construction
-
- Stream() or parallelStream() by collection;
-
- Static methods of streams, such as stream.of ();
-
- Random number stream random.ints ();
-
- Wrapper types are commonly used in streams, but primitives can also use streams. For primitive streams, we can use the IntStream, LongStream, DoubleStream interface static methods of, range, empty
-
- Constructed from an input stream such as a file
//1
Arrays.asList(1.2.3.4.5).stream()... ;//2
Stream.of(1.2.3.4.5)...
//3. Random number stream
//IntStream
new Random(100).ints();
IntStream.of(1.2.3);
IntStream.range(1.100);
//LongStream
new Random(100).longs();
//DoubleStream
new Random(100).doubles();
//4.IntStream/LongStream/DoubleStream
// The Stream
, Stream
, Stream
constructions can also be used
IntStream.of(new int[] {1.2.3});
//5. File input construction
// Normally the Stream does not need to be closed, only the Stream running on the IO channel needs to be closed
try(final Stream<String> lines=Files.lines(Paths.get("somePath"))){
lines.forEach(System.out::println);
}
Copy the code
3. Stream workflow
-
A Stream is like an iterator that iterates over each element in the Stream. Serialization is handled similarly to an iterator, but a Stream can do much more than iterate.
-
The intermediate operation and terminal operation. Usually friends find IDE popular after a stream operation, often do not understand why, in many cases, do not understand the intermediate operation and terminal operation.
-
In simple terms, the intermediate operation returns a stream after execution, similar to the build() operation that usually returns this in the Builder design mode. The intermediate operation returns a stream to provide the chained call syntax. A terminal operation, on the other hand, is a termination operation and generally returns a void or non-stream result. ToList (), toSet(), toMap(), toArray are non-stream results, and foreach() with side effects is void.
-
Map, FlatMap, filter, peek, Limit, Skip, DISTINCT, sorted… Foreach, forEachOrdered, Collect, findFirst, min and Max are terminal operations.
3.1. Intermediate operation
3.1.1, the map
- Applies to every element in stream
// There are two equivalent ways to uppercase and sort a string
//1. Functional interface
()->stream.of("apple"."banana"."orange"."grapes"."melon"."blueberry"."blackberry")
.map(String::toUpperCase)
.sorted();
//2.Lambda expressions
()->stream.of("apple"."banana"."orange"."grapes"."melon"."blueberry"."blackberry")
.map(v->v.toUpperCase())
.sorted();
Copy the code
3.1.2, flatmap
-
Similar to map, a function applies to each element in a stream.
-
As you can see from the function signature, the map returns an object that forms a new Stream. Flatmap returns a Stream. Instead of creating a new Stream, the FlatMap converts the original element to a Stream. It is usually used to flatten a stream
/ / the map () signature <R> Stream<R> map(Function<? super T, ? extends R> mapper); / / flatmap sign () <R> Stream<R> flatMap(Function<? super T, ? extends Stream<? extends R>> mapper); / / flatmap returned to the Stream Stream.of(1.22.33).flatMap(v->Stream.of(v*v)).collect(Collectors.toList()); //map returns an object Stream.of(1.22.33).map(v->v*v).collect(Collectors.toList()); //flatMap List<Map<String, String>> list = new ArrayList<>(); Map<String,String> map1 = new HashMap(); map1.put("1"."one"); map1.put("2"."two"); Map<String,String> map2 = new HashMap(); map2.put("3"."three"); map2.put("4"."four"); list.add(map1); list.add(map2); Set<String> output= list.stream() // Stream
.map(Map::values) // Stream<List<String>> .flatMap(Collection::stream) // Stream<String> .collect(Collectors.toSet()); //Set<String> [one, two, three,four] Copy the code
3.1.3, peek,
-
Peek also operates on each element in the Stream, providing a Consumer consumption function in addition to generating a new Stream containing all the original elements.
-
Compared with MAP, PEEK can do some output, external processing, side effects and no return value in stream processing. Generates a new Stream containing all the elements of the original Stream. Each element of the new Stream executes peek’s given consumption function before being consumed.
// Perform some side effects on each element List<Integer> list = new ArrayList(); List<Integer> result = Stream.of(1.2.3.4) .peek(x -> list.add(x)) .map(x -> x * 2) .collect(Collectors.toList()); / / 1 / / 2 / / 3 / / [1, 2, 3] System.out.println(list); //map() signature, return value R <R> Stream<R> map(Function<? super T, ? extends R> mapper); //peek() signature, return void Stream<T> peek(Consumer<? super T> action); Copy the code
3.1.4, filter,
- Set the filter criteria to generate a new stream
// Filter out elements >0
Arrays.asList(1.2.3.4.5)
.stream()
.filter(v-> v>0)
.toArray(Integer[]::new);
// Filter out strings starting with the letter A
Stream.of("apple"."banana"."orange"."grapes"."melon"."blueberry"."blackberry")
.filter(s->s.startWith("A"))
.forEach(System.out::println)
Copy the code
3.1.5, limit/skip
- Limit returns the first elements of Stream, skip throws out the first elements.
3.1.6, distinct
-
Decrement of a simple stream
/ / to heavy Stream.of(1.2.3.3.3.2.4.5.6) .distinct() .collect(Collectors.toSet()); Copy the code
3.2 Terminal operation
3.2.1, findFirst
-
It always returns the first element of the Stream, or null. Note that the return value is Optional.
-
Optional may or may not have a value, mainly to avoid NPE when possible.
Optional<String> ops = Stream.of("apple"."banana"."orange"."blueberry"."blackberry")
.filter(s->s.startsWith("b"))
.findFirst();
//banana
ops.orElse("apple");
Optional<String> ops = Stream.of("apple"."banana"."orange"."blueberry"."blackberry")
.filter(s->s.startsWith("c"))
findFirst();
//apple
ops.orElse("apple");
Copy the code
3.2.2 powerful collect
Collectors, Collections, Collectors, Collectors, Collectors. 1.Collection is the Java Collection ancestor interface; 2.Collections is a tool under the java.util package that contains various static methods for working with Collections. 3. Java. Util. Stream. Stream# collect (Java. Util. Stream. The Collector) is a function of the stream, is responsible for collecting flow. 4. Java. Util. Stream. The Collector is a collection function interface, declare a Collector function. 5. Java.util. Collectors is a tool class for Collectors. A series of common Collectors, such as Collectors.
-
toList/toMap
//toList 1 / / way List<String> list = Stream.of("I"."love"."you"."too") .collect(ArrayList::new,ArrayList::add,ArrayList::addAll); 2 / / way List<String> list = stream.collect(Collections.toList()) //toMap Map<Integer, Integer> collect1 = Stream.of(1.3.4) .collect(Collectors.toMap(x -> x, x -> x + 1)); Copy the code
-
Array to the List
List list = new ArrayList<>(Arrays.asList("a"."b"."c")); Integer [] myArray = { 1.2.3 }; List myList = Arrays.stream(myArray).collect(Collectors.toList()); // Basic types can also be converted (depending on boxed boxing) int [] myArray2 = { 1.2.3 }; List myList = Arrays.stream(myArray2).boxed().collect(Collectors.toList()); Copy the code
-
Perhaps Collect is the most commonly used terminal operation, toList, toSet and toMap all in one go. However, if you look at the signature of the collect function, you will find that the collect function is not simple.
//collect1
<R> R collect(
Supplier<R> supplier,
BiConsumer<R,? super T> accumulator,
BiConsumer<R,R> combiner)
//collect2
/** @param1: supplier stores the result. @param2: Accumulator is the result. @param3: combiner is the aggregation policy of multiple containers
<R,A> R collect(collector<? super T,A,R> collector);
Copy the code
-
We can see from the function signature that we only use the second method of Collect, namely toList, toSet, toMap of the Collector provided by JDK. Because these operations are common, these Collectors are provided directly in the JDK.
-
We can also implement our own Collector
/* T: the generic type of the object to be collected in the stream A: the type of the accumulator, which is the object used to accumulate partial results during the collection R: the type of the object (usually but not necessarily A collection) resulting from the collection operation. * / public interface Collector<T.A.R> { // Result container Supplier<A> supplier(a); // The accumulator performs a concrete implementation of the accumulation BiConsumer<A, T> accumulator(a); // The container to merge the two results BinaryOperator<A> combiner(a); // Apply the final transformation finisher to the result container Function<A, R> finisher(a); //characteristics Set<Characteristics> characteristics(a); } // Custom Collector //1. Create a new result container supplier(), which must return an empty supplier for the data collection process //toList returns an empty List<>, like toSet and toMap @Override public Supplier<List<T>> supplier() { return ArrayList::new; } //2. Accumulator () // BiConsumer returns void, which takes two arguments: the first is the cumulative value, and the second is the NTH element of the current process @Override public BiConsumer<List<T>, T> accumulator() { // Add a list return List::add; } //3. Convert the final result container finisher() // After the stream traversal is complete, sometimes you need to process the result, you can use finisher. Finisher () must return the last function called in the accumulation process to convert the accumulator object into a collection. // Take two arguments, the first is the cumulative value, the second is the return value, the return value is what we want. @Override public Function<List<T>, List<T>> finisher() { // No extra processing for the as-is output // Function. Identity () return (i) -> i; } //4. Merge container combiner() //Stream supports parallel operations, but what about the parallel sub-part processing specification? Combiner () specifies how the subtasks are combined. @Override public BinaryOperator<List<T>> combiner() { // Each subtask is a List, and the results of the two subtasks are combined and added to the first subtask return (list1, list2) -> { list1.addAll(list2); return list1; }; } //5.characteristics() Copy the code
-
ToList Collector implementation of strings, the core is to understand T, A, R
public class MyCollector<String> implements Collector<String.List<String>, List<String>> {
@Override
public Supplier<List<String>> supplier() {
return ArrayList::new;
}
@Override
public BiConsumer<List<String>, String> accumulator() {
return (List<String> l, String s) -> {
l.add(s);
};
}
@Override
public BinaryOperator<List<String>> combiner() {
return (List<String> l, List<String> r) -> {
List<String> list = new ArrayList<>(l);
list.addAll(r);
return list;
};
}
@Override
public Function<List<String>, List<String>> finisher() {
return Function.identity();
}
@Override
public Set<Characteristics> characteristics(a) {
return Collections.unmodifiableSet(EnumSet.of(Characteristics.IDENTITY_FINISH));
}
}
Stream<String> apple = Stream
.of("apple"."banana"."orange"."grapes"."melon"."blueberry"."blackberry");
System.out.println(apple.collect(new MyCollector<>()));
// Concatenate string concat
String concat = Stream
.of("apple"."apple"."banana"."orange"."grapes"."melon"."berry"."blary")
.collect(StringBuilder::new,
StringBuilder::append,
StringBuilder::append)
.toString();
// This is equivalent to the above, so it should be clearer
String concat = stringStream.collect(() -> new StringBuilder(),(l, x) -> l.append(x), (r1, r2) -> r1.append(r2)).toString();
Copy the code
3.2.3 Reduce Operation
- The main purpose of this method is to combine the stream elements. It provides an initial value (seed) and then combines the operation rule (BinaryOperator) with the first, second, and NTH elements of the previous Stream. In this sense, string concatenation, sum, min, Max, and average of arrays are special Reduce.
Integer sum = integers.reduce(0,(a,b)->a+b); Or Integer sum = integers. Reduce (0,Integer::sum);
Copy the code
- If there is no start value, combine the two elements before Stream and return Optional(because there is no start reduce(), there may not be enough elements, so the design returns Optional).
// Native operation
final Integer[] integers = Lists.newArrayList(1.2.3.4.5)
.stream()
.collect(() -> new Integer[]{0}, (a, x) -> a[0] += x, (a1, a2) -> a1[0] += a2[0]);
/ / reducing operations
final Integer collect = Lists.newArrayList(1.2.3.4.5)
.stream()
.collect(Collectors.reducing(0, Integer::sum));
// Of course Stream also provides reduce operations
final Integer collect = Lists.newArrayList(1.2.3.4.5)
.stream().reduce(0, Integer::sum)
Copy the code
- Reduce concatenation string
String concat = Stream.of("A"."B"."C"."D")
.reduce("",String::concat);
Copy the code
- Reduce evaluates to the minimum value
double minValue = Stream.of(-1.5.1.0, -3.0, -2.0)
.reduce(Double.MAX_VALUE,Double::min);
Copy the code
- Reduce sum
// Have a starting value
int sumValue = Stream.of(1.2.3.4)
.reduce(0,Integer::sum);
// No start value
int sumValue = Stream.of(1.2.3.4)
.reduce(Integer::sum).get();
Copy the code
4. Commonly used in development
4.1. Generate your own stream
- You can control the flow generation by implementing the Supplier interface.
- Pass the Supplier instance to a Stream generated by stream.generate (), which defaults to serial (as opposed to parallel) but unordered (as opposed to ordered). Since it is infinite, in a pipe, you must limit the Stream size with operations like limit.
Random seed = new Random();
Supplier<Integer> random = seed::nextInt;
Stream.generate(random).limit(10).forEach(System.out::println);
//Another way
IntStream.generate(() -> (int) (System.nanoTime() % 100)).
limit(10).forEach(System.out::println);
Copy the code
4.2. Generate arithmetic sequence
// The step size is 3
Stream.iterate(0, n -> n + 3)
.limit(10)
.forEach(x -> System.out.print(x + ""));
Copy the code
4.3 Reduce and Collect Implement the filter function
1. Collect is more simple and efficient. 2. Reduce requires new ArrayList each time because reduce specifies the second parameter: BiFunction Accumulator expression cannot change the original value of its parameter ACC. Therefore, new ArrayList(ACC) is required each time and a new list is returned.
/ / reduce way
public static <T> List<T> filter(Stream<T> stream,Predicate<T> predicate){
return stream.reduce(new ArrayList<T>(),(acc,t)->{
if(predicate.test(t)){
List<T> lists = new ArrayList<T>(acc);
lists.add(t);
return lists;
}
return acc;
}, (List<T> left,List<T> right)->{
List<T> lists= new ArrayList<T>(left);
lists.addAll(right);
returnlists; }}//collect
public static <T> List<T> filter(Stream<T> stream, Predicate<T> predicate) {
return stream.collect(ArrayList::new, (acc, t) -> {
if (predicate.test(t))
acc.add(t);
}, ArrayList::addAll);
}
Copy the code
4.4 return to the protocol type
// Use toCollection() to specify the type of protocol container
ArrayList<String> arrayList = stream.collect(Collectors.toCollection(ArrayList::new));/ / (3)
HashSet<String> hashSet = stream.collect(Collectors.toCollection(HashSet::new));/ / (4)
Copy the code
4.5 groupingBy upstream and downstream collector
- The collector, generated using partitioningBy(), is suitable for dividing elements in a Stream into two complementary intersecting parts based on some binary logic (yes, no), such as sex, pass or fail
- GroupingBy () groups data by an attribute, and elements with the same attribute are mapped to a key in the Map
//partitioningBy()
Map<Boolean, List<Student>> passingFailing = students.stream()
.collect(Collectors.partitioningBy(s -> s.getGrade() >= PASS_THRESHOLD));
//groupingBy
Map<Department, List<Employee>> byDept = employees.stream()
.collect(Collectors.groupingBy(Employee::getDepartment));
/ / downstream of the mapping
// Group employees by department and keep only their names
Map<Department,List<String>> byDept=employees.stream()
.collect(Collectors.groupingBy(Employee::getDepartment,
Collectors.mapping(Employee::getName, // Downstream collector
Collectors.toList()))); // Further downstream collector
Copy the code