I’ve been reading Modern Java in Action, 2nd Edition on and off, and I’ve learned a lot (it’s nice to have some “me time” after having a baby). Although I had read the first edition (the title was “Java 8 In Action”), it was not a paper book, and I didn’t finish reading the first part of the basic knowledge at that time, so I felt the effect was not very obvious. In the subsequent daily development process, I still had to lick Google and StackOverflow frequently to learn the basic language knowledge of Java 8, which made me realize the importance of basic knowledge and the seriousness of the problem. I went shopping and found out that there was a new version, and it wasn’t just limited to Java 8.
This read is quite careful, and look at the process in the book to make some lines and notes. However, in fact, there has always been a problem that bothers me, that is, no matter how much I feel at that time, after a period of time, I will not have any impression, that is, “read quickly, forget quickly”. So I decided to make an electronic version of my notes, both to review the book again and for quick reference later.
Structure of the book
The book is divided into six parts as follows: Part 2 (Chapters 4 to 7), “Functional Data processing with Streams,” discusses in detail the newly introduced Stream API. Part 3 (Chapters 8-10), “Efficient Programming with Streams and Lambda,” helps readers process data declaratively (compared to the imperative programming of the Collection API in the past). The starting point was to introduce high-level programming ideas for writing more efficient Java code. Part 4 (chapters 11 through 14), “Java Everywhere,” covers a number of new features in Java 8 and Java 9. It includes the java.util.Optional class, the new datetime API, the default method of the interface, and the module system introduced in Java 9. Part 5 (Chapters 15-17), “Improving Java concurrency,” discusses how to use Java’s advanced features to build concurrent programs. Part 6 (chapters 18-21), “Functional Programming and the Future evolution of Java,” covers the introduction of the asynchronous API idea, CompletableFuture, the Flow API introduced in Java 9, and reactive programming. This paper discusses how to write efficient functional programs in Java, introduces some functional programming terminology, advanced techniques (such as higher-order functions, currization, delay sets, pattern matching, etc.), reviews the process of slowly moving toward functional programming since Java 8, and looks forward to the possible future enhancements in Java.
In addition, it is worth mentioning that the book’s four appendices are also worth reading. Appendix A summarizes some minor Java 8 features not discussed in this text. Appendix B Outlines some of the other Java class library updates (such as some new methods to the collection API, some updates to the java.util.Concurrent package, some new methods to the Number and Math and Files classes, etc.). Appendix C is a continuation of Part 2 and covers advanced uses of streams. Appendix D briefly explores how the Java compiler implements Lambda expressions behind the scenes (not as simple as syntactically sugar for anonymous inner classes).
Part 1 Basics
Chapter 1 changes to Java 8, 9, 10, and 11
This chapter summarizes the major changes in Java, including Lambda expressions, method references, flows, and default methods, in preparation for what follows.
Reasons for introducing new features
- Two pressing needs: cleaner code and easier access to multi-core processors
- The Stream API supports parallel operations of multiple data processes in a similar way to SQL — the requirements are described from a high-level perspective, and the “implementation” (in this case the Stream library) chooses the underlying best execution strategy. This avoids writing explicit concurrent code
- with
synchronized
Not only is locking error-prone, it is often more expensive than expected to perform on multi-core cpus- Synchronization forces code to execute sequentially, which defeats the purpose of parallel processing
- Multi-core cpus have separate caches for each processor core. Locking requires these caches to run synchronously, but this requires slow cache consistency protocol communication between cores
Stream processing
- Context: Pipes in Unix Note that in Unix these pipes are connected in parallel, like an assembly line for a car. Although it is actually a sequence, the different workshops generally run in parallel
- The Stream API’s many methods can be linked together to form a complex pipeline, just as Unix commands are linked together
- The key to driving this approach
- Java programs can be written at a higher level of abstraction, and the code is more readable
- Almost free parallelism, because Java 8 transparently takes unrelated parts of the input to several CPU cores to execute the Stream pipeline you gave them
- I say “almost” free because the behavior provided to the stream must be able to execute safely on different inputs at the same time, which may require us to get a little used to writing “pure functions” (or “side-effect free functions” or “stateless functions”)
From object-oriented programming to functional programming
- The major changes in Java 8 reflect a shift away from classical object-oriented thinking, which often focuses on changing existing values, and toward the realm of functional programming
- In functional programming, thinking about what you want to do in general is considered the most important thing, separated from how you want to do it. Right
- Taken to the extreme, traditional object-oriented programming and functional programming may seem contradictory. But the idea is to take the best of both programming paradigms in order to find the ideal tool for the task
- Method and Lambda as first-class citizens
- The whole purpose of a programming language is to manipulate values, which are traditionally referred to as equivalents (or first-class citizens). Other constructs or concepts in the language that may help represent the structure of values cannot be passed during program execution and are therefore binary (e.g. methods and classes in Java, for example, can define classes and classes can be instantiated to produce values, but neither is itself a value)
- Before Java 8, it was common to do this by passing an instance of an anonymous class to a methodPass codeIs cumbersome and unreadable, simply because methods are second-class citizens in Java; Now all you need is a method reference
: :
The syntax (” take this method as a value “) will do. - Method references can be passed just as objects are passed using object references, because now methods are first-class citizens! The point is that as long as the code is in the method, the code can be passed by method references
- In addition to allowing (named) functions to become first-class citizens, Java 8 also embodies the idea of functions as values in a broader sense, including lambdas (or anonymous functions)
- With the introduction of Lambda, you don’t even need to write definitions for once-only methods, and the code is cleaner. But if the Lambda is more than a few lines long or the behavior isn’t obvious, it’s better to use a method reference to point to a method with a descriptive name (after all, the method name itself describes the method and can be reused).
flow
- The parallelism that Java 8 gives to Streams is rarely used
synchronized
Functional programming style, itFocus on data chunking rather than coordinating access - The Stream API handles data differently from the Collection API. Collections are external iterations (the programmer manages the iteration process himself), whereas streams are internal iterations (the programmer doesn’t have to worry about loops at all; data processing happens entirely inside the library).
- Another headache with collections is that it requires programmers to handle multithreading themselves, which is no easy task
- Access and updates to shared variables need to be carefully coordinated
- This model is less understandable than the sequential model of step by step execution
- The Stream API solves two problems
- Templating and obscurity when working with data using collections (unreadable?)
- Difficult to utilize multiple cores
- Why Java 8 designed the Stream API this way
- There are many recurring data processing patterns (such as filter, map, grouping, etc.)
- Such operations can often be done in parallel
- A Collection is used to store and access data, while a Stream is used to evaluate data.
The default method
- One of the real problems Java designers face when introducing many of these new features is that existing interfaces also need to be improved, but adding a new method to an interface is a disaster for thousands of users of the interface, because all users of the interface need to implement the new method. The default approach introduced in Java 8 resolves this problem (supports interface evolution)
- use
default
Keyword to represent the default method in the interface - Some restrictions (three rules are given below) are used to avoid problems like the infamous diamond inheritance problem in C++
other
- Java 8 provides this
Optional<T>
Class to help avoidNullPointerException
Chapter 11 discusses this topic in detail- This is a container object that may or may not contain values
- It allows programmers to explicitly indicate that a variable may be missing values through a type system
- Java 9 provides a module system that allows you to define a module consisting of a series of packages syntactically, a topic discussed in Chapter 14
- Better control over namespace and package visibility
- Simple JAR-like components have been enhanced to have structure
Chapter 1 Summary
Two core ideas that Java introduced from functional programming, both of which are used in the new Stream API
- Take the method and Lambda as equivalents
- In the absence of mutable shared state, functions or methods can be efficiently and safely executed in parallel
Chapter 2 passes code through behavior parameterization
Responding to changing requirements: Ideally, the effort needed to respond to changing requirements should be minimal, and new features like these should be simple to implement and easy to maintain over time. A good rule of thumb is to abstract similar code as much as possible
Behavior parameterization
- Behavior parameterization corresponds to value parameterization
- Screening apple’s example to help farmers to step by step instructions, method to add parameters is not a good way to cope with change, it’s best to take a step back to look at a higher level of abstraction, thus it comes to transfer behavior rather than a simple parameter values, and then introduced the strategy pattern, namely define gens algorithm and encapsulate (referred to as the “policy”)
To deal with the repetitive
- The policy pattern is appropriate in this example to address the different requirements of filtering Apple, but the code is too verbose. You need to define an interface and then multiple implementation classes to parameterize behavior by passing instances of different implementation classes to existing methods
- Continue to improve
- Named classes -> anonymous classes, which allow simultaneous declaration and instantiation of a class, in other words, allow build-as-you-go
- The code looks clunky, with extra wrapping when all you want is code in a method
- It can be confusing at times, but a classic Java puzzle illustrates this point (in anonymous classes)
this
Corresponds to the containing class, not the outer class.) - In general, verbose is bad, and good code should be self-explanatory
- Anonymous classes -> Lambda expressions
- Introduce the generic parameter T (you can go further on the road to abstraction than the problem at hand)
- Named classes -> anonymous classes, which allow simultaneous declaration and instantiation of a class, in other words, allow build-as-you-go
More instances
- There are many methods in the Java API that can be parameterized with different behaviors, and the book shows four typical examples
Comparator
Sorting:sort(Comparator compartor)
Runnable
Execute code block:Thread(Runnable runnable)
Callable
Return result from task:executorService.submit(Callable callable)
EventHanlder
GUI event handling :(similar to JavaScriptonClick = () => {}
),
Chapter 2 Summary
- Behavior parameterization allows code to better adapt to changing requirements, reducing future workload
- Passing code is passing the new behavior as an argument to a method (C++ function pointer? Or callback functions in C#? Before Java 8, however, the implementation of the passing code was very verbose, i.e. the use of anonymous classes reduced the declaration of many once-only entity classes, and the code was still not very readable. This can now be done with Lambda using just one line
Chapter 3 Lambda expressions
Lambda got a sneak peek
- Lambda expressions can be thought of as a succinct transitive anonymous function
- Anonymous, unlike ordinary methods that have a clear name: write less and think more
- It is a function because, unlike methods, it does not belong to a particular class
- Passable, which can be passed to a method as a parameter or stored in a variable
- Concise, you don’t need to write as much template code as anonymous classes
- Lambda expressions primarily provide a concise way to pass code (compared to anonymous classes). In theory, Lambda can’t do anything that Java 8 couldn’t do before
- The LambdaThe basic grammaris
(param) -> expr
, called an expression style Lambda. Note that Lambda expressions of this style do notreturn
Statement, because it is already impliedreturn
(param) -> { stmts; }
, called block-style Lambda
Use of Lambda expressions
- Where to use:Lambda expressions can only be used where functional interfaces are accepted
- Can be assigned to a variable
- Or to a method that takes a functional interface as an argument
- Functional interfaceAn interface that defines only one abstract method (LLDB).
Comparator
,Runnable
,Callable
, etc.).Pay attention toEven with many default methods, an interface is still a functional interface as long as it defines only one abstract method - Lambda expressions allow you to provide implementations of abstract methods of a functional interface directly inline, using the entire expression as an instance of the functional interface (strictly speaking, an instance of a concrete implementation of the functional interface)
@FunctionalInterface
Is a tag annotation that indicates that the interface will be designed as a functional interface@Override
There are some similar- Example: Surround execution mode, such as
public void doSomething(Function<T, R> processor) { // Open the resource (template code) // The concrete business code (the changing part) => can be passed in as an argument to the method // Clean up (template code) } Copy the code
Use of functional interfaces
- The signature of an abstract method of a functional interface is called a function descriptor. As mentioned above, Lambda expressions can be applied to methods that take functional interfaces as arguments, so to apply different Lambda expressions, we need a set of functional interfaces that can describe common function descriptors. Java 8 comes with some commonly used functional interfaces in the
java.util.function
In the bag, such as@FunctionalInterface public interface Predicate<T> { // T -> boolean boolean test(T t); } @FunctionalInterface public interface Consumer<T> { //T -> void void accept(T t); } @FunctionalInterface public interface Supplier<T> { // () -> T T get(a); } @FunctionalInterface public interface Function<T.R> { // T -> R R apply(T t); } Copy the code
- Base type specialization
- Generics can only be bound to reference types, not primitive types, because of the way Java generics are implemented internally (again, a little bit on the topic of generics in Chapter 20).
- So in Java, there are boxing and unboxing mechanisms, as well as automatic boxing mechanisms to help programmers perform conversions between primitive types and corresponding reference types
- There is a cost to packing and unpacking in terms of performance. The boxed value is essentially wrapped around the base type and stored in the heap, so the boxed value requires more memory and additional memory searches to retrieve the wrapped base value
- The Java 8 functional interface comes with a specialized version to avoid automatic boxing operations when both input and output are primitive types, such as
DoublePredicate
,IntConsumer
,LongBinaryOperator
,IntFunction
,ToIntFunction<T>
Etc. - If necessary, you can design your own generic functional interfaces or the basic type specifiers you need
Type checking and type inference
- The type of Lambda is inferred from the context in which Lambda is used, and type checking is done by checking whether the Lambda expression conforms to the function descriptor
- The same Lambda expression can be associated with different functional interfaces, as long as their abstract method signatures are compatible
- Type inference can help programmers dispenses with annotation of parameter types in Lambda syntax. For example,
// No type inference Comparator<Apple> c = (Apple a1, Apple a2) -> a1.getWeight().compareTo(a1.getWeight()); // There is type inference Comparator<Apple> c = (a1, a2) -> a1.getWeight().compareTo(a1.getWeight()); Copy the code
But sometimes it’s easier to show the type and sometimes it’s easier to remove it. There’s no rule that says which is better. It’s up to the programmer to decide.
- Local variables (or free variables, defined in outer scopes) can also be used in Lambda expressions, saying Lambda captures XX variables
- The Java compiler places the following restrictions on capturing local variables
- The local variable must be
final
Or in fact equivalent tofinal
(i.e., no subsequent modification) - The reason for this restriction is that local variables are kept on the stack, implicitly indicating that they are limited to their thread. Allowing capture of mutable local variables opens up new possibilities that make it unsafe for threads (there is no such restriction on capturing instance variables because they are held in the heap, which is shared between threads)
Method references
- Method references let you reuse existing methods and pass them directly
- How do I build method references
There are three main types of method references.
- Method references to static methods (e.g
Integer::parseInt
) - Method references to instance methods (e.g
String::length
) - A method reference to an existing object or expression instance (e.g
someInstance::doSomething
orthis::getValue
) - The second method and the third method are easy to confuse at first, but in fact they are not difficult. I can make the following deduction
// String::length is equivalent to the following Lambda expression. Note that s is only a parameter and not an actual existing object, so it cannot be s::length (String s) -> s.length(); / / someInstance: : doSomething is equivalent to the following Lambda expressions. Note that someInstance is not a parameter, but an actual object SomeClass someInstance; Consumer<T> consumer = (args) -> someInstance.doSomething(args); Copy the code
Compound Lambda expression
Multiple simple lambdas can be compounded into complex expressions
- Comparator composition, e.g
/ / reverse inventory.sort(comparing(Apple::getWeight).reversed()); // Comparator chain inventory.sort(comparing(Apple::getWeight).reversed() .thenComparing(Apple::getCountry)); Copy the code
- Predicate composition, as shown in the following code
// negate Predicate<Apple> notRedApple = redApple.negate(); // and, or Predicate<Apple> redAndHeavyAppleOrGreen = readApple.and(apple -> apple.getWeight() > 150).or(apple -> GREEN.equals(apple.getColor())); Copy the code
Note that the AND and OR methods are determined from left to right by position in the expression chainpriority, i.e.
a.or(b).and(c)
Can be seen as(a || b) && c
- Composition of functions.
Function
Lambda expressions represented by interfaces can also be composed, somewhat like composition functions in mathematics// H (x) = g(f(x)); f(x) = g(x) Function<Integer, Integer> h = f.andThen(g); // H (x) = f(g(x)) Function<Integer, Integer> h = f.compose(g); Copy the code
- What is the application of function composition in practice? You can build a variety of transformation pipelines by combining various tool methods, as shown below
java Function<String, String> addHeader = Letter::addHeader; Function<String, String> transformationPipeline = addHeader.andThen(Letter::checkSpelling).andThen(Letter::addFooter);
- What is the application of function composition in practice? You can build a variety of transformation pipelines by combining various tool methods, as shown below
Part 2 uses streams for functional data processing
Chapter 4 introduces flow
The reason for introducing flows: While collections are indispensable to almost any Java application, collection operations are far from perfect
- A lot of business logic involves database-like operations (find, group, filter, etc.), and you can express what you want to do in SQL without worrying about explicitly implementing those queries. Why can’t that be the case in collections?
- When dealing with large numbers of elements, you need parallel processing and a multi-core architecture to improve performance. But writing parallel code is more complex than using iterators, and debugging is boring
Benefits of the Stream API
- Declarative, code more concise and easy to read. This approach, coupled with behavioral parameterization, allows you to easily respond to changing demands
- Composite, more flexible. You can link several basic operations together to express a complex data processing pipeline while keeping your code clear and readable
- Parallel, better performance.The operations are decoupled from the threading model.
filter
,sorted
,map
,collect
Such operations are independent of the specific thread modelHigh-level component, so their internal implementations can be single-threaded, or they can transparently take advantage of multi-core architectures. In practice, this means that you don’t have to worry about threads and locks to make certain data processing tasks parallel, the Stream API does it for you!
Flow profile
- What exactly is flow? A: “A stream is a sequence of elements generated from a source that supports data processing”
- Collections are data, streams are computation because collections are data structures, their primary purpose is to store/access elements at a specific time/space complexity; But the purpose of flow is to express computation
- Data processing operations for streams can be performed sequentially or in parallel; Streams generated from ordered collections retain the original order
- There are two important features of flow
- Assembly line
Many stream operations themselves return a stream so that multiple operations can be linked together to form a larger pipeline
- Internal iteration
Flow and collection
- Computing time
- Collections are eagerly created/production-driven. A collection is an in-memory data structure that contains the current data structureallEach element has to be evaluated before it can be part of the set
- Streams are lazy creation/requirements driven. Streams are conceptually fixed data structures (no elements can be added or removed), and the idea is that the user merely extracts the required values from the stream, and these values are calculated or generated on demand
- Iterative way
- Collections use external iteration of the loop to explicitly pull out each element for processing
The for-each structure is a syntactic sugar that has hidden some of the complexity in iteration, something behind it
Iterator
Objects are uglier when expressed- Streams are internally iterative and can be transparently processed in parallel or in a more optimized order
- A stream, like an iterator, can only be traversed once. After traversing, we say that the stream has been consumed. For example, the following code throws an exception for trying to repeat the consumption stream
List<String> title = Arrays.asList("Modern"."Java"."in"."Action"); Stream<String> s = title.stream(); s.forEach(System.out::println); s.forEach(System.out::println); Copy the code
Flow operation
- The use of streams generally involves three things
- The data source
- Intermediate chain of operation
- Terminal operation
java.util.stream.Stream
Interfaces define many operations that fall into two broad categories: operations that can be linked together are called intermediate operations,The operation of closing a stream is called a terminal operation- The intermediate operation is lazy and does not perform any processing unless a terminal operation is started on the pipeline
This is because intermediate operations can generally be combined and processed all at once in terminal operations
- The endpoint operation generates the result from the pipeline of the flow, which is”Any non-stream value“(um participant.
List
,Integer
,void
Etc.)
- The idea behind pipelining flows is similar to the builder pattern
xxxBuilder.setX().setY().setZ().build();
Chapter 5 uses streams
Flow of various operations
-
Filter: filter and DISTINCT
List<Integer> numbers = Arrays.asList(1.2.1.3.3.2.4); numbers.stream() .filter(i -> i % 2= =0) .distinct() .forEach(System.out::println); Copy the code
-
Slice (select/truncate/skip) select: takeWhile, dropWhile; Cut off: limit; Skip, skip
// The disadvantage of using filter is that you need to iterate through the data in the stream, performing predicate operations on each element // But the filter operation cannot take advantage of the known condition that the elements in the actual stream are ordered and stop in time List<Integer> numbers = Arrays.asList(1.2.3.4.5.8.15); List<Integer> smallNumbers1 = numbers.stream() .filter(n -> n <= 5) .collect(toList()); // takeWhile (introduced in Java 9) stops processing when it encounters the first element that does not meet the requirement // Similar to while (true) {take(); } List<Integer> smallNumbers2 = numbers.stream() .takeWhile(n -> n <= 5) .collect(toList()); // dropWhile is also introduced in Java 9, similar to but opposite to takeWhile, no longer pastes code // Truncate: limit returns another stream that does not exceed a given length // Skip: Skip is complementary to limit and returns a stream with the first n elements thrown away Copy the code
-
Mapping: Map and flatMap
- Translation: Mapping and transformation have similar meanings, with the slight difference that mapping is “create a new version” rather than “modify in place”
flatMap
Method to generate each streamflatIs a single stream. In short, it lets you replace every value in a stream with another stream, and then join all the streams together to form a single stream. In plain English, it is to turn the flow of the flow into the flow, such as usingmap
And what we get is thisStream<Stream<T>>
orStream<List<Stream<T>>>
, switch toflatMap
I can flatten it and get what I wantStream<T>
-
Match: The allMatch, anyMatch, and noneMatch match operations take a predicate argument, return a Boolean value, and are therefore terminal operations
-
Find: findAny and findFirst
findAny
/findFirst
Method returnsIn the current flowAny/first element of, and therefore a terminal operation. Both methodsDo not accept parameters(Always mistake the method name for assuming both accept a predicate parameter)- Why are there both
findAny
andfindFirst
? The answer is parallelism. Finding the first element is more constrained in parallelism, soIf you don’t care which element is returned, usefindAny
betterBecause it is less restrictive when using parallel streams - The lookup operation may not find any elements, so what is the return value type
Optional<T>
I guess avoid on stream operationsnull
Also introducedOptional<T>
A big reason and motivation)
-
reduction
- The reduction operation combines all the elements in the stream repeatedly to get a value (reduce the stream to a value). In functional programming language terminology, this is called “folding,” because you can think of this operation as folding a long piece of paper (stream) repeatedly into a small square (result of folding operation)
- Element sum
// sum the elements in a list of numbers in a for-each loop int sum = 0; for (int x : numbers) { sum += x; } // Reduce operation that takes two arguments: initial value, BinaryOperator
that combines the two elements to produce a new value int sum = numbers.stream().reduce(0, (a, b) -> a + b); // In Java 8, the Integer class now has a static sum method to sum two numbers, so the code can be written more easily int sum = numbers.stream().reduce(0, Integer::sum); // There is also an overloaded version of reduce that does not accept an initial value and returns an Optional object (if there are no elements in the stream) Optional<T> sum = numbers.stream().reduce((a, b) -> a + b); Copy the code - Maximum/minimum value
Optional<Integer> max = numbers.stream().reduce(Integer::max); Optional<Integer> min = numbers.stream().reduce(Integer::min); Copy the code
- count
/ / the map - the reduce model int count = numbers.stream().map(x -> 1).reduce(0, Integer::sum); // The built-in count method can count and return long long count = numbers.stream().count(); Copy the code
- Advantages of the reduction method
- The example of iterative summation is updating shared variables
sum
Is not easy to parallelize. If synchronization is added, it is likely that thread contention cancels out the performance gains that parallelism should provide. So the parallelization of this computation requires another approach: block sum and finally merge. But then the code is not very readable. - use
reduce
The iteration is abstracted away from the inner iteration, allowing the inner implementation to choose parallel executionreduce
Operation. - As a result,The variable accumulator pattern is a dead end to parallelizationAnd the
reduce
Provides a new model
- The example of iterative summation is updating shared variables
Numerical flow
- Primitive type flow specialization
- Java 8 introduces three primitive-type specialized flow interfaces to avoid implicit boxing costs in some cases:
IntStream
,DoubleStream
andLongStream
. Each interface brings a new method for reducing common values (LLDB).sum
,max
,average
Etc.) and methods to convert back to the object stream if necessary. - Remember, the reason for introducing a specialized flow is not the complexity of the flow, but the efficiency difference that boxing brings
- Conversion of a numeric stream to an object stream
// map to the numeric stream for the sum operation, while avoiding the boxing overhead of sum with reduce operation. IntStream intStream = menu.stream().mapToInt(Dish::getCalories); int calories = intStream.sum(); // Convert back to the object Stream in order to use the more generalized operations defined in the Stream interface Stream<Integer> stream = intStream.boxed(); Stream<int[]> objStream = intStream.mapToObj(i -> new int[]{i, i}); Copy the code
- The default value
// Find the largest element in the numeric stream. If there is no maximum value, the display gives the default value // OptionalDouble, OptionalLong similar OptionalInt maxValue = intStream.max(); int max = maxValue.orElse(0); Copy the code
- Java 8 introduces three primitive-type specialized flow interfaces to avoid implicit boxing costs in some cases:
- Numerical range flow
/ / [1, 100] IntStream numberStream = IntStream.rangeClosed(1.100); / / [1, 100) IntStream numberStream2 = IntStream.range(1.100); Copy the code
Build flow
- Create a flow from a value
Stream<String> stream = Stream.of("Modern"."Java"."in"."Action"); Stream<String> emptyStream = Stream.empty(); Copy the code
- Create a flow from nullable objects
// Java 8 String s = getString(); Stream<String> stringStream8 = s == null ? Stream.empty() : Stream.of(s); // Java 9 Stream<String> stringStream9 = Stream.ofNullable(getString()); Copy the code
- Create streams from arrays
int[] numbers = {1.2.3.4.5}; // One line of code to sum array elements int sum = Arrays.stream(numbers).sum(); Copy the code
- The FILE generated Stream NIO API has been updated to take advantage of the Stream API, for example
java.nio.file.Files
Many static methods in the. Note the small details of the comments in the code belowlong uniqueWords = 0; // The Stream interface implements AutoCLoseable, so there is no need to explicitly recycle through a finally block try (Stream<String> lines = Files.lines(Paths.get("data.txt"), Charset.defaultCharset())) { uniqueWorkds = lines.flatMap(line -> Arrays.stream(line.split(""))) .distinct().count(); } catch (IOException e) { } Copy the code
- Generate streams from functions: Create infinite streams
- The Stream API provides two static methods
Stream.iterate
andStream.generate
Generate streams from functions, and these two operations can create what is calledInfinite flowThe resulting flow uses the given functionAccording to the need to createValue, so it can go on endlessly). - In addition, Fibonacci sequence is generated in two ways in this section. For details, please refer to P113/P116.
- In general, this should be used when you need to generate a series of values in sequence
iterate
- The iteration
iterate
iterate
The method takes an initial value and a Lambda applied in turn to each generated new value.UnaryOperator<T>
Type). This iteration is essentially sequential because the result depends on the previous application.// Stream.iterate(0, n -> n + 2).limit(10).forEach(System.out::println); Copy the code
Java 9 on again
iterate
Method has been enhanced so that iterations can be stopped. Here withIntStream.iterate
An example of the method is as follows// The second argument is a predicate that determines when the iteration call terminates IntStream.iterate(0, n -> n < 100, n -> n + 2); Copy the code
- generate
generate
generate
Instead of applying a function to each newly generated value in turn, the method accepts oneSupplier<T>
Lambda of type provides new values. For example,Stream.generate(Math::random).limit(5).forEach(System.out::println); Copy the code
- It must be used because it is dealing with an infinite stream
limit
Operation toExplicitly limit its sizeOtherwise the terminal operation will go on forever! Similarly,An infinite stream cannot be sorted or reducedBecause all the elements need to be processed, and that’s never done!
- The Stream API provides two static methods
Chapter 5 summary
- If the data source is explicitly known to be ordered, then use
takeWhile
/dropWhile
Methods are usually better thanfilter
Much more efficient - All of these operations for finding and matching are utilizedA short circuitStop the calculation as soon as the result is found, there is no need to process the entire stream. The takeWhile mentioned above
/
DropWhile ‘is also a short-circuit operation
Chapter 6 collects data with streams
Introduction to collector
- With the collector (
Collector
) can be defined succinctly and flexiblycollect
The standard that the operation is used to generate the result set; As with the comparator (Comparator
) defines the criteria for sorting methods collect
An operation is essentially areductionoperationCollectors
Utility classes offer a lotStatic factory methodTo make it easy to create common collectorsThe instance. They provide three main functions- Reduce and summarize the stream elements into a value
- Grouping elements
- Element partition
Reduction and summary
- Count/Max/sum/connect string
import static java.util.stream.Collectors.*; / / count long howManyDishes = menu.stream().collect(counting()); // Maximum/minimum value Optional<Dish> mostCalorieDish = menu.stream().collect(maxBy(Comparator.comparingInt(Dish::getCalories))); / / summary int totalCalories = menu.stream().collect(summingInt(Dish::getCalories)); double avgCalories = menu.stream().collect(averagingInt(Dish::getCalories)); // Complete the summary of one operation // IntSummaryStatistics{count=9, sum=4300, min=120, average=477.777778, Max =800} IntSummaryStatistics menuStat = menu.stream().collect(summarizingInt(Dish::getCalories)); System.out.print(menuStat); // Concatenate string, concatenate all the strings obtained by applying toString method to each object in the stream // Note that joining uses StringBuilder internally, not += or concat String shortMenu = menu.stream().collect(joining()); Copy the code
- Collectors are typically used when you need to regroup flow items into collections, as is common
collect(toList())
Operation. More broadly, a collector can be used whenever you want to combine all the items in a stream into a single result (which can be of any type). - In fact,
Collectors.reducing
The factory method is a generalization of all of these special cases. Arguably, those cases are just for programmer convenience (but remember, programmer convenience and readability are a priority).reducing
There are three parameters: the initial value, the conversion function, and the cumulative function - since
collect
Method is actually a reduction operation, socollect
andreduce
What’s the difference?- Semantically,
reduce
The method is designed to combine two values to produce a new value, which is aImmutable reduction. whilecollect
The method is designed toTo change the containerTo accumulate the results to be output (e.gcollect(toList())
Is to keep adding elements from the stream to the collection container. - Real problem, if used with incorrect semantics
reduce
Method causes the reduction process to failparallelWork (because concurrent modification of the same data structure by multiple threads may damage the data structure itself; To be thread-safe, you need to allocate a new collection container each time, but this affects performance. whilecollect
Method is suitable for parallel operations and is therefore particularly suitable for expressing reduction over mutable containers
- Semantically,
grouping
groupingBy
Available in single-parameter and double-parameter versions. From this point of view, ordinary single parametergroupingBy(f)
F is the classification functiongroupingBy(f, toList())
A simple way of writing; When the second argument is stillgroupingBy
Is reachedMultistage groupingThis multi-level grouping operation can be extended to any level.- To understand multi-level grouping, you can put
groupingBy
Think of it as buckets, number onegroupingBy
A bucket is created for each key, and then a downstream collector is used to collect the elements in each bucket, resulting in n-level grouping - We have several collectorsnestedIt’s very common to get up. Going further,
groupingBy
The second collector accepted can be of any type, not necessarily anothergroupingBy
. For example, to count the number of dishes per category in a menu, you can passcounting
Collector as the second parameter; To see the highest calorie dish in each category, passmaxBy
. This is theCollect data by subgroup - If the code remains readable using the Collection API for a single standard grouping, scaling to multilevel grouping is difficult; The Stream API does this through efficient composition
- A quick fact:
collect(toList())
For the returnedList
There is no guarantee of the type. If you want more control, you can use the examplecollect(toCollection(ArrayList::new))
Code like this
partition
- Partition (
partioningBy
) is grouping (groupingBy
), where a predicate is used as a classification function, called partition function; So you get the groupingMap
The key type of theBoolean
, can be divided intotrue
/false
The two groups. For example, the following code divides numbers into prime and non-prime numberspublic boolean isPrime(int candidate) { int candidateRoot = (int) Math.sqrt((double) candidate); return IntStream.rangeClosed(2, candidateRoot).noneMatch(i -> candidate % i == 0); } public Map<Boolean, List<Integer>> partionPrimes(int n) { return IntStream.rangeClosed(2, n).boxed() .collect(partioningBy(candidate -> isPrime(candidate))); } Copy the code
Collector interface
- All of the above predefined collectors are correct
Collector
An implementation of an interface that provides a template for implementing a specific reduction operation, that is, a collector.// T is the type of the item to be collected in the flow, A is the type of the accumulator, and R is the type of the object to be collected (usually but not necessarily A collection) // The first three methods are sufficient for sequential reduction of streams, and the fourth method (using the branch/merge framework introduced in Java 7) allows parallel reduction of streams public interface Collector<T.A.R> { // Create a new result container for the supply source Supplier<A> supplier(a); // accumulator to add elements to the result container BiConsumer<A, T> accumulator(a); // Apply the final transformation to the result container. If no transformation is required, return the identity Function function.identity () Function<A, R> finisher(a); // combinator to merge two result containers BinaryOperator<A> combiner(a); // Defines the behavior of the collector, in particular hints about whether parallel reduction can be done and which optimizations can be used Set<Characteristics> characteristics(a); } Copy the code
- A small point mentioned in the sample code in this section (P143) : singletons can be used when an empty list needs to be returned
Collections.emptyList()
Replace the usualnew ArrayList<>()
- P144-148 implements a custom collector to partition primes and non-primes; P148-149 predefined for this collector
partioningBy
The collector created by the factory method proceedsThe performance comparison. The way to compare this is to partition the first 1 million natural numbers and run them 10 times, comparing the elapsed time (the result is a performance improvement of about 30%). It’s mentioned hereA more scientific test is to use a framework such as the JMH. JMH is an acronym for Java Microbenchmark Harness. JMH is a Java Microbenchmark Harness. JMH is a Java Microbenchmark Harness. P154 uses JMH to compare the performance of two versions of a method in order to choose between them- Here is an excerpt of this code (p146) :
public Map<K, V> xxxMethod(a) { // Here the Map is initialized while being created // I think the double curly bracket is a normal single bracket with a non-static code block inside return new HashMap<K, V>() {{ put(XXX, xxx); put(YYY, yyy); }}; // Don't forget the semicolon here } Copy the code
- Here is an excerpt of this code (p146) :
Chapter 7 Parallel data processing and performance
(This chapter focuses on branching/merging frameworks and spliterators, which are rarely used in daily life, so IT’s only a curated overview and will be reviewed in conjunction with other sources if necessary.)
An overview of the
- Prior to Java 7, parallel processing of collection data was cumbersome
- The data structure that contains the data needs to be unambiguously broken down into subparts
- You need to assign a separate thread to each subpart
- You need to synchronize them at the right time to avoid race conditions and wait for all threads to complete, merging these partial results
- Java 7 introduces a framework called “branch/merge” to make these operations more stable and error-resistant
The parallel flow
- A parallel stream is a stream that splits content into chunks of data, with different threads processing each chunk separately. This automatically distributes work to all the cores of the multicore processor, keeping them busy
- Call to the sequential stream
parallel
Method to convert the flow to a parallel flow.Pay attention toThis process does not mean that there is any actual change to the stream itself, it is just an internal setupboolean
Flag that indicates that you want to callparallel
All subsequent operations are executed in parallel. Similarly, calls to parallel streamssequential
Method to turn it into a sequential stream. - The last time
parallel
orsequential
The call affects the pipeline, that is, whether the pipeline executes in parallel depends on the last callparallel
orsequential
- The default is used internally for parallel streams
ForkJoinPool
, its default number of threads is the number of processors (byRuntime.getRuntime().availbleProcessors()
Get) - Measurement flow performance
- Always follow the golden rule when optimizing performance: Measure, measure, measure
- In the example of measuring flow performance, large heaps were specifically configured and garbage collection was attempted to be enforced after each benchmark iteration (
System.gc()
), hoping to avoid the impact of garbage collection as much as possible. But there are so many factors that can affect the timing of execution that the results of benchmarks should still be taken with a grain of salt - Marking streams as parallel does not always improve performance. It must be recognized that some flow operations are easier to parallelize than others (counter example:
iterate
Is inherently sequential and every application of this function depends on the result of the previous application, so it is hard to split into independent execution of small pieces) this means that, in this case, the flow marked as parallel, actually is to increase spending to order processing, it also have to put each sum operation on a different thread - Choosing the right data structure is often more important than parallelization algorithms, using the right data structures and then making them work in parallel ensures the best performance (examples in the book are as follows)
// Problem: Find the sum of the first N numbers Stream.iterate(1L, i -> i + 1).limit(N).parallel().reduce(0L, Long::sum); // The performance is much better for the following reasons: 1. 2. The resulting flow is easy to split LongStream.rangeClosed(1, N).parallel().reduce(0L, Long::sum); Copy the code
- Parallelism is not free. The parallelization process itself requires recursive partitioning of the streams, assigning the reduction operations of each substream to different threads, and then combining the results of those operations into a single value. Also, moving data between cores can be more expensive than you think
- Errors The number one reason for errors when using parallel streams is when the algorithm used changes some shared state (i.e. the function has side effects)
- Suggestions for efficient use of streams
- It is easy to turn sequential flows into parallel flows, but it is not necessarily correct or efficient
- As far as possible with
IntStream
Such primitive streams avoid automatic boxing and unboxing operations, which can significantly degrade performance - Some operations themselves perform worse on a parallel stream than on a sequential stream (e.g
limit
andfindFirst
Operations that depend on the order of the elements are expensive to perform on parallel streams.) - For small volumes of data, it is almost never a good decision to choose a parallel stream
- Consider whether the data structure behind the flow is easily decomposed (e.g
ArrayList
Resolution efficiency ratioLinkedList
Much higher, the former can be evenly split without traversal;IntStream.range
thanStream.iterate
Much better decomposability) - Also consider the merge step in terminal operations (
Collector::combiner
Method) - The infrastructure used behind parallel flows is the branch/merge framework introduced in Java 7. Understanding the inner workings of parallel streams is critical to using them correctly
Branch/merge framework
- The branch/merge framework, which aims to recursively break parallel tasks into smaller tasks and then combine the results of each subtask to produce an overall result. It is a
ExecutorService
An implementation of the interface that assigns subtasks to a thread pool (calledForkJoinPool
). - In practical applications, use multiple
ForkJoinPool
It doesn’t make much sense, so it’s usually a singleton - Having a large number of small tasks rather than a few large ones helps balance the load better between worker threads. Branch/merge framework to solve practical each subtask time may vary considerably in the problem of stealing technology called work (work ‘dealing) : each thread is assigned to its mission to save a two-way linked list, when all of its own task, randomly selected from a different thread and “stolen” from the tail of the queue a task.
- The automatic stream splitting mechanism behind parallel streams is called
Spliterator
Is a new interface added to Java 8.Spliterator
The name stands for “separable iterator” (splitable iterator), which defines how a parallel stream splits the data it is iterating over. andIterator
The same,Spliterator
It is also used to iterate over elements in a data source, but is designed for parallel execution - In practice, you may not need to develop it yourself
Spliterator
, but knowing how it is implemented will enhance your understanding of how parallel flows work. P166-173 explains this and uses the word counting task as an example to demonstrate customizationSpliterator
The process of
Part 3 uses streams and Lambda for efficient programming
Chapter 8 enhancements to the Collection API
Create a collection
The factory method added to Java 9 simplifies the creation of small lists, sets, or maps
// The list created by this method is an immutable collection (read-only list) that cannot add or subtract elements, nor can it be updated
List<String> wordList = List.of("aaa"."bbb"."ccc");
// java.lang.UnsupportedOperationException
wordList.add("ddd");
wordList.set(0."ddd");
// Create an immutable Set
Set<String> wordSet = Set.of("aaa"."bbb"."ccc");
// Create an immutable Map
// It is convenient to use the of factory method to create small maps
Map<String, Integer> ageOfFriends = Map.of("Tom".10."Jerry".9."Jack".11);
// Consider creating larger maps using the map. ofEntries factory method
Map<String, Integer> ageOfFriends =
Map.ofEntries(Map.entry("Tom".10), Map.entry("Jerry".9), Map.entry("Jack".11));
Copy the code
Processing collections (lists and sets)
-
Java 8 introduces the following new methods for List and Set: Collection::removeIf, List::replaceAll, List::sort. Note that these methods work on the calling object itself, that is, they change the collection itself (modify in place). This is very different from the operation of a stream (making a new copy)
-
Why introduce these new methods? Because prior to Java 8, collection modification was cumbersome and error-prone (a classic example would be deleting list elements during iteration)
// for-each iterates through the collection elements while deleting them for (Integer number : numbers) { if (number % 2= =0) { // ConcurrentModificationExceptionnumbers.remove(number); }}// The reason for the error is that the underlying implementation of the for-each loop uses an iterator object, so the actual code looks like this for (Iterator<Integer> iterator = numbers.iterator(); iterator.hasNext(); ) { Integer number = iterator.next(); if (number % 2= =0) { // The problem here is that we are using two different objects to iterate over and modify the collection // Iterator object, using the next() and hasNext() methods to query the source; Collection object, which calls the remove() method to remove collection elements. // Mixing iterator objects with collection objects is error-prone. Here, the state of the iterator object is not synchronized with that of the collection objectnumbers.remove(number); }}// The common solution prior to Java 8 was to use Iterator objects explicitly instead of for-each for (Iterator<Integer> iterator = numbers.iterator(); iterator.hasNext(); ) { Integer number = iterator.next(); if (number % 2= =0) { // Expose the Iterator object to call the remove() methoditerator.remove(); }}Copy the code
Obviously, this code becomes cumbersome with the above solution, so here’s the code using the removeIf method provided with Java 8
numbers.removeIf(number -> number % 2= =0); Copy the code
-
Sometimes what we want to do is not remove elements from a list, but replace them. To that end, Java 8 added the replaceAll method
// You can use the ListIterator object (which provides a set() method that can be used to replace elements in a collection) for (ListIterator<String> iterator = words.listIterator; iterator.hasNext(); ) { String word = iterator.next(); iterator.set(word.toUpperCase()); } // With Java 8... words.replaceAll(word -> word.toUpperCase()); Copy the code
-
In addition, lists now support sort themselves, rather than sorting lists by collections.sort as before
Processing Set (Map)
Java 8 introduces several new default methods in the Map interface, which aim to provide idiomatic patterns (what I mean by “patterns” : frequently encountered situations or requirements), reduce the overhead of repeated implementations, and help us write cleaner code
- traverse
- All the way through
Map
Keys and values in are very unwieldy operations that need to be usedMap.Entry<K, V>
Iterator accessMap
Every member of the set forEach
Method accepts aBiConsumer
In order toMap
Key value pairs ofpersons.forEach((name, age) -> System.out.println(name + ":" + age)); Copy the code
- All the way through
- The sorting
// Sort the entries in the dictionary by name. Another way to sort by value is Entry.comparingByValue persons.entrySet().stream() .sorted(Entry.comparingByKey()).forEachOrdered(System.out::println); Copy the code
- The key doesn’t exist notice if the key is in
Map
Exists in, it just happens to benull
, thengetOrDefault
It still comes backnull
- Calculation mode
computeIfAbsent
/computeIfPresent
/compute
List<String> movies = friendsToMovies.get("Tom"); if (movies == null) { movies = new ArrayList<String>(); friendsToMovies.put("Tom", movies); } movies.add("Star Wars"); // In Java 8 friendsToMovies.computeIfAbsent("Tom", name -> new ArrayList<>()).add("Star Wars"); Copy the code
- delete
remove(key, value)
Methods to removeMap
A mapping pair for a particular value for a key in - replace
replace
/replaceAll
persons.replaceAll((name, age) -> age + 1); Copy the code
- merge
The merge method allows for more flexibility in handling Map merge conflicts, whereas the original putAll method is simply overwriting The merge method can also be used to perform initialization checks // Learn how to name a map object Map<String, Long> moviesToCount = new HashMap<>(); String movieName = "Matrix"; Long count = moviesToCount.get(movieName); if (count == null) { moviesToCount.put(movieName, 1L); } else { moviesToCount.put(movieName, count + 1); } // Java 8 smells good // If the current associative value of the key is null, then the key is associated with 1L; Otherwise, count is processed by BiFunction method moviesToCount.merge(movieName, 1L, (key, count) -> count + 1); Copy the code
ConcurrentHashMap
- Three new operations are supported:
forEach
/reduce
/search
- Support from the
Map
Inherit the new default method and provide itThread-safe implementation(Take a look at how efficiently the source code provides thread-safe implementations sometime.)
Chapter 8 summary
- Java 9 supports collection factories, using
List.of
,Set.of
,Map.of
As well asMap.ofEntries
You can create small immutableList
,Set
,Map
. All of these methods return objectsimmutableThat is, their state cannot be changed after they are created (which is equivalent to adding pairs in a different way by enhancing the Collection APIA collection of constantsSupport) Map
Interfaces provide several new default approaches to common patterns (requirements) and reduce the probability of defects
Chapter 9 refactoring, Testing, and debugging
Improved readability
- Migrating anonymous classes to Lambda expressions is sometimes a little more complicated
- Anonymous classes and Lambda expressions
this
andsuper
The meaning is different (the former represents the class itself, the latter represents the containing class) - Anonymous classes can mask variables in containing classes (that is, override local variables defined in the method context), whereas Lambda expressions cannot (compilation error)
- In contexts where overloading is involved, converting anonymous classes to Lambda expressions can result in less readable code. Because the type of an anonymous class is determined at initialization, the type of a Lambda depends on its context
interface Task { public void execute(a); } public static void doSomething(Runnable r) { r.run(); } public static void doSomething(Task t) { t.execute(); } // Anonymous class, OK doSomething(new Task() { public void execute(a) { System.out.println("Hello"); }})// Lambda, ambiguous doSomething(() -> System.out.println("Lambda")); // You can try to use explicit type conversions to disambiguate doSomething((Task) () -> System.out.println("Lambda")); Copy the code
- Anonymous classes and Lambda expressions
- To improve code readability, use method references (as opposed to Lambda), since method names are often more intuitive about code intent (as opposed to Lambda)
- Switch from imperative data processing to Stream
- The Stream API more clearly expresses the intent of the data processing pipeline; And we can optimize the Stream by short-circuit, delay calculation, built-in parallelism and other techniques
- Converting imperative code structures into the form of the Stream API is sometimes a difficult task because of the control flow statements (LLDB).
break
/continue
/return
) and select the appropriate flow operation. The good news is that there are tools that can help us with this task, such asLambdaFicator
Increased flexibility
Lambda is good for parameterization of behavior, and you can introduce Lambda to increase your code’s flexibility in two general patterns
- Conditional delayed execution We often see code like this,Control statements are intermixed with business logic code. Typical scenarios include security checks and log output
// Problem 1: Logger state (log level) is exposed to client code // Problem 2: The state of the query logger object must be displayed before each log output if (logger.isLoggable(Log.FINER)) { logger.finer("Problem: " + generateDiagnostic()); } // Hide the status check inside the method implementation // The downside: generateDiagnostic() still computs the result (perhaps an expensive operation) even if the current log does not output logger.log(Level.FINER, "Problem: " + generateDiagnostic()); // New API and its internal implementation public void log(Level level, Supplier<String> msgSupplier) { if(logger.isLoggable(level)) { log(level, msgSupplier.get()); }}// The new client code simply changes the immediate calculation to delayed calculation logger.log(Level.FINER, () -> "Problem: " + generateDiagnostic()); Copy the code
The refining mode is as follows:
// Old API and client code public void oldMethod(T value); if (instance.checkState(state)) { instance.oldMethod(calcValue()); } // New API and client code public void newMethod(State state, Supplier<T> supplier) { if (instance.checkState(state)) { oldMethod(supplier.get()) } } instance.newMethod(State.READY, () -> calcValue()); Copy the code
- Loop execution reuses the header and tail template code, parameterizing the behavior of the middle part of the code and passing it in as a Lambda
Use Lambda to refactor the design pattern
-
The strategy pattern
- The policy pattern represents a common solution to a class of algorithms, and you can choose which solution to use at run time.
- The policy pattern consists of three parts
- An interface representing an algorithm (Strategy)
- One or more concrete implementations of this interface, representing multiple implementations of the algorithm (AStrategy/BStrategy)
- One or more clients using policy objects (relying on interfaces to pass different implementations)
- With Lambda, instead of declaring a new implementation class, passing Lambda expressions serves the same purpose
-
The template method pattern creates an algorithm framework that lets concrete inherited classes override or implement certain parts
abstract class ClassX { public void publicMethod(a) { // common logic abstractMethod(id); // common logic } abstract void abstractMethod(String id); } // Different behaviors can now be inserted directly by passing Lambda expressions, eliminating the need to create and inherit the abstract class ClassX class ClassY { public void publicMethod(Consumer<String> abstractFunction) { // common logic abstractFunction.accept(id); // common logic}}Copy the code
-
Observer model
- This scheme is used when one object (topic /Subject) needs to automatically notify multiple other objects (Observer /Observer) when certain events occur (such as state transitions)
- Simple instances, such as when the Observer interface defines only one method, can be instantiated directly through Lambda, eliminating the rigid code that declares multiple implementation classes
- If the Observer logic is complex, defines multiple methods, or holds state, you should continue to apply the pattern as it is
-
Chain of Responsibility model
- The chain of responsibility pattern is a general scheme for creating a sequence of processing objects. Typically, this is done by defining an abstract class that represents the processing object, in which a field is defined to record subsequent objects. Once the object completes its work, the processing object passes its work to its descendants
// The responsibility chain pattern incorporates the template method pattern public abstract class ProcessingObject<T> { protected ProcssingObject<T> successor; public void setSuccessor(ProcessingObject<T> successor) { this.successor = successor; } public T handle(T input) { T r = handleWork(input); if(successor ! =null) { return successor.handle(r); } return r; } abstract protected T handleWork(T input); } Copy the code
- This pattern looks like it is chaining (that is, constructing) functions. You can create multiple functions and link them using the andThen method
UnaryOperator<String> headerProcessing = (String text) -> "From Jerry: " + text; UnaryOperator<String> spellCheckerProcessing = (String text) -> text.replaceAll("labda"."lambda"); Function<String, String> pipeline = headerProcessing.andThen(spellCheckerProcessing); String result = pipeline.apply("I have to say labda is really cool!"); Copy the code
- The chain of responsibility pattern is a general scheme for creating a sequence of processing objects. Typically, this is done by defining an abstract class that represents the processing object, in which a field is defined to record subsequent objects. Once the object completes its work, the processing object passes its work to its descendants
Test and debug
- Unit tests can be done even if Lambda expressions are used. But you should generally focus on the behavior of methods that use Lambda expressions (that is, think of a Lambda expression as a small piece of concrete code inside the containing method, and therefore test the containing method, since the “unit” of a unit test is the method itself).
- Look at Stack Trace debugging – Stack traces involving Lambda expressions can be difficult to understand (Lambda expressions have no name, so the compiler can only specify a name for them, LLDB).
lambda$main$0
), this is an area where future versions of the Java compiler could improve - Use log debugging – use
peek
View the value of the data Stream in the Stream pipeline,peek
The original design is to insert an action before each element of the stream resumes running// With the peek operation, you can clearly understand the output of each step in the pipeline operation List<Integer> result = numbers.stream() .peek(x -> System.out.println("from stream: " + x)) .map(x -> x + 17) .peek(x -> System.out.println("after map: " + x)) .filter(x -> x %2= =0) .peek(x -> System.out.println("after filter: " + x)) .limit(3) .peek(x -> System.out.println("after limit: " + x)) .collect(toList()); // from stream: 2 // after map: 19 // from stream: 3 // after map: 20 // after filter: 20 // after limit: 20 // from stream: 4 // after map: 21 // from stream: 5 // after map: 22 // after filter: 22 // after limit: 22 Copy the code
Don’t you think this debugging is too much trouble? It seems that the function is now easy to write and easy to debug.
Chapter 10 is lambda-based domain specific Language
This chapter focuses on DSL and API design, which I think is a very interesting topic. However, in view of my current goal is to learn the basic knowledge of Java ecology in a fast and convenient way, I will skip this chapter for the time being. After all, there are still too many topics waiting for me, such as Spring, MyBatis, MySQL, Redis, ES, and concurrency. Be sure to come back sometime and fill in this chapter)
Part 4: Ubiquitous Java
This section introduces several new features in Java 8 and Java 9 that can help you write more stable and reliable code with less effort. It mainly introduces java.util.Optional class, new date-time API, default method of interface and Java module system
Chapter 11 replaces NULL with Optional
Problems with NULL
One of the authors’ points is that null checks are just a cover. The first impulse of almost any Java programmer to run into a NullPointerException is to add an if statement to check for a quick fix. But if you approach the problem this way, without even thinking about whether your algorithm or data structure should return null in this case, you haven’t really solved the problem, just temporarily covered it up, making it more difficult to investigate and fix the problem the next time
- The mother of all mistakes
NullPointerException
Is the most typical exception in Java programs - Causes code to bloat – code to be full or deeply nested
null
Checks (code scalability, poor readability) or too many exit points (code maintainability) - It doesn’t have any semantics of its own
- Breaks the Java philosophy — Java has always tried to avoid making programmers aware of Pointers, with one exception
null
Pointer to the - A crack in the Java type system
null
A variable that is not of any type but can be assigned to any reference type
Optional entry
- When the variable exists,
Optional
Classes simply encapsulate classes; If the variable does not exist, the missing value will bemodelingInto an “empty”Optional
Object, by methodOptional.empty()
Return, return isOptional
The singleton class Optional
Enriches the semantics of the model, making the knowledge hidden in your domain model explicit in your code through a type system- With particular emphasis, introduction
Optional
The intent of the class is not to eliminate every single onenull
References. Instead, its goal is to help you better design a generic API so that a programmer can see a method signature and see if it accepts oneOptional
The value of the
Application of Optional
-
Create an Optional object
Optional<Car> optCar = Optional.empty(); // NullPointerException Optional<Car> optCar = Optional.of(car); // If car==null, return optional.empty () Optional<Car> optCar = Optional.ofNullable(car); Copy the code
-
Extract and convert values from Optional objects
map
Instance method, ifOptional
Contains values that the function passes to map as arguments for conversion; ifOptional
Empty, do nothingflatMap
The instance method, like in Stream, willOptional<Optional<T>>
Flat forOptional<T>
-
Since the Optional class was not designed specifically to be used as a field of the class, it also does not implement the Serializable interface. Therefore, using Optional in your domain model can cause problems if your application uses libraries or frameworks that require serialization
// An alternative to the domain model of serialization using Optional: fields do not have Optional types, but provide public methods that return Optional types public class Person { private Car car; public Optional<Car> getCarAsOptional(a) { returnOptional.ofNullable(car); }}Copy the code
-
Java 9 introduced the stream() method of the Optional class, which returns a stream containing the value if it has a value, and an empty stream otherwise
-
The Optional class provides multiple methods for reading variable values in Optional instances
get()
Is the simplest and least secure method, compared to the nested methodnull
Inspections did not show much improvementorElse(T other)
Provides a default value without a valueorElseGet(Supplier<? extends T> other)
isorElse
methodsDelay callVersion, becauseSupplier
The only wayOptional
The call is made when the object has no value. If creating defaults is an expensive operation, consider this approach to improve performanceorElseThrow
andget
The method is similar to that encountered by emptyOptional
Object, but this method can customize the type of exception you want to throwor
/ifPresent
/ifPresentOrElse
Etc., see the Java API to learn
-
Use the filter method to weed out specific values
Optional<Person> optPerson = ... ;// Important: You can think of Optional as a Stream that contains at most one element. The behavior of the filter method is clear // If the Optional object is empty, do nothing; Otherwise, predicate operations are performed on the values contained in the Optional object. // If the result of this operation is true, the Optional object is returned without any changes. Otherwise, the Optional object is filtered and the value of the Optional object is null optPerson.filter(person -> person.age >= 35) .ifPresent(p -> System.out.println(35 "warning")); Copy the code
Transform old code with Optional
Optional was a good thing, but it came too late. In order to maintain backward compatibility, it is difficult to change old Java apis to use Optional. This section explains how to fix or bypass these problems
- Almost all existing Java apis do this by returning one
null
To indicate that a value is missing or not available for some reasonOptional<Object> value = Optional.ofNullable(map.get("key")); Copy the code
- In addition to return
null
A more common alternative to the Java API is to throw an exception indicating that the function cannot return a value for some reason// Encapsulate a method like this into a utility class, such as OptionalUtility, that can be called directly later public static Optional<Integer> optParseInt(String s) { try { return Optional.of(Integer.parseInt(s)); } catch (NumberFormatException e) { returnOptional.empty(); }}Copy the code
- Optional of the base type is not recommendedAs with the Stream object,
Optional
Provides a similar base type (OptionalInt
/OptionalLong
/OptionalDouble
). forStream
For performance reasons, base types are a good choice when you have a large number of elements, butOptional
The argument that an object contains at most one value is invalid. Therefore, it is not recommended because the base type is not supportedmap
/flatMap
/filter
And other very useful methods
Chapter 12 new date and time API
- Why does Java 8 need to introduce a new date and time library
Date
/Calendar
The class itself is imperfect (e.gCalendar
Class month starts from 0)- At the same time there is
Date
/Calendar
Class adds to programmer confusion about which to use - Formatter is not thread-safe
Date
/Calendar
Classes are mutable
Use the new class
Java 8 provides classes such as LocalDate, LocalTime, LocalDateTime, Instant, Duration, and Period in the Java. time package
-
LocalDate
/LocalTime
/LocalDateTime
LocalDate
Provide a simple date with no time of the day or time zone information (after all, Local is not a decoration)LocalTime
Used to indicate the time of dayLocalDateTime
isLocalDate
andLocalTime
Represents the date and time without time zone information.LocalDateTime
Objects can be created directly or by merging date and time objectsLocalDate/``LocalTime
You can use static methodsparse
They are created by parsing the strings that represent them.parse
The method can accept one moreDateTimeFormatter
Object, which is a replacement for the older versionjava.util.DateFormat
Recommended alternatives to
-
Instant
- The date-time classes mentioned above are intended for human reading and use, while from a computer perspective, the most natural format for modeling time is a single large integer representing a point in time duration (traditionally midnight January 1, 1970 UTC time zone)
Instant
Class supports nanosecond accuracy, which is designed for convenienceThe machineUse. soYou can’t mix the two
-
Duration
/Period
- All of the classes described above are implemented
Temporal
Interface that defines how to read and manipulate the values of objects modeled for time;Duration
/Period
Measure twoTemporal
interobjectinterval Duration
Mainly used to measure the length of time in seconds or nanoseconds, so cannot itbetween
Static methods are not acceptableLocalTime
/LocalDateTime
/Instant
Object, but cannot acceptLocalDate
Period
Can be used to represent this interval in years, months, days, and so on
- All of the classes described above are implemented
-
It is important to note that all of the above date/time objects are immutable to better support functional programming and ensure thread-safety
Manipulation of the date
- Manipulate in an intuitive way
LocalDate
The properties of theget
/with
Method declaration inTemporal
Interface, all date and time API classes implement these two methods. withget
/set
Similarly, they are used respectively forTemporal
Read/modify the value of an object. The difference is,with
Methods do not directly modify existing onesTemporal
Object that is created by modifying some state from the object as a templateA copy of the, called functional updateLocalDate date = LocalDate.of(2020.10.30); / / 2020-10-30 LocalDate date1 = date.withYear(2022); / / 2022-10-30 LocalDate date2 = date1.withDayOfMonth(25); / / 2022-10-25 LocalDate date3 = date2.with(ChronoField.MONTH_OF_YEAR, 2); / / 2022-02-25 Copy the code
- Modify in a relative way
LocalDate
Object propertiesLocalDate date = LocalDate.of(2020.10.1); / / 2020-10-01 LocalDate date1 = date.plusWeek(1); / / 2020-10-08 LocalDate date2 = date1.minusYear(2); / / 2018-10-08 LocalDate date3 = date2.plus(3, ChronoUnit.MONTHS); / / 2019-01-08 Copy the code
- use
TemporalAdjuster
Adjust date and timeTemporalAdjuster
It allows us to manipulate the date in a more subtle and flexible way than changing its value one at a time. Check out the Java API for more static methodsimport static java.time.temporal.TemporalAdjuster.*; LocalDate date1 = LocalDate.of(2014.3.18); / / 2014-03-18 LocalDate date2 = date1.with(nextOrSame(DayOfWeek.SUNDAY)); / / 2014-03-23 LocalDate date3 = date2.with(lastDayOfMonth()); / / 2014-03-31 // If no predefined TemporalAdjuster that meets the requirements is found, create an implementation of the interface // (Better yet, TemporalAdjuster is a functional interface, which makes Lambda useful) Copy the code
Parses and formats date/time objects
java.time.format
The package is designed specifically for formatting and parsing datetime objects, and the most important class in this package isDateTimeFormatter
.// Note that format/parse is the datetime class's own methods, not formatter's // This is not like SimpleDateFormat LocalDate date = LocalDate.of(2014.3.18); String s1 = date.format(DateTimeFormatter.BASIC_ISO_DATE); / / 20140318 String s2 = date.format(DateTimeFormatter.ISO_LOCAL_DATE); / / 2014-03-18 LocalDate date3 = LocalDate.parse("20140318", DateTimeFormatter.BASIC_ISO_DATE); LocalDate date4 = LocalDate.parse("2014-03-18", DateTimeFormatter.ISO_LOCAL_DATE); // Create a DateTimeFormatter based on the given pattern DateTimeFormatter formatter = DateTimeFormatter.ofPattern("MM/dd/yyyy"); LocalDate date1 = LocalDate.of(2020.2.15); String formattedDate = date1.format(formatter); LocalDate date2 = LocalDate.parse(formattedDate, formatter); Copy the code
- If more granular control is needed,
DateTimeFormatterBuilder
Class provides more sophisticated formatters, such as flexible parsing, which allows the parser to useheuristicThe mechanism to parse the input, inaccurately matching the specified pattern), consult the Java API for details - The point of the point is, and the old
java.util.DateFormat
Compared to all of themDateTimeFormatter
Examples areThread safety. So, we can create instances (such as those defined by the class) in a singleton patternBASIC_ISO_DATE
And can share these instances across multiple threads
Deal with different time zones and calendars
The new java.time.ZoneId class is a replacement for the old java.util.TimeZone, greatly simplifying time zone handling. Like the other date-time apis, the ZoneId class cannot be modified
- Using the time zone
- in
ZoneRules
Class contains 40 time zone instances. Each specificZoneId
Objects have a region ID identifier (” region/city “format), as inZoneId romeZone = ZoneId.of("Europe/Rome")
- Once you get one
ZoneId
Object, you can associate it withLocalDate
,LocalDateTime
orInstant
Objects are grouped together and constructed into oneZonedDateTime
Example (P275 in book Figure 12-1 is helpful to understandLocalDate
/LocalTime
/LocalDateTime
/ZoneId
/ZonedDateTime
The relationship betweenLocalDate date = LocalDate.of(2014, Month.MARCH, 18); ZonedDateTime zdt1 = date.atStartOfDay(romeZone); LocalDateTime dateTime = LocalDateTime.of(2014, Month.MARCH, 18.13.45); ZonedDateTime zdt2 = dateTime.atZone(romeZone); Instant instant = Instant.now(); ZonedDateTime zdt3 = instant.atZone(romeZone); Copy the code
ZoneId
contactedLocalDateTime
andInstant
LocalDateTime dateTime = LocalDateTime.of(2014, Month.MARCH, 18.13.45); Instant instantFromDateTime = dateTime.toInstant(romeZone); Instant instant = now(); LocalDateTime timeFromInstant = LocalDateTime.ofInstant(instant, romeZone); Copy the code
Instant
In the class increasedtoInstant
/fromInstant
Method connects the old and new apis (Date
andInstant
)
- in
Chapter 13 default methods
- Interfaces in Java 8 can provide code implementation of methods as both static and default methods
- The default method starts with a keyword
default
Modifier, the method body is consistent with regular class methods// List default void sort(Comparator<? super E> c) { Collections.sort(this, c); } // Collection default Stream<E> stream(a) { return StreamSupport.stream(spliterator(), false); } Copy the code
- The default approach, whose primary target is library designers, was introduced to address library evolution issues such as Java apis in a compatible manner
- Defining interfaces and companion classes together is a common pattern in the Java language (e.g
Collection
Is to theCollections
), a utility class defines many static methods that work with interface instances. Since static methods can live inside the interface, there is no need for these helper classes in your own code to exist and you can move these static methods inside the interface. - Different types of compatibility: binary level compatibility, source level compatibility, functional behavior compatibility
- A functional interface contains only one abstract method, and the default method is non-abstract
- In Java 8The difference between abstract classes and interfacesWhat is it? Can’t they all contain implementations of abstract methods and method bodies?
- A class can inherit only one abstract class, but can implement multiple interfaces
- An abstract class can hold state through instance variables (fields), whereas an interface cannot have instance variables
- Some Wrong Ideas about inheritance (P288)
- Inheritance should not be a fail-safe when it comes to code reuse, which can sometimes introduce unnecessary complexity
- Some classes are deliberately declared to be
final
Type to avoid the occurrence of such anti-patterns and prevent the functionality of the core code from being contaminated - Sometimes declared as
final
Each class has its own reasons and considerations. For example,String
The class is declared asfinal
Because we don’t want people messing with core functions like that
Resolve the conflict
- The problem introduced in the Java language is that a class can only inherit from one parent class but can implement multiple interfaces. With the introduction of default methods, it is possible for a class to inherit multiple methods using the same function signature. Which method will the class choose to use in this case?
- Three rules for problem solving(If the conflict still cannot be resolved, the compiler fails to compile.)
- A method explicitly declared in a class or parent class that takes precedence over all default methods
- If it is still not clear (that is, neither the class itself nor its parent provides an explicit declaration), then choose the interface that provides the default method for the most concrete implementation (my understanding: the lowest level in the inheritance hierarchy)
- If the conflict remains unresolved (multiple default methods are equally specific, such as the diamond inheritance problem),
B extends A
.C extends A
.D implements B,C
), the default method can only be overridden in the class and the desired method can be explicitly called// Java 8 introduces a new syntax classx.super.method (...) Make an explicit choice public class D implements B.C { void method(a) { C.super.method(); }}Copy the code
Chapter 14 Java module system
This chapter mainly introduces the Java introduction module system, rather than simply fold a pile of haphazard heap together, thus through separation of concerns and information hiding, helped to create software is easy to understand Because there is no demand on this at the moment, so easy to turn over a few pages of this chapter, understanding is not deep, there is need to come again after this chapter. The general impression is that Java has borrowed a lot from functional programming languages, such as require and export in JavaScript.
Part 5 improves Concurrency in Java
This section on how to build concurrent programs using Java’s advanced features, including asynchronous and reactive programming, is important, but it’s only a cursory look at a few pages, and will be covered in more detail when we have a special session on concurrency
Part 6 functional programming and the future evolution of Java
This section focuses on how to write efficient functional programs in Java
Chapter 18 functional thinking
Why functional programming
- Declarative programming
- There are generally two ways of thinking about implementing a system programmatically: one focuses on how it is implemented (object-oriented programming/imperative programming); The one that focuses more on what to do (declarative programming)
- No side effect calculation
- The side effect is that the effect of the function has gone beyond the function itself; In the long run, having fewer shared mutable data structures can help you reduce maintenance and debugging costs; Consider immutable objects
- If the components that make up the system adhere to the principle of no side effects, the system can use the multi-core concurrency mechanism completely without locking, because neither method will interfere with the others
- Functional programming practices declarative programming and side-effect free computation, two ideas that make it easier to build and maintain systems
- As discussed in Chapter 1, changes in hardware (such as multicore) and programmer expectations (such as manipulating data in a database-like query-like language) have driven the Java style of software engineering to become increasingly functional to some extent
What is functional programming
- What is functional programming? The simplest answer: “It’s a way of programming with functions.” When we talk about functions, we mean “like mathematical functions, without side effects.”
- Reference transparency: No perceived side effects (no changes to variables visible to callers, no I/O, no exceptions thrown). In other words, a function is functional if it consistently returns the same result with the same input, no matter where or when it is called. The rule is that a function or method called functional can only modify local variables, and any object it references should be immutable
Recursion and iteration
- Recursion is a particularly popular technique for functional programming. It teaches you to think “what do you want to do
- In general, it is more expensive to perform a recursive invocation than iteratively execute a single machine-level branch instruction. How to write produce
StackOverflowError
The program? recursive - Functional languages provide a way to solve memory-consuming problems called tail-call optimization; The bad news is that this optimization is not currently (2018) supported in Java
- When programming with Java 8, the authors’ advice is that you should use it as much as possible
Stream
Replace iterative operations to avoid the impact of change; If recursion allows you to implement an algorithm in a more refined way without any side effects, use recursion instead of iteration - Recursive implementations tend to be easier to implement, read, and understand, and most of the time the efficiency of programming matters more than subtle differences in execution time
Chapter 19. Techniques for functional programming
function
- First-class functions are functions that can be passed as arguments, returned as results, and stored in data structures
- Higher-order functionsIs a function that takes one or more functions as an input parameter or returns another function. Typical higher-order functions in Java include
comparing
,andThen
,compose
Etc. - Currification is a technique that helps us modularize functions and improve code reuse. It represents a method of converting a function with n-tuple arguments into n unary function chains
Immutable data structures
(The original title of the book was “Persistent data Structures”, which felt a little obscure. The understanding of the context, especially the introduction of the text below P417 Figure 19-4, was changed to “immutable data structures”.)
- Functional methods do not allow modification of any global data structures or structures passed in as parameters, otherwise two identical calls are likely to produce different results — this violates the principle of referential transparency and makes it impossible to think of a method simply as a mapping from parameters to results
- Since methods with side effects are prohibited, how do you update variables? The solution to functional programming is: if you need to use a data structure that represents the result, create a copy of it instead of directly modifying the existing data structure
Lazy evaluation of Stream
- For various reasons, such as current efficiency considerations, the Stream design has some limitations, such as the inability to declare a recursive Stream, because a Stream can only be used once, and once an endpoint operation call is made to the Stream, it terminates permanently
- The solution is to delay the calculation, and the details will be illustrated by the examples in the book
- A Stream is deliberately designed to be deferred: a Stream is like a black box that receives requests and generates results; When a series of action requests are made to a Stream, the requests are simply saved one by one and only evaluated when the terminal action is initiated.
- The obvious advantages of this design is that multiple operations on the Stream, Stream need to traverse only once, without the need for each traversal operation once all of the elements (the individual feels actually somewhat similar restaurant order process, the waiter just write down every request, one by one only when to execute commands “order” will actually go to perform these requests)
- A deferred data structure is a data structure that will include
Map<String, Object>
The type parameter is changed toSupplier<Map<String, Object>>
Parameter to achieve the purpose of creating on demand. However, the performance of deferred computation is not always better (because there is also the overhead of performing additional calls to the functional interface abstraction methods). The authors’ advice is to use them if they make programming easier, and to use traditional methods if there is an unacceptable performance penalty
miscellaneous
- The function returns the same result each time it is called, which in the case of reference types means the same object. Therefore, functional programming is usually not used
= =
(reference equality), but useequal
Data structure values are compared, that is, there is no logical difference between the two objects, so the function is still reference-transparent - A combinator is a functional idea that combines two or more functions or data structures
static <A, B, C> Function<A, C> compose(Function<B, C> g, Function <A, B> f) { // g(f(x)) return x -> g.apply(f.apply(x)); } Copy the code
Chapter 20 mixing Object-oriented and functional programming: Java and Scala comparison
Java and Scala are both programming languages that combine object-oriented and functional programming features, and both run on top of the JVM. Scala offers richer features for functions than Java does. Skip this chapter for now, because there are too many things to review… If you haven’t learned Scala/Clojure in recent years, you can’t even say you know Java.
Chapter 21 concludes and the future of Java
Review the Java 8 language features
- Behavior parameterization (Lambda and method references)
- What is so wrong with Stream collections that they need to be replaced all over again, or enhanced with a similar but different concept of Stream? The larger the dataset, the more important it is to reduce the number of times the dataset is traversed. The Stream API uses a delay algorithm to pipeline multiple operations into a single iteration that can complete all operations at once
CompletableFuture
Java 5 provides thisFuture
Interface,CompletableFuture
forFuture
The meaning of is likeStream
Is to theCollection
Optional
- The default method
Java 9
Java 9 does not add new language features, but the main change is to further improve on the work initiated in Java 8 by adding some new methods. The focus of Java 9 is the introduction of a new module system
- Modular systems Modular systems improve the way we design and implement applications from an architectural perspective, clearly defining the boundaries of the various sub-parts and defining the way they interact. An important reason for introducing such a change was that we wanted better, stricter cross-package encapsulation, and that the new module system would help us slice the Java runtime into more fine-grained pieces
- The Flow API Java 9 standardises reactive flows, and the reactive backpressure protocol based on the Pull pattern prevents slow consumers from being overwhelmed by one or more fast producers. The Flow API contains four core interfaces
Publisher
,Subscriber
,Subscription
andProcessor
Java 10
- Local variable type inference
The future of Java
-
Limitations of Java generics
- Limitation 1: Parameters passed to generics can only be object types, not primitive types; Limitation 2: Unpacking and packing affects performance
- When generics were first introduced in Java 5, they were used to maintain compatibilityElimination mode for generic polymorphism(Erasure model of generic polymorphism)
ArrayList<String>
andArrayList<Integer>
The runtime representation of is the same; In C#, the two types of runtime representations are essentially different, and this model is calledGeneralized polymorphism with a generalized patternReified model of generic polymorphism, or polymorphism - What we obviously want is a generic type that better blends the base data types with their corresponding object types. The main difficulty with a Java implementation of generics is that it needs to maintain backward compatibility, and that compatibility requires both SUPPORT for the JVM and legacy code that uses reflection and wants to perform generic cleanup
- Facts: Object types
Void
It actually contains one value. It has one and only one valuenull
value
-
If you want to do true functional programming in Java, you need language-level support, such as “immutable values.”
- Functional programming is very strict about not modifying existing data structures, ensuring that neither direct changes to the field itself, nor changes to objects directly or indirectly accessed through the field, occur. But existing keywords
final
This goal has not been achieved in a real sense - Immutable values embody the idea that values are immutable; only variables (which store values) can be modified, and then other immutable values are stored in the variable
- Functional programming is very strict about not modifying existing data structures, ensuring that neither direct changes to the field itself, nor changes to objects directly or indirectly accessed through the field, occur. But existing keywords
-
Value type
- We wanted to introduce value types in Java because immutable objects handled by functional programming have no reference characteristics.
- We want primitive data types to be a special case for value types, but without Java’s current generics elimination pattern, which means that value types cannot use generics without boxing.
- Current issues: Object versions of primitive types are still very important to collections and Java generics due to the elimination pattern of objects; However, since they inherit from Object (and therefore have reference characteristics), this is undesirable. Solving any one of these problems means solving all of them
Make Java grow faster
- Doesn’t it make sense that Java’s slow release rate outpaces the language’s need for rapid development, meaning that small changes have to wait for larger changes to be made before they can be integrated into the release language
- The Java development cycle was tweaked to six months; A long-term support version is released every three years, with support for this version lasting for three years
- The authors predict that the idea of functional programming and its influence will continue to guide Java in the near future
Appendix A Updates to other language features
Appendix A discusses three new language features not covered in Java 8: repeated annotations, type annotations, and common target type inference
annotations
In Java, annotations are a mechanism for decorating program elements with additional information. In other words, it is like syntactic metadata. Improvements to Java 8 annotations include:
- Repeated annotations can be defined
@Repeatable(Authors.class) @interface Author { String name(a); } @interface Authors { Author[] value(); } @Author(name = "Raoul") @Author(name = "Raoul") @Author(name = "Raoul") class Book {}Copy the code
- Annotations can be added for any type (prior to Java 8, only declarations could be annotated)
@NotNull String name = person.getName(); List<@NotNull Car> cars = new ArrayList<>(); Copy the code
General target type inference
Java 8 has enhanced inference of generic parameters
// Generic method signature
static <T> List<T> emptyList(a);
// Java 7
List<Car> cars = Collections.<Car>emptyList();
// In Java 8, target types include parameters passed to methods, so explicit generic parameters are no longer required
List<Car> cars = Collections.emptyList();
// There is no need to write complex code like Collectors.
toList()
List<Car> cleanCars = dirtyCars.stream()
.filter(Car::isClean).collect(Collectors.toList());
Copy the code
Appendix B Updates to other class libraries
A collection of
Have already mentioned above chapter 8 many Collection (Collection/Collections/List/Set/Map) added in some way. A major mention here is the change to the Comparator interface
- New instance method
reversed
— To the currentComparator
Object is sorted in reverse order and returns new after sortedComparator
objectthenComparing
When two objects are the same, the other object is returnedComparator
comparativeComparator
objectthenComparingInt
/thenComparingDouble
/thenComparingLong
- New static methods
comparing
— Return aComparator
Object that provides a function that extracts the sort keycomparingInt
/comparingDouble
/comparingLong
natrualOrder
–Comparator
Object in natural order, returns oneComparator
objectnullsFirst
/nullsLast
— Compares null objects with non-null objects, specifying that null is smaller or larger than non-null, and returns oneComparator
objectreverseOrder
– andnatrualOrder().reversed()
Methods the similar
concurrent
- The parallel flow
CompletableFuture
java.util.concurrent.atomic
Classes in packages have added more method support (getAndUpdate
/updateAndGet
/getAndAccumulate
/accumulateAndGet
)Adder
/Accumulator
In a multi-threaded environment, if multiple threads need to perform frequent update operations with little read action, the Java API documentation recommends that we use a new class (Long/Double)Adder/Accumulator, and try to avoid using their corresponding atomic types. These new classes are designed with dynamically growing requirements in mind, effectively reducing contention between threads
ConcurrentHashMap
ConcurrentHashMap
Class greatly improvedHashMap
The degree of modernization that allows concurrent addition and update operations because it locks only certain parts of the internal data structure, and therefore synchronizesHashtable
It has higher read and write performance- To improve performance, adjust the internal data structure. In Java 8, buckets are dynamically replaced with sorted trees when they become too bloated.
- Three new operations are supported:
forEach
/reduce
/search
Each operation supports four formats, including key, value, andMap.Entry
And the key/value pair function - Counting: provides new methods
mappingCount
, it returnslong
Not the old waysize
As returnint
. We should try to use the new method because it provides a greater range of counts
Arrays
The Arrays class now supports concurrent operations
parallelSort
Method sorts the specified array concurrentlysetAll
/parallelSetAll
Sets all elements in the specified array sequentially/concurrentlyparallelPrefix
The Number and Math
Short
,Integer
,Long
,Float
andDouble
Class provides static methodssum
,min
,max
Such as reduce operating- if
Math
Overflow occurs in the operation of the method in,Math
Class provides new methods to throw arithmetic exceptions
Files
The most important change is that we can now generate streams directly from files such as lines, List, Walk, find, and so on. Because streams are consumed lazily, these methods are useful when data volumes are large
Reflection
The new changes mainly support several changes to the annotation mechanism (such as repeated annotations)
String
Added static method JOIN
String authors = String.join(","."Raoul"."Mario"."Alan");
Copy the code
Appendix C How can MULTIPLE operations be performed concurrently on the same stream
Appendix C mainly shows some advanced uses of the common API and how to creatively implement curve-saving when the language does not provide the desired features for the time being
- The design of streams in Java 8 has one very big (and possibly the biggest) limitation: when used, you can only get one processing result at a time. But we often want to get multiple results at the same time, in other words, to pass multiple Lambda expressions in a stream at once, preferably in a concurrent fashion and get their respective results
- Copying streams such as fork is currently not implemented in Java 8. Appendix C utilizes the common API —
Spliterator
And combined with theBlockingQueues
andFutures
To implement this feature
Appendix D Lambda expressions and JVM bytecode
Appendix D briefly discusses how Java compiles Lambda expressions by examining the compiled.class files. (Personal note: Understanding JVM bytecode is important)
- The pitfalls of anonymous classes
- The compiler generates a new.class file for each anonymous class
The file name is usually
ClassName$1
In this form, a large number of class files are generated, which directly affects the startup performance of applications- Each new anonymous class generates a new subtype for the class/interface
- Comparing bytecode files reveals that anonymous classes and Lambda expressions use different bytecode instructions (by
javap -c -v ClassName
View bytecode files) - Use anonymous class code to create additional classes by
new
Instruction completion; Lambda expressions in code that uses Lambda expressionsinvokedynamic
instruction - Bytecode instruction
invokedynamic
Originally introduced by JDK7 to support dynamically typed languages running on the JVM. When a method call is made,invokedynamic
A higher level of abstraction has been added so that part of the logic can be targeted based on the characteristics of a dynamic language - In the case of Lambda expressions, the
invokedynamic
An instruction can defer bytecode generation of that part of the code that implements a Lambda expression until runtime. The effect is similar to the pseudo-code shown belowpublic class LambdaDmo { Function<Object, String> f = [dynamic invocation of lambda$1] static String lambda$1(Object obj) { returnobj.toString(); }}Copy the code
- This design brings a series of benefits (P485 ~ P486)