Author: acupt

preface

Regardless of multithreaded concurrency, container classes tend to be more efficient by using threadless classes such as ArrayList and HashMap. In concurrent scenarios, thread-safe container classes such as ConcurrentHashMap and ArrayBlockingQueue are often used, sacrificing efficiency but gaining security.

All thread-safe containers mentioned above are in the java.util.concurrent package.

Just a brief introduction, and further exploration.



Introduction to Concurrent Containers

  1. ConcurrentHashMap: ConcurrentHashMap
  2. CopyOnWriteArrayList: a concurrent ArrayList
  3. CopyOnWriteArraySet: indicates a concurrent Set
  4. ConcurrentLinkedQueue: Concurrent queue (based on linked lists)
  5. ConcurrentLinkedDeque: Concurrent queue (based on two-way linked lists)
  6. ConcurrentSkipListMap: concurrent Map based on hop tables
  7. ConcurrentSkipListSet: a concurrent Set based on a hop table
  8. ArrayBlockingQueue: Blocking queue (array-based)
  9. LinkedBlockingQueue: Blocking queue (based on linked lists)
  10. LinkedBlockingDeque: Blocking queue (based on two-way linked lists)
  11. PriorityBlockingQueue: Thread-safe priority queue
  12. SynchronousQueue: Reads and writes pairs of queues
  13. LinkedTransferQueue: A data exchange queue based on a linked list
  14. DelayQueue: indicates a DelayQueue

1.ConcurrentHashMap ConcurrentHashMap

One of the most common concurrent containers that can be used as a cache in concurrent scenarios. The underlying hash table is still a hash table, but it has changed a lot in JAVA 8, and JAVA 7 and JAVA 8 are both used in more common versions, so there are often comparisons between the two implementations (for example, in interviews).

One big difference is that JAVA 7 uses piecewise locking to reduce lock contention, while JAVA 8 ditches piecewise locking in favor of CAS (a type of optimistic locking), and to prevent degradation to linked lists in the event of a serious hash collision (when a collision occurs, a linked list is created at that location, and objects with the same hash value are linked together). Will be converted to a red-black tree after the list length reaches a threshold (8) (tree query efficiency is more stable than linked list).

2.CopyOnWriteArrayList Concurrent ArrayList

A concurrent version of ArrayList is also an array, unlike ArrayList in that it creates a new array when adding or deleting elements, adds or removes specified objects from the new array, and finally replaces the original array with the new array.

Application scenario: Read operations are not locked, but write (add, Delete, or change) operations are locked. Therefore, it applies to the scenario where read operations are heavy and write operations are short.

Limitations: Since it reads without locking (efficient reads, just like a normal ArrayList), it reads the current copy, so it may read dirty data. If you mind, I suggest not.

Take a look at the source code to feel:

public class CopyOnWriteArrayList<E> implements List<E>, RandomAccess, Cloneable, java.io.Serializable { final transient ReentrantLock lock = new ReentrantLock(); private transient volatile Object[] array; Public Boolean add(E E) {final ReentrantLock lock = this.lock; lock.lock(); Try {Object[] elements = getArray(); // Current array int len = elements. Length; Object[] newElements = Arrays.copyOf(elements, len + 1); NewElements [len] = e; // Create a new array, one more space than the old one. // Put the element to be added into the new arraysetArray(newElements); // Replace the old array with the new arrayreturn true; } finally { lock.unlock(); Public E get(int index) {public E get(int index) {returnget(getArray(), index); }}Copy the code


3. CopyOnWriteArraySet concurrent Set

The implementation is based on CopyOnWriteArrayList (with a CopyOnWriteArrayList member variable), which means that the underlying layer is an array, meaning that each add has to iterate through the entire collection to see if it exists, and insert (lock) if it doesn’t.

Application scenario: Add a copy of CopyOnWriteArrayList to CopyOnWriteArrayList. The set should not be too large.



4.ConcurrentLinkedQueue Concurrent queue (based on linked list)

Concurrent queue based on linked list implementation, using optimistic lock (CAS) to ensure thread safety. Because the data structure is a linked list, there is theoretically no queue size limit, which means adding data is guaranteed to succeed.



5.ConcurrentLinkedDeque concurrent queue (based on bidirectional linked list)

Concurrent queues based on bidirectional linked lists can operate from head to tail, so in addition to FIFO (FIFO), it can also be FIFO (FILO), of course, it should be called stack.



6.ConcurrentSkipListMap Concurrent Map based on the hop table

SkipList is a spatial-temporal data structure that uses redundant data to index linked lists layer by layer, similar to binary lookup



ConcurrentSkipListSet Indicates the concurrent Set based on the hop table

ConcurrentSkipListSet ConcurrentSkipListMap ConcurrentSkipListMap ConcurrentSkipListMap ConcurrentSkipListMap



8.ArrayBlockingQueue blocks a queue (array-based)

An array-based implementation of a blocking queue must be constructed with an array size, and if the array is full it will block until it has a place (also supports direct return and timeout waiting), with a lock ReentrantLock to ensure thread safety.

Use the offer operation as an example:

public class ArrayBlockingQueue<E> extends AbstractQueue<E> implements BlockingQueue<E>, Java.io.Serializable {/** * Read-write can share this lock, and threads communicate with each other through the following two conditions * which are closely related to lock (and are generated by lock methods) * Object likewait/notify */ final ReentrantLock lock; Private final Condition notEmpty; private final Condition notEmpty; Private final Condition notFull; private final Condition notFull; Public E take() throws InterruptedException {final ReentrantLock lock = this.lock; lock.lockInterruptibly(); try {while (count == 0)
                notEmpty.await();
            returndequeue(); } finally { lock.unlock(); }} // Insert an element at the end of the queue, wait for the specified time when the queue is full, Public Boolean Offer (E E, long timeout, TimeUnit Unit) throws InterruptedException {checkNotNull(E); long nanos = unit.toNanos(timeout); final ReentrantLock lock = this.lock; lock.lockInterruptibly(); // loop and wait until the queue is freewhile (count == items.length) {
                if (nanos <= 0)
                    return false; Nanos = notFull. AwaitNanos (nanos); nanos = notFull. } enqueue(e); // Insert an elementreturn true; } finally { lock.unlock(); // unlock}}Copy the code
At first glance, it is a little confusing. The same lock is used for reading and writing.

The answer lies in notEmpty and notFull, two little things from a lock that make a lock similar to synchronized + wait + notify.



9.LinkedBlockingQueue Blocking queue (based on linked lists)

The list based implementation of the blocking queue, like the non-blocking ConcurrentLinkedQueue, has one more capacity limit, if not set to default to int Max.



10.LinkedBlockingDeque blocking queue (based on two-way linked lists)

Similar to LinkedBlockingQueue, but with actions specific to two-way lists.



PriorityBlockingQueue Specifies the thread-safe priority queue

You can construct it by passing in a comparator, as if the elements you put in are sorted, and then consumed in order as they are read. Some low-priority elements may not be consumed in the long run because higher-priority elements keep coming in.



12.SynchronousQueue Specifies the queue for data synchronization exchange

A dummy queue, because it has no real space to store elements, every insert must have a corresponding fetch, and can’t continue to put without fetching.

To get a feel for a simple example:

import java.util.concurrent.*; public class Main { public static void main(String[] args) { SynchronousQueue<Integer> queue = new SynchronousQueue<>();  New Thread(() -> {try {//for (int i = 0; ; i++) {
                    System.out.println("Put in:"+ i); queue.put(i); } } catch (InterruptedException e) { e.printStackTrace(); } }).start(); New Thread(() -> {try {//while (true) {
                    System.out.println("Take out:"+ queue.take()); Thread.sleep((long) (Math.random() * 2000)); } } catch (InterruptedException e) { e.printStackTrace(); } }).start(); }} /* Output: Put: 0 Take out: 0 Put in: 1 Take out: 1 Put in: 2 Take out: 2 Put in: 3 Take out: 3 */Copy the code
As you can see, the writing thread doesn’t have any sleep, so to speak, trying to put something in the queue, and the reading thread is very inactive, reading and sleeping for a while. The result is a pair of read and write operations.

A usage scenario in JAVA is the Executors. NewCachedThreadPool (), create a cached thread pool.

public static ExecutorService newCachedThreadPool() {
    returnNew ThreadPoolExecutor(0, // Core threads are 0, useless threads are discarded integer. MAX_VALUE, // The maximum number of threads is theoretically infinite, SynchronousQueue<Runnable>()); timeunit.seconds // If there are no free threads fetching the task, it will fail and the thread pool will create a new thread.Copy the code


13.LinkedTransferQueue A data exchange queue based on a linked list

The interface TransferQueue is realized. When the element is put in by transfer method, if a thread is found to be blocking and fetching the element, the element will be directly given to the waiting thread. If no one is waiting to consume it, the element is placed at the end of the queue and the method blocks until someone reads it. Similar to SynchronousQueue, but more powerful.



14.DelayQueue indicates the DelayQueue

The queued elements can be pulled out by the consumer after a specified delay, and the elements need to implement the Delayed interface.



conclusion

With a brief introduction to JAVA and some of the container classes in the package above, knowing that you have these things, you can remember that you have something ready to use in the right scenario. To understand why, you’ll have to dig a little deeper.



The last

Welcome to pay attention to my public number [programmer chasing wind], the article will be updated in it, sorting out the data will be placed in it.