Hash table
A hash table is a data structure that performs inserts, deletes, and lookups at a constant average time. When an operation is performed on an element in a hash table, it is usually performed on a certain part of the element’s data, which is called a key. We define the size of the table as tableSize, and each key is mapped to some number in the range [0, tablesize-1], and placed in an array. This mapping is called a hash function. Ideally, it should be easy to compute and ensure that any two different keys are mapped to different places. If two different keywords map to the same location, this is called a hash collision; A common approach to conflict resolution is split chaining, which stores elements mapped to the same location in a linked list.
Basic attributes
The basic structure
HashMap is an implementation of a hash table that uses a split chaining approach to resolve conflicts with some optimizations. In a HashMap, a hash is an array of nodes:
transient Node<K,V>[] table;
static class Node<K.V> implements Map.Entry<K.V> {
final int hash;
final K key;
V value;
Node<K,V> next;
Node(int hash, K key, V value, Node<K,V> next) {
this.hash = hash;
this.key = key;
this.value = value;
this.next = next;
}
public final K getKey(a) { return key; }
public final V getValue(a) { return value; }
public final String toString(a) { return key + "=" + value; }
public final int hashCode(a) {
return Objects.hashCode(key) ^ Objects.hashCode(value);
}
public final V setValue(V newValue) {
V oldValue = value;
value = newValue;
return oldValue;
}
public final boolean equals(Object o) {
if (o == this)
return true;
if (o instanceofMap.Entry) { Map.Entry<? ,? > e = (Map.Entry<? ,? >)o;if (Objects.equals(key, e.getKey()) &&
Objects.equals(value, e.getValue()))
return true;
}
return false; }}Copy the code
Nodes store keys and values, hash values for keywords, hash values are computed only once, and next points to the next Node to form a linked list to resolve conflicts. There is another kind of node in a HashMap:
static final class TreeNode<K.V> extends LinkedHashMap.Entry<K.V> {
TreeNode<K,V> parent; // red-black tree links
TreeNode<K,V> left;
TreeNode<K,V> right;
TreeNode<K,V> prev; // needed to unlink next upon deletion
boolean red;
TreeNode(int hash, K key, V val, Node<K,V> next) {
super(hash, key, val, next); }... Omit some code}Copy the code
TreeNode is a red-black TreeNode. This is an optimization of HashMap from JDK8, when the list length reaches a certain length, the list will be reconstructed into a red-black tree.
static final int TREEIFY_THRESHOLD = 8;
static final int UNTREEIFY_THRESHOLD = 6;
static final int MIN_TREEIFY_CAPACITY = 64;
Copy the code
When the hash table length is at least 64 and the number of linked list nodes exceeds 8, the linked list is converted into a red-black tree. When the number of red black nodes is less than 6, the red black tree is converted to a linked list. soHashMap
The basic structure of the
The hash function
In the HashMap, the keyword location is divided into two steps. The first step is to calculate the hash value of the keyword.
static final int hash(Object key) {
int h;
return (key == null)?0 : (h = key.hashCode()) ^ (h >>> 16);
}
Copy the code
The hash function does xor on the high 16 bits of the key’s hashCode and the low 16 bits, and the comment on the method explains why:
Computes key.hashCode() and spreads (XORs) higher bits of hash to lower. Because the table uses power-of-two masking, sets of hashes that vary only in bits above the current mask will always collide. (Among known examples are sets of Float keys holding consecutive whole numbers in small tables.) So we apply a transform that spreads the impact of higher bits downward. There is a tradeoff between speed, utility, and quality of bit-spreading. Because many common sets of hashes are already reasonably distributed (so don’t benefit from spreading), and because we use trees to handle large sets of collisions in bins, we just XOR some shifted bits in the cheapest possible way to reduce systematic lossage, as well as to incorporate impact of the highest bits that would otherwise never be used in index calculations because of table bounds.
When determining the position of the key in the array (subscript), since the length of the array is always a power of 2, it replaces the remainder by subtracting the array length by one and then doing the sum operation with the key’s hash value:
(n - 1) & hash
Copy the code
However, when the array length is relatively small, only the low value is involved in the calculation. Therefore, in order to avoid the collision caused by the high value being unable to participate in the location determination, and to reduce the system overhead by integrating speed, function and quality, the high and low xOR method is adopted. Given an array length of 8, let’s calculate the position of the following Hello World string:
h = hashCode() : 0110101011101111 1110001011000100 = 1794106052
h >>> 16 : 0000000000000000 0110101011101111
hash = h ^ (h >>> 16) : 0110101011101111 1000100000101011
8 - 1 : 0000000000000000 0000000000000111
(n - 1) & hash : 0000000000000000 0000000000000011 = 3
Copy the code
The array length is always a power of 2 by doing the following:
/ * * * * a HashMap method - 1 = 0 b11111111111111111111111111111111 * so - 1 the unsigned right shift n, then n = Math. Pow (2, 32 - n) if n > = 32-1 * and n < 0, Returns 1 = math.pow (2, 0) * so, no matter what, the length of the array is a power of 2 */
static final int tableSizeFor(int cap) {
int n = -1 >>> Integer.numberOfLeadingZeros(cap - 1);
return (n < 0)?1 : (n >= MAXIMUM_CAPACITY) ? MAXIMUM_CAPACITY : n + 1;
}
/** * Integer Return the number of zeros before the highest non-zero bit of I * for example 0000000000000000 0000000000000001 returns 31 * for example 0000000000000000 111111111111 returns 16 * for example 0000000000000000 111111111111 1000000000000000 0000000000000000 returns 0 * The calculation method is similar to 2 */
public static int numberOfLeadingZeros(int i) {
/ / 0 = 0 b00000000000000000000000000000000 returned to 32
// I < 0, the sign bit must be 1, return 0
if (i <= 0)
return i == 0 ? 32 : 0;
int n = 31;
// If I >= 2 to the 16th, the number of zeros must be subtracted by 16
if (i >= 1 << 16) { n -= 16; i >>>= 16; }
// If I >= 2 to the eighth, the number of zeros must be subtracted by 8
if (i >= 1 << 8) { n -= 8; i >>>= 8; }
// If I >= 2 to the fourth, the number of zeros must be subtracted by 4
if (i >= 1 << 4) { n -= 4; i >>>= 4; }
// If I >= 2 to the 2nd power, the number of zeros must be subtracted by 2
if (i >= 1 << 2) { n -= 2; i >>>= 2; }
// If I >= 2 to the 1, subtract 1 from the number of zeros
return n - (i >>> 1);
}
Copy the code
Other attributes
There are a few other important attributes in the HashMap:
// Default initial array size = 16
static final int DEFAULT_INITIAL_CAPACITY = 1 << 4; // aka 16
// The array is at most 2 to the 30th
static final int MAXIMUM_CAPACITY = 1 << 30;
// Default load factor
static final float DEFAULT_LOAD_FACTOR = 0.75 f;
// The actual number of stored elements
transient int size;
// Number of changes
transient int modCount;
// Next capacity expansion threshold. If size = threshold, capacity expansion
// threshold = table.length * loadFactor
int threshold;
// Load factor
final float loadFactor;
Copy the code
HashMap provides four constructors:
public HashMap(int initialCapacity, float loadFactor) {
if (initialCapacity < 0)
throw new IllegalArgumentException("Illegal initial capacity: " +
initialCapacity);
if (initialCapacity > MAXIMUM_CAPACITY)
initialCapacity = MAXIMUM_CAPACITY;
if (loadFactor <= 0 || Float.isNaN(loadFactor))
throw new IllegalArgumentException("Illegal load factor: " +
loadFactor);
this.loadFactor = loadFactor;
this.threshold = tableSizeFor(initialCapacity);
}
public HashMap(int initialCapacity) {
this(initialCapacity, DEFAULT_LOAD_FACTOR);
}
public HashMap(a) {
this.loadFactor = DEFAULT_LOAD_FACTOR; // all other fields defaulted
}
public HashMap(Map<? extends K, ? extends V> m) {
this.loadFactor = DEFAULT_LOAD_FACTOR;
putMapEntries(m, false);
}
Copy the code
The source code parsing
Add elements
/** * inserts the element, replacing the old */ with the new value if the key already exists
public V put(K key, V value) {
return putVal(hash(key), key, value, false.true);
}
/** * inserts the element, without replacing the old value */ if the key already exists
@Override
public V putIfAbsent(K key, V value) {
return putVal(hash(key), key, value, true.true);
}
final V putVal(int hash, K key, V value, boolean onlyIfAbsent,
boolean evict) {
Node<K,V>[] tab; Node<K,V> p; int n, i;
// If table is empty or has a length of 0, resize first
if ((tab = table) == null || (n = tab.length) == 0)
n = (tab = resize()).length;
// If the specified position is null, place the new node at the specified position
if ((p = tab[i = (n - 1) & hash]) == null)
tab[i] = newNode(hash, key, value, null);
else {
// e is the successor of the node to be inserted
Node<K,V> e; K k;
// The key to insert already exists
if(p.hash == hash && ((k = p.key) == key || (key ! =null && key.equals(k))))
e = p;
// The tree node stored at the location to be inserted
else if (p instanceof TreeNode)
e = ((TreeNode<K,V>)p).putTreeVal(this, tab, hash, key, value);
else {
// Iterate over the list
for (int binCount = 0; ; ++binCount) {
// Find the end node and insert the new node
if ((e = p.next) == null) {
p.next = newNode(hash, key, value, null);
// If the list length is greater than or equal to 7 (8 plus the newly inserted node), convert the list to a tree
if (binCount >= TREEIFY_THRESHOLD - 1) // -1 for 1st
treeifyBin(tab, hash);
break;
}
if(e.hash == hash && ((k = e.key) == key || (key ! =null && key.equals(k))))
break; p = e; }}// If e is not null, the key already exists
if(e ! =null) { // existing mapping for key
V oldValue = e.value;
if(! onlyIfAbsent || oldValue ==null)
e.value = value;
/ / used to LinkedHashMap
afterNodeAccess(e);
return oldValue;
}
}
++modCount;
// If the number of elements reaches the threshold, resize
if (++size > threshold)
resize();
/ / used to LinkedHashMap
afterNodeInsertion(evict);
return null;
}
Copy the code
The resize method is annotated as follows:
Initializes or doubles table size. If null, allocates in accord with initial capacity target held in field threshold. Otherwise, because we are using power-of-two expansion, the elements from each bin must either stay at same index, or move with a power of two offset in the new table.
If the table is empty, initialize the table (16), otherwise double the size of the table. And is extended by powers of two (the new array is twice the length of the old array), so the element’s position is either the same or the old position plus the length of the old array. Why is that? Let’s take hello World as an example. The array expands from 8 to 16, where it used to be in position 3:
Before capacity expansion: hash = H ^ (H >>> 16) : 0110101011101111 100010000010101011 8-1: 0000000000000000 0000000000000111 (N-1) & Hash: 0000000000000000 0000000000000011 = 3 After expansion: hash = H ^ (H >>> 16) : 0110101011101111 1000100000101011 16 - 1 : 0000000000000000 0000000000001111 (n - 1) & hash : 0000000000000000 0000000000001011 = 3 + 8 = 11 Assume that the hash value of the other key is 0110101011101111 1000100000100011 Before capacity expansion: hash = h ^ (h >>> 16) : 0110101011101111 1000100000100011 8 - 1 : 0000000000000000 0000000000000111 (n - 1) & hash : 0000000000000000 0000000000000011 = 3 After expansion: Hash = H ^ (H >>> 16) : 0110101011101111 1000100000100011 16-1: 0000000000000000 0000000000001111 (n - 1) & hash : 0000000000000000 0000000000000011 = 3Copy the code
Therefore, the hash value of the new key can determine its position in the new array by determining the hash value of the new bit.
(e.hash & oldCap) == 0Hash = h ^ (h >>>)16) : 0110101011101111 1000100000101011
oldCap = 8 : 0000000000000000 0000000000001000
oldCap & hash : 0000000000000000 0000000000001000 = 1
Copy the code
Resize resize resize
final Node<K,V>[] resize() {
// This section determines the size of the new array. If it is larger than 2 ^ 30, it will not be expanded
Node<K,V>[] oldTab = table;
int oldCap = (oldTab == null)?0 : oldTab.length;
int oldThr = threshold;
int newCap, newThr = 0;
if (oldCap > 0) {
if (oldCap >= MAXIMUM_CAPACITY) {
threshold = Integer.MAX_VALUE;
return oldTab;
}
else if ((newCap = oldCap << 1) < MAXIMUM_CAPACITY &&
oldCap >= DEFAULT_INITIAL_CAPACITY)
newThr = oldThr << 1; // double threshold
}
else if (oldThr > 0) // initial capacity was placed in threshold
newCap = oldThr;
else { // zero initial threshold signifies using defaults
newCap = DEFAULT_INITIAL_CAPACITY;
newThr = (int)(DEFAULT_LOAD_FACTOR * DEFAULT_INITIAL_CAPACITY);
}
if (newThr == 0) {
float ft = (float)newCap * loadFactor;
newThr = (newCap < MAXIMUM_CAPACITY && ft < (float)MAXIMUM_CAPACITY ?
(int)ft : Integer.MAX_VALUE);
}
// Update the next resize threshold
threshold = newThr;
@SuppressWarnings({"rawtypes","unchecked"})
Node<K,V>[] newTab = (Node<K,V>[])new Node[newCap];
table = newTab;
if(oldTab ! =null) {
// Walk through the node, repositioning the element, either in the original position, or the original position plus the length of the old array
for (int j = 0; j < oldCap; ++j) {
Node<K,V> e;
if((e = oldTab[j]) ! =null) {
oldTab[j] = null;
// Indicates only one element
if (e.next == null)
newTab[e.hash & (newCap - 1)] = e;
// If it is a tree node, split the tree
else if (e instanceof TreeNode)
((TreeNode<K,V>)e).split(this, newTab, j, oldCap);
// Walk through the list, and use the tail method
else { // preserve order
// Record the original location of the node
Node<K,V> loHead = null, loTail = null;
// Record the node in the original position + the original capacity position
Node<K,V> hiHead = null, hiTail = null;
Node<K,V> next;
do {
next = e.next;
/ / the original position
if ((e.hash & oldCap) == 0) {
if (loTail == null)
loHead = e;
else
loTail.next = e;
loTail = e;
}
/ / the new location
else {
if (hiTail == null)
hiHead = e;
elsehiTail.next = e; hiTail = e; }}while((e = next) ! =null);
// Assign a value to the specified position
if(loTail ! =null) {
loTail.next = null;
newTab[j] = loHead;
}
if(hiTail ! =null) {
hiTail.next = null;
newTab[j + oldCap] = hiHead;
}
}
}
}
}
return newTab;
}
/** * treenode.split * TreeNode is an inherited Node, so TreeNode can also be traversed by next */
final void split(HashMap<K,V> map, Node<K,V>[] tab, int index, int bit) {
TreeNode<K,V> b = this;
// Relink into lo and hi lists, preserving order
TreeNode<K,V> loHead = null, loTail = null;
TreeNode<K,V> hiHead = null, hiTail = null;
int lc = 0, hc = 0;
for(TreeNode<K,V> e = b, next; e ! =null; e = next) {
next = (TreeNode<K,V>)e.next;
e.next = null;
if ((e.hash & bit) == 0) {
if ((e.prev = loTail) == null)
loHead = e;
else
loTail.next = e;
loTail = e;
++lc;
}
else {
if ((e.prev = hiTail) == null)
hiHead = e;
elsehiTail.next = e; hiTail = e; ++hc; }}if(loHead ! =null) {
// If the number of nodes is less than or equal to 6, switch from tree to linked list
if (lc <= UNTREEIFY_THRESHOLD)
tab[index] = loHead.untreeify(map);
else {
tab[index] = loHead;
if(hiHead ! =null) // (else is already treeified)loHead.treeify(tab); }}if(hiHead ! =null) {
if (hc <= UNTREEIFY_THRESHOLD)
tab[index + bit] = hiHead.untreeify(map);
else {
tab[index + bit] = hiHead;
if(loHead ! =null) hiHead.treeify(tab); }}}Copy the code
When the list length reaches a threshold, the list becomes a tree:
final void treeifyBin(Node<K,V>[] tab, int hash) {
int n, index; Node<K,V> e;
// If the table is empty, or the table length is less than 64, resize first
if (tab == null || (n = tab.length) < MIN_TREEIFY_CAPACITY)
resize();
else if ((e = tab[index = (n - 1) & hash]) ! =null) {
TreeNode<K,V> hd = null, tl = null;
// Convert the node to a tree node
do {
TreeNode<K,V> p = replacementTreeNode(e, null);
if (tl == null)
hd = p;
else {
p.prev = tl;
tl.next = p;
}
tl = p;
} while((e = e.next) ! =null);
if((tab[index] = hd) ! =null)
/ / treehd.treeify(tab); }}Copy the code
Look for the element
/** * find the value */ based on the key
public V get(Object key) {
Node<K,V> e;
return (e = getNode(hash(key), key)) == null ? null : e.value;
}
final Node<K,V> getNode(int hash, Object key) {
Node<K,V>[] tab; Node<K,V> first, e; int n; K k;
// Table is not empty, length is not 0, array specified position is not empty
if((tab = table) ! =null && (n = tab.length) > 0 &&
(first = tab[(n - 1) & hash]) ! =null) {
// Check the first node, hash value is equal, and key is equal
if (first.hash == hash && // always check first node((k = first.key) == key || (key ! =null && key.equals(k))))
return first;
// loop search
if((e = first.next) ! =null) {
if (first instanceof TreeNode)
return ((TreeNode<K,V>)first).getTreeNode(hash, key);
do {
if(e.hash == hash && ((k = e.key) == key || (key ! =null && key.equals(k))))
return e;
} while((e = e.next) ! =null); }}return null;
}
public boolean containsKey(Object key) {
returngetNode(hash(key), key) ! =null;
}
/** * Returns the default value */ if no key exists
@Override
public V getOrDefault(Object key, V defaultValue) {
Node<K,V> e;
return (e = getNode(hash(key), key)) == null ? defaultValue : e.value;
}
Copy the code
Remove elements
public V remove(Object key) {
Node<K,V> e;
return (e = removeNode(hash(key), key, null.false.true)) = =null ?
null : e.value;
}
final Node<K,V> removeNode(int hash, Object key, Object value,
boolean matchValue, boolean movable) {
Node<K,V>[] tab; Node<K,V> p; int n, index;
// Table is not empty, the length is greater than 0, mapping position node is not empty
if((tab = table) ! =null && (n = tab.length) > 0 &&
(p = tab[index = (n - 1) & hash]) ! =null) {
Node<K,V> node = null, e; K k; V v;
// Determine whether the first node is the node to be deleted
if(p.hash == hash && ((k = p.key) == key || (key ! =null && key.equals(k))))
node = p;
else if((e = p.next) ! =null) {
// If it is a tree node, find the tree
if (p instanceof TreeNode)
node = ((TreeNode<K,V>)p).getTreeNode(hash, key);
// loop search
else {
do {
if(e.hash == hash && ((k = e.key) == key || (key ! =null && key.equals(k)))) {
node = e;
break;
}
p = e;
} while((e = e.next) ! =null); }}// The node found is not empty and meets other conditions
if(node ! =null&& (! matchValue || (v = node.value) == value || (value ! =null && value.equals(v)))) {
// Process of deleting tree nodes
if (node instanceof TreeNode)
((TreeNode<K,V>)node).removeTreeNode(this, tab, movable);
// Common list operations
else if (node == p)
tab[index] = node.next;
else
p.next = node.next;
++modCount;
--size;
/ / used to LinkedHashMap
afterNodeRemoval(node);
returnnode; }}return null;
}
/** * TreeNode.removeTreeNode */
final void removeTreeNode(HashMap<K,V> map, Node<K,V>[] tab,
boolean movable) {
int n;
if (tab == null || (n = tab.length) == 0)
return;
int index = (n - 1) & hash;
TreeNode<K,V> first = (TreeNode<K,V>)tab[index], root = first, rl;
TreeNode<K,V> succ = (TreeNode<K,V>)next, pred = prev;
if (pred == null)
tab[index] = first = succ;
else
pred.next = succ;
if(succ ! =null)
succ.prev = pred;
if (first == null)
return;
if(root.parent ! =null)
root = root.root();
// only if root is empty or the child of root is empty (number of elements between [0,2])
// the tree is converted to a linked list
if (root == null
|| (movable
&& (root.right == null
|| (rl = root.left) == null
|| rl.left == null))) {
tab[index] = first.untreeify(map); // too small
return; }... Omit subsequent code}Copy the code
The iterator
In a HashMap, all iterators, including those for KeySet, Values, and EntrySet, inherit from HashIterator:
abstract class HashIterator {
Node<K,V> next; // Next node
Node<K,V> current; // The current node
int expectedModCount; // Prevent elements from being changed during iteration, achieving rapid failure
int index; // Index of the current array
HashIterator() {
expectedModCount = modCount;
Node<K,V>[] t = table;
current = next = null;
index = 0;
// Find the first non-empty element in the array during initialization
if(t ! =null && size > 0) { // advance to first entry
do {} while (index < t.length && (next = t[index++]) == null); }}public final boolean hasNext(a) {
returnnext ! =null;
}
final Node<K,V> nextNode(a) {
Node<K,V>[] t;
Node<K,V> e = next;
if(modCount ! = expectedModCount)throw new ConcurrentModificationException();
if (e == null)
throw new NoSuchElementException();
// If the list is iterated, the next element in the array that is not empty is found
if ((next = (current = e).next) == null&& (t = table) ! =null) {
do {} while (index < t.length && (next = t[index++]) == null);
}
return e;
}
public final void remove(a) {
Node<K,V> p = current;
if (p == null)
throw new IllegalStateException();
if(modCount ! = expectedModCount)throw new ConcurrentModificationException();
current = null;
removeNode(p.hash, p.key, null.false.false); expectedModCount = modCount; }}final class KeyIterator extends HashIterator
implements Iterator<K> {
public final K next(a) { returnnextNode().key; }}final class ValueIterator extends HashIterator
implements Iterator<V> {
public final V next(a) { returnnextNode().value; }}final class EntryIterator extends HashIterator
implements Iterator<Map.Entry<K.V>> {
public final Map.Entry<K,V> next(a) { returnnextNode(); }}Copy the code
So, no matter which method (keySet(), values(), entrySet()) is used to traverse the elements in the HashMap, the order is the same.
serialization
In the attributes of HashMap, except threshold and loadFactor, other attributes are added with transient keyword.
private void writeObject(java.io.ObjectOutputStream s)
throws IOException {
int buckets = capacity();
// Write out the threshold, loadfactor, and any hidden stuff
s.defaultWriteObject();
s.writeInt(buckets);
s.writeInt(size);
// Write the contents of the storage
internalWriteEntries(s);
}
void internalWriteEntries(java.io.ObjectOutputStream s) throws IOException {
Node<K,V>[] tab;
if (size > 0&& (tab = table) ! =null) {
for (Node<K,V> e : tab) {
for(; e ! =null; e = e.next) { s.writeObject(e.key); s.writeObject(e.value); }}}}private void readObject(java.io.ObjectInputStream s)
throws IOException, ClassNotFoundException {
// Read in the threshold (ignored), loadfactor, and any hidden stuff
s.defaultReadObject();
reinitialize();
if (loadFactor <= 0 || Float.isNaN(loadFactor))
throw new InvalidObjectException("Illegal load factor: " +
loadFactor);
s.readInt(); // Read and ignore number of buckets
int mappings = s.readInt(); // Read number of mappings (size)
if (mappings < 0)
throw new InvalidObjectException("Illegal mappings count: " +
mappings);
else if (mappings > 0) { // (if zero, use defaults)
// Size the table using given load factor only if within
/ / the range of 0.25... 4.0
// This section determines the table size and the threshold for the next resize
float lf = Math.min(Math.max(0.25 f, loadFactor), 4.0 f);
float fc = (float)mappings / lf + 1.0 f;
int cap = ((fc < DEFAULT_INITIAL_CAPACITY) ?
DEFAULT_INITIAL_CAPACITY :
(fc >= MAXIMUM_CAPACITY) ?
MAXIMUM_CAPACITY :
tableSizeFor((int)fc));
float ft = (float)cap * lf;
threshold = ((cap < MAXIMUM_CAPACITY && ft < MAXIMUM_CAPACITY) ?
(int)ft : Integer.MAX_VALUE);
// Check Map.Entry[].class since it's the nearest public type to
// what we're actually creating.
SharedSecrets.getJavaObjectInputStreamAccess().checkArray(s, Map.Entry[].class, cap);
@SuppressWarnings({"rawtypes","unchecked"})
Node<K,V>[] tab = (Node<K,V>[])new Node[cap];
table = tab;
// This section reads and inserts elements
// Read the keys and values, and put the mappings in the HashMap
for (int i = 0; i < mappings; i++) {
@SuppressWarnings("unchecked")
K key = (K) s.readObject();
@SuppressWarnings("unchecked")
V value = (V) s.readObject();
putVal(hash(key), key, value, false.false); }}}Copy the code
conclusion
HashMap
Both key and value are allowednull
There is no guarantee that elements will be accessed in the order they were inserted, nor that the order of access will not change.- The length of a table is always a power of two, which can occur in two cases
resize
When: 1)size >= table.length * loadFactor
When; 2) When the list length is greater than or equal to 8 (tree-like), but the table length is less than 64. However, if the length of the table reaches the maximum (2 ^ 30), the table does not expand. - Conversion from tree to linked list occurs in two cases: 1)
resize
When, the number of nodes is less than or equal to 6; 2) When deleting a tree node, the number of nodes is too small (0-2).