preface
To comprehensive thorough analysis of the implementation principle of the weak, the iOS weak underlying implementation principle of (a) : SideTable | s, weak_table_t, weak_entry_t data structures such as all data required for the structure are analyzed, with occasional reference to related operation functions. This article will go straight through the functions in objc-weak.mm and chew through the code line by line from the start. After the analysis of these functions is completed, I believe that the general realization principle of weak will immediately come to mind. Since this paper is quite long, a new chapter will be opened to summarize and verify weak completely. ⛽ ️ ⛽ ️
TABLE_SIZE macro definition
#define TABLE_SIZE(entry) (entry->mask ? entry->mask + 1 : 0)
Copy the code
Used to obtain the total capacity currently allocated by the hash array of weak_entry_t or Weak_table_t.
-
In weak_entry_t, when the number of weak references of the object is not more than 4, weak_referrer_t inline_referrers[WEAK_INLINE_COUNT], which is an array with fixed length of 4, is used for storage Weak_referrer_t. When the length is greater than 4, weak_referrer_t * Referrers hash array is used to store Weak_Referrer_t data.
-
Not all of this is true, but here’s another situation:hash
The length of the array is dynamically adjusted, and it can shrink from large to less than 4. (Shrink to 1/8 of the hash array only if the total size of the hash array exceeds 1024 and less than 1/16 of the total size is used.hash
In the case of arrays,mask
The value of is always the total length minus 1 and participateshash
Function evaluation. -
The above sentence is confused, the following is true.
-
The initial length of the hash array of weak_table_t is 64. When the storage ratio exceeds 3/4, the hash array will be expanded to 2 times of the total capacity, and then the previous data will be hashed again in the new space. When some data is removed from the hash array, in order to improve the search efficiency, it is necessary to reduce the total length of the hash array. The rule is that when the total capacity of the hash array exceeds 1024 and the used part is less than 1/16 of the total capacity, it is reduced to 1/8 of the total capacity. After the reduction, the original data will be hashed again in the new space. (Both scaling and scaling are done by using calloc to create new space, while cache_t is done by ignoring old data, which can be remembered here.) . Keep in mind that this is just for the hash array of weak_table_t.
-
Weak_entry_t first uses an array of fixed length 4. When a new weak reference comes in, it will first judge whether the current fixed-length array is used or the hash array. If the fixed-length array is used at this time, it will first judge whether the fixed-length array is empty. If no vacancy will apply for the hash array length for 4 and a circular array of fixed length of the data in the hash array, there appears to be in accordance with the subscript cycle storage, actually here will go to hash, then the judgment of the hash array expansion, and if more than three-quarters of the expansion of total accounted for the total capacity of 2 times, So the hash array of Weak_entry_t is 8 after the first expansion. There is no reduction mechanism for the hash array of weak_entry_t. The operation of removing the weak reference is actually to set the pointer of the weak reference to nil. The operation of removing is to judge if the fixed-length array is empty or the hash array is empty. The weak_entry_t in the weak_table_t hash array will be removed, and then the weak_table_t will do some operations to reduce the capacity.
-
Weak_entry_t and Weak_table_t can share TABLE_SIZE, because their mechanism of using mask is exactly the same. Here, the reason why weak_entry_t does not shrink and uses fixed-length array at the beginning is optimization, because the number of weak references of an object is not too many.
Now that I think about it, I still think the value of mask is very clever. (We’ve already covered all the functions of masks, but there are many other places in ObjC4 that do this too.)
⬇ ️
static void append_referrer(weak_entry_t *entry, objc_object **new_referrer);
Copy the code
The weak_entry_t function, as its name suggests, adds new_referrer(the address of the weak variable) to the specified weak_entry_t. This is just a declaration, which will be implemented later. This declaration is just a declaration for calling other functions in advance below.
objc_weak_error
// BREAKPOINT_FUNCTION
/* Use this for functions that are intended to be breakpoint hooks. If you do not, the compiler may optimize them away. BREAKPOINT_FUNCTION( void stop_on_error(void) ); * /
# define BREAKPOINT_FUNCTION(prototype) \
OBJC_EXTERN __attribute__((noinline, used, visibility("hidden"))) \
prototype { asm(""); }
BREAKPOINT_FUNCTION(
void objc_weak_error(void));Copy the code
GCC extension attribute ((visibility(“hidden”))
bad_weak_table
static void bad_weak_table(weak_entry_t *entries)
{
_objc_fatal("bad weak table at %p. This may be a runtime bug or a "
"memory error somewhere else.", entries);
}
Copy the code
_objc_fatal is used to exit the program or abort and print the cause. Here it means that a certain weak_entry_t in weak_table_t has a memory error. Global search finds that this function will only be called when index continues to increase in hash conflict until it is equal to begin.
Hash_pointer and w_hash_pointer
/** * Unique hash function for object Pointers only. * @param key The object pointer * @return Size unrestricted hash of pointer. */
static inline uintptr_t hash_pointer(objc_object *key) {
return ptr_hash((uintptr_t)key);
}
Copy the code
Hash the pointer of an objC_object object to obtain the corresponding Weak_entry_t from the Weak_table_t hash table.
/** * Unique hash function for weak object pointers only. * @param key The weak object pointer. * @return Size unrestricted hash of pointer. */
static inline uintptr_t w_hash_pointer(objc_object **key) {
return ptr_hash((uintptr_t)key);
}
Copy the code
Hash the pointer of an objC_object pointer (here refers to the address of weak variable), used to obtain weak_referrer_t from weak_entry_t hash table to set the pointer of the weak reference variable saved to nil or remove it from the hash table, etc.
ptr_hash
// Pointer hash function. This is not a terrific hash, but it is fast and not outrageous elasticity for our purposes.
Pointer hash function. It’s not a great hash, but it’s fast and not overly flawed for our purposes.
// Based on principles from http://locklessinc.com/articles/fast_hash/
// and evaluation ideas from http://floodyberry.com/noncryptohashzoo/
#if __LP64__
static inline uint32_t ptr_hash(uint64_t key)
{
key ^= key >> 4;
key *= 0x8a970be7488fda55;
key ^= __builtin_bswap64(key);
return (uint32_t)key;
}
#else
static inline uint32_t ptr_hash(uint32_t key)
{
key ^= key >> 4;
key *= 0x5052acdb;
key ^= __builtin_bswap32(key);
return key;
}
#endif
Copy the code
__LP64__ refers to an environment where both long and pointer are 64 bits.
Ptr_hash is a pointer hash function, as you can see in objC4-781 it is used in many places.
key
Move the value 4 bits to the right to xor.- with
0x8a970be7488fda55
This value multiplies. (thehardcode
It’s probably apple’s best value.) __builtin_bswap64
Flip the 64-bit bytes and do xor again.
__builtin_bSWap64 refer to the __builtin_ function of GCC
grow_refs_and_insert
Expand the hash array of Weak_entry_t, insert a new new_referrer, and re-hash the original data in the new space.
/** * Grow the entry's hash table of referrers. * Rehashes each of the referrers. * * @param entry Weak pointer hash set for a particular object. */
__attribute__((noinline, used))
static void grow_refs_and_insert(weak_entry_t *entry,
objc_object **new_referrer)
{
// DEBUG an assertion to ensure that weak_entry_t currently uses hash array mode
ASSERT(entry->out_of_line());
// The new capacity is twice the old capacity
size_t old_size = TABLE_SIZE(entry);
size_t new_size = old_size ? old_size * 2 : 8;
// Record the current used capacity
size_t num_refs = entry->num_refs;
// Record the starting address of the old hash array, to be released at the end
weak_referrer_t *old_refs = entry->referrers;
// Mask is still the total capacity minus 1
entry->mask = new_size - 1;
// Allocates space for the new hash array
// Length: total capacity * sizeof(weak_referrer_t) (8) bytes
entry->referrers = (weak_referrer_t *)
calloc(TABLE_SIZE(entry), sizeof(weak_referrer_t));
// Default is 0
entry->num_refs = 0;
entry->max_hash_displacement = 0;
for (size_t i = 0; i < old_size && num_refs > 0; i++) {
if(old_refs[i] ! = nil) {// Put the old hash array into the new hash array
append_referrer(entry, old_refs[i]);
// The old hash array is reduced in lengthnum_refs--; }}// Then pass in the new_referrer and insert the new hash array.
append_referrer(entry, new_referrer);
// Free the old hash data
if (old_refs) free(old_refs);
}
Copy the code
append_referrer
Add the given referrer to the hash array of Weak_entry_t (or an internal array of fixed length 4).
/** * Add the given referrer to set of weak Pointers in this entry. * Add the given referrer to weak_entry_t's hash array (or an internal array of fixed length 4). * Does not perform duplicate checking (B/C weak Pointers are never * added to a set of twice). * * @param entry The entry holding the set of weak pointers. * @param new_referrer The new weak pointer to be added. */
static void append_referrer(weak_entry_t *entry, objc_object **new_referrer)
{
if (! entry->out_of_line()) {
// Try to insert inline.
// If weak_entry does not already use hash arrays, go here
for (size_t i = 0; i < WEAK_INLINE_COUNT; i++) {
// find a space to put new_referrer in
if (entry->inline_referrers[i] == nil) {
entry->inline_referrers[i] = new_referrer;
return; }}// Couldn't insert inline. Allocate out of line.
// If inline_referrers is full, the hash array is referrers
// Apply space for the hash array
weak_referrer_t *new_referrers = (weak_referrer_t *)
calloc(WEAK_INLINE_COUNT, sizeof(weak_referrer_t));
// This constructed table is invalid, but grow_refs_and_insert
// will fix it and rehash it.
// This constructed table is invalid, grow_refs_and_INSERT will fix it and re-hash it
// Put the inline_referrers internal data into the hash array
// It looks like it's just a cycle of subscripts, but it's actually going to be expanded and hashed
for (size_t i = 0; i < WEAK_INLINE_COUNT; i++) {
new_referrers[i] = entry->inline_referrers[i];
}
// assign the referrers value
entry->referrers = new_referrers;
// indicates that the weak reference is currently 4
entry->num_refs = WEAK_INLINE_COUNT;
// out_of_line_ness set REFERRERS_OUT_OF_LINE,
// Flag weak_entry_t to start using hash arrays to hold Pointers to weak references
entry->out_of_line_ness = REFERRERS_OUT_OF_LINE;
// See there is a minus 1 operation here
// mask is assigned and the total capacity is reduced by 1
entry->mask = WEAK_INLINE_COUNT- 1;
// The hash collision offset is 0
entry->max_hash_displacement = 0;
}
// For dynamic array expansion processing
// Assert: the dynamic array must be used at this time
ASSERT(entry->out_of_line());
// #define TABLE_SIZE(entry) (entry->mask ? entry->mask + 1 : 0)
// add 1 to mask
// If it is greater than 3/4 of the total capacity
if (entry->num_refs >= TABLE_SIZE(entry) * 3/4) {
// Weak_entry_t hash array expands and inserts new_referrer
return grow_refs_and_insert(entry, new_referrer);
}
// If no capacity expansion is required, perform normal insertion
size_t begin = w_hash_pointer(new_referrer) & (entry->mask);
size_t index = begin;
size_t hash_displacement = 0;
while(entry->referrers[index] ! = nil) { hash_displacement++; index = (index+1) & entry->mask;
// Select * from index == begin where index == begin
if (index == begin) bad_weak_table(entry);
}
// Update the maximum offset value
if (hash_displacement > entry->max_hash_displacement) {
entry->max_hash_displacement = hash_displacement;
}
// Find an empty place to put a weak reference pointer
weak_referrer_t &ref = entry->referrers[index];
ref = new_referrer;
/ / since the increase
entry->num_refs++;
}
Copy the code
remove_referrer
Remove weak-referenced addresses from weak_entry_t’s hash array (or internal array of fixed length 4).
/** * Remove old_referrer from set of referrers, if it's present. * Does not remove duplicates, because duplicates should not exist. * * @todo this is slow if old_referrer is not present. Is this ever the case? * * @param entry The entry holding the referrers. * @param old_referrer The referrer to remove. */
static void remove_referrer(weak_entry_t *entry, objc_object **old_referrer)
{
// If you are currently using an internal array of fixed length 4
if (! entry->out_of_line()) {
// Loop to find old_referrer's location and place its original location nil to remove old_referrer from array
for (size_t i = 0; i < WEAK_INLINE_COUNT; i++) {
if (entry->inline_referrers[i] == old_referrer) {
entry->inline_referrers[i] = nil;
return; }}// If the current weak_entry_t does not contain the passed old_referrer
// There is an obvious error, execute objc_WEAK_error function
_objc_inform("Attempted to unregister unknown __weak variable "
"at %p. This is probably incorrect use of "
"objc_storeWeak() and objc_loadWeak(). "
"Break on objc_weak_error to debug.\n",
old_referrer);
objc_weak_error(a);return;
}
// Set old_referrer to nil from hash array (remove old_referrer)
size_t begin = w_hash_pointer(old_referrer) & (entry->mask);
size_t index = begin;
size_t hash_displacement = 0;
while(entry->referrers[index] ! = old_referrer) { index = (index+1) & entry->mask;
if (index == begin) bad_weak_table(entry);
hash_displacement++;
if (hash_displacement > entry->max_hash_displacement) {
_objc_inform("Attempted to unregister unknown __weak variable "
"at %p. This is probably incorrect use of "
"objc_storeWeak() and objc_loadWeak(). "
"Break on objc_weak_error to debug.\n",
old_referrer);
objc_weak_error(a);return; }}// set the old_referrer position to nil, num_refs decrement
entry->referrers[index] = nil;
entry->num_refs--;
}
Copy the code
weak_entry_insert
Add a new Weak_entry_t to the hash array of the given weak_table_t.
/** * Add new_entry to the object's table of weak references. * Add a new weak_entry_t to the given weak_table_t. * Does not check Whether the referent is already in the table. * Does not check whether the referent is already in weak_table_t. * /
static void weak_entry_insert(weak_table_t *weak_table, weak_entry_t *new_entry)
{
// Hash the starting address of the array
weak_entry_t *weak_entries = weak_table->weak_entries;
ASSERT(weak_entries ! = nil);size_t begin = hash_pointer(new_entry->referent) & (weak_table->mask);
size_t index = begin;
size_t hash_displacement = 0;
while(weak_entries[index].referent ! = nil) { index = (index+1) & weak_table->mask;
// If index == begin, a blank position must be found.
// Because the hash array size of weak_table_t is always dynamically adjusted before calling this function,
// When the total capacity is greater than 3/4, it expands by 2 times
if (index == begin) bad_weak_table(weak_entries);
// The offset value increases
hash_displacement++;
}
// Insert the hash array directly
weak_entries[index] = *new_entry;
/ / num_entries since
weak_table->num_entries++;
// Record the maximum hash conflict offset
if(hash_displacement > weak_table->max_hash_displacement) { weak_table->max_hash_displacement = hash_displacement; }}Copy the code
Because there are no nested function calls inside this function, the implementation is also relatively simple. First, let’s explain why we don’t check whether referent is already in weak_table_t. Global search can find that there are only two places where this function is called:
weak_table_t
When you resize the hash array, you want to rehash itweak_entry_t
It’s definitely not in the hash array.weak_register_no_lock
The function is calling internallyweak_entry_insert
You’ve already calledweak_entry_for_referent
There is no correspondingweak_entry_t
Exists, soweak_entry_insert
You don’t need to repeat the judgment in the function. (Create a new oneweak_entry_t
Added to theweak_table_t
Hash array of.
Another point to note is that max_hash_displacement of Weak_table_t is updated at the end of the function to record the maximum offset for hash conflicts.
weak_resize
Adjust the capacity of weak_table_t hash array, and re-hash weak_entry_t in the original hash array into the new space.
static void weak_resize(weak_table_t *weak_table, size_t new_size)
{
size_t old_size = TABLE_SIZE(weak_table);
// Raw hash array entry
weak_entry_t *old_entries = weak_table->weak_entries;
// Apply new_size * sizeof(weak_entry_t) bytes for the new hash array, set it to 0, and return the starting address to new_entries
weak_entry_t *new_entries = (weak_entry_t *)
calloc(new_size, sizeof(weak_entry_t));
// mask updated to new_size minus 1
weak_table->mask = new_size - 1;
Weak_entries Start address is updated to new_entries
weak_table->weak_entries = new_entries;
// max_hash_displacement and num_entries default to 0 and will be updated in the insert operations below
weak_table->max_hash_displacement = 0;
weak_table->num_entries = 0; // restored by weak_entry_insert below
// If the old hash array contains data
if (old_entries) {
weak_entry_t *entry;
// Find the end of the hash array
weak_entry_t *end = old_entries + old_size;
for (entry = old_entries; entry < end; entry++) {
if (entry->referent) {
// Num_entries and max_hash_DISPLACEMENT will be updated in this operation
weak_entry_insert(weak_table, entry); }}// Free the memory of the old hash array
free(old_entries); }}Copy the code
weak_grow_maybe
Capacity expansion of hash array of Weak_table_t.
// Grow the given zone's table of weak references if it is full.
static void weak_grow_maybe(weak_table_t *weak_table)
{
size_t old_size = TABLE_SIZE(weak_table);
// Grow if at least 3/4 full.
// If the ratio exceeds 3/4 of the total capacity, expand the capacity.
if (weak_table->num_entries >= old_size * 3 / 4) {
// The second parameter is used to specify the capacity expansion value,
// If the initial state is 64, the total capacity is 64.
// Expand to twice the previous capacity if not initialized
weak_resize(weak_table, old_size ? old_size*2 : 64); }}Copy the code
This function is also fairly clear, making three points clear:
- when
weak_table_t
If the number of elements in the hash array exceeds three quarters of the total capacity, the hash array is expanded. weak_table_t
The hash array initialization capacity is 64.weak_table_t
During capacity expansion, the capacity is doubled.
The last call weak_resize is analyzed below.
weak_compact_maybe
If the Weak_table_t hash array is mostly empty, shrink the hash array.
// Shrink the table if it is mostly empty.
static void weak_compact_maybe(weak_table_t *weak_table)
{
size_t old_size = TABLE_SIZE(weak_table);
// Shrink if larger than 1024 buckets and at most 1/16 full.
// When the total capacity exceeds 1024 and the usage is less than 1/16 of the total capacity, the capacity is reduced
if (old_size >= 1024 && old_size / 16 >= weak_table->num_entries) {
// Shrink to 1/8 of the total capacity
weak_resize(weak_table, old_size / 8);
// leaves new table no more than 1/2 full
// In combination with the above 1/16, make sure that even after scaling down, it still occupies less than 1/2 of the capacity}}Copy the code
If the total capacity exceeds 1024 and the occupied capacity is less than 1/16 of the total capacity, the reduced capacity is 1/8 of the total capacity, and the occupied capacity is still less than 1/2 of the total capacity.
weak_entry_remove
Removes the specified Weak_entry_t from the hash array of Weak_table_t.
/** * Remove entry from the zone's table of weak references. */
static void weak_entry_remove(weak_table_t *weak_table, weak_entry_t *entry)
{
// remove entry
// If Weak_entry_t currently uses a dynamic hash array, it frees its memory
if (entry->out_of_line()) free(entry->referrers);
// Set the sizeof(*entry) byte space from entry to 0
bzero(entry, sizeof(*entry));
/ / num_entries since
weak_table->num_entries--;
// Reduce the hash array capacity of weak_table_t
weak_compact_maybe(weak_table);
}
Copy the code
weak_entry_for_referent
Weak_entry_t of referent is found from the hash array of Weak_table_t, and NULL is returned if not found.
/** * Return the weak reference table entry for the given referent. * If there is no entry for referent, Return NULL. * everyone Performs a lookup. * * @param Weak_table * @param referent The object. Must not be nil. If empty, the assertion is executed directly. * @return The table of weak referrers to this object. */
static weak_entry_t *
weak_entry_for_referent(weak_table_t *weak_table, objc_object *referent)
{
// If referent is NULL, the assertion is executed directly
ASSERT(referent);
// Hash array entry
weak_entry_t *weak_entries = weak_table->weak_entries;
/ / found empty
if(! weak_entries)return nil;
// Obtain the hash key corresponding to the referent, and ensure that begin is within [0, mask] with mask-&
size_t begin = hash_pointer(referent) & weak_table->mask;
// begin is used to record the starting point, index is used to control loop conditions
size_t index = begin;
// Record the offset of the hash conflict
size_t hash_displacement = 0;
while(weak_table->weak_entries[index].referent ! = referent) {// If hash conflict occurs, index + 1, and mask + operation to prevent overcrossing
index = (index+1) & weak_table->mask;
// If index overlaps with begin, weak_entries have memory errors.
If index == begin is present, the offset will be exceeded
if (index == begin) bad_weak_table(weak_table->weak_entries);
// Record offset
hash_displacement++;
// If the offset is greater than the maximum offset, there is no corresponding Weak_entry_t, and nil is returned
if (hash_displacement > weak_table->max_hash_displacement) {
returnnil; }}Return the corresponding weak_entry_t pointer
return &weak_table->weak_entries[index];
}
Copy the code
weak_unregister_no_lock
Unregister an already-registered weak reference. This is used when the referrer’s storage is about to go away. but referent isn’t dead yet. (Otherwise, zeroing referrer later would be a bad memory access.) Does nothing if referent/referrer is not a currently active weak reference. Does not zero referrer.
FIXME currently requires old referent value to be passed in (lame). FIXME unregistration should be automatic if referrer is collected.
Unregister a previously registered weak reference. The method’s storage for the referrer is about to disappear, but the referent still exists. (Otherwise, if the referrer is freed, it may cause an incorrect memory access, i.e. the object has not been freed, but the weak variable has been freed. In this case, accessing the weak variable will cause a wild pointer access.) If the referent/referrer is not a currently valid weak reference, nothing is done.
You currently need to pass the old reference value. If the referrer is freed, removing the referrer from its corresponding Weak_entry_t hash array (or internal array of fixed length 4) should be automatic.
Unregister the specified weak reference from the hash array (or internal array of fixed length 4) of weak_entry_t corresponding to the Referent.
void
weak_unregister_no_lock(weak_table_t *weak_table, id referent_id,
id *referrer_id)
{
// id is converted to a pointer to objc_object *
objc_object *referent = (objc_object *)referent_id;
// Referrer_id refers to the address of the weak variable, so this is **
objc_object **referrer = (objc_object **)referrer_id;
weak_entry_t *entry;
if(! referent)return;
// Find the referent's weak_entry_t from weak_table
if ((entry = weak_entry_for_referent(weak_table, referent))) {
// When this entry is found, remove the referrer from weak_entry_t's hash array (or internal array of fixed length 4)
remove_referrer(entry, referrer);
bool empty = true;
// After logging off the referrer, judge whether the corresponding weak_entry_t needs to be deleted.
// If weak_entry_t currently uses hash arrays and num_refs is not 0,
// indicates that the hash array is not empty and does not need to be deleted
if (entry->out_of_line() && entry->num_refs ! =0) {
empty = false;
}
else {
// Loop to judge whether there is still weak_referrer_t in weak_entry_t array with fixed length 4
for (size_t i = 0; i < WEAK_INLINE_COUNT; i++) {
if (entry->inline_referrers[i]) {
empty = false;
break; }}}// If the weak reference address in the entry has been cleared, the entry will be deleted. If the array is empty, the array will be deleted
if (empty) {
weak_entry_remove(weak_table, entry); }}// Do not set *referrer = nil. objc_storeWeak() requires that the
// value not change.
}
Copy the code
weak_register_no_lock
Register a pointer to an object and its weak reference to weak_entry_t in weak_table_t.
/** * Registers a new (object, weak pointer) pair. Registers a new weak * object entry if it does not exist. Weak pointer). * Create a new Weak object entry (Weak_entry_t) if it does not exist. * * @param weak_table The global weak table. Referent_id weak_table_t table * @param referent The object pointed to by * @param referrer the weak pointer address. Weak pointer address */
id
weak_register_no_lock(weak_table_t *weak_table, id referent_id,
id *referrer_id, bool crashIfDeallocating)
{
// Object pointer
objc_object *referent = (objc_object *)referent_id;
// address of the weak variable
objc_object **referrer = (objc_object **)referrer_id;
// If the object does not exist or is a Tagged Pointer, return the object.
if(! referent || referent->isTaggedPointer()) return referent_id;
// ensure that the referenced object is viable
// Determine whether the object is being released
bool deallocating;
if(! referent->ISA() - >hasCustomRR()) {
deallocating = referent->rootIsDeallocating(a); }else {
AllowsWeakReference = allowsWeakReference = allowsWeakReference = allowsWeakReference
BOOL (*allowsWeakReference)(objc_object *, SEL) =
(BOOL(*)(objc_object *, SEL))
object_getMethodImplementation((id)referent,
@selector(allowsWeakReference));
if ((IMP)allowsWeakReference == _objc_msgForward) {
return nil;
}
// Execute a function through a function pointer
deallocating =
! (*allowsWeakReference)(referent, @selector(allowsWeakReference));
}
// If the object is being released or cannot be weak referenced, and crashIfDeallocating is true, crash is thrown
if (deallocating) {
if (crashIfDeallocating) {
_objc_fatal("Cannot form weak reference to instance (%p) of "
"class %s. It is possible that this object was "
"over-released, or is in the process of deallocation.",
(void*)referent, object_getClassName((id)referent));
} else {
returnnil; }}// now remember it and where it is being stored
weak_entry_t *entry;
// Find weak_entry_t corresponding to referent in weak_table
if ((entry = weak_entry_for_referent(weak_table, referent))) {
// If found, call append_referrer and put the address of the __weak variable into the hash array
append_referrer(entry, referrer);
}
else {
// If no entry is found, create a new entry
weak_entry_t new_entry(referent, referrer);
// Determine whether weak_table_t needs to be expanded
weak_grow_maybe(weak_table);
// Insert weak_entry_t into the hash array of weak_table_t
weak_entry_insert(weak_table, &new_entry);
}
// Do not set *referrer. objc_storeWeak() requires that the value not change.
// Do not set the *referrer. Objc_storeWeak () requires the same value.
/ / return referent_id
return referent_id;
}
Copy the code
The process is extremely long, but each step is clear.
- In the first place to judge
referent
Whether it isTagged Pointer
If not, execute the following process.Tagged Pointer
Weak references are not supported. (Tagged Pointer
The analysis ofweak
Article) - Determines whether the object is freed and whether the object supports weak references. Inherited from
NSObject
Class default support,NSObject.mm
Found in the fileallowsWeakReference
Function to see the class method return by defaultYES
, and returns if the object has not been freedYES
.
- (BOOL)_isDeallocating {
return _objc_rootIsDeallocating(self);
}
+ (BOOL)allowsWeakReference {
return YES;
}
- (BOOL)allowsWeakReference {
return ! [self _isDeallocating];
}
Copy the code
- According to the
deallocating
(flag whether the object is being freed and whether the object supports weak references) and incoming argumentscrashIfDeallocating
Determine whether to abort the program. - in
weak_table_t
To look for inreferent
The correspondingweak_entry_t
If I can find itentry
The callappend_referrer
The function weakly references a pointer to an objectreferrer
insertweak_entry_t
(or an internal array of fixed length 4). - If you don’t find one
weak_entry_t
, create one firstnew_entry
“, and then execute firstweak_grow_maybe
Expand, and then callweak_entry_insert
把new_entry
insertweak_table_t
In the hash array of.
weak_is_registered_no_lock
DEBUG Function that is invoked. Judge whether an object is registered in weak_table_t, which can be understood as whether an object has weak reference. (Registered = there is weak reference, unregistered = there is no weak reference, when the object has weak reference, the system will register it in weak_table_t, that is, weak_entry_t can be found in the hash array of weak_table_t).
#if DEBUG
bool
weak_is_registered_no_lock(weak_table_t *weak_table, id referent_id)
{
// Call weak_entry_for_referent to check whether the object has corresponding entry
return weak_entry_for_referent(weak_table, (objc_object *)referent_id);
}
#endif
Copy the code
Weak_entry_for_referent This function uses Weak_entry_for_referent to determine whether an object is registered in Weak_table_t.
weak_clear_no_lock
This function is called when the object’s dealloc function is executed. Its main function is to point all weak reference Pointers to nil when the object is released into deprecation.
/** * Called by dealloc; nils out all weak pointers that point to the * provided object so that they can no longer be used. * * @param weak_table. * @param referent The object being deallocated. */
void
weak_clear_no_lock(weak_table_t *weak_table, id referent_id)
{
objc_object *referent = (objc_object *)referent_id;
// Find weak_entry_t corresponding to referent from weak_table_t hash array
weak_entry_t *entry = weak_entry_for_referent(weak_table, referent);
// If entry does not exist, return
if (entry == nil) {
/// XXX shouldn't happen, but does with mismatched CF/objc
//printf("XXX no entry for clear deallocating %p\n", referent);
return;
}
// zero out references
// Weak_referRER_t for recording weak_referRER_t
// typedef DisguisedPtr<objc_object *> weak_referrer_t;
weak_referrer_t *referrers;
size_t count;
// If weak_entry_t currently uses a hash array
if (entry->out_of_line()) {
// Record the hash array entry
referrers = entry->referrers;
/ / total length
Weak_entry_t mask and weak_table_t mask are both total lengths minus 1
// Record length
count = TABLE_SIZE(entry);
}
else {
// If the current number of weak references is not more than 4, the inline_referrers array is used to record the weak reference pointer
// Record inline_referrers entry
referrers = entry->inline_referrers;
/ / count is 4
count = WEAK_INLINE_COUNT;
}
// This loop sets the weak pointer in the inline_referrers or hash array to nil
for (size_t i = 0; i < count; ++i) {
// the pointer to the weak variable
objc_object **referrer = referrers[i];
if (referrer) {
// If the weak variable refers to referent, set its reference to nil
if (*referrer == referent) {
*referrer = nil;
}
else if (*referrer) {
Weak_entry_t (weak_entry_t) weak_entry_t (weak_entry_t) weak_entry_t (weak_entry_t) weak_entry_t (weak_entry_t)
// It is possible that the objc_storeWeak and objc_loadWeak functions are incorrectly called.
// Run objc_weak_error to debug
_objc_inform("__weak variable at %p holds %p instead of %p. "
"This is probably incorrect use of "
"objc_storeWeak() and objc_loadWeak(). "
"Break on objc_weak_error to debug.\n",
referrer, (void*)*referrer, (void*)referent);
objc_weak_error(a); }}}// Finally remove entry from weak_table_t
weak_entry_remove(weak_table, entry);
}
Copy the code
The flow is long, but the idea is clear. When the object executes dealloc, this function will be called. First, the corresponding Weak_entry_t in weak_table will be found according to the input parameter referent_id. Then, the hash array of Weak_entry_t or inline_referrers fixed-length array is traversed through the address of weak variable stored inside, and the weak variable pointing is set to nil. Finally, the weak_entry_t is removed from the Weak_table.
Refer to the link
Reference link :🔗
- Objective-c Runtime mechanism (6) — The underlying implementation of weak references
- The underlying iOS — weak describes the principle of object storage
- SideTables, SideTable, Weak_table, weak_entry_t