preface
Recursive solutions always give people a sense of “can not be explained”, the code is easy to understand, write their own hands will be frozen, very uncomfortable. The reason is that we don’t practice enough and we don’t understand enough.
What is recursion
Examples of recursion are easy to see in everyday life, such as:
For example, if f(x)=x+f(x-1)f(x)=x+f(x−1) :
def f(x):
return x + f(x- 1)
Copy the code
If f(2)f(2) : returns 2+f(1)2+f(1); Call f (1) f (1); Return 1 + 1 + f (0) f (0); Call f (0) f (0); Return 0 + f (1) 0 + f (1 -)…
The program will run endlessly until it crashes.
If we add a statement x > 0:
def f(x):
if x > 0:
return x + f(x- 1)
else: # f(0) = 0
return 0
Copy the code
The calculation f (2) = 2 + f (1) = 2 + 1 + f (0) = 2 + 1 + 0 = 3 f (2) = 2 + f (1) = 2 + 1 + f (0) = 2 + 1 + 0 = 3
We can summarize two rules:
Recursive functions must have termination conditions, otherwise errors will occur;
A recursive function calls itself repeatedly until it encounters a termination condition and then backtracks, eventually returning the answer.
sample
Merges two ordered lists into a new ordered list and returns. A new list is formed by concatenating all the nodes of a given two lists. Example:
Input: 1->2->4, 1->3->4 Output: 1->1->2->3->4->4
The recursive method
We can recursively define the merge operation on two lists (ignoring boundary cases such as empty lists, etc.) as follows:
That is, the smaller of the two lists is merged with the result of the remaining elements of the merge operation. Consider this topic according to the above rules:
Termination condition: When both lists are empty, we have merged the lists.
How to recurse: We determine which l1 or L2 header is smaller, and then the next pointer on the smaller node points to the merging result of the rest. (Call recursion)
code
Python:
class Solution:
def mergeTwoLists(self, l1: ListNode, l2: ListNode) -> ListNode:
if not l1: return l2 Terminating condition until both lists are empty
if not l2: return l1
if l1.val <= l2.val: # recursive call
l1.next = self.mergeTwoLists(l1.next,l2)
return l1
else:
l2.next = self.mergeTwoLists(l1,l2.next)
return l2
Copy the code
Java:
class Solution {
public ListNode mergeTwoLists(ListNode l1, ListNode l2) {
if (l1 == null) {
return l2;
}
else if (l2 == null) {
return l1;
}
else if (l1.val < l2.val) {
l1.next = mergeTwoLists(l1.next, l2);
return l1;
}
else {
l2.next = mergeTwoLists(l1, l2.next);
returnl2; }}}Copy the code
C + + :
class Solution {
public:
ListNode* mergeTwoLists(ListNode* l1, ListNode* l2) {
if (l1 == NULL) {
return l2;
}
if (l2 == NULL) {
return l1;
}
if (l1->val <= l2->val) {
l1->next = mergeTwoLists(l1->next, l2);
return l1;
}
l2->next = mergeTwoLists(l1, l2->next);
returnl2; }};Copy the code
Complexity analysis
How do you calculate the time and space complexity of recursion? The time complexity can be calculated as follows:
Give a recursive algorithm whose time complexity O(T), usually denoted as the product of the number of recursive calls R and the time complexity of computation (expressed as O(S)) : O(T) = R * O(S)
Time complexity O(m+n).
M and n are the number of elements in L1 and L2. The recursive function removes one element at a time until both lists are empty, so it calls R=O(m+n) R=O(m+n) times. In the recursive function, we only perform the assignment operation of the next pointer, and the complexity is O(1), so the total time complexity of recursion is O(T) = R * O(1) = O(m+n).
Space complexity: O(m+ N).
For the recursive call selsel.mergetwolists (), when it encounters the termination condition to prepare the backtrace, it has already recursed m+nm+n times, using M +nm+ N stack frames, so the final spatial complexity is O(m+n).
Related topics
Here are some basic but classic ones worth practicing: Reversing strings (leetcode-cn.com/problems/re…)
The Hannotta Problem (leetcode-cn.com/problems/ha…)
Swap nodes in linked lists in pairs (leetcode-cn.com/problems/sw…)
Maximum depth of binary tree (leetcode-cn.com/problems/ma…)
If you have any questions, welcome to discuss