Nowadays college students learn sorting algorithm, in addition to learning its algorithm principle, code implementation, as a college student is often more important to learn how to evaluate, analyze a sorting algorithm. Sorting is probably familiar to any programmer. Sorting functions are also available in most programming languages. In normal projects, we often use sorting as well. Sorting is important! This chapter mainly starts with how to analyze an algorithm, and then gradually analyzes the sorting algorithm that must be mastered by the end of the four years of university! @[toc]

Of course you can think about it for a minute or two, and with that question in mind, here we go! And notice that my red font is often the most noticeable or unremarkable point.

How to analyze a “sorting algorithm”?

1. The execution efficiency of sorting algorithm

For the analysis of the execution efficiency of sorting algorithm, we generally measure it from the following three aspects:

1.1. Best, Worst, average time complexity

When we analyze the time complexity of sorting algorithm, we should give the time complexity of best case, worst case and average case respectively. In addition, you need to say what the best and worst time complexity is for the raw data to be sorted.

Why distinguish between these three time complexities? One, there are some sorting algorithms that differentiate, so it’s best to do it all for comparison. Second, for the data to be sorted, some are nearly ordered, some are completely unordered. The data with different degree of order certainly has an impact on the execution time of sorting. We need to know the performance of sorting algorithm under different data.

1.2. Coefficient, constant and low order of time complexity

As we know, time complexity reflects an increasing trend when the data size n is large, so it ignores coefficients, constants and low orders. However, in actual software development, we may sort 10, 100, 1000 such small scale data, so, when comparing the performance of the sorting algorithm of the same time complexity, we need to take coefficients, constants, and low order into consideration.

1.3. Number of comparisons and number of swaps (or moves)

This video and the next video are all about sorting algorithms based on comparison. The execution process of comparison based sorting algorithm involves two operations, one is the comparison of the size of elements, and the other is the exchange or movement of elements. Therefore, if we analyze the efficiency of sorting algorithms, we should also take into account the number of comparisons and swaps (or moves).

2. Memory consumption of sorting algorithm

As we mentioned earlier, the memory consumption of an algorithm can be measured by its space complexity, and sorting algorithms are no exception. However, aiming at the spatial complexity of sorting algorithm, we also introduce a new concept, Sorted in place. In situ sorting algorithm, is specifically refers to the space complexity is O(1) sorting algorithm.

3. Stability of sorting algorithm

Stability is not to be ignored. It is not enough to measure a sorting algorithm in terms of execution efficiency and memory consumption. We also have an important metric for sorting algorithms, stability. The concept is that if there are elements with equal values in a sorted sequence, the order of the elements will remain the same.

Let me illustrate this with an example. Let’s say we have two, nine, three, four, eight, three, sorted by size and then we have two, three, three, four, eight, nine.

Stable sorting algorithm
Unstable sorting algorithm

You might ask, what does it matter which three comes first or which three comes after, what does it matter if it’s stable or not? Why do we want to look at the stability of sorting algorithms?

A lot of data structures and algorithms, when we talk about sorting, we use integers as an example, but in real software development, we tend to sort not just integers, but a set of objects, and we need to sort by some key of the object.

For example, we now want to order “orders” in the e-commerce trading system. The order has two attributes, one is the order time, the other is the order amount. If we now have 100,000 order data, we want to sort the order data from smallest to largest. For the same amount of orders, we want to order from morning till night according to the order time. So what do we do with a sorting requirement like this?

The first method that comes to mind is: we first sort the order data according to the amount, and then iterate over the sorted order data, and then sort the order time between each cell with the same amount. This sort of sorting idea is not difficult to understand, but can be complicated to implement.

With stable sorting algorithms, this problem can be solved very concisely. The solution is as follows: we first order the order by the order time, note that the order time, not the amount. After sorting, we use a stable sorting algorithm to resort the order by the amount. After sorting twice, the order data we get is sorted according to the amount from small to large, and the orders with the same amount are sorted according to the order time from morning to night. Why is that?

The stable sorting algorithm can keep two objects with the same amount of money in the same order after sorting. After the first sorting, all orders are ordered from morning to night according to the order time. In the second sorting, we use a stable sorting algorithm, so after the second sorting, the same amount of orders still keep the order time from morning to night.

At this point, the analysis of a “sorting algorithm” is over, you get it? Next, we go into the actual combat algorithm analysis.

Start analyzing bubbling “sorting algorithms”

1. Bubble sort description

Bubble sort description: Bubble sort only operates on two adjacent data. Each bubbling operation compares two adjacent elements to see if they meet the size relationship requirements. If you don’t, switch them. One bubble moves at least one element to its proper position, n times, and n numbers are sorted.

2. Graphic bubble sorting

3 code to achieve bubble sort
package BubbleSort;
import java.util.Arrays;

public class generalBubble {
   public static void main(String[] args) {
        int[] arr=new int[] {5.7.2.9.4.1.0.5.8.7};
        System.out.println(Arrays.toString(arr));
        bubbleSort(arr);
        System.out.println(Arrays.toString(arr));
    }
    // Bubble sort
    public static void bubbleSort(int[]  arr) {
        // Control how many rounds are compared
        for(int i=0; i<arr.length-1; i++) {// Control the number of comparisons
            for(int j=0; j<arr.length-1-i; j++) {if(arr[j]>arr[j+1]) {
                    int temp=arr[j];
                    arr[j]=arr[j+1];
                    arr[j+1]=temp;
                }
            }
        }

    }
}
Copy the code

Test effect:

4. Code optimization bubble sort

In fact, this bubbling process can be optimized. If no data is exchanged during a bubbling operation, the order is complete and subsequent bubbling operations do not need to be performed. I have another example here, where you sort six elements in four bubbles.

// Bubble sort, a for array, n for array size
public void bubbleSort(int[] a, int n) {
  if (n <= 1) return;
 
 for (int i = 0; i < n; ++i) {
    // Exit the bubble loop flag bit early
    boolean flag = false;
    for (int j = 0; j < n - i - 1; ++j) {
      if (a[j] > a[j+1]) { / / exchange
        int tmp = a[j];
        a[j] = a[j+1];
        a[j+1] = tmp;
        flag = true;  // Indicates data exchange}}if(! flag)break;  // There is no data exchange, exit early}}Copy the code

Now, with the three aspects of the sorting algorithm I just analyzed, let’s look at the bubble sorting algorithm.

First: bubble sort is in place sort algorithm?

First of all, in situ sorting is specifically O(1) sorting algorithm, I mentioned above, again (I guess you didn’t read the article carefully…).

The bubbling process only involves the exchange operation of adjacent data and only requires constant level temporary space, so its spatial complexity is O(1) and it is an in-place sorting algorithm.

Second: is bubble sort a stable sorting algorithm?

In bubble sort, only swapping can change the order of two elements. In order to ensure the stability of bubble sort algorithm, when there are two adjacent elements of the same size, we do not exchange, the same size of data before and after sorting will not change the order, so bubble sort is a stable sorting algorithm.

Third: what is the time complexity of bubble sort?

In the best case, the data to be sorted is already in order, and we only need one bubble operation, and we’re done, so in the best case, the time is order n. In the worst case, the data to be sorted happens to be in reverse order, and we need to do n bubbling operations, so the worst case is O(n2), and the average case is O(n2).

Start analyzing “Insert Sort Algorithm”

1. Insert sort description

Insertion Sort algorithm is a simple and intuitive sorting algorithm. It works by building an ordered sequence, scanning back to front in the sorted sequence for unsorted data, finding the appropriate location and inserting it. In the implementation of insertion sort, in-place sort is usually adopted (that is, the sort only needs to use O(1) extra space). Therefore, in the process of scanning from back to front, the sorted elements need to be moved backward step by step repeatedly to provide insertion space for the latest elements.

2. Graphical insertion sort

3. Code to achieve insertion sort
public class InsertSort {
	
	public static void main(String[] args) {
		int[] arr = new int[] {5.3.2.8.5.9.1.0};
		insertSort(arr);
		System.out.println(Arrays.toString(arr));
	}
	
	// Insert sort
	public static void insertSort(int[] arr) {
		// Iterate over all the numbers
		for(int i=1; i<arr.length; i++) {// If the current number is smaller than the previous number
			if(arr[i]<arr[i-1]) {
				// Save the current traversal number
				int temp=arr[i];
				int j;
				// Iterate over all the numbers before the current number
				for(j=i-1; j>=0&&temp<arr[j]; j--) {// Assign the first number to the second number
					arr[j+1]=arr[j];
				}
				// Assign the temporary variable (the current element of the outer for loop) to the last element that does not meet the condition
				arr[j+1]=temp; }}}}Copy the code

Now, with the three aspects of sorting that I’ve been looking at, let’s look at insertion sort.

First: is insertion sort an in-place sort algorithm?

It is obvious from the implementation process that the insertion sort algorithm does not require additional storage space to run, so the space complexity is O(1), that is, this is an in-place sort algorithm.

Second: Is insertion sort a stable sorting algorithm?

In insert sort, for elements with the same value, we can choose to insert the elements that appear later after the elements that appear before, so that the original order can remain unchanged, so insert sort is a stable sorting algorithm.

Third: What is the time complexity of insert sort?

If the data to be sorted is already in order, we do not need to move any data. If we look up the insertion position from end to end in an ordered data set, we can determine the insertion position by comparing only one data at a time. So in this case, the best time is order n. Notice that the ordered data is traversed from end to end.

If the array is in reverse order, each insert is equivalent to inserting new data in the first position of the array, so a large amount of data needs to be moved, so the worst-case time complexity is O(n2).

Remember what was the average time we took to insert a piece of data into an array? That’s right, order n. So, for insert sort, each insert is equivalent to inserting one data into an array, iterating n inserts, so the average time complexity is O(n2).

Start analyzing the “selection Sort algorithm”

1. Select the sort description

Selection sort is a simple and intuitive sorting algorithm. It works by first picking the smallest (or largest) element from the data elements to be sorted, storing it at the beginning of the sequence, then finding the smallest (or largest) element from the remaining unsorted elements, and placing it at the end of the sorted sequence. And so on until the total number of data elements to be sorted is zero. Selection sort is an unstable sort method.

2. Graphical selection sorting

3. Code to achieve selection sorting
public class SelectSort {

	public static void main(String[] args) {
		int[] arr = new int[] {3.4.5.7.1.2.0.3.6.8};
		selectSort(arr);
		System.out.println(Arrays.toString(arr));
	}
	
	// Select sort
	public static void selectSort(int[] arr) {
		// iterate over all the numbers
		for(int i=0; i<arr.length; i++) {int minIndex=i;
			// Compare the current number with all subsequent numbers, and record the smallest number subscript
			for(int j=i+1; j<arr.length; j++) {// If the following number is smaller than the smallest number recorded.
				if(arr[minIndex]>arr[j]) {
					// Write down the index of the smallest numberminIndex=j; }}// If the minIndex index is not the same as the minIndex index, the minIndex index is smaller than the minIndex index.
			if(i! =minIndex) {inttemp=arr[i]; arr[i]=arr[minIndex]; arr[minIndex]=temp; }}}}Copy the code
4. Analyze the selection sorting algorithm

Selection sorting algorithm is a kind of in-situ, unstable sorting algorithm, best time complexity: T(n) = O(n2) Worst time complexity: T(n) = O(n2) Average time complexity: T(n) = O(n2)

Start analyzing “Hill sorting algorithm”

1. Hill sort description

Hill sort is also an insertion sort, which is a more efficient version of simple insertion sort, also known as reduced incremental sort, and was one of the first algorithms to break out of O(n2). It differs from insertion sort in that it compares distant elements first. Hill sort is also called reduced increment sort. Hill sort is to group the records in a certain increment of the following table, and sort each group using the direct insert sort algorithm. As the increments decrease, each group contains more and more keywords, and when the increments decrease to 1, the entire file is exactly grouped and the algorithm terminates. 1, select the increment gap=length/2 2, narrow the increment continue to gap= gap/2, n/2,(n/2)/2… 1, a little dizzy right? Let’s look at the diagram haha ~

3. Code to achieve Hill sort
public class ShellSort {

	public static void main(String[] args) {
		int[] arr = new int[] { 3.5.2.7.8.1.2.0.4.7.4.3.8 };
		System.out.println(Arrays.toString(arr));
		shellSort(arr);
		System.out.println(Arrays.toString(arr));
	}
	
	public static void shellSort(int[] arr) {
		int k = 1;
		// Iterate over all the steps
		for (int d = arr.length / 2; d > 0; d /= 2) {
			// Iterate over all available elements
			for (int i = d; i < arr.length; i++) {
				// Iterate over all the elements in this group
				for (int j = i - d; j >= 0; j -= d) {
					// If the current element is larger than the element after the step is added
					if (arr[j] > arr[j + d]) {
						int temp = arr[j];
						arr[j] = arr[j + d];
						arr[j + d] = temp;
					}
				}
			}
			System.out.println("The first" + k + "Secondary sorting result:"+ Arrays.toString(arr)); k++; }}}Copy the code
4. Analyze hill sorting algorithm

Hill sorting algorithm is a kind of in-situ, unstable sorting algorithm, best time complexity: T(n) =O(nlog2n) Worst time complexity: T(n) =O(nlog2n) Average time complexity: T(n) =O(nlog2n)

Start analyzing “Quicksort algorithm”

1. Quicksort description

We used to call it “quicken” for short. Quicksort also uses the idea of divide and conquer. At first glance, it looks a bit like merge sort, but the idea is completely different. Separate the records to be sorted into two independent parts through a sort. If the keywords of one part of the records are smaller than those of the other part, the two parts of the records can be sorted to achieve the order of the whole sequence. Quicksort general steps: 1. Pick an element from the sequence, called pivot, and generally take the first cardinal number; 2. Reorder the array so that all elements smaller than the base value are placed in front of the base value and all elements larger than the base value are placed behind the base value (the same number can go to either side). After the partition exits, the benchmark is in the middle of the array. This is called a partition operation; 3. Recursively sort subsequences of elements smaller than the base value and subsequences larger than the base value.

2. Graphical quicksort

3. Code to achieve quicksort
public class QuickSort {

	public static void main(String[] args) {
		int[] arr = new int[] {3.4.6.7.2.7.2.8.0.9.1};
		quickSort(arr,0,arr.length-1);
		System.out.println(Arrays.toString(arr));
	}
	
	public static void quickSort(int[] arr,int start,int end) {
		if(start<end) {
			// use the 0th digit in the array as the standard number
			int stard=arr[start];
			// Record the subscripts to be sorted
			int low=start;
			int high=end;
			// Loop for numbers greater than and less than the standard number
			while(low<high) {
				// The number on the right is larger than the standard number
				while(low<high&&stard<=arr[high]) {
					high--;
				}
				// Replace the left number with the right number
				arr[low]=arr[high];
				// If the left number is smaller than the standard number
				while(low<high&&arr[low]<=stard) {
					low++;
				}
				arr[high]=arr[low];
			}
			// Assign the standard number to the element in the lower position
			arr[low]=stard;
			// Handle all small numbers
			quickSort(arr, start, low);
			// Handle all large numbers
			quickSort(arr, low+1, end); }}}Copy the code
4. Analyze quicksort algorithms

Quicksort algorithm is a kind of in-situ and unstable sorting algorithm. Best time complexity: T(n) = O(nlogn) Worst time complexity: T(n) = O(n2) Average time complexity: T(n) = O(nlogn)

Start analyzing “union Sort algorithm”

Merge-sort is an effective sorting algorithm based on MERGE operation. It is a typical application of Divide and Conquer. The ordered subsequences are combined to obtain a fully ordered sequence. That is, each subsequence is ordered first, and then the subsequence segments are ordered. If two ordered tables are merged into one ordered table, it is called binary merge.

1. Categorization description

Merge operation works as follows: First step: apply for space, make the size of the sum of two sorted sequences, this space is used to store the combined sequence. Second step: Set two Pointers, the initial position of each is the start position of the two sorted sequences. Third step: Compare the elements pointed to by the two Pointers, select the relatively small element into the merge space, and move the pointer to the next position. Repeat Step 3 until a pointer exceeds the end of the sequence. Copy all the remaining elements of the other sequence directly to the end of the merge sequence

2. Diagram and sort

3. Code implementation and sorting
public class MergeSort {

	public static void main(String[] args) {
		int[] arr = new int[] {1.3.5.2.4.6.8.10};
		System.out.println(Arrays.toString(arr));
		mergeSort(arr, 0, arr.length-1);
		System.out.println(Arrays.toString(arr));
	}
	
	// merge sort
	public static void mergeSort(int[] arr,int low,int high) {
		int middle=(high+low)/2;
		if(low<high) {
			// Handle the left side
			mergeSort(arr, low, middle);
			// Handle the right side
			mergeSort(arr, middle+1, high);
			/ / mergemerge(arr,low,middle,high); }}public static void merge(int[] arr,int low,int middle, int high) {
		// To store a temporary array after merging
		int[] temp = new int[high-low+1];
		// Record the subscript to iterate over in the first array
		int i=low;
		// Record the subscript to iterate over in the second array
		int j=middle+1;
		// Used to record subscripts stored in temporary arrays
		int index=0;
		// Iterate over the two arrays to retrieve the small number and place it in the temporary array
		while(i<=middle&&j<=high) {
			// The first array is smaller
			if(arr[i]<=arr[j]) {
				// Put small data into a temporary array
				temp[index]=arr[i];
				// move the index back one;
				i++;
			}else {
				temp[index]=arr[j];
				j++;
			}
			index++;
		}
		// Process redundant data
		while(j<=high) {
			temp[index]=arr[j];
			j++;
			index++;
		}
		while(i<=middle) {
			temp[index]=arr[i];
			i++;
			index++;
		}
		// Restore data from the temporary array to the original array
		for(int k=0;k<temp.length;k++) {
			arr[k+low]=temp[k];
		}
	}

}
Copy the code
4. Analyze and classify sorting algorithms

Unitized sorting algorithm is a stable sorting algorithm, best time complexity: T(n) = O(n) Worst time complexity: T(n) = O(nlogn) Average time complexity: T(n) = O(nlogn)

Start analyzing “Radix Sort algorithm”

Radix sort is also a non-comparative sort algorithm, sorting each bit, sorting from the least, complexity is O(kN), is the length of the array, k is the largest number of the number in the array; Radix sort is sorted by low order and then collected; And then sort it in order, and then collect it; And so on, all the way to the highest place. Sometimes some attributes are prioritized, first by low priority, then by high priority. The final order is that the higher-priority ones come first, and the lower-priority ones come first.

2. Graphic radix sort

Tip: note that the progress bar blocks the sorting of numbers from 0 to 9

3. Code to achieve radix sort
public class RadixSort {

	public static void main(String[] args) {
		int[] arr = new int[] {23.6.189.45.9.287.56.1.798.34.65.652.5};
		radixSort(arr);
		System.out.println(Arrays.toString(arr));
	}
	
	public static void  radixSort(int[] arr) {
		// Store the largest number in the array
		int max=Integer.MIN_VALUE;
		for(int i=0; i<arr.length; i++) {if(arr[i]>max) { max=arr[i]; }}// Calculate the maximum number of digits
		int maxLength = (max+"").length();
		// An array used to temporarily store data
		int[][] temp = new int[10][arr.length];
		// Used to record the number of numbers stored in the corresponding array in temp
		int[] counts = new int[10];
		// Determine the number of comparisons based on the number of maximum lengths
		for(int i=0,n=1; i<maxLength; i++,n*=10) {
			// Count the remainder of each number separately
			for(int j=0; j<arr.length; j++) {// Calculate the remainder
				int ys = arr[j]/n%10;
				// Puts the currently traversed data into the specified array
				temp[ys][counts[ys]] = arr[j];
				// The number of records
				counts[ys]++;
			}
			// Record where the element needs to be placed
			int index=0;
			// Take the number out
			for(int k=0; k<counts.length; k++) {// The number of current remainder records in the array is not 0
				if(counts[k]! =0) {
					// Loop out elements
					for(int l=0; l<counts[k]; l++) {// Fetch the element
						arr[index] = temp[k][l];
						// Record a location
						index++;
					}
					// Set the quantity to 0
					counts[k]=0;
				}
			}
		}
	}
	
}
Copy the code
4. Analyze the radix sorting algorithm

The cardinality sorting algorithm is a stable sorting algorithm. The best time complexity: T(n) = O(n * k) The worst time complexity: T(n) = O(n * k) The average time complexity: T(n) = O(n * k).

Start analyzing “heapsort algorithm”

1. Heap sort description

Heapsort (English: Heapsort) refers to a sort algorithm designed by using heap data structure. A heap is a nearly complete binary tree structure that also satisfies the property of a heap: the key value or index of a child node is always smaller than (or greater than) its parent node.

In the data structure of the heap, the maximum value in the heap is always at the root node (the minimum value in the heap is at the root node when the heap is used in priority queues). The following operations are defined in the Heap: Max Heapify: To adjust the end child of the Heap so that the child is always smaller than the parent to create a Build Max Heap: to HeapSort all the data in the Heap: Remove bits at the root of the first data, and do the maximum heap adjustment recursive operation

2. Diagram heap sort

3. Code heap sort
public class HeapSort {
	
	public static void main(String[] args) {
		int[] arr = new int[] {9.6.8.7.0.1.10.4.2};
		heapSort(arr);
		System.out.println(Arrays.toString(arr));
	}
	
	public static void heapSort(int[] arr) {
		// The start position is the parent of the last non-leaf node
		int start = (arr.length-1) /2;
		// Adjust to big top heap
		for(inti=start; i>=0; i--) { maxHeap(arr, arr.length, i); }// Swap the 0th item in the array with the last item in the heap
		for(int i=arr.length-1; i>0; i--) {int temp = arr[0];
			arr[0]=arr[i];
			arr[i]=temp;
			maxHeap(arr, i, 0); }}public static void maxHeap(int[] arr,int size,int index) {
		// Left child node
		int leftNode = 2*index+1;
		// Right child node
		int rightNode = 2*index+2;
		int max = index;
		// Compare with the two child nodes to find the largest node
		if(leftNode<size&&arr[leftNode]>arr[max]) {
			max=leftNode;
		}
		if(rightNode<size&&arr[rightNode]>arr[max]) {
			max=rightNode;
		}
		// Switch places
		if(max! =index) {int temp=arr[index];
			arr[index]=arr[max];
			arr[max]=temp;
			// Swapping places may destroy the previously arranged heap, so the heap needs to be rearrangedmaxHeap(arr, size, max); }}}Copy the code
4. Analyze the heap sort algorithm

Cardinality sorting algorithm is a kind of in-situ and unstable sorting algorithm, best time complexity: T(n) = O(nlogn) Worst time complexity: T(n) = O(nlogn) Average time complexity: T(n) = O(nlogn)

Why is insert sort more popular than bubble sort?

The time complexity of bubble sort and insert sort is O(n2), and both are in place sorting algorithms. Why is insertion sort more popular than bubble sort?

When we looked at bubble sort and insert sort, we said that bubble sort, no matter how optimized it is, the number of elements swapped is a fixed value, which is the reverse of the original data. Insert sort is the same, no matter how optimized it is, the number of element moves is equal to the degree of reverse order of the original data.

However, from the point of view of code implementation, the data exchange of bubble sort is more complicated than the data movement of insert sort. Bubble sort requires three assignments, while insert sort requires only one. Let’s look at this operation:

Data exchange operations in bubble sort:if (a[j] > a[j+1]) { / / exchange
   int tmp = a[j];
   a[j] = a[j+1];
   a[j+1] = tmp;
   flag = true; } insert sort data move operation:if (a[j] > value) {
  a[j+1] = a[j];  // Data movement
} else {
  break;
}
Copy the code

We roughly count the time it takes to execute an assignment statement as unit_time, and then sort the same array of reverse degree K using bubble sort and insert sort, respectively. Bubble sort requires K swaps, each of which requires 3 assignments, so the total swap time is 3*K units of time. In insert sort, data movement takes K units of time.

This is just our very theoretical analysis, in order to experiment, for the above bubble sort and insert sort Java code, I wrote a performance comparison test program, randomly generate 10000 arrays, each array contains 200 data, and then use the bubble sort algorithm and insert sort algorithm respectively on my machine to sort. Bubble sort takes about 700ms to complete, while insert sort takes about 100ms!

So, although bubble sort and insert sort have the same time complexity, O(n2), insert sort is definitely preferred if we want to maximize performance. There’s also a lot of room for optimization in the idea of insertion sort, and we’ve just covered the most basic one. If you’re interested in optimizing insertion sort, you can revisit Hill sort for yourself.

Below are the analysis diagrams of the eight classical algorithms:

Up to here, the above eight classical algorithm analysis, are based on the array. Do these sorting algorithms work if the data is stored in linked lists? If so, what is the corresponding time and space complexity? Look forward to Daniel’s comments

If this article is helpful to you, even a little bit, please give a thumbs up. Thank you

Welcome to pay attention to my public number, discuss technology together, yearning for technology, the pursuit of technology… Said come is a friend oh…