For the last few lectures in computer science, we have been discussing sorting and efficiency. There are multiple types of sorting algorithms, each with their own efficiencies. Some are better to use in shorter lists, whereas some are better to use for longer ones. This also varies for how sorted the list is already.
Bubble sort is an algorithm, which shoves the largest item to the end of the list and continues to repeat for each item. Since it runs through the list first n-1 times, then n-2 etc... until the last comparison of the two elements, the algorithm makes about n*n comparisons as a worst case scenario.
Selection sort is similar to bubble sort. However instead of swapping positions of each comparison (if it must be swapped), it will only move the highest element to the top at the end of each pass (with each pass not including the already sorted section). The number of comparisons is still n*n however it is a more efficient algorithm due to only a maximum of n exchanges. This is good for elements that have sorted elements at the end due to less replacements.
Insertion sort is similar to selection sort, but instead of adding sorted items inversely to the end of the list, it is added in order to the start of the list. Again the efficiency is n*n, however this method is good for lists that are already sorted at the beginning.
Merge sort breaks each list into smaller parts and recursively sorts those parts, eventually merging the sorted smaller lists into a larger sorted list. This is one of the searches that uses recursion, calling itself on each smaller list, then splits that list and calls itself again until each list is size 1. Then each sublist merges into a sorted list and returns itself. This sorting does n comparisons but has logn calls so its a sorting efficiency of nlogn.
Quick sort is another recursive sort, that doesn't use the storage of merge sort. There is a point called the pivot point which is randomly selected (usually the first one). All items less than that point are put to the left and those higher than it are put to the right. Then on each sublist, the sort is called again, returning a sorted list. This ensures that the pivot point always ends up in the right slot of the list. Again this list has n calls with logn recursions so its an efficiency of nlogn.
A note about efficiency: people use big O notation to show the worst case efficiency of a program. For example, traversing a 2-d array with same width and length would be an efficiency of n*n (n items in a list, then n items per item in the outer list). This can be written as O(n*n).