1.)Description:
a.) Insertion sort is a simple sorting algorithm, a comparison sort in which the sorted array (or list) is built one entry at a time. It is much less efficient on large lists than more advanced algorithms such as quicksort, heapsort, or merge sort.
a.) Bucket sort, or bin sort, is a sorting algorithm that works by partitioning an array into a number of buckets. Each bucket is then sorted individually, either using a different sorting algorithm, or by recursively applying the bucket sorting algorithm. It is a cousin of radix sort in the most to least significant digit flavour. Bucket sort is a generalization of pigeonhole sort. Since bucket sort is not a comparison sort, the Ω(n log n) lower bound is inapplicable. The computational complexity estimates involve the number of buckets.
d.) Quicksort is a well-known sorting algorithm developed by C. A. R. Hoare that, on average, makes Θ(nlogn) (big O notation) comparisons to sort n items. However, in the worst case, it makes Θ(n2) comparisons. Typically, quicksort is significantly faster in practice than other Θ(nlogn) algorithms, because its inner loop can be efficiently implemented on most architectures, and in most real-world data, it is possible to make design choices which minimize the probability of requiring quadratic time. Quicksort is a comparison sort and, in efficient implementations, is not a stable sort.
e.) Heapsort (method) is a comparison-based sorting algorithm, and is part of the selection sort family. Although somewhat slower in practice on most machines than a good implementation of quicksort, it has the advantage of a worst-case Θ(n log n) runtime. Heapsort is an in-place algorithm, but is not a stable sort.
f.) Merge sort or merge_sort is an O(n log n) comparison-based sorting algorithm. In most implementations it is stable, meaning that it preserves the input order of equal elements in the sorted output. It is an example of the divide and conquer algorithmic paradigm. It was invented by John von Neumann in 1945.
g.) Shell sort is a sorting algorithm that is a generalization of insertion sort, with two observations:
- insertion sort is efficient if the input is "almost sorted", and
- insertion sort is typically inefficient because it moves values just one position at a time.
2.) pseudocode
a.)
insertionSort(array A)
begin
for i := 1 to length[A]-1 do
begin
value := A[i];
j := i-1;
while j ≥ 0 and A[j] > value do
begin
A[j + 1] := A[j];
j := j-1;
end;
A[j+1] := value;
end;
end;
b.)
function bucket-sort(array, n) is
buckets ← new array of n empty lists
for i = 0 to (length(array)-1) do
insert array[i] into buckets[msbits(array[i], k)]
for i = 0 to n - 1 do
next-sort(buckets[i])
return the concatenation of buckets[0], ..., buckets[n-1]
c.)procedure bubbleSort( A : list of sortable items ) defined as:
do
swapped := false
for each i in 0 to length(A) - 2 inclusive do:
if A[ i ] > A[ i + 1 ] then
swap( A[ i ], A[ i + 1 ] )
swapped := true
end if
end for
while swapped
end procedure
procedure bubbleSort( A : list of sortable items ) defined as:
function quicksort(array)
n := length( A )
do
swapped := false
n := n - 1
for each i in 0 to n - 1 inclusive do:
if A[ i ] > A[ i + 1 ] then
swap( A[ i ], A[ i + 1 ] )
swapped := true
end if
end for
while swapped
end procedure
d.)
var list less, greater
if length(array) ≤ 1
return array
select and remove a pivot value pivot from array
for each x in array
if x ≤ pivot then append x to less
else append x to greater
return concatenate(quicksort(less), pivot, quicksort(greater))
e.)Zero Based
function heapSort(a, count) is
input: an unordered array a of length count
(first place a in max-heap order)
heapify(a, count)
end := count - 1
while end > 0 do
(swap the root(maximum value) of the heap with the last element of the heap)
swap(a[end], a[0])
(decrease the size of the heap by one so that the previous max value will
stay in its proper placement)
end := end - 1
(put the heap back in max-heap order)
siftDown(a, 0, end)
function heapify(a,count) is
(start is assigned the index in a of the last parent node)
start := (count - 2) / 2
while start ≥ 0 do
(sift down the node at index start to the proper place such that all nodes below
the start index are in heap order)
siftDown(a, start, count-1)
start := start - 1
(after sifting down the root all nodes/elements are in heap order)
function siftDown(a, start, end) is
input: end represents the limit of how far down the heap
to sift.
root := start
while root * 2 + 1 ≤ end do (While the root has at least one child)
child := root * 2 + 1 (root*2+1 points to the left child)
(If the child has a sibling and the child's value is less than its sibling's...)
if child + 1 ≤ end and a[child] <>then
child := child + 1 (... then point to the right child instead)
if a[root] <>then (out of max-heap order)
swap(a[root], a[child])
root := child (repeat to continue sifting down the child now)
else
f.)
function merge_sort(m)
var list left, right, result
if length(m) ≤ 1
return m
// This calculation is for 1-based arrays. For 0-based, use length(m)/2 - 1.
var middle = length(m) / 2
for each x in m up to middle
add x to left
for each x in m after middle
add x to right
left = merge_sort(left)
right = merge_sort(right)
result = merge(left, right)
return result
g.)
input: an array a of length n
inc ← round(n/2)
while inc > 0 do:
for i = inc .. n − 1 do:
temp ← a[i]
j ← i
while j ≥ inc and a[j − inc] > temp do:
a[j] ← a[j − inc]
j ← j − inc
a[j] ← temp
inc ← round(inc / 2.2)
references:
["Wiki"] http://en.wikipedia.org/wiki/Bubble_sort
["Wiki"] http://en.wikipedia.org/wiki/Insertion_sort
["Wiki"] http://en.wikipedia.org/wiki/Shell_sort
["Wiki"] http://en.wikipedia.org/wiki/Heapsort
["Wiki"] http://en.wikipedia.org/wiki/Merge_sort
["Wiki"] http://en.wikipedia.org/wiki/Quicksort
["Wiki"] http://en.wikipedia.org/wiki/Bucket_sort