Sorting arrays in linear time is possible, but it can only be done with certain restrictions (our array has to meet some desired properties).

Problem

Input:array $A[1..n]$ ofintegersin between $[1..R]$. That is, the integers in this array are within a certainrange$R$.

Output:a sorted version of $A$.

## Count sort

- Given our array $A$ and a range $1..R$, we create a new array $C[1..R]$. For each element in $A[i]$, we use the
*value*to index into $C$, and increment by one. That is, $C[A[i]]=C[A[i]]+1$. - We then iterate through $C$, and for each index and frequency, we duplicate that value by the frequency inside a new array $B$. When done, this would have all the elements of $A$ in sorted order.

### Runtime analysis

$O(R)$ to initialize $C$ to all zeroes, $O(n)$ to iterate through $A$ and map elements to frequencies, and then $O(R)$ to iterate through $C$ to populate $B$ and duplicate elements as needed.

Total runtime is therefore $O(n+R)$. When $R≤O(n)$, this algorithm is linear time.

## Radix sort

We begin with count sort.

- Instead of an array $C$ of numbers, we make $C$ an array of linked lists. Each linked list stored an instance of the number, and we store an additional piece of info with it, that is the index of where it is in $A$. This allows us to remember the order in which elements were inserted into $C$.
- We represent each $A[i]∈A$ in a base $k$ representation. This means that it will take a max of $g_{k}R$ digits to represent every possible number in $A$ in base $k$.
- Beginning with the least significant digit, we use count sort to sort the columns. when we get to the most significant digit, we will have a sorted array. For some fucking reason

### Runtime analysis

Count sort takes $O(n+k)$, and there are $g_{k}R$ columns/digits, so runtime is $O(n+k)⋅O(g_{k}R)$.

Using logarithm rules, we get $O(n+k)⋅O(logklogR )$. Choosing $k=n$ simplifies this to

$O(n)⋅O(gngR )$For values $R≤n$, this simplifies down to $O(n)⋅O(1)=O(n)$.