� � The Computational Essence of Sorting Algorithms — Background WG 2.8 3.3 Final coalgebra • the final object in this category, the final F -coalgebra, is the ‘greatest’ fixed point of F unfold c � ν F C c out � F (ν F ) F C F ( unfold c ) • finality entails that there is a unique homomorphism to the final coalgebra from any coalgebra c , called unfold c • final List -coalgebra: finite and infinite lists (in Set ) University of Oxford — Ralf Hinze 30-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.3 Final coalgebra in Haskell • in Haskell, ν f can be defined newtype ν f = Out ◦ { out :: f (ν f ) } • as an aside, Out ◦ a will be written as ⌊ a ⌋ • since out is an isomorphism, we can turn the commuting diagram into a generic definition of unfold unfold :: ( Functor f ) ⇒ ( a → f a ) → ( a → ν f ) unfold f = out ◦ · map ( unfold f ) · f • Haskell: initial algebras and final coalgebras coincide! University of Oxford — Ralf Hinze 31-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Embedding initial into final Least fixed points can be embedded into greatest fixed points. upcast :: ( Functor f ) ⇒ µ f → ν f University of Oxford — Ralf Hinze 32-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Embedding initial into final Least fixed points can be embedded into greatest fixed points. upcast :: ( Functor f ) ⇒ µ f → ν f How to define upcast ? We can write it as a fold . . . fold ... : µ F → ν F ... University of Oxford — Ralf Hinze 32-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Embedding initial into final Least fixed points can be embedded into greatest fixed points. upcast :: ( Functor f ) ⇒ µ f → ν f How to define upcast ? We can write it as a fold . . . fold ... : µ F → ν F ... : F (ν F ) → ν F University of Oxford — Ralf Hinze 32-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Embedding initial into final Least fixed points can be embedded into greatest fixed points. upcast :: ( Functor f ) ⇒ µ f → ν f How to define upcast ? We can write it as a fold . . . fold ( unfold c ) : µ F → ν F unfold c : F (ν F ) → ν F University of Oxford — Ralf Hinze 32-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Embedding initial into final Least fixed points can be embedded into greatest fixed points. upcast :: ( Functor f ) ⇒ µ f → ν f How to define upcast ? We can write it as a fold . . . fold ( unfold c ) : µ F → ν F unfold c : F (ν F ) → ν F c : F (ν F ) → F ( F (ν F )) University of Oxford — Ralf Hinze 32-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Embedding initial into final Least fixed points can be embedded into greatest fixed points. upcast :: ( Functor f ) ⇒ µ f → ν f How to define upcast ? We can write it as a fold . . . fold ( unfold c ) : µ F → ν F unfold c : F (ν F ) → ν F c : F (ν F ) → F ( F (ν F )) . . . or as an unfold: unfold ( fold a ) : µ F → ν F University of Oxford — Ralf Hinze 32-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Embedding initial into final Least fixed points can be embedded into greatest fixed points. upcast :: ( Functor f ) ⇒ µ f → ν f How to define upcast ? We can write it as a fold . . . fold ( unfold c ) : µ F → ν F unfold c : F (ν F ) → ν F c : F (ν F ) → F ( F (ν F )) . . . or as an unfold: unfold ( fold a ) : µ F → ν F fold a : µ F → F (µ F ) University of Oxford — Ralf Hinze 32-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Embedding initial into final Least fixed points can be embedded into greatest fixed points. upcast :: ( Functor f ) ⇒ µ f → ν f How to define upcast ? We can write it as a fold . . . fold ( unfold c ) : µ F → ν F unfold c : F (ν F ) → ν F c : F (ν F ) → F ( F (ν F )) . . . or as an unfold: unfold ( fold a ) : µ F → ν F fold a : µ F → F (µ F ) a : F ( F (µ F )) → F (µ F ) University of Oxford — Ralf Hinze 32-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Embedding initial into final Least fixed points can be embedded into greatest fixed points. upcast :: ( Functor f ) ⇒ µ f → ν f How to define upcast ? We can write it as a fold . . . fold ( unfold c ) : µ F → ν F unfold c : F (ν F ) → ν F c : F (ν F ) → F ( F (ν F )) . . . or as an unfold: unfold ( fold a ) : µ F → ν F fold a : µ F → F (µ F ) a : F ( F (µ F )) → F (µ F ) Obvious candidates: c = map out and a = map in . University of Oxford — Ralf Hinze 32-86
� � � � The Computational Essence of Sorting Algorithms — Background WG 2.8 The coalgebra fold ( map in ) is the inverse of in ; the algebra unfold ( map out ) is the inverse of out . Moreover, � F (ν F ) F (µ F ) out ◦ = unfold ( map out ) in fold out ◦ � ν F µ F � unfold in ◦ fold ( map in ) = in ◦ out � F (ν F ) F (µ F ) (The triples � µ F , in , in ◦ � and � ν F , out ◦ , out � are examples of bialgebras , more later.) University of Oxford — Ralf Hinze 33-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 3.4 Intermediate summary • initial algebra : syntax (finite trees) • folds: replacing constructors by functions • (denotational semantics: compositional valuation function that maps syntax to semantics—folding over syntax trees) • final coalgebra : behaviour (finite and infinite trees) • unfolds: tracing a state space • (operational semantics: unfolding to transition trees) • we have seen a glimpse of type-driven program development • running time (assuming a strict setting): • fold : proportional to the size of the input • unfold : proportional to the size of the output (output-sensitive algorithm) University of Oxford — Ralf Hinze 34-86
The Computational Essence of Sorting Algorithms — Background WG 2.8 University of Oxford — Ralf Hinze 35-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 Section 4 Exchange sort University of Oxford — Ralf Hinze 36-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 4.0 Back to sorting A sorting function takes a list to an ordered list, sort :: µ List → ν List where ν List is the datatype of ordered lists: data List list = Nil | Cons K list instance Functor List where map f Nil = Nil map f ( Cons k ks ) = Cons k ( f ks ) (No guarantees, we use List for emphasis.) University of Oxford — Ralf Hinze 37-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 To define a sorting function let us follow a type-directed approach: f :: µ List → ν List f = unfold c University of Oxford — Ralf Hinze 38-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 To define a sorting function let us follow a type-directed approach: f :: µ List → ν List f = unfold c c :: µ List → List (µ List ) c = fold a University of Oxford — Ralf Hinze 38-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 To define a sorting function let us follow a type-directed approach: f :: µ List → ν List f = unfold c c :: µ List → List (µ List ) c = fold a a :: List ( List (µ List )) → List (µ List ) a Nil = Nil a ( Cons x Nil ) = Cons x ⌈ Nil ⌉ a ( Cons x ( Cons y xs )) | x � y = Cons x ⌈ Cons y xs ⌉ | otherwise = Cons y ⌈ Cons x xs ⌉ University of Oxford — Ralf Hinze 38-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 4.1 Bubble sort We have re-invented bubble sort! bubbleSort :: µ List → ν List bubbleSort = unfold bubble bubble :: µ List → List (µ List ) bubble = fold bub bub :: List ( List (µ List )) → List (µ List ) bub Nil = Nil bub ( Cons x Nil ) = Cons x ⌈ Nil ⌉ bub ( Cons x ( Cons y xs )) | x � y = Cons x ⌈ Cons y xs ⌉ | otherwise = Cons y ⌈ Cons x xs ⌉ University of Oxford — Ralf Hinze 39-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 Dually, we can start with a fold: f :: µ List → ν List f = fold a University of Oxford — Ralf Hinze 40-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 Dually, we can start with a fold: f :: µ List → ν List f = fold a a :: List (ν List ) → ν List a = unfold c University of Oxford — Ralf Hinze 40-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 Dually, we can start with a fold: f :: µ List → ν List f = fold a a :: List (ν List ) → ν List a = unfold c c :: List (ν List ) → List ( List (ν List )) c Nil = Nil c ( Cons x ⌊ Nil ⌋ ) = Cons x Nil c ( Cons x ⌊ Cons y xs ⌋ ) | x � y = Cons x ( Cons y xs ) | otherwise = Cons y ( Cons x xs ) University of Oxford — Ralf Hinze 40-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 4.2 Na ¨ ıve insertion sort We obtain a na¨ ıve variant of insertion sort! naiveInsertionSort :: µ List → ν List naiveInsertionSort = fold naiveInsert naiveInsert :: List (ν List ) → ν List naiveInsert = unfold naiveIns naiveIns :: List (ν List ) → List ( List (ν List )) naiveIns Nil = Nil naiveIns ( Cons x ⌊ Nil ⌋ ) = Cons x Nil naiveIns ( Cons x ⌊ Cons y xs ⌋ ) | x � y = Cons x ( Cons y xs ) | otherwise = Cons y ( Cons x xs ) Why na¨ ıve? University of Oxford — Ralf Hinze 41-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 The algebra and the coalgebra are almost identical: a :: List ( List (µ List )) → List (µ List ) a Nil = Nil a ( Cons x Nil ) = Cons x ⌈ Nil ⌉ a ( Cons x ( Cons y xs )) | x � y = Cons x ⌈ Cons y xs ⌉ | otherwise = Cons y ⌈ Cons x xs ⌉ University of Oxford — Ralf Hinze 42-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 The algebra and the coalgebra are almost identical: a :: List ( List (µ List )) → List (µ List ) c :: List (ν List ) → List ( List (ν List )) a Nil = Nil c Nil = Nil a ( Cons x Nil ) = Cons x ⌈ Nil ⌉ c ( Cons x ⌊ Nil ⌋ ) = Cons a Nil a ( Cons x ( Cons y xs )) c ( Cons x ⌊ Cons y xs ⌋ ) | x � y = Cons x ⌈ Cons y xs ⌉ | x � y = Cons x ( Cons y xs ) | otherwise = Cons y ⌈ Cons x xs ⌉ | otherwise = Cons y ( Cons x xs ) University of Oxford — Ralf Hinze 42-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 The algebra and the coalgebra are almost identical: a :: List ( List (µ List )) → List (µ List ) c :: List (ν List ) → List ( List (ν List )) a Nil = Nil c Nil = Nil a ( Cons x Nil ) = Cons x ⌈ Nil ⌉ c ( Cons x ⌊ Nil ⌋ ) = Cons a Nil a ( Cons x ( Cons y xs )) c ( Cons x ⌊ Cons y xs ⌋ ) | x � y = Cons x ⌈ Cons y xs ⌉ | x � y = Cons x ( Cons y xs ) | otherwise = Cons y ⌈ Cons x xs ⌉ | otherwise = Cons y ( Cons x xs ) We can unify them in a single natural transformation : swap :: List ( List a ) → List ( List a ) swap Nil = Nil swap ( Cons x Nil ) = Cons x Nil swap ( Cons x ( Cons y xs )) | x � y = Cons x ( Cons y xs ) | otherwise = Cons y ( Cons x xs ) University of Oxford — Ralf Hinze 42-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 swap :: List ( List x ) → List ( List x ) swap Nil = Nil swap ( Cons x Nil ) = Cons x Nil swap ( Cons x ( Cons y l )) | x � y = Cons x ( Cons y xs ) | otherwise = Cons y ( Cons x xs ) University of Oxford — Ralf Hinze 43-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 swap :: List ( List x ) → List ( List x ) swap Nil = Nil swap ( Cons x Nil ) = Cons x Nil swap ( Cons x ( Cons y l )) | x � y = Cons x ( Cons y xs ) | otherwise = Cons y ( Cons x xs ) We can re-define bubble and na¨ ıve insertion sort using swap : bubbleSort :: µ List → ν List bubbleSort = unfold ( fold ( map in · swap )) naiveInsertionSort :: µ List → ν List naiveInsertionSort = fold ( unfold ( swap · map out )) In a sense, swap extracts the computational ‘essence’ of bubble and na¨ ıve insertion sorting. University of Oxford — Ralf Hinze 43-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 bubble sort initial input 2 4 1 3 2 ↔ 1 4 ↔ 1 1 ↔ 3 1 2 4 3 2 ↔ 3 4 ↔ 3 1 2 3 4 3 ↔ 4 1 2 3 4 1 2 3 4 output na¨ ıve insertion sort input 2 4 1 3 1 ↔ 3 2 4 1 3 4 ↔ 1 4 ↔ 3 2 4 1 3 2 ↔ 1 2 ↔ 3 3 ↔ 4 2 1 3 4 1 2 3 4 final output University of Oxford — Ralf Hinze 44-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 4.2 Intermediate summary • swap exchanges adjacent elements • swap is the computational essence of bubble sort and na¨ ıve insertion sort • running time Θ ( n 2 ) • how can we write true insertion sort? • first: proof that bubbleSort and naiveInsertionSort are equal (in a strong sense) University of Oxford — Ralf Hinze 45-86
The Computational Essence of Sorting Algorithms — Exchange sort WG 2.8 University of Oxford — Ralf Hinze 46-86
The Computational Essence of Sorting Algorithms — Bialgebras and distributive laws WG 2.8 Section 5 Bialgebras and distributive laws University of Oxford — Ralf Hinze 47-86
� � � The Computational Essence of Sorting Algorithms — Bialgebras and distributive laws WG 2.8 Recall that bubble is a List -algebra homomorphism. map bubble � List ( List (µ List )) List (µ List ) swap in List ( List (µ List )) map in � List (µ List ) µ List bubble University of Oxford — Ralf Hinze 48-86
� � � � � The Computational Essence of Sorting Algorithms — Bialgebras and distributive laws WG 2.8 Let us rearrange the diagram. List (µ List ) map bubble in List ( List (µ List )) swap µ List List ( List (µ List )) bubble map in List (µ List ) The algebra in and the coalgebra bubble form a swap -bialgebra: � µ List , in , bubble � . University of Oxford — Ralf Hinze 49-86
� � � � � The Computational Essence of Sorting Algorithms — Bialgebras and distributive laws WG 2.8 Recall that naiveInsert is a List -coalgebra homomorphism. List (ν List ) map out naiveInsert List ( List (ν List )) swap ν List out List ( List (ν List )) map naiveInsert List (ν List ) The algebra naiveInsert and the coalgebra out also form a swap -bialgebra: � ν List , naiveInsert , out � . University of Oxford — Ralf Hinze 50-86
� � � � � The Computational Essence of Sorting Algorithms — Bialgebras and distributive laws WG 2.8 5.1 Bialgebra For an algebra a and coalgebra c to be a swap -bialgebra, we must have that List X List c a List ( List X ) swap X c List ( List X ) List a List X University of Oxford — Ralf Hinze 51-86
� � � � The Computational Essence of Sorting Algorithms — Bialgebras and distributive laws WG 2.8 5.2 Bialgebra homomorphism A swap -bialgebra homomorphism h is simultaneously an List -algebra and a List -coalgebra homomorphism. List h � List Y List X a b � Y X h c d � List Y List X List h swap -bialgebras and homomorphisms form a category. University of Oxford — Ralf Hinze 52-86
� � � � The Computational Essence of Sorting Algorithms — Bialgebras and distributive laws WG 2.8 5.2 Initial and final bialgebra The initial object in this category is � µ List , in , bubble � ; the final object is � ν List , naiveInsert , out � . � List (ν List ) List (µ List ) in naiveInsert fold naiveInsert � ν List µ List � unfold bubble out bubble � List (ν List ) List (µ List ) By uniqueness, naiveInsertionSort and bubbleSort are equal. University of Oxford — Ralf Hinze 53-86
The Computational Essence of Sorting Algorithms — Bialgebras and distributive laws WG 2.8 5.2 Intermediate summary • swap is a distributive law • � µ List , in , bubble � is the initial swap -bialgebra • � ν List , naiveInsert , out � is the final swap -bialgebra • bubble sort and na¨ ıve insertion sort are two (strongly related) variations of the same idea: repeatedly exchanging adjacent elements University of Oxford — Ralf Hinze 54-86
The Computational Essence of Sorting Algorithms — Bialgebras and distributive laws WG 2.8 University of Oxford — Ralf Hinze 55-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 Section 6 Insertion and selection sort II University of Oxford — Ralf Hinze 56-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 • sorting algorithms as folds of unfolds or unfolds of folds necessarily have a running time of Θ ( n 2 ) • to define insertion and selection sort, we need variants of folds and unfolds, so-called para- and apomorphisms University of Oxford — Ralf Hinze 57-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 6.1 Paramorphism • we start by defining products data a × b = As { outl :: a , outr :: b } ( △ ) :: ( c → a ) → ( c → b ) → ( c → a × b ) ( f △ g ) x = As ( f x ) ( g x ) • we write As a b as a b (we use it like Haskell’s a @ b ). University of Oxford — Ralf Hinze 58-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 6.1 Paramorphism • we start by defining products data a × b = As { outl :: a , outr :: b } ( △ ) :: ( c → a ) → ( c → b ) → ( c → a × b ) ( f △ g ) x = As ( f x ) ( g x ) • we write As a b as a b (we use it like Haskell’s a @ b ). • we are now ready to define paramorphisms: para :: ( Functor f ) ⇒ ( f (µ f × a ) → a ) → (µ f → a ) para f = f · map ( id △ para f ) · in ◦ a paramorphism also provides the intermediate input: the ‘algebra’ has type f (µ f × a ) → a instead of f a → a • slogan: eats its argument and keeps it too University of Oxford — Ralf Hinze 58-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 6.2 Apomorphism • products dualise to sums data a + b = Stop a | Play b ( ▽ ) :: ( a → c ) → ( b → c ) → ( a + b → c ) ( f ▽ g ) ( Stop a ) = f a ( f ▽ g ) ( Play b ) = g b • we write Stop a as a ◾ , and Play b as ▸ b University of Oxford — Ralf Hinze 59-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 6.2 Apomorphism • products dualise to sums data a + b = Stop a | Play b ( ▽ ) :: ( a → c ) → ( b → c ) → ( a + b → c ) ( f ▽ g ) ( Stop a ) = f a ( f ▽ g ) ( Play b ) = g b • we write Stop a as a ◾ , and Play b as ▸ b • paramorphisms dualise to apomorphisms: apo :: ( Functor f ) ⇒ ( a → f (ν f + a )) → ( a → ν f ) apo f = out ◦ · map ( id ▽ apo f ) · f the corecursion is split into two branches, with no recursive call on the left • apomorphisms improve the running time University of Oxford — Ralf Hinze 59-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 With apomorphisms, we can write the insertion function as one that stops scanning after inserting an element: insertSort :: µ List → ν List insertSort = fold insert University of Oxford — Ralf Hinze 60-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 With apomorphisms, we can write the insertion function as one that stops scanning after inserting an element: insertSort :: µ List → ν List insertSort = fold insert insert :: List (ν List ) → ν List insert = apo ins University of Oxford — Ralf Hinze 60-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 With apomorphisms, we can write the insertion function as one that stops scanning after inserting an element: insertSort :: µ List → ν List insertSort = fold insert insert :: List (ν List ) → ν List insert = apo ins ins :: List (ν List ) → List (ν List + List (ν List )) ins Nil = Nil ins ( Cons x ⌊ Nil ⌋ ) = Cons x ( ⌊ Nil ⌋ ◾ ) ins ( Cons x ⌊ Cons y xs ⌋ ) | x � y = Cons x ( ⌊ Cons y xs ⌋ ◾ ) | otherwise = Cons y ( ▸ ( Cons x xs )) University of Oxford — Ralf Hinze 60-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 From ins we can extract a natural transformation, which we call swop for swap‘n’stop : swop :: List ( a × List a ) → List ( a + List a ) swop Nil = Nil swop ( Cons x ( xs Nil )) = Cons x ( xs ◾ ) swop ( Cons x ( xs ( Cons y ys ))) | x � y = Cons x ( xs ◾ ) | otherwise = Cons y ( ▸ ( Cons x ys )) University of Oxford — Ralf Hinze 61-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 From swop we get both insertion and selection sort: insertSort :: µ List → ν List insertSort = fold ( apo ( swop · map ( id △ out ))) selectSort :: µ List → ν List selectSort = unfold ( para ( map ( id ▽ in ) · swop )) In general, a natural transformation such as swop gives rise to two algorithms. Algorithms for free! University of Oxford — Ralf Hinze 62-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 6.3 Intermediate summary • apomorphisms improve the running time • running time of insertion sort: worst case still Θ ( n 2 ) , but best case Θ ( n ) • (paramorphisms don’t improve the running time) • the computational essence of insertion and selection sort is the natural transformation swop • in general, we shall seek natural transformation of type F ( A × G A ) → G ( A + F A ) • (proof of equality involves (co-) pointed functors) University of Oxford — Ralf Hinze 63-86
The Computational Essence of Sorting Algorithms — Insertion and selection sort II WG 2.8 University of Oxford — Ralf Hinze 64-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 Section 7 Quicksort and treesort University of Oxford — Ralf Hinze 65-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 • so far: one-phase sorting algorithms µ List → ν List • to improve performance we need to exchange non-adjacent elements • next: two-phase sorting algorithms that make use of an intermediate data structure µ List → ν Tree → µ Tree → ν List • the intermediate data structure can sometimes be deforested (turning a data into a control structure) • we can play our game for each phase University of Oxford — Ralf Hinze 66-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 7.0 Search trees • an obvious intermediate data structure is a binary tree data Tree tree = Empty | Node tree K tree instance Functor Tree where map f Empty = Empty map f ( Node l k r ) = Node ( f l ) k ( f r ) • we assume a ‘horizontal’ ordering type SearchTree = Tree University of Oxford — Ralf Hinze 67-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 7.1 Phase one: growing a search tree • the essence of growing a search tree sprout :: List ( a × SearchTree a ) → SearchTree ( a + List a ) sprout Nil = Empty sprout ( Cons x ( t Empty )) = Node ( t ◾ ) x ( t ◾ ) sprout ( Cons x ( t ( Node l y r ))) | x � y = Node ( ▸ ( Cons x l )) y ( r ◾ ) | otherwise = Node ( l ◾ ) y ( ▸ ( Cons x r )) • this is the only sensible definition: no choices • we compare elements across some distance University of Oxford — Ralf Hinze 68-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 • we can either recursively partition a list, building subtrees from the resulting sublists, or start with an empty tree and repeatedly insert the elements into it grow :: µ List → ν SearchTree grow = unfold ( para ( map ( id ▽ in ) · sprout )) grow ′ :: µ List → ν SearchTree grow ′ = fold ( apo ( sprout · map ( id △ out ))) • the algebra is a useful function on its own: insertion into a search tree • efficient insertion into a tree is necessarily an apomorphism University of Oxford — Ralf Hinze 69-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 7.2 Phase two: withering a search tree • the essence of withering a search tree wither :: SearchTree ( a × List a ) → List ( a + SearchTree a ) wither Empty = Nil wither ( Node ( l Nil ) x ( r )) = Cons x ( r ◾ ) ( Cons x l ′ )) y ( r wither ( Node ( l )) = Cons x ( ▸ ( Node l ′ y r )) • again, this is the only sensible definition University of Oxford — Ralf Hinze 70-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 • this should surprise no one: the second phase would surely be an in-order traversal flatten :: µ SearchTree → ν List flatten = fold ( apo ( wither · map ( id △ out ))) flatten ′ :: µ SearchTree → ν List flatten ′ = unfold ( para ( map ( id ▽ in ) · wither )) • the algebra is essentially a ternary version of append • the coalgebra deletes the leftmost element from a search tree University of Oxford — Ralf Hinze 71-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 7.2 Putting things together We obtain the famous quicksort and the less prominent treesort algorithms, quickSort :: µ List → ν List quickSort = flatten · downcast · grow treeSort :: µ List → ν List treeSort = flatten · downcast · grow ′ where downcast :: ( Functor f ) ⇒ ν f → µ f projects the final coalgebra onto the initial algebra. University of Oxford — Ralf Hinze 72-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 7.2 Intermediate summary • once the intermediate data structure has been fixed, everything falls into place: no choices • observation: only the first phase performs comparisons • quicksort and treesort are are two (strongly related) variations of the same idea • running time: worst case still Θ ( n 2 ) , but average case Θ ( n log n ) University of Oxford — Ralf Hinze 73-86
The Computational Essence of Sorting Algorithms — Quicksort and treesort WG 2.8 University of Oxford — Ralf Hinze 74-86
The Computational Essence of Sorting Algorithms — Heapsort and minglesort WG 2.8 Section 8 Heapsort and minglesort University of Oxford — Ralf Hinze 75-86
The Computational Essence of Sorting Algorithms — Heapsort and minglesort WG 2.8 8.0 Heaps • a search tree imposes a horizontal ordering • we can also assume a ‘vertical’ ordering type Heap = Tree University of Oxford — Ralf Hinze 76-86
The Computational Essence of Sorting Algorithms — Heapsort and minglesort WG 2.8 8.1 Phase one: piling up a heap • the essence of piling up a heap pile :: List ( a × Heap a ) → Heap ( a + List a ) pile Nil = Empty pile ( Cons x ( t Empty )) = Node ( t ◾ ) x ( t ◾ ) pile ( Cons x ( t ( Node l y r ))) | x � y = Node ( ▸ ( Cons y r )) x ( l ◾ ) | otherwise = Node ( ▸ ( Cons x r )) y ( l ◾ ) • now we have a choice (3rd equation)! Braun’s trick! University of Oxford — Ralf Hinze 77-86
The Computational Essence of Sorting Algorithms — Heapsort and minglesort WG 2.8 8.1 Phase one: piling up a heap • the essence of piling up a heap pile :: List ( a × Heap a ) → Heap ( a + List a ) pile Nil = Empty pile ( Cons x ( t Empty )) = Node ( t ◾ ) x ( t ◾ ) pile ( Cons x ( t ( Node l y r ))) | x � y = Node ( ▸ ( Cons y r )) x ( l ◾ ) | otherwise = Node ( ▸ ( Cons x r )) y ( l ◾ ) • now we have a choice (3rd equation)! Braun’s trick! • let a = x ‘ min ‘ y and b = x ‘ max ‘ y , = Node ( ▸ ( Cons b l )) a ( r ◾ ) = Node ( r ◾ ) a ( ▸ ( Cons b l )) = Node ( l ◾ ) a ( ▸ ( Cons b r )) = Node ( ▸ ( Cons b r )) a ( l ◾ ) University of Oxford — Ralf Hinze 77-86
The Computational Essence of Sorting Algorithms — Heapsort and minglesort WG 2.8 • as usual we obtain two algorithms heapify :: µ List → ν Heap heapify = unfold ( para ( map ( id ▽ in ) · pile )) heapify ′ :: µ List → ν Heap heapify ′ = fold ( apo ( pile · map ( id △ out ))) • the algebra is a useful function on its own: insertion into a heap University of Oxford — Ralf Hinze 78-86
The Computational Essence of Sorting Algorithms — Heapsort and minglesort WG 2.8 8.2 Phase two: sifting through a heap • the essence of sifting through a heap sift :: Heap ( a × List a ) → List ( a + Heap a ) sift Empty = Nil sift ( Node ( l Nil ) x ( r )) = Cons x ( r ◾ ) sift ( Node ( l ) x ( r Nil )) = Cons x ( l ◾ ) ( Cons y l ′ )) x ( r ( Cons z r ′ ))) sift ( Node ( l = Cons x ( ▸ ( Node l ′ y r )) | y � z = Cons x ( ▸ ( Node l z r ′ )) | otherwise • when constructing the heap node to continue with, we have the option to swap left with right, but this buys us nothing University of Oxford — Ralf Hinze 79-86
The Computational Essence of Sorting Algorithms — Heapsort and minglesort WG 2.8 • again, we obtain two algorithms unheapify :: µ Heap → ν List unheapify = fold ( apo ( sift · map ( id △ out ))) unheapify ′ :: µ Heap → ν List unheapify ′ = unfold ( para ( map ( id ▽ in ) · sift )) • the coalgebra deletes the mimimum element from a heap University of Oxford — Ralf Hinze 80-86
Recommend
More recommend