Complexity Theory 1 Complexity Theory 2 Texts Complexity Theory The main text for the course is: Computational Complexity . Christos H. Papadimitriou. Introduction to the Theory of Computation . Michael Sipser. Anuj Dawar Other useful references include: Computers and Intractability: A guide to the theory of Computer Laboratory NP-completeness . University of Cambridge Michael R. Garey and David S. Johnson. Lent Term 2004 Structural Complexity. Vols I and II . J.L. Balc´ azar, J. D´ ıaz and J. Gabarr´ o. http://www.cl.cam.ac.uk/Teaching/current/Complexity/ Computability and Complexity from a Programming Perspective . Neil Jones. Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004 Complexity Theory 3 Complexity Theory 4 Outline Outline - contd. A rough lecture-by-lecture guide, with relevant sections from the • Sets, numbers and scheduling. 9.4 text by Papadimitriou (or Sipser, where marked with an S). • coNP. 10.1–10.2. • Algorithms and problems. 1.1–1.3. • Cryptographic complexity. 12.1–12.2. • Time and space. 2.1–2.5, 2.7. • Space Complexity 7.1, 7.3, S8.1. • Time Complexity classes. 7.1, S7.2. • Hierarchy 7.2, S9.1. • Nondeterminism. 2.7, 9.1, S7.3. • Protocols 12.2, 19.1–19.2. • NP-completeness. 8.1–8.2, 9.2. • Graph-theoretic problems. 9.3 Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004
Complexity Theory 5 Complexity Theory 6 Complexity Theory Algorithms and Problems Complexity Theory seeks to understand what makes certain Insertion Sort runs in time O ( n 2 ), while Merge Sort is an problems algorithmically difficult to solve. O ( n log n ) algorithm. The first half of this statement is short for: In Data Structures and Algorithms , we saw how to measure the If we count the number of steps performed by the Insertion complexity of specific algorithms, by asymptotic measures of Sort algorithm on an input of size n , taking the largest number of steps. such number, from among all inputs of that size, then the function of n so defined is eventually bounded by a In Computation Theory , we saw that certain problems were not constant multiple of n 2 . solvable at all, algorithmically. It makes sense to compare the two algorithms, because they seek to solve the same problem. Both of these are prerequisites for the present course. But, what is the complexity of the sorting problem? Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004 Complexity Theory 7 Complexity Theory 8 Lower and Upper Bounds Review What is the running time complexity of the fastest algorithm that The complexity of an algorithm (whether measuring number of sorts a list? steps, or amount of memory) is usually described asymptotically: By the analysis of the Merge Sort algorithm, we know that this is Definition no worse than O ( n log n ). For functions f : I N → I N and g : I N → I N, we say that: The complexity of a particular algorithm establishes an upper • f = O ( g ), if there is an n 0 ∈ I N and a constant c such that for bound on the complexity of the problem. all n > n 0 , f ( n ) ≤ cg ( n ); To establish a lower bound , we need to show that no possible algorithm, including those as yet undreamed of, can do better. • f = Ω( g ), if there is an n 0 ∈ I N and a constant c such that for all n > n 0 , f ( n ) ≥ cg ( n ). In the case of sorting, we can establish a lower bound of Ω( n log n ), showing that Merge Sort is asymptotically optimal. • f = θ ( g ) if f = O ( g ) and f = Ω( g ). Sorting is a rare example where known upper and lower bounds match. Usually, O is used for upper bounds and Ω for lower bounds. Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004
Complexity Theory 9 Complexity Theory 10 Lower Bound on Sorting Travelling Salesman An algorithm A sorting a list of n distinct numbers a 1 , . . . , a n . Given • V — a set of vertices. a i < a j ? • c : V × V → I N — a cost matrix. a p < a q ? Find an ordering v 1 , . . . , v n of V for which the total cost: a k < a l ? n − 1 a r < a s ? � . . . . . c ( v n , v 1 ) + c ( v i , v i +1 ) . . . . . . . . . . i =1 done done done done done is the smallest possible. To work for all permutations of the input list, the tree must have at least n ! leaves and therefore height at least log 2 ( n !) = θ ( n log n ). Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004 Complexity Theory 11 Complexity Theory 12 Complexity of TSP Formalising Algorithms Obvious algorithm: Try all possible orderings of V and find the To prove a lower bound on the complexity of a problem, rather one with lowest cost. than a specific algorithm, we need to prove a statement about all The worst case running time is θ ( n !). algorithms for solving it. In order to prove facts about all algorithms, we need a Lower bound: An analysis like that for sorting shows a lower mathematically precise definition of algorithm. bound of Ω( n log n ). We will use the Turing machine . The simplicity of the Turing machine means it’s not useful Upper bound: The currently fastest known algorithm has a for actually expressing algorithms, but very well suited for running time of O ( n 2 2 n ). proofs about all algorithms. Between these two is the chasm of our ignorance. Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004
Complexity Theory 13 Complexity Theory 14 Turing Machines Configurations For our purposes, a Turing Machine consists of: A complete description of the configuration of a machine can be given if we know what state it is in, what are the contents of its • K — a finite set of states; tape, and what is the position of its head. This can be summed up • Σ — a finite set of symbols, including ⊔ . in a simple triple: • s ∈ K — an initial state; Definition A configuration is a triple ( q, w, u ), where q ∈ K and w, u ∈ Σ ⋆ • δ : ( K × Σ) → K ∪ { a, r } × Σ × { L, R, S } A transition function that specifies, for each state and symbol a The intuition is that ( q, w, u ) represents a machine in state q with next state (or accept acc or reject rej), a symbol to overwrite the string wu on its tape, and the head pointing at the last symbol the current symbol, and a direction for the tape head to move in w . ( L – left, R – right, or S - stationary) The configuration of a machine completely determines the future behaviour of the machine. Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004 Complexity Theory 15 Complexity Theory 16 Computations Computations The relation → ⋆ Given a machine M = ( K, Σ , s, δ ) we say that a configuration M is the reflexive and transitive closure of → M . ( q, w, u ) yields in one step ( q ′ , w ′ , u ′ ), written A sequence of configurations c 1 , . . . , c n , where for each i , c i → M c i +1 , is called a computation of M . ( q, w, u ) → M ( q ′ , w ′ , u ′ ) The language L ( M ) ⊆ Σ ⋆ accepted by the machine M is the set of if strings • w = va ; { x | ( s, ⊲, x ) → ⋆ M (acc , w, u )for some w and u } • δ ( q, a ) = ( q ′ , b, D ); and • either D = L and w ′ = v u ′ = bu or D = S and w ′ = vb and u ′ = u A machine M is said to halt on input x if for some w and u , either or D = R and w ′ = vbc and u ′ = x , where u = cx . If u is ( s, ⊲, x ) → ⋆ M (acc , w, u ) or ( s, ⊲, x ) → ⋆ M (rej , w, u ) empty, then w ′ = vb ⊔ and u ′ is empty. Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004
Complexity Theory 17 Complexity Theory 18 Decidability Example A language L ⊆ Σ ⋆ is recursively enumerable if it is L ( M ) for some Consider the machine with δ given by: M . ⊲ 0 1 ⊔ A language L is decidable if it is L ( M ) for some machine M which halts on every input . s s, ⊲, R s, 0 , R s, 1 , R q, ⊔ , L A language L is semi-decidable if it is recursively enumerable. q acc , ⊲, R q, ⊔ , L rej , ⊔ , R q, ⊔ , L A function f : Σ ⋆ → Σ ⋆ is computable , if there is a machine M , such that for all x , ( s, ⊲, x ) → ⋆ M (acc , f ( x ) , ε ) This machine will accept any string that contains only 0s before the first blank (but only after replacing them all by blanks). Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004 Complexity Theory 19 Complexity Theory 20 Multi-Tape Machines Complexity The formalisation of Turing machines extends in a natural way to For any function f : I N → I N, we say that a language L is in multi-tape machines. For instance a machine with k tapes is TIME ( f ( n )) if there is a machine M = ( K, Σ , s, δ ), such that: specified by: • L = L ( M ); and • K , Σ, s ; and • The running time of M is O ( f ( n )). • δ : ( K × Σ k ) → K ∪ { a, r } × (Σ × { L, R, S } ) k Similarly, we define SPACE( f ( n ) ) to be the languages accepted by a Similarly, a configuration is of the form: machine which uses O ( f ( n )) tape cells on inputs of length n . In defining space complexity, we assume a machine M , which has a ( q, w 1 , u 1 , . . . , w k , u k ) read-only input tape, and a separate work tape. We only count cells on the work tape towards the complexity. Anuj Dawar February 13, 2004 Anuj Dawar February 13, 2004
Recommend
More recommend