Why Simple Algorithms Can Be Complex: Toward a Systematic Study of Algorithms?

  • Date: 06/12/2008
  • Time: 16:00

Professor Allan Borodin (University of Toronto)


University of British Columbia


Richard Karp has called Computer Science the systematic study of algorithms. This raises the question as to how much the design and analysis of algorithms is a systematic study or science as opposed to say an art form as suggested by the title of Don Knuth's influential texts. For discrete computation, the well-accepted Church-Turing thesis provides a precise mathematical model for what is "computable'' and hence (indirectly) for what can constitute an "algorithm''. While there has been a rich and ongoing development of new and often surprising algorithms in diverse areas, conceptually simple algorithms are often used for expediency and sometimes even to obtain the best-known results for many fundamental problems. An ambitious (or perhaps naive) goal is to develop a theory for "conceptually simple combinatorial algorithms'', starting off with greedy or myopic algorithms. I will review some of the history of previous efforts in this regard, some recent ideas and results, and my current thinking about the power and (provable) limitations of simple algorithmic paradigms. In particular, I will present the priority algorithm framework as a model for greedy-like optimization algorithms, and then discuss how this framework can be extended to model some common forms of simple dynamic programming, backtracking and primal dual algorithms.

Abstracts / Downloads / Reports: 
Other Information: 

Location: GEOG 100, UBC


CRM-Fields-PIMS Prize Lecture 2008