Algorithmic Theory of Networks: 2012-2015

OVERVIEW

The technology revolution of the 1990s and the 2000s owes much of its existence to the advances in computer networking technologies. These advances have made profound changes in how we model, construct/modify, maintain, use, and, ultimately, view our networks. This Collaborative Research Group will work on the theoretical foundations for new generation networks. That is, we will develop and analyze models, design and prove the correctness and optimality of algorithms. We will explore the theoretical limits of communication and computation in this new context.

 

RATIONALE AND OBJECTIVES

As networking technologies evolve, our understanding of computer (and other types of) networks changes in terms of their scale, structure, and functionality. Now, the scientific and technical community is faced with new networking paradigms which are complex, heterogenous, and data-intensive. The resulting theoretical and practical problems are multifaceted, requiring a wide range of expertise from many fields. In this CRG we bring together researchers from four major universities in Western Canada (Simon Fraser University, University of Victoria, University of Calgary, and University of British Columbia), with the necessary range of interests and expertise in different aspects of these networks and with the technical skills to investigate these problems. The collaboration will be on the modelling and algorithmic aspects of networks with special emphasis on sensor networks and probabilistic techniques.

Right now networking theory is not a fully coherent field; much of the related research is scattered across computer science, mathematics, and engineering, mostly based on the approach used, or the particular application. While networking groups in Western Canada share significant elements in terms of the general theme (distributed decision-making, sensor networks, game theoretic questions), while exhibiting complementary technical expertise, so far they have been working and publishing without much coordination or collaboration. We aim at bringing the expertise of these researchers together to improve our understanding of the new types of networks that have become so important for our lives.

The following scientific and technological challenges provide motivation for this work:

   • Increased quantities of data:  Modern user and system applications generate and store previously unthinkable amounts and data. For example,  Cisco projects that the Internet traffic will amount to 1 zettabyte (10^21 bytes) over 2015. This will require a new approach to “feasible computation.”

   • New constraints on memory and power:  Data transmission is now very easy to achieve, and this has led to non-traditional types of networks with new kinds of constraints such as sensors with very small memories and short battery life. Theoretical computer scientists and mathematician now have to deal with such technical restrictions in order to develop models and efficient algorithms for these networks.

   • Heterogeneity:  The ability to connect many different kinds of networks to the Internet has resulted in a structure which is very heterogenous and unpredictable. The area of distributed computing has historically dealt with comparatively homogeneous and predictable networks, and the challenge now is to get beyond these assumptions since they are infeasible for very large networks.

   • Social nature of networks: The mathematical understanding of social networks, their structure, and the spread of information and influence, has barely begun and will impact the study of sociology and other social sciences.

   • New parallel programming paradigms such as map-reduce or hadoop that have been developed for massively parallel systems are not adequately modelled by shared memory models nor by the fixed topology networks which have been previously studied. They require a design which which allows for efficient data partitioning that minimises communication costs between processors as well as minimising the rounds of computation.
 

PARTICIPANTS

Principal Investigators
Petra Berenbrink (SFU)
Funda Ergun (SFU)
Valerie King (UVIC)
 

PIMS Faculty
Will Evans (UBC)
Nick Harvey (UBC)
Lisa Higham (UC)
Bruce Kapron (UVic)
David Kirkpatrick (UBC)
Cenk Sahinalp (SFU)
Venkatesh Srinivasan (UVic)
Philipp Woelfel (UC)

Other Faculty
David Bremner (U New Brunswick)
Tim Craig (Princess Margaret Hospital)
Antoine Deza (McMaster University)
Robert Mifflin (Washington State University)
Michael Sharpe (Princess Margaret Hospital)

Advisory Committee
James Aspnes (Yale U.)
Pavol Hell (SFU)
Anna Karlin (UW)
David Lee (HP Labs).

PLANNED SCIENTIFIC ACTIVITIES

2012:

*Fall 2012:CRG Kickoff Meeting, Victoria, BC, October 9.

*Winter 2012: Research Meeting and Inaugural Distinguished Lecture at UBC

 

2013:

*Spring 2013: The Third Pacific Northwest Theory Day

 

2014:

*Summer 2014: Summer School on Randomized Techniques for Combinatorial Algorithms at SFU.

*October 2014: Workshop on Big Data in Networks and Distributed Systems at SFU.

 

2015:

March 2015: Algorithmic Theory of Networks Workshop at SFU 

 

 

Ongoing events:
• A distinguished lecture series with twelve visitors from academia and research labs/industry in the participating venues over two years. We expect six of these visitors to stay for a longer-term visit.
• Longer term visits by the PIs and their highly qualified personnel between institutions.  
• Collaboration with US institutions such as AT&T, Yahoo research, Cisco, HP Research, IBM, MIT, and the U. of New Mexico. Collaborators: Ali Begen (Cisco), Ravi Kumar (Yahoo), David Lee (HP), Kostas Oikonomou (AT&T), Kadangode Ramakrishnan (AT&T), Ronitt Rubinfeld (MIT), Jared Saia (UNM), Rakesh Sinha (AT&T), and David Woodruff (IBM)).

 

Postdoctoral Fellows
Two PIMS PDFs for two years each.