Networked Systems SeminarTalk #5: Friday, May 1st, 2009DBH 6011, 11am Joint talk with the CS Seminar 
Getting an Edge at High Speeds: Randomized Algorithms and Networking HardwareGeorge VargheseUCSD 
About the Talk:
Even commercial router vendors have adopted randomized algorithms
in a few cases because of their simplicity, speed, and memoryefficiency.
Further, because of the opportunity to see every member of
the population (i.e., every arriving packet) and preserve summary information
about the entire population, such randomized algorithms can obtain an "edge"
over standard algorithms that merely sample the population. I illustrate this thesis by three algorithms. First, I will describe a simple algorithm for finding sources that send a
large proportion of traffic, and its application in a worm detection chip.
Second, I will describe an algorithm that provides an inexpensive technique for
measuring the average and variance of packet latencies and loss on a
link. By contrast, the majority of routers have no support for
finegrained latency measurement; managers must instead rely on
approximate methods such as sending probe packets or using
"tomographic" techniques. If time permits, I will describe a third algorithm
which allows scalable logging, say of millions of network addresses infected
after an attack, using only a small amount of memory. In all three case I
will quantify the edge obtained over simple sampling.
Joint work with Cristi Estan, Ramana Kompella, Kirill Levchenko, Terry Lam and Alex Snoeren. [slides]

About the Speaker:
