Click the link that follows for more Brown CS content about the Paris C. Kanellakis Memorial Lecture.
The Paris C. Kanellakis Memorial Lecture, a seventeen-year tradition at Brown University's Department of Computer Science (Brown CS), honors Paris Kanellakis, a distinguished computer scientist who was an esteemed and beloved member of the Brown CS community. Paris came to Brown in 1981 and became a full professor in 1990. His research area was theoretical computer science, with emphasis on the principles of database systems, logic in computer science, the principles of distributed computing, and combinatorial optimization. He died in an airplane crash on December 20, 1995, along with his wife, Maria Teresa Otoya, and their two young children, Alexandra and Stephanos Kanellakis.
Each year, Brown CS invites one of the field's thought leaders to address wide-ranging topics in honor of Paris. Last year, Donald Knuth of Stanford University returned to Brown to give a "history of clever ideas that arose around the world” as he traced the evolution of a combinatorial problem dating back to antiquity. Just this week, on December 11, 2017, Piotyr Indyk of the Massachusetts Institute of Technology delivered the seventeenth annual Paris C. Kanellakis Memorial Lecture.
Indyk is a Professor of Electrical Engineering and Computer Science at MIT. He joined MIT in 2000, after earning his PhD from Stanford University and his Magister degree from Uniwersytet Warszawski in 1995. His research interests lie in the design and analysis of efficient algorithms, and specific interests include high-dimensional computational geometry, sketching and streaming algorithms and sparse recovery. Indyk has received the Sloan Fellowship (2003), the Packard Fellowship (2003), and the Simons Investigator Award (2013). His work on Sparse Fourier Transform has been named to the Technology Review “TR10” in 2012, while his work on locality-sensitive hashing has received the 2012 ACM Kanellakis Theory and Practice Award.
Speaking in front of an enthusiastic crowd that filled CIT 368, Piotyr gave a lecture ("Below P vs. NP: Conditional Quadratic-Time Hardness for Big Data Problems") that worked outward from one of the key concepts of computational complexity, the theory of NP-hardness, to address some of the newest challenges presented by the era of Big Data. Quadratic-time algorithms, Indyk stated, can be inefficient when used on moderately-sized inputs, but lose their usefulness on problems that involve gigabytes or more of data. Using recent research in string processing and machine learning as evidence, he made the case that under a natural complexity-theoretic conjecture, there are no near-linear time algorithms for such problems, and he described how this framework has led to the development of new algorithms.
Professor James Tompkin of Brown CS, whose research is in graphics, vision, and interaction techniques, was one of the many attendees. Asked about Piotyr and his work, he says, "His and his colleagues work on analyzing complexity of empirical risk minimization is highly relevant to computer vision, as we often use supervised learning methods like SVM and neural networks. For instance, Indyk showed that quadratic time is required to compute the gradient of the empirical loss in neural nets."
For more information, click the link that follows to contact Brown CS Communication Outreach Specialist Jesse C. Polhemus.