Generated by GPT-5-mini| Moses Charikar | |
|---|---|
| Name | Moses Charikar |
| Fields | Computer Science |
| Workplaces | Stanford University; Google Research; Princeton University; Yale University |
| Alma mater | Harvard University; Princeton University |
| Doctoral advisor | Sanjeev Arora |
| Known for | Locality-sensitive hashing; sketching; streaming algorithms; approximation algorithms |
Moses Charikar is an American theoretical computer scientist known for foundational contributions to algorithms, randomized techniques, and data structures. He has held faculty and research positions at leading institutions and has influenced areas spanning approximation algorithms, streaming algorithms, and algorithmic foundations for large-scale data processing. His work connects to major developments in theoretical computer science and industrial research at organizations shaping Google, Stanford University, Princeton University, Yale University, and broader computational practice.
Charikar received his undergraduate and graduate education at prominent institutions linked to prominent researchers and centers. He completed degrees at Princeton University and Harvard University, studying under advisors associated with research communities around Sanjeev Arora and collaborators tied to groups at IBM Research and AT&T Bell Labs. His formative years intersected with contemporaries from MIT, UC Berkeley, Carnegie Mellon University, and University of California, San Diego, and he participated in conferences such as STOC, FOCS, ICALP, and SODA that shape theoretical computer science.
Charikar has held appointments at research and academic institutions connected to industrial and academic ecosystems. He served on faculty at Stanford University and held positions at Princeton University and Yale University, while also engaging with industry research at Google Research and interactions with teams at Microsoft Research, IBM Research, Facebook AI Research, and Amazon research labs. His career includes collaborations with groups at Columbia University, Cornell University, University of Washington, and international partners at École Polytechnique, University of Toronto, ETH Zurich, and Tsinghua University. He has been active in program committees for conferences such as NeurIPS, KDD, SIGMOD, PODS, and editorial boards for journals associated with ACM and IEEE.
Charikar's research produced several widely cited algorithms and conceptual tools that influenced streaming, sketching, sublinear algorithms, and similarity search. He introduced and developed techniques related to locality-sensitive hashing that connect to applications in search systems at Google, Microsoft, Yahoo!, and Amazon. His contributions include algorithmic results for approximate nearest neighbor search linking to work at SIGIR and VLDB, and sketching methods related to the Count-Min Sketch and streaming frameworks from researchers at IBM Research and AT&T Bell Labs. He co-developed approximation algorithms and hardness results tied to complexity themes from P versus NP discussions and reductions used in STOC and FOCS papers. Specific notable results include algorithms for sublinear-time approximation influenced by prior work at Princeton University and follow-up developments at MIT and UC Berkeley.
His work has direct relevance to machine learning and data mining pipelines used in projects at Google Brain, DeepMind, OpenAI, and in systems described at NeurIPS, ICML, and KDD. He produced influential bounds and techniques for similarity-preserving hashing and embeddings that built on threads from Johnson–Lindenstrauss lemma research and connections to metric embeddings studied at Courant Institute and IAS. Collaborations and citations link his results to those by scholars at Harvard University, Yale University, Columbia University, University College London, and University of Illinois Urbana-Champaign.
Charikar's contributions earned recognition from academic and professional communities associated with awards and honors granted to researchers listed alongside recipients from ACM, IEEE, NSF, and major conferences such as STOC and FOCS. He has been invited to speak at venues including International Congress of Mathematicians-style plenaries in algorithmic workshops and has received distinctions similar to prizes awarded by organizations like ACM SIGACT and national fellowships comparable to grants from NSF and Simons Foundation. His papers have won best paper awards and distinguished paper recognitions at conferences including KDD, SODA, and SIGMOD, joining other laureates from Stanford University and Princeton University.
Charikar's publication record includes seminal papers presented at leading conferences and journals that have influenced practitioners and theoreticians across institutions such as MIT, Stanford University, Princeton University, Harvard University, UC Berkeley, and industrial labs at Google Research and Microsoft Research. Representative publications include early work on locality-sensitive hashing and approximate nearest neighbors featured at STOC and SODA, streaming and sketching algorithms presented at PODS and VLDB, and theoretical advances published in outlets associated with ACM Transactions and IEEE Transactions. His papers are widely cited by researchers at Columbia University, Cornell University, ETH Zurich, University of Toronto, Tsinghua University, and others, and have been incorporated into systems and textbooks used in courses at MIT, UC Berkeley, Princeton University, and Stanford University.