Duke Computer Science Colloquium
Effort-Light StructMine: Turning Massive Corpora into Structures
||Monday, March 27, 2017
||12:00pm - 1:00pm
||D106 LSRC, Duke
||Pizza will be served at 11:45.
The real-world data, though massive, are hard for machines to resolve as they are largely unstructured and in the form of natural-language text. One of the grand challenges is to turn such massive corpora into machine-actionable structures. Yet, most existing systems have heavy reliance on human effort in the process of structuring various corpora, slowing down the development of downstream applications.
In this talk, I will introduce a data-driven framework, Effort-Light StructMine, that extracts structured facts from massive corpora without explicit human labeling effort. In particular, I will discuss how to solve three structure mining tasks under Effort-Light StructMine framework: from identifying typed entities in text, to fine-grained entity typing, to extracting typed relationships between entities. Together, these three solutions form a clear roadmap for turning a massive corpus into a structured network to represent its factual knowledge. Finally, I will share some directions towards mining corpus-specific structured networks for knowledge discovery.
Xiang Ren is a Computer Science PhD candidate at University of Illinois at Urbana-Champaign, working with Jiawei Han and the Data and Information System（DAIS）Research Lab. Xiang’s research develops data-driven methods for turning unstructured text data into machine-actionable structures. More broadly, his research interests span data mining, machine learning, and natural language processing, with a focus on making sense of massive text corpora. His research has been recognized with many awards including a Google PhD Fellowship, a Yahoo!-DAIS Research Excellence Award, and a C. W. Gear Outstanding Graduate Student Award from UIUC. Techniques and algorithms he developed has been transferred to US Army Research Lab, NIH, Microsoft, Yelp and TripAdvisor.
Hosted by: Ashwin Machanavajjhala