17

Click here to load reader

Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Embed Size (px)

Citation preview

Page 1: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

ACL 2012

50th Annual Meeting of theAssociation for Computational Linguistics

Tutorial Abstracts of ACL 2012

July 8, 2012Jeju Island, Korea

Page 2: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

c©2012 The Association for Computational Linguistics

Order copies of this and other ACL proceedings from:

Association for Computational Linguistics (ACL)209 N. Eighth StreetStroudsburg, PA 18360USATel: +1-570-476-8006Fax: [email protected]

ISBN 978-1-937284-28-2

ii

Page 3: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Introduction

A total of 17 tutorial proposals were submitted to the ACL 2012 Tutorials track from which six werefinally accepted. I am grateful to the ACL community for the diverse, and high-quality proposals Ireceived. This guarantees a strong tutorials track at ACL 2012, but at the same time made the selectionprocess very difficult. All 17 proposals were reviewed by the chair with assistance by colleagues andexperts from the NLP community where necessary. The final selection was approved by the ACL 2012General Chair.

The following criteria guided the selection: (1) Quality: content, scope and organization of the proposal,competence and experience of the presenters. (2) Diversity: I tried to include diverse topics rangingfrom linguistically motivated approaches to current developments in machine learning. (3) Novelty:Tutorials recently held at ACL events were not selected.

One of my aims was to convince the presenters that they make their tutorials accessible to the novice inthe respective areas. In my acceptance email I told them: “When preparing for the tutorial, please keepin mind that the target audience is not your buddies whom you may want to impress, but researchersand graduate students who are not familiar with your topic.”

I would like to thank all the presenters for putting a lot of effort in the tutorials. I am indebted to theLocal Chairs, the Publication Chairs and the ACL 2012 General Chair for making it happen.

Enjoy,

Michael Strube, HITS gGmbHACL 2012 Tutorial Chair

iii

Page 4: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics
Page 5: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Tutorial Chair:

Michael StrubeHeidelberg Institute for Theoretical Studies, gGmbHHeidelberg, Germany

v

Page 6: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics
Page 7: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Table of Contents

Qualitative Modeling of Spatial Prepositions and Motion ExpressionsInderjeet Mani and James Pustejovsky . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

State-of-the-Art Kernels for Natural Language ProcessingAlessandro Moschitti . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2

Topic Models, Latent Space Models, Sparse Coding, and All That: A Systematic Understanding ofProbabilistic Semantic Extraction in Large Corpus

Eric Xing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3

Multilingual Subjectivity and Sentiment AnalysisRada Mihalcea, Carmen Banea and Janyce Wiebe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

Deep Learning for NLP (without Magic)Richard Socher, Yoshua Bengio and Christopher D. Manning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5

Graph-based Semi-Supervised Learning Algorithms for NLPAmar Subramanya and Partha Pratim Talukdar . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

vii

Page 8: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics
Page 9: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Tutorial Program

Sunday July 8, 2012

Morning

9:00-12:30 Qualitative Modeling of Spatial Prepositions and Motion ExpressionsInderjeet Mani and James Pustejovsky

State-of-the-Art Kernels for Natural Language ProcessingAlessandro Moschitti

Topic Models, Latent Space Models, Sparse Coding, and All That: A SystematicUnderstanding of Probabilistic Semantic Extraction in Large CorpusEric Xing

12:30-2:00 Lunch Break

Afternoon

2:00-5:30 Multilingual Subjectivity and Sentiment AnalysisRada Mihalcea, Carmen Banea and Janyce Wiebe

Deep Learning for NLP (without Magic)Richard Socher, Yoshua Bengio and Christopher D. Manning

Graph-based Semi-Supervised Learning Algorithms for NLPAmar Subramanya and Partha Pratim Talukdar

ix

Page 10: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics
Page 11: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Tutorial Abstracts of ACL 2012, page 1,Jeju, Republic of Korea, 8 July 2012. c©2012 Association for Computational Linguistics

Qualitative Modeling of Spatial Prepositions and Motion Expressions

Inderjeet ManiChildren’s Organization of

Southeast AsiaThailand

[email protected]

James PustejovskyComputer Science Department

Brandeis UniversityWaltham, MA USA

[email protected]

The ability to understand spatial prepositions andmotion in natural language will enable a variety ofnew applications involving systems that can respondto verbal directions, map travel guides, display in-cident reports, etc., providing for enhanced infor-mation extraction, question-answering, informationretrieval, and more principled text to scene render-ing. Until now, however, the semantics of spatial re-lations and motion verbs has been highly problem-atic. This tutorial presents a new approach to thesemantics of spatial descriptions and motion expres-sions based on linguistically interpreted qualitativereasoning. Our approach allows for formal inferencefrom spatial descriptions in natural language, whileleveraging annotation schemes for time, space, andmotion, along with machine learning from annotatedcorpora. We introduce a compositional semanticsfor motion expressions that integrates spatial primi-tives drawn from qualitative calculi.

No previous exposure to the semantics of spatialprepositions or motion verbs is assumed. The tu-torial will sharpen cross-linguistic intuitions aboutthe interpretation of spatial prepositions and mo-tion constructions. The attendees will also learnabout qualitative reasoning schemes for static anddynamic spatial information, as well as three annota-tion schemes: TimeML, SpatialML, and ISO-Space,for time, space, and motion, respectively.

While both cognitive and formal linguistics haveexamined the meaning of motion verbs and spatialprepositions, these earlier approaches do not yieldprecise computable representations that are expres-sive enough for natural languages. However, theprevious literature makes it clear that communica-

tion of motion relies on imprecise and highly ab-stract geometric descriptions, rather than Euclideanones that specify the coordinates and shapes of ev-ery object. This property makes these expressionsa fit target for the field of qualitative spatial reason-ing in AI, which has developed a rich set of geomet-ric primitives for representing time, space (includingdistance, orientation, and topological relations), andmotion. The results of such research have yielded awide variety of spatial and temporal reasoning logicsand tools. By reviewing these calculi and resources,this tutorial aims to systematically connect qualita-tive reasoning to natural language.

Tutorial Schedule:I. Introduction. i. Overview of geometric idealiza-tions underlying spatial PPs; ii. Linguistic patternsof motion verbs across languages; iii. A qualita-tive model for static spatial descriptions and for pathverbs; iv. Overview of relevant annotation schemes.II. Calculi for Qualitative Spatial Reasoning. i.Semantics of spatial PPs mapped to qualitative spa-tial reasoning; ii. Qualitative calculi for representingtopological and orientation relations; iii. Qualitativecalculi to represent motion.III. Semantics of Motion Expressions. i. Introduc-tion to Dynamic Interval Temporal Logic (DITL); ii.DITL representations for manner-of-motion verbsand path verbs; iii. Compositional semantics for mo-tion expressions in DITL, with the spatial primitivesdrawn from qualitative calculi.IV. Applications and Research Topics. i. Routenavigation, mapping travel narratives, QA, scenerendering from text, and generating event descrip-tions; ii. Open issues and further research topics.

1

Page 12: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Tutorial Abstracts of ACL 2012, page 2,Jeju, Republic of Korea, 8 July 2012. c©2012 Association for Computational Linguistics

State-of-the-Art Kernels for Natural Language Processing

Alessandro MoschittiDepartment of Computer Science and Information Engineering

University of TrentoVia Sommarive 5, 38123 Povo (TN), Italy

[email protected]

IntroductionIn recent years, machine learning (ML) has beenused more and more to solve complex tasks in dif-ferent disciplines, ranging from Data Mining to In-formation Retrieval or Natural Language Processing(NLP). These tasks often require the processing ofstructured input, e.g., the ability to extract salientfeatures from syntactic/semantic structures is criti-cal to many NLP systems. Mapping such structureddata into explicit feature vectors for ML algorithmsrequires large expertise, intuition and deep knowl-edge about the target linguistic phenomena. Ker-nel Methods (KM) are powerful ML tools (see e.g.,(Shawe-Taylor and Cristianini, 2004)), which can al-leviate the data representation problem. They substi-tute feature-based similarities with similarity func-tions, i.e., kernels, directly defined between train-ing/test instances, e.g., syntactic trees. Hence fea-ture vectors are not needed any longer. Additionally,kernel engineering, i.e., the composition or adapta-tion of several prototype kernels, facilitates the de-sign of effective similarities required for new tasks,e.g., (Moschitti, 2004; Moschitti, 2008).

Tutorial ContentThe tutorial aims at addressing the problems above:firstly, it will introduce essential and simplified the-ory of Support Vector Machines and KM with theonly aim of motivating practical procedures and in-terpreting the results. Secondly, it will simply de-scribe the current best practices for designing ap-plications based on effective kernels. For this pur-pose, it will survey state-of-the-art kernels for di-verse NLP applications, reconciling the different ap-

proaches with a uniform and global notation/theory.Such survey will benefit from practical expertise ac-quired from directly working on many natural lan-guage applications, ranging from Text Categoriza-tion to Syntactic/Semantic Parsing. Moreover, prac-tical demonstrations using SVM-Light-TK toolkitwill nicely support the application-oriented perspec-tive of the tutorial. The latter will lead NLP re-searchers with heterogeneous background to the ac-quisition of the KM know-how, which can be usedto design any target NLP application.

Finally, the tutorial will propose interesting newbest practices, e.g., some recent methods for large-scale learning with structural kernels (Severynand Moschitti, 2011), structural lexical similarities(Croce et al., 2011) and reverse kernel engineering(Pighin and Moschitti, 2009).

ReferencesDanilo Croce, Alessandro Moschitti, and Roberto Basili.

2011. Structured Lexical Similarity via ConvolutionKernels on Dependency Trees. In Proc. of EMNLP.

Alessandro Moschitti. 2004. A Study on ConvolutionKernels for Shallow Semantic Parsing. In Proceedingsof ACL.

Alessandro Moschitti. 2008. Kernel Methods, Syntaxand Semantics for Relational Text Categorization. InProceedings of CIKM.

Daniele Pighin and Alessandro Moschitti. 2009. Effi-cient Linearization of Tree Kernel Functions. In Pro-ceedings of CoNLL.

Aliaksei Severyn and Alessandro Moschitti. 2011. FastSupport Vector Machines for Structural Kernels. InECML.

John Shawe-Taylor and Nello Cristianini. 2004. KernelMethods for Pattern Analysis. Cambridge Univ. Press.

2

Page 13: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Tutorial Abstracts of ACL 2012, page 3,Jeju, Republic of Korea, 8 July 2012. c©2012 Association for Computational Linguistics

Topic Models, Latent Space Models, Sparse Coding, and All That: Asystematic understanding of probabilistic semantic extraction in large

corpus

Eric XingSchool of Computer ScienceCarnegie Mellon University

Abstract

Probabilistic topic models have recentlygained much popularity in informational re-trieval and related areas. Via such mod-els, one can project high-dimensional objectssuch as text documents into a low dimen-sional space where their latent semantics arecaptured and modeled; can integrate multiplesources of information—to ”share statisticalstrength” among components of a hierarchicalprobabilistic model; and can structurally dis-play and classify the otherwise unstructuredobject collections. However, to many practi-tioners, how topic models work, what to andnot to expect from a topic model, how is it dif-ferent from and related to classical matrix al-gebraic techniques such as LSI, NMF in NLP,how to empower topic models to deal withcomplex scenarios such as multimodal data,contractual text in social media, evolving cor-pus, or presence of supervision such as la-beling and rating, how to make topic mod-eling computationally tractable even on web-scale data, etc., in a principled way, remain un-clear. In this tutorial, I will demystify the con-ceptual, mathematical, and computational is-sues behind all such problems surrounding thetopic models and their applications by present-ing a systematic overview of the mathemati-cal foundation of topic modeling, and its con-nections to a number of related methods pop-ular in other fields such as the LDA, admix-ture model, mixed membership model, latentspace models, and sparse coding. I will offera simple and unifying view of all these tech-niques under the framework multi-view latentspace embedding, and online the roadmap ofmodel extension and algorithmic design to-

ward different applications in IR and NLP. Amain theme of this tutorial that tie together awide range of issues and problems will buildon the ”probabilistic graphical model” formal-ism, a formalism that exploits the conjoinedtalents of graph theory and probability theoryto build complex models out of simpler pieces.I will use this formalism as a main aid to dis-cuss both the mathematical underpinnings forthe models and the related computational is-sues in a unified, simplistic, transparent, andactionable fashion.

3

Page 14: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Tutorial Abstracts of ACL 2012, page 4,Jeju, Republic of Korea, 8 July 2012. c©2012 Association for Computational Linguistics

Multilingual Subjectivity and Sentiment Analysis

Rada MihalceaUniversity of North Texas

Denton, [email protected]

Carmen BaneaUniversity of North Texas

Denton, [email protected]

Janyce WiebeUniversity of Pittsburgh

Pittsburgh, [email protected]

AbstractSubjectivity and sentiment analysis focuses onthe automatic identification of private states,such as opinions, emotions, sentiments, evalu-ations, beliefs, and speculations in natural lan-guage. While subjectivity classification labelstext as either subjective or objective, sentimentclassification adds an additional level of gran-ularity, by further classifying subjective text aseither positive, negative or neutral.

While much of the research work in thisarea has been applied to English, researchon other languages is growing, includingJapanese, Chinese, German, Spanish, Ro-manian. While most of the researchers inthe field are familiar with the methods ap-plied on English, few of them have closelylooked at the original research carried out inother languages. For example, in languagessuch as Chinese, researchers have been look-ing at the ability of characters to carry sen-timent information (Ku et al., 2005; Xiang,2011). In Romanian, due to markers of po-liteness and additional verbal modes embed-ded in the language, experiments have hintedthat subjectivity detection may be easier toachieve (Banea et al., 2008). These addi-tional sources of information may not be avail-able across all languages, yet, various arti-cles have pointed out that by investigating asynergistic approach for detecting subjectiv-ity and sentiment in multiple languages at thesame time, improvements can be achieved notonly in other languages, but in English aswell. The development and interest in thesemethods is also highly motivated by the factthat only 27% of Internet users speak En-glish (www.internetworldstats.com/stats.htm,

Oct 11, 2011), and that number diminishesfurther every year, as more people across theglobe gain Internet access.

The aim of this tutorial is to familiarize theattendees with the subjectivity and sentimentresearch carried out on languages other thanEnglish in order to enable and promote cross-fertilization. Specifically, we will review workalong three main directions. First, we willpresent methods where the resources and toolshave been specifically developed for a giventarget language. In this category, we willalso briefly overview the main methods thathave been proposed for English, but which canbe easily ported to other languages. Second,we will describe cross-lingual approaches, in-cluding several methods that have been pro-posed to leverage on the resources and toolsavailable in English by using cross-lingualprojections. Finally, third, we will show howthe expression of opinions and polarity per-vades language boundaries, and thus methodsthat holistically explore multiple languages atthe same time can be effectively considered.

ReferencesC. Banea, R. Mihalcea, and J. Wiebe. 2008. A Boot-

strapping method for building subjectivity lexicons forlanguages with scarce resources. In Proceedings ofLREC 2008, Marrakech, Morocco.

L. W. Ku, T. H. Wu, L. Y. Lee, and H. H. Chen. 2005.Construction of an Evaluation Corpus for Opinion Ex-traction. In Proceedings of NTCIR-5, Tokyo, Japan.

L. Xiang. 2011. Ideogram Based Chinese SentimentWord Orientation Computation. Computing ResearchRepository, page 4, October.

4

Page 15: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Tutorial Abstracts of ACL 2012, page 5,Jeju, Republic of Korea, 8 July 2012. c©2012 Association for Computational Linguistics

Deep Learning for NLP (without Magic)

Richard Socher Yoshua Bengio∗ Christopher D. [email protected] [email protected], [email protected]

Computer Science Department, Stanford University∗ DIRO, Universite de Montreal, Montreal, QC, Canada

1 Abtract

Machine learning is everywhere in today’s NLP, butby and large machine learning amounts to numericaloptimization of weights for human designed repre-sentations and features. The goal of deep learningis to explore how computers can take advantage ofdata to develop features and representations appro-priate for complex interpretation tasks. This tuto-rial aims to cover the basic motivation, ideas, mod-els and learning algorithms in deep learning for nat-ural language processing. Recently, these methodshave been shown to perform very well on variousNLP tasks such as language modeling, POS tag-ging, named entity recognition, sentiment analysisand paraphrase detection, among others. The mostattractive quality of these techniques is that they canperform well without any external hand-designed re-sources or time-intensive feature engineering. De-spite these advantages, many researchers in NLP arenot familiar with these methods. Our focus is oninsight and understanding, using graphical illustra-tions and simple, intuitive derivations. The goal ofthe tutorial is to make the inner workings of thesetechniques transparent, intuitive and their results in-terpretable, rather than black boxes labeled ”magichere”.

The first part of the tutorial presents the basics ofneural networks, neural word vectors, several simplemodels based on local windows and the math andalgorithms of training via backpropagation. In thissection applications include language modeling andPOS tagging.

In the second section we present recursive neuralnetworks which can learn structured tree outputs aswell as vector representations for phrases and sen-tences. We cover both equations as well as applica-tions. We show how training can be achieved by a

modified version of the backpropagation algorithmintroduced before. These modifications allow the al-gorithm to work on tree structures. Applications in-clude sentiment analysis and paraphrase detection.We also draw connections to recent work in seman-tic compositionality in vector spaces. The princi-ple goal, again, is to make these methods appear in-tuitive and interpretable rather than mathematicallyconfusing. By this point in the tutorial, the audiencemembers should have a clear understanding of howto build a deep learning system for word-, sentence-and document-level tasks.

The last part of the tutorial gives a generaloverview of the different applications of deep learn-ing in NLP, including bag of words models. We willprovide a discussion of NLP-oriented issues in mod-eling, interpretation, representational power, and op-timization.

5

Page 16: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Tutorial Abstracts of ACL 2012, page 6,Jeju, Republic of Korea, 8 July 2012. c©2012 Association for Computational Linguistics

Graph-based Semi-Supervised Learning Algorithms for NLP

Amar SubramanyaGoogle Research

[email protected]

Partha Pratim TalukdarCarnegie Mellon University

[email protected]

Abstract

While labeled data is expensive to prepare, ever in-creasing amounts of unlabeled linguistic data arebecoming widely available. In order to adapt tothis phenomenon, several semi-supervised learning(SSL) algorithms, which learn from labeled as wellas unlabeled data, have been developed. In a sep-arate line of work, researchers have started to real-ize that graphs provide a natural way to representdata in a variety of domains. Graph-based SSL al-gorithms, which bring together these two lines ofwork, have been shown to outperform the state-of-the-art in many applications in speech processing,computer vision and NLP. In particular, recent NLPresearch has successfully used graph-based SSL al-gorithms for PoS tagging (Subramanya et al., 2010),semantic parsing (Das and Smith, 2011), knowledgeacquisition (Talukdar et al., 2008), sentiment anal-ysis (Goldberg and Zhu, 2006) and text categoriza-tion (Subramanya and Bilmes, 2008).

Recognizing this promising and emerging area of re-search, this tutorial focuses on graph-based SSL al-gorithms (e.g., label propagation methods). The tu-torial is intended to be a sequel to the ACL 2008SSL tutorial, focusing exclusively on graph-basedSSL methods and recent advances in this area, whichwere beyond the scope of the previous tutorial.

The tutorial is divided in two parts. In the firstpart, we will motivate the need for graph-based SSLmethods, introduce some standard graph-based SSLalgorithms, and discuss connections between theseapproaches. We will also discuss how linguistic datacan be encoded as graphs and show how graph-basedalgorithms can be scaled to large amounts of data(e.g., web-scale data).

Part 2 of the tutorial will focus on how graph-basedmethods can be used to solve several critical NLPtasks, including basic problems such as PoS tagging,semantic parsing, and more downstream tasks suchas text categorization, information acquisition, and

sentiment analysis. We will conclude the tutorialwith some exciting avenues for future work.

Familiarity with semi-supervised learning andgraph-based methods will not be assumed, and thenecessary background will be provided. Examplesfrom NLP tasks will be used throughout the tutorialto convey the necessary concepts. At the end of thistutorial, the attendee will walk away with the follow-ing:

• An in-depth knowledge of the current state-of-the-art in graph-based SSL algorithms, and theability to implement them.

• The ability to decide on the suitability ofgraph-based SSL methods for a problem.

• Familiarity with different NLP tasks wheregraph-based SSL methods have been success-fully applied.

In addition to the above goals, we hope that this tu-torial will better prepare the attendee to conduct ex-citing research at the intersection of NLP and otheremerging areas with natural graph-structured data(e.g., Computation Social Science).

Please visit http://graph-ssl.wikidot.com/ for details.

ReferencesDipanjan Das and Noah A. Smith. 2011. Semi-supervised

frame-semantic parsing for unknown predicates. In Proceed-ings of the ACL: Human Language Technologies.

Andrew B. Goldberg and Xiaojin Zhu. 2006. Seeing stars whenthere aren’t many stars: graph-based semi-supervised learn-ing for sentiment categorization. In Proceedings of the Work-shop on Graph Based Methods for NLP.

Amarnag Subramanya and Jeff Bilmes. 2008. Soft-supervisedtext classification. In EMNLP.

Amarnag Subramanya, Slav Petrov, and Fernando Pereira.2010. Graph-based semi-supervised learning of structuredtagging models. In EMNLP.

Partha Pratim Talukdar, Joseph Reisinger, Marius Pasca,Deepak Ravichandran, Rahul Bhagat, and Fernando Pereira.2008. Weakly supervised acquisition of labeled class in-stances using graph random walks. In EMNLP.

6

Page 17: Tutorial Abstracts of ACL 2012 - ACL Member Portal | … · Tutorial Abstracts of ACL 2012, page 2, Jeju, Republic of Korea, 8 July 2012. c 2012 Association for Computational Linguistics

Author Index

Banea, Carmen, 4Bengio, Yoshua, 5

Mani, Inderjeet, 1Manning, Christopher D., 5Mihalcea, Rada, 4Moschitti, Alessandro, 2

Pustejovsky, James, 1

Socher, Richard, 5Subramanya, Amar, 6

Talukdar, Partha Pratim, 6

Wiebe, Janyce, 4

Xing, Eric, 3

7