CS388: Natural Language Processing Lecture 15: Seman

35
CS388: Natural Language Processing Greg Durre8 Lecture 15: Seman<cs II / Seq2seq I credit: NawaphonIsarathanachaikul on imgflip

Transcript of CS388: Natural Language Processing Lecture 15: Seman

Page 1: CS388: Natural Language Processing Lecture 15: Seman

CS388:NaturalLanguageProcessing

GregDurre8

Lecture15:Seman<csII/Seq2seqI

credit:NawaphonIsarathanachaikulonimgflip

Page 2: CS388: Natural Language Processing Lecture 15: Seman

Administrivia

‣ Project2outtoday

‣Mini2gradedbytomorrow

‣ Finalprojectfeedbacksoon

Page 3: CS388: Natural Language Processing Lecture 15: Seman

Recall:ParsestoLogicalForms

NP

VPNNP NNP

S

VBPLadyGaga

sings

e470

λy. sings(y)

sings(e470) ∧ dances(e470)

VP

CC VP

VBPdancesλy. dances(y)

and

VP:λy.a(y)∧b(y)->VP:λy.a(y)CCVP:λy.b(y)

λy. sings(y) ∧ dances(y)

‣ Generalrules:S:f(x)->NP:xVP:f

Page 4: CS388: Natural Language Processing Lecture 15: Seman

Recall:CCG‣ Steedman+Szabolcsi1980s:formalismbridgingsyntaxandseman<cs

‣ Syntac<ccategories(forthislecture):S,NP,“slash”categories‣ S\NP:“ifIcombinewithanNPonmyleaside,Iformasentence”—verb

‣ (S\NP)/NP:“IneedanNPonmyrightandthenonmylea”—verbwithadirectobject

NP S\NP

Eminem singse728 λy. sings(y)

Ssings(e728)

NP (S\NP)/NP

Oklahoma borderse101

Texase89NP

λx.λy borders(y,x)

S\NPλy borders(y,e89)

Sborders(e101,e89)

Page 5: CS388: Natural Language Processing Lecture 15: Seman

ThisLecture‣ Seq2seqmodels

‣ Seq2seqmodelsforseman<cparsing

‣ Introtoa8en<on

Page 6: CS388: Natural Language Processing Lecture 15: Seman

Encoder-DecoderModels

Page 7: CS388: Natural Language Processing Lecture 15: Seman

Mo<va<on‣ Parsershavebeenpre8yhardtobuild…‣ Cons<tuency/graph-based:complexdynamicprograms

‣ Transi<on-based:complextransi<onsystems

‣ CCG/seman<cparsers:complexsyntax/seman<csinterface,challenginginference,challenginglearning

‣ Forseman<cparsinginpar<cular:bridgingthesyntax-seman<csdivideresultsinstructuralweirdnessesinparsers,hardtolearntherightseman<cgrammar

‣ Encoder-decodermodelscan(inprinciple)predictanylinearizedsequenceoftokens

Page 8: CS388: Natural Language Processing Lecture 15: Seman

Encoder-Decoder‣ Seman<cparsing:

WhatstatesborderTexas λ x state( x ) ∧ borders( x , e89 )

‣ Syntac<cparsing

Thedogran (S (NP (DT the) (NN dog) ) (VP (VBD ran) ) )

(butwhatifweproduceaninvalidtreeoronewithdifferentwords?)🤔

‣Machinetransla<on,summariza<on,dialoguecanallbeviewedinthisframeworkaswell

Page 9: CS388: Natural Language Processing Lecture 15: Seman

Encoder-Decoder‣ Encodeasequenceintoafixed-sizedvector

themoviewasgreat

‣ NowusethatvectortoproduceaseriesoftokensasoutputfromaseparateLSTMdecoder

lefilmétaitbon[STOP]

Sutskeveretal.(2014)

Page 10: CS388: Natural Language Processing Lecture 15: Seman

Encoder-Decoder

‣ Isthistrue?Sortof…we’llcomebacktothislater

Page 11: CS388: Natural Language Processing Lecture 15: Seman

Model‣ Generatenextwordcondi<onedonpreviouswordaswellashiddenstate

themoviewasgreat <s>

‣Wsizeis|vocab|x|hiddenstate|,soamaxoveren<revocabulary

Decoderhasseparateparametersfromencoder,sothiscanlearntobealanguagemodel(produceaplausiblenextwordgivencurrentone)

P (y|x) =nY

i=1

P (yi|x, y1, . . . , yi�1)

P (yi|x, y1, . . . , yi�1) = softmax(Wh̄)y1<latexit sha1_base64="7G4kLJYkX3D7/ov8pWJUOLn1JaM=">AAAGE3icjVTLbtQwFE3LDJTwamHJxqIaNVFDlbSVQEhFFWwQEtLw6ENq2shxPBmrecl2mhll/A9s+BU2LECILRt2/A2OkxQ6nbZYmszNOee+7Bv7WUQYt+3fc/PXOt3rNxZu6rdu37l7b3Hp/i5Lc4rwDkqjlO77kOGIJHiHEx7h/YxiGPsR3vOPX1b83gmmjKTJBz7O8GEMw4QMCIJcQt5Sx+y5MMqG0HMMZoIt4OJRZrjZkHjYYJZjuTHkQ39QjoRp6q2WG8zjSs3y2CuZV/LHjhCgodWb0aDm2ZAet/h00FOet05WFd8EkvMxn5lvVeVTrHoxGnA6nQIt+bg0Kbdad5U0wgNu1LlC4JIEuByPOI3LJKQwZjJz4UkCBSkHyKUkHPIqZJSGoG+Mt5xJYSETrAJ3QCEqHVEei6Z0suWIIxmvldqTwiNSrPf6RltgIapW+0bhOab6W5+cmhuVaUlI9hnJ9KxylCEUXCOSLYna+DpIxUkgqfZhSqGcJ21zJ4QRjgPwHibKua6+ZVGaJ1wYs8QWKEzxP0JTnM2YyP0rYMIBT0GYyud0SVcKXqfDBLQp3sAAhpAhSOv6I/kdBHKyL+/kohCXdnWBkylHY7U5GWXUFWzOrmBW/Gqmmm36ewYTt8yegSlpZknFc9sVExmlGq4CXKXTe8xz9N6onmk1bn75ThwFoB5iSGlagNFKzZe25bjiqAxWhD72HG9x2V6z1QLnDacxlrVm9b3FX26QojzGCUcRZOzAsTN+WELKCYqw0N2c4QyiYxjiA2kmMMbssFR3mgA9iQRgkFL5k+ev0H89SvkdsnHsS2XVB5vmKnAWd5DzwdPDkiRZznGC6kSDPKomrLogQUAoRjwaSwMiSmStAA2hPAUur1FdboIz3fJ5Y3d9zdlYW3+7ubz9otmOBe2h9kgzNEd7om1rr7S+tqOhzsfO587Xzrfup+6X7vfuj1o6P9f4PNDOrO7PP83BCWI=</latexit>

Page 12: CS388: Natural Language Processing Lecture 15: Seman

Inference‣ Generatenextwordcondi<onedonpreviouswordaswellashiddenstate

themoviewasgreat

‣ Duringinference:needtocomputetheargmaxoverthewordpredic<onsandthenfeedthattothenextRNNstate

le

<s>

‣ Needtoactuallyevaluatecomputa<ongraphuptothispointtoforminputforthenextstate

‣ Decoderisadvancedonestateata<meun<l[STOP]isreached

film était bon [STOP]

Page 13: CS388: Natural Language Processing Lecture 15: Seman

Implemen<ngseq2seqModels

themoviewasgreat

‣ Encoder:consumessequenceoftokens,producesavector.Analogoustoencodersforclassifica<on/taggingtasks

le

<s>

‣ Decoder:separatemodule,singlecell.Takestwoinputs:hiddenstate(vectorhortuple(h,c))andprevioustoken.Outputstoken+newstate

Encoder

film

le

Decoder Decoder

Page 14: CS388: Natural Language Processing Lecture 15: Seman

Training

‣ Objec<ve:maximize

themoviewasgreat <s> lefilmétaitbon

le

‣ Onelosstermforeachtarget-sentenceword,feedthecorrectwordregardlessofmodel’spredic<on(called“teacherforcing”)

[STOP]était

X

(x,y)

nX

i=1

logP (y⇤i |x, y⇤1 , . . . , y⇤i�1)

Page 15: CS388: Natural Language Processing Lecture 15: Seman

Training:ScheduledSampling

‣ Star<ngwithp=1(teacherforcing)anddecayingitworksbest

‣ Scheduledsampling:withprobabilityp,takethegoldasinput,elsetakethemodel’spredic<on

themoviewasgreat

lafilmétaisbon[STOP]

le film était

‣Modelneedstodotherightthingevenwithitsownpredic<ons

Bengioetal.(2015)

sample

‣ “Right”thing:trainwithreinforcementlearning

Page 16: CS388: Natural Language Processing Lecture 15: Seman

Implementa<onDetails

‣ Sentencelengthsvaryforbothencoderanddecoder:

‣ Typicallypadeverythingtotherightlengthanduseamaskorindexingtoaccessasubsetofterms

‣ Encoder:lookslikewhatyoudidinMini2

‣ Decoder:executeonestepofcomputa<onata<me,socomputa<ongraphisformulatedastakingoneinput+hiddenstate

‣ Test<me:dothisun<lyougeneratethestoptoken

‣ Training:dothisun<lyoureachthegoldstoppingpoint

Page 17: CS388: Natural Language Processing Lecture 15: Seman

Implementa<onDetails(cont’d)

‣ Batchingispre8ytricky:decoderisacross<mesteps,soyouprobablywantyourlabelvectorstolooklike[num<mestepsxbatchsizexnumlabels],iterateupwardsby<mesteps

‣ Beamsearch:canhelpwithlookahead.Findsthe(approximate)highestscoringsequence:

argmaxy

nY

i=1

P (yi|x, y1, . . . , yi�1)

Page 18: CS388: Natural Language Processing Lecture 15: Seman

BeamSearch‣Maintaindecoderstate,tokenhistoryinbeam

la:0.4

<s>

la

le

les

le:0.3les:0.1

log(0.4)log(0.3)

log(0.1)

film:0.4

la

film:0.8

le

… lefilm

lafilm

log(0.3)+log(0.8)

log(0.4)+log(0.4)

‣ Keepbothfilmstates!Hiddenstatevectorsaredifferent

themoviewasgreat

Page 19: CS388: Natural Language Processing Lecture 15: Seman

OtherArchitectures‣What’sthebasicabstrac<onhere?

‣ Encoder:sentence->vector

‣ Decoder:hiddenstate,outputprefix->newhiddenstate,newoutput

‣Widevarietyofmodelscanapplyhere:CNNencoders,decoderscanbeanyautoregressivemodelincludingcertaintypesofCNNs

‣ Transformer:anothermodeldiscussednextlecture

‣ OR:sentence,outputprefix->newoutput(moregeneral)

Page 20: CS388: Natural Language Processing Lecture 15: Seman

Seq2seqSeman<cParsing

Page 21: CS388: Natural Language Processing Lecture 15: Seman

Seman<cParsingasTransla<on

JiaandLiang(2016)

‣Writedownalinearizedformoftheseman<cparse,trainseq2seqmodelstodirectlytranslateintothisrepresenta<on

‣Whatmightbesomeconcernsaboutthisapproach?Howdowemi<gatethem?

“whatstatesborderTexas”

lambda x ( state( x ) and border( x , e89 ) ) )

‣Whataresomebenefitsofthisapproachcomparedtogrammar-based?

Page 22: CS388: Natural Language Processing Lecture 15: Seman

HandlingInvariances

‣ Parsing-basedapproacheshandlethesethesameway

‣ Possibledivergences:features,differentweightsinthelexicon

‣ Keyidea:don’tchangethemodel,changethedata

“whatstatesborderTexas” “whatstatesborderOhio”

‣ Canwegetseq2seqseman<cparserstohandlethesethesameway?

‣ “Dataaugmenta<on”:encodeinvariancesbyautoma<callygenera<ngnewtrainingexamples

Page 23: CS388: Natural Language Processing Lecture 15: Seman

DataAugmenta<on

‣ Abstractouten<<es:nowwecan“remix”examplesandencodeinvariancetoen<tyID.Morecomplicatedremixestoo

‣ Letsussynthesizea“whatstatesborderohio?”example

JiaandLiang(2016)

Page 24: CS388: Natural Language Processing Lecture 15: Seman

Seman<cParsingasTransla<on

JiaandLiang(2016)

‣ Prolog

‣ Lambdacalculus

‣ OtherDSLs

‣ Handleallofthesewithuniformmachinery!

Page 25: CS388: Natural Language Processing Lecture 15: Seman

Seman<cParsingasTransla<on

JiaandLiang(2016)

‣ Threeformsofdataaugmenta<onallhelp

‣ Resultsonthesetasksares<llnotasstrongashand-tunedsystemsfrom10yearsago,butthesamesimplemodelcandowellatallproblems

Page 26: CS388: Natural Language Processing Lecture 15: Seman

RegexPredic<on

‣ Predictregexfromtext

‣ Problem:requiresalotofdata:10,000examplesneededtoget~60%accuracyonpre8ysimpleregexes

Locascioetal.(2016)‣ Doesnotscalewhenregexspecifica<onsaremoreabstract(Iwanttorecognizeadecimalnumberlessthan20)

Page 27: CS388: Natural Language Processing Lecture 15: Seman

SQLGenera<on

‣ Convertnaturallanguagedescrip<onintoaSQLqueryagainstsomeDB

‣ Howtoensurethatwell-formedSQLisgenerated?

Zhongetal.(2017)

‣ Threeseq2seqmodels

‣ Howtocapturecolumnnames+constants?

‣ Pointermechanisms,tobediscussedlater

Page 28: CS388: Natural Language Processing Lecture 15: Seman

A8en<on

‣ Orangepiecesareprobablyreusedacrossmanyproblems

‣ LSTMhastorememberthevalueofTexasfor13steps!

‣ Next:a8en<onmechanismsthatletus“lookback”attheinputtoavoidhavingtoremembereverything

“whatstatesborderTexas” lambdax(state(x)andborder(x,e89)))

‣ Nottoohardtolearntogenerate:startwithlambda,alwaysfollowwithx,followthatwithparen,etc.

Page 29: CS388: Natural Language Processing Lecture 15: Seman

A8en<on

Page 30: CS388: Natural Language Processing Lecture 15: Seman

ProblemswithSeq2seqModels

‣ Needsomeno<onofinputcoverageorwhatinputwordswe’vetranslated

‣ Encoder-decodermodelsliketorepeatthemselves:

AboyplaysinthesnowboyplaysboyplaysUngarçonjouedanslaneige

‣Whydoesthishappen?

‣Modelstrainedpoorly

‣ Inputisforgo8enbytheLSTMsoitgetsstuckina“loop”ofgenera<ngthesameoutputtokensagainandagain

Page 31: CS388: Natural Language Processing Lecture 15: Seman

ProblemswithSeq2seqModels

‣ Badatlongsentences:1)afixed-sizehiddenrepresenta<ondoesn’tscale;2)LSTMss<llhaveahard<merememberingforreallylongperiodsof<me

RNNenc:themodelwe’vediscussedsofarRNNsearch:usesa8en<on

Bahdanauetal.(2014)

Page 32: CS388: Natural Language Processing Lecture 15: Seman

ProblemswithSeq2seqModels

‣ Unknownwords:

‣ Infact,wedon’twanttoencodethem,wewantawayofdirectlylookingbackattheinputandcopyingthem(Pont-de-Buis)

‣ Encodingtheserarewordsintoavectorspaceisreallyhard

Page 33: CS388: Natural Language Processing Lecture 15: Seman

AlignedInputs

<s>lefilmétaitbon

themoviewasgreat

themoviewasgreat

lefilmétaitbon

‣ Supposeweknewthesourceandtargetwouldbeword-by-wordtranslated

‣ Inthatcase,wecouldlookatthecorrespondinginputwordwhentransla<ng—mightimprovehandlingoflongsentences!

lefilmétaitbon[STOP]

‣ Howcanweachievethiswithouthardcodingit?

Page 34: CS388: Natural Language Processing Lecture 15: Seman

A8en<on

‣ Ateachdecoderstate,computeadistribu<onoversourceinputsbasedoncurrentdecoderstatethemoviewasgreat <s> le

themovie wa

sgreatth

emovie wa

sgreat

… …

‣ Usetheweightedsumofinputtokenstopredictoutput

Page 35: CS388: Natural Language Processing Lecture 15: Seman

Takeaways

‣ Ratherthancombiningsyntaxandseman<cslikeinCCG,wecaneitherparsetoseman<crepresenta<onsdirectlyorgeneratethemwithseq2seqmodels

‣ Seq2seqmodelsareaveryflexibleframework,someweaknessescanpoten<allybepatchedwithmoredata

‣ Howtofixtheirshortcomings?Next<me:a8en<on,copying,andtransformers