Upload
marujirou
View
156
Download
2
Embed Size (px)
Citation preview
DeepMulti-TaskLearningwithSharedMemoryPengfei Liu,Xipeng Qiu,Xuanjing HuangEMNLP2016readinggrouppresenter:ryosukemiyazaki
AbstractDuetothelargenumberofparametersneuralmodelsneedalarge-scalecorpus. →unsupervisedpre-trainingiseffectiveMulti-tasklearningalsoimprovethefinalperformance.ThispaperproposeLSTMwithexternalmemoryformulti-tasklearning.
Model:ME-LSTM
Keyvector,Erasevector,Addvector
Model:ME-LSTM
Readingoperation
Ksegment,Mdimensionsperonesegment
,
Model:ME-LSTM
DeepFusionstrategy
Model:ME-LSTM
Writingoperation
Twoarchitectures
ARC-1 ARC-2
TrainingTask-specificoutputlayer
Linearcombinationofcostfunction
λm istheweightsforeachtaskm
Experiment:textclassification
Result:Movie
Result:Product
Analysis:Visualizedeepfusiongate
Sentimentscore
Dimensionsofdeepfusiongategt
Activate→ black
Analysis:Visualizedeepfusiongate
Conclusion・ Thispaperproposetwodeeparchitecturesformulti-tasklearning.・ Theydesignanexternalmemorytostoretheknowledgebyrelatedtasks.・ Deepfusionstrategyenablingthemodeltogivesharedinformation.