방법론_Text style transfer

방법론 정리 (2019년도 논문들 위주의 내용으로 정리해보겠음)

1. Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation

  • encoder-decoder” framework
    • Hu et al., 2017 (Toward controlled generation of text)
    • Shen et al., 2017 (Style transfer from non-parallel text by cross-alignment)
    • Fu et al., 2018 (Style transfer in text: Exploration and evaluation)
    • Carlson et al. 2017; 
    • Zhang et al., 2018b,a (Style transfer as unsupervised machine translation)
    • Prabhumoye et al., 2018 (Style transfer through back-translation)
    • Jin et al., 2019
    • Melnyk et al., 2017
    • dos Santos et al., 2018
  • learned latent representation 기반
    • Shen et al. (2017)
    • Hu et al. (2017)
    • Fu et al., 2018
    • John et al., 2018
    • Zhang et al., 2018a,b
  • without manipulating latent representation
    • Xu et al. (2018) (Unpaired sentiment-to-sentiment translation: A cycled reinforcement learning approach.)
    • Li et al. (2018) (Delete, Retrieve, Generate: A Simple Approach to Sentiment and Style Transfer)
    • Lample et al. (2019) (Multiple-attribute text rewriting) 

2. Transforming Delete, Retrieve, Generate Approach for Controlled Text Style Transfer

  • learned latent representations to disentangle style and content from sentences
    • Hu et al., 2017
    • Shen et al., 2017
    • Fu et al., 2018
  • find that style attributes
    • Li et al. (2018) 
  • do not rely on a latent representation to separate content and attribute
    • Xu et al., 2018
    • Gong et al., 2019 (Reinforcement learning based text style transfer without parallel training corpus.)
    • Subramanian et al., 2018 (Multiple-attribute text style transfer.)
    • Li et al., 2018
  • attention weights to extract attribute significance exist
    • Feng et al., 2018; Li et al., 2016; Globerson and Roweis, 2006
    • including the salience deletion method of Li et al. (2018)

3. A Dual Reinforcement Learning Framework for Unsupervised Text Style Transfer

  • learns a style-independent content representation vector via adversarial training, and then passes it to a style-dependent decoder for rephrasing
    • Shen et al., 2017
    • Fu et al., 2018
    • Hu et al., 2017
    • Tsvetkov et al., 2018 (Style transfer through back-translation)
  • directly removes the specific style attribute words in the input, and then feeds the neutralized sequence which only contains content words to a style-dependent generation mode
    • Li et al., 2018
    • Xu et al., 2018
  • learn style-independent content representation
    • Fu et al., 2018
    • Shen et al., 2017
    • Hu et al., 2017
    • Yang et al., 2018b (Unsupervised text style transfer using language models as discriminators)
    • Tsvetkov et al., 2018
  • easy to fool the discriminator without actually removing the style information
    • Li et al., 2017
    • Lample et al., 2019
  • propose to separate content and style by directly removing the style words
    • Li et al., 2018
    • Zhang et al., 2018a (Learning sentiment memories for sentiment modification without parallel data)
    • Xu et al., 2018

4. Mask and Infill: Applying Masked Language Model to Sentiment Transfer

  • try to learn the disentangled representation of content and attribute of a sentence in a hidden space
    • Shen et al., 2017
    • Prabhumoye et al., 2018
    • Fu et al., 2018
  • explicitly separate style from content in feature-based ways and encode them into hidden representations respectively
    • Xu et al., 2018
    • Li et al., 2018

5. Domain Adaptive Text Style Transfer

  • explored this direction by assuming the disentanglement can be achieved in an auto-encoding procedure with a suitable style regularization, implemented by either adversarial discriminators or style classifiers.
    • Hu et al. (2017)
    • Fu et al. (2018)
    • Shen et al. (2017)
    • Yang et al. (2018)
    • Gong et al. (2019)
    • Lin et al. (2017) 
  • achieved disentanglement by filtering the stylistic words of input sentences
    • Li et al. (2018)
    • Xu et al. (2018)
    • Zhang et al. (2018c) 
  • has proposed to use back-translation for text style transfer
    • Prabhumoye et al. (2018) 

6. 총 정리하자면

  • learned latent representations to disentangle style and content from sentences
    • hu2017toward (Toward controlled generation of text)
    • shen2017style (Style transfer from non-parallel text by cross-alignment)
    • fu2018style (Style transfer in text: Exploration and evaluation)
    • prabhumoye-etal-2018-style (Style transfer through back-translation)
    • logeswaran2018content, (Content preserving text generation with attribute controls)
  • without manipulating latent representation
    • Delete 방식
      • Li et al. (2018) (Delete, Retrieve, Generate: A Simple Approach to Sentiment and Style Transfer)
      • Sudhakar (Transforming Delete, Retrieve, Generate Approach for Controlled Text Style Transfer)
      • Zhang et al., 2018a (Learning sentiment memories for sentiment modification without parallel data)
      • Xu et al. (2018) (Unpaired sentiment-to-sentiment translation: A cycled reinforcement learning approach.)
      • Wu (Mask and Infill: Applying Masked Language Model for Sentiment Transfer)
    • 아예 안 없애는 방식
      • dai-etal-2019-style, (Style Transformer: Unpaired Text Style Transfer without Disentangled Latent Representation)
      • subramanian2018multiple(Lample) (Multiple-attribute text style transfer.)
      • Gong et al., 2019 (Reinforcement learning based text style transfer without parallel training corpus.)
      • ijcai2019-711 (A Dual Reinforcement Learning Framework for Unsupervised Text Style Transfer)

댓글