Hi, I'm Kayo
Bonjour, je suis Kayo
今日は、かよと申します 

[kajo iɴ]
she/her
kayoyin🥸berkeley.edu
Curriculum Vitae


Anonymous feedback
I'm a PhD student at UC Berkeley working on machine learning and natural language processing. I am co-advised by Jacob Steinhardt and Dan Klein, and I am a member of Berkeley AI Research and Berkeley NLP.

I interned at DeepMind where I worked with Chris Dyer, I did my master's at Carnegie Mellon University where I was advised by Graham Neubig, and I did my undergrad at École Polytechnique. In a previous life, I wanted to become a classical musician and I have a CEM from Conservatoire Frédéric Chopin.

I come from Akashi, Japan and grew up in Paris, France. I like playing music, martial arts, backcountry snowboarding, outdoor climbing, solo traveling, hiking quickly, running slowly, and making oddly specific Spotify playlists.

  • 2022-08-19 Gave an invited talk at the Workshop on Pronouns and Machine Translation.
  • 2022-07-27 Gave an invited presentation at IJCAI on Including Signed Languages in NLP. My first in-person conference yay!
  • 2022-07-09 Gave a keynote talk at the Queer in AI Workshop @NAACL.
  • 2022-06-06 I started my internship at DeepMind! If you're in London this summer, let's meet up :)
  • 2022-05-19 I was invited to the NLP Highlights Podcast for their episode on Including Signed Languages in NLP!
  • 2022-04-15 I will join UC Berkeley for my PhD next Fall!
  • 2021-11-05 Gave an invited talk at DeepMind on Natural Language Processing for Signed Languages
  • 2021-10-07 Gave an invited talk at University of Pittsburgh on Extending Neural Machine Translation to Dialogue and Signed Languages
  • 2021-09-23 Extremely honored to be selected as a Siebel Scholar Class of 2022!
  • 2021-09-17 Gave an invited talk at SIGTYP on Understanding, Improving and Evaluating Context Usage in Context-aware Machine Translation
  • 2021-07-05 Extremely thrilled to receive the Best Theme Paper award at ACL 2021!
  • 2021-03-01 Gave an invited talk at Unbabel on Do Context-Aware Translation Models Pay the Right Attention?
  • 2020-10-18 Gave an invited talk at Computer Vision Talks on Sign Language Translation with Transformers
  • 2020-09-21 Extremely honored to be awarded Global Winner in Computer Science at The Global Undergraduate Awards 2020!
  • 2020-08-31 Started my Master's degree at CMU LTI!
  • Research

    My current research interests are as difficult to pin down as my music taste. I am generally motivated by ideas that 1) improve our understanding of how things (e.g. black box models, natural languages, cognition, the universe...) work, 2) promote safety, inclusivity, and fairness, 3) become elegant solutions for really hard problems that will work well in the long run.

    Here are some topics that I have worked on:

  • Model Interpretability: I am interested in understanding how current NLP models use contextual information to make decisions during language generation (EMNLP'22) and using this information to guide models to process context more efficiently. I am also interested in how neural networks process language on a fundamental level and how machine intelligence compares with human cognition.

  • Sign Language Processing: I am interested in modeling signed languages from a linguistic perspective and extending existing language technologies to signed languages. I have argued the importance for the NLP community to include signed languages, both socially and scientifically, and how to get involved (ACL'21), I researched how to translate a signed language into a spoken language (ECCV'20, COLING'20), how to perform data augmentation for Sign Language Translation (MTSummit21), and how to perform coreference resolution of pronominal pointing signs (EMNLP'21).

  • Context-aware Machine Translation: I am interested in when context, either on an intra-sentential (within the current sentence), inter-sentential (across multiple sentences), or extra-linguistic (e.g. social, temporal, cultural) level, is required during translation, and how to model these features in machine translation. My works examine when translation requires context and how well models perform on these translations (arxiv'21), how much context-aware models are actually using context (ACL'21), whether models are using the type of context we expect them to (ACL'21), and how to encourage models to use more/better context.


  • Please reach out if you'd like to chat or collaborate! I am generally responsive to emails and Twitter messages, I do not check LinkedIn very often, and I prefer that people do not contact me on social media that I have not listed on this website.

    Publications


    * = equal contribution

    2022

    • Interpreting Language Models with Contrastive Explanations
      Kayo Yin and Graham Neubig.
      Conference on Empirical Methods in Natural Language Processing (EMNLP). December 2022.
      PDF Code

    2021

    • Signed Coreference Resolution
      Kayo Yin, Kenneth DeHaan and Malihe Alikhani.
      Conference on Empirical Methods in Natural Language Processing (EMNLP). November 2021.
      PDF Code Video

    • When is Wall a Pared and when a Muro?: Extracting Rules Governing Lexical Selection
      Aditi Chaudhary, Kayo Yin, Antonios Anastasopoulos and Graham Neubig.
      Conference on Empirical Methods in Natural Language Processing (EMNLP). November 2021.
      PDF Code

    • When Does Translation Require Context? A Data-driven, Multilingual Exploration
      Kayo Yin*, Patrick Fernandes*, André F. T. Martins and Graham Neubig.
      In submission.
      PDF Code

    • 🏆 Best Theme Paper Including Signed Languages in Natural Language Processing
      Kayo Yin, Amit Moryossef, Julie Hochgesang, Yoav Goldberg and Malihe Alikhani.
      Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP). August 2021.
      PDF Video

    • Do Context-Aware Translation Models Pay the Right Attention?
      Kayo Yin, Patrick Fernandes, Danish Pruthi, Aditi Chaudhary, André F. T. Martins and Graham Neubig.
      Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP). August 2021.
      PDF Code Video

    • Measuring and Increasing Context Usage in Context-Aware Machine Translation
      Patrick Fernandes, Kayo Yin, Graham Neubig and André F. T. Martins.
      Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP). August 2021.
      PDF Code

    • Data Augmentation for Sign Language Gloss Translation
      Amit Moryossef*, Kayo Yin*, Graham Neubig and Yoav Goldberg.
      18th Biennial Machine Translation Summit (MTSummit) 1st International Workshop on Automatic Translation for Signed and Spoken Languages (AT4SSL). August 2021.
      PDF

    2020

    • Better Sign Language Translation with STMC-Transformer
      Kayo Yin and Jesse Read.
      Proceedings of the 28th International Conference on Computational Linguistics (COLING). November 2020.
      PDF Code

    • Sign Language Translation with Transformers
      Kayo Yin and Jesse Read.
      European Conference on Computer Vision (ECCV) Workshop on Sign Language Recognition, Translation and Production (SLRTP). August 2020.
      PDF Code Video

    Awards



    Copyright © Kayo Yin 2021-2022
    Last updated October 14 2022