Mo

Ph.D. Student

Machine Intelligence & Translation Lab

School of Computer Science and Technology

Harbin institute of Technology

E-MAIL:

yumo AT mtlab DOT edu DOT cn

I have moved to a new website: https://sites.google.com/site/moyunlp/

Education

  • 2011.9 to present: Ph.D. in Computer Science and Technology, School of Computer Science and Technology, Harbin Institute of Technology. Advisor: Tie-Jun Zhao
  • 2009.9 to 2011.7: M.S. in Computer Science and Technology, School of Computer Science and Technology, Harbin Institute of Technology. Advisor: Tie-Jun Zhao
  • 2005.8 to 2009.7: B.S. in Computer Science and Technology, School of Honor, Harbin Institute of Technology.
  • internship and Work Experiences

    • The Center for Language and Speech Processing, Johns Hopkins University (2013.12 to present)

    Now I am a joint PhD. Student at JHU working on learning structured representations for NLP. My advisor is Prof. Mark Dredze and Prof. Raman Arora.

    • Baidu Inc., Natural Language Processing Group (2013.7-2013.12)

    I was a research intern working on Dependency Parsing, learning of syntactic representations and online learning algorithms for problems with structured predictions. Some of my work was advised by Prof. Tong Zhang.

    • Baidu Inc., Natural Language Processing Group (2012.5-2013.6)

    I was a research intern working on Deep Learning for NLP. I’ve got the “excellent intern” award in December 2012.

    • Microsoft Research Asia, Natural Language Computing Group (2010.08 to 2010.12)

    I was an intern working on twitter sentiment analysis instructed by Long Jiang.

    • Microsoft Research Asia, Web Search & Mining Group (2008.12 to 2009.8)

    I was an intern working on intelligent advertising and re-ranking of the results of image retrieval, instructed by Dr. Xin-Jing Wang. My work got me the “excellent certification”.

    Research interests

    Now I am mainly interested in machine learning, syntactic parsing and their applications.

    • Representations Learning for NLP Problems

    Structure representations and their applications

    Working on learning structured representations and its applications on relation extraction and phrase similarity.

    Learning of word embedding and its applications

    Working on learning strategies of word embeddings. Applying word embeddings to NER, Chinese POS-tagging as well as cross-lingual projection tasks.

    • Syntactic Parsing and its applications

    Online learning for dependency parsing

    Improving the online learning algorithms and distributed online learning algorithms for dependency parsing.

    Domain adaptation for dependency parsing

    Analyzed the performances of graph-based and transition-based dependency parsers and domain differences of dependency parsing. Researched on domain adaptation methods for dependency parsing.

    Target-dependent and Context-based Sentiment Analysis

    Researched on target-dependent (based on information from dependency parsing) and context-based (based on conditional random fields) sentiment analysis for tweets.

    • Machine Learning and its Applications to NLP

    Online learning and its applications to NLP

    Working on online learning methods for structured prediction tasks and parallel online learning and other general online learning techniques.

    Semi-supervised learning and its applications to NLP

    Extended semi-supervised learning methods (e.g. co-training and label propagation) to handle problems with structured predictions and applied the methods to cross-lingual projection and other NLP tasks. Carried out theoretical analysis on the experimental results.

    • Image Retrieval

    Re-ranking of results of image retrieval based on Dirichlet Processes (DP)

    Learned the principle of DP and worked on image clustering using DP with multi-views (textual and visual information) and its application to image search re-ranking.

    Advertising based on users’ photo collections

    Extracted topics from images based on the results of image annotation and surrounding texts of the image. Match the images with adverts based on the topic space.

    Publications

    Learning representations for NLP tasks:

  • Mo Yu, Matthew Gormley, Mark Dredze. Factor-based Compositional Embedding Models. The NIPS 2014 workshop on Learning Semantics. [extended abstract] [code & data]
  • Mo Yu, Mark Dredze. Improving Lexical Embeddings with Semantic Knowledge. Association for Computational Linguistics (ACL), 2014 short (accepted). [pdf] [code & data]
  • Mo Yu, Tiejun Zhao and Yalong Bai, Hao Tian, Dianhai Yu. Cross-lingual Projections between Languages from different Families. ACL2013 short paper. [pdf]
  • Mo Yu, Tiejun Zhao, Daxiang Dong, Hao Tian, Dianhai Yu: Compound Embedding Features for Semi-supervised Learning. NAACL 2013 short paper. [pdf]
  • Syntactic parsing and its applications:

  • Mo Yu, Tiejun Zhao, Yalong Bai. Learning Domain Differences Automatically for Dependency Parsing Adaptation. IJCAI 2013 poster. [pdf]
  • Long Jiang, Mo Yu, Ming Zhou, Xiaohua Liu, Tiejun Zhao: Target-dependent Twitter Sentiment Classification. ACL 2011: 151-160
  • Zhen Li, Hanjing Li, Mo Yu, Tiejun Zhao, Sheng Li: Event Entailment Extraction Based on EM Iteration. IALP 2010: 101-104
  • Machine learning and its applications on NLP:

  • Tuo Zhao*, Mo Yu*, Yiming Wang, Raman Arora, Han Liu. Accelerated Mini-batch Randomized Block Coordinate Descent Method. NIPS 2014. [pdf] (Both authors contributed equally.)
  • Mo Yu, Tiejun Zhao, Penglong Hu. A Theoretical Analysis on Structured Learning with Noisy Data and its Applications. Journal of Software. 2013 (In Chinese). [pdf]
  • Lemao Liu, Hailong Cao, Taro Watanabe, Tiejun Zhao, Mo Yu, Conghui Zhu. Locally Training the Log-Linear Model for SMT. EMNLP-CoNLL 2012.
  • Huashen Liang, Lemao Liu, Mo Yu, Yupeng Liu, Penglong Hu, Tingting Li, Chunyue Zhang, Hailong Cao, Tiejun Zhao. Technique reports of HIT-Machine Intelligence & Translation Lab for CWMT2011 (In Chinese)
  • Mo Yu, Shu Wang, Conghui Zhu, Tiejun Zhao: Semi-supervised learning for word sense disambiguation using parallel corpora. FSKD 2011: 1490-1494
  • PengLong Hu, Mo Yu, Jing Li, Conghui Zhu, Tiejun Zhao: Semi-supervised Learning Framework for Cross-Lingual Projection. Web Intelligence/IAT Workshops 2011: 213-216
  • Han Xi-Wu, Yu Mo, Zhu Cong-Hui, and Zhao Tie-Jun. A Sequence Kernel Method for Chinese Subcategorization Analysis. Chinese Journal of Electronics. Vol.19, No.3, July 2010.
  • Cong-Hui Zhu, Mo Yu, Tie-Jun Zhao. Chinese Word Segmenter Based on Discriminative Classifiers Integration. Journal of Computational Information Systems3:5(2008) 1-7
  • Image retrieval

  • Yu-Heng Ren, Mo Yu, Xin-Jing Wang, Lei Zhang, Wei-Ying Ma, Diversifying Landmark Image Search Results by Learning Interested Views from Community Photos. WWW2010 Proceedings
  • Xin-Jing Wang, Mo Yu, Lei Zhang, Wei-Ying Ma, Argo: Intelligent Advertising Made Possible from Users’ Photos (demo paper). ACM MM09 Proceedings.
  • Xin-Jing Wang, Mo Yu, Lei Zhang, Wei-Ying Ma. Advertising based on users’ photos. IEEE ICME 2009 workshop.
  • Xin-Jing Wang, Mo Yu, Lei Zhang, Rui Cai, Wei-Ying Ma. Argo: Intelligent Advertising by Mining a User’s Interest from His Photo Collections. ADKDD’09, June 28, 2009
  • Research Proposal & Funding

    I’ve written a proposal with my advisor to Natural Science Foundation of China with the subject “Cross-lingual projection based on semi-supervised structured learning” and got funding (NSFC: 61173073) in the year 2012. Now I’m taking charge of works related to the funding in my lab.

    Skills & Expertise

    1. Machine Learning
    2. Natural Language Processing
    3. Deep Learning
    4. Representation Learning
    5. Relation Extraction
    6. Semantic Similarity
    7. Cross-lingual Projection
    8. Sentiment Analysis
    9. Named Entity Recognition
    10. Online Learning
    11. Multi-view Learning
    12. Domain Adaptation
    13. Dependency Parsing
    14. Machine Translation
    15. Information Retrieval
    16. C/C++
    17. C#
    18. perl
    19. java
    20. python
    21. matlab