A special issue of the Natural Language Engineering journal
17 July 2018, by Reinhard Zierke
Language Technology group is involved in organization of the special issue of the Natural Language Engineering journal on informing neural architectures for NLP with linguistic and background knowledge.
Language Technology group is involved in the organization of the special issue of the Natural Language Engineering journal on informing neural architectures for NLP with linguistic and background knowledge.
There has been a huge amount of research on the use of deep neural architectures for Natural Language Processing (NLP). Namely, in the last years (i.e. starting at a constant rate from 2010) the proceedings from all major conferences in Artificial Intelligence and Computational Linguistics, including AAAI, IJCAI, ACL, NAACL, NIPS, and ICLR to name a few, included a substantial amount of contributions on deep neural networks applied to NLP. Despite the fact that automatic representation learning, as opposed to manual feature engineering, has become the de facto standard methodological framework, linguistic knowledge - encoded informally in the form of human expertise and intuitions about the language, but also formally in large symbolic linguistic resources - has not become obsolete. It is still an invaluable source of knowledge required by most modern NLP technologies to reach peak performance. This special issue aims to collect state-of-the-art contributions to the development and use of linguistic and background knowledge for neural architectures in NLP, such as task-specific objective functions that are informed on the basis of linguistic knowledge or using linguistic resources like lexical knowledge bases and multilingual dictionaries to generate training data for neural architectures, as well as specialize and improve text representations.
The deadline for submissions is the 1st of November, 2018.