Search for dissertations about: "Neural language model"

Showing result 1 - 5 of 31 swedish dissertations containing the words Neural language model.

  1. 1. Why the pond is not outside the frog? Grounding in contextual representations by neural language models

    Author : Mehdi Ghanimifard; Göteborgs universitet; []
    Keywords : NATURVETENSKAP; NATURAL SCIENCES; Computational linguistics; Language grounding; Spatial language; Distributional semantics; Computer vision; Language modelling; Vision and language; Neural language model; Grounded language model;

    Abstract : In this thesis, to build a multi-modal system for language generation and understanding, we study grounded neural language models. Literature in psychology informs us that spatial cognition involves different aspects of knowledge that include visual perception and human interaction with the world. READ MORE

  2. 2. Representation learning for natural language

    Author : Olof Mogren; RISE; []
    Keywords : NATURVETENSKAP; NATURAL SCIENCES; NATURVETENSKAP; NATURAL SCIENCES; NATURVETENSKAP; NATURAL SCIENCES; artificial neural networks; artificial intelligence; natural language processing; deep learning; machine learning; summarization; representation learning;

    Abstract : Artificial neural networks have obtained astonishing results in a diverse number of tasks. One of the reasons for the success is their ability to learn the whole task at once (endto-end learning), including the representations for data. READ MORE

  3. 3. Word Sense Embedded in Geometric Spaces - From Induction to Applications using Machine Learning

    Author : Mikael Kågebäck; Chalmers tekniska högskola; []
    Keywords : NATURVETENSKAP; NATURAL SCIENCES; HUMANIORA; HUMANITIES; NATURVETENSKAP; NATURAL SCIENCES; word sense induction; word sense disambiguation; word embeddings; extractive summarisation; neural networks; deep learning; natural language procsessing; reinforcement learning;

    Abstract : Words are not detached individuals but part of a beautiful interconnected web of related concepts, and to capture the full complexity of this web they need to be represented in a way that encapsulates all the semantic and syntactic facets of the language. Further, to enable computational processing they need to be expressed in a consistent manner so that similar properties are encoded in a similar way. READ MORE

  4. 4. The Search for Syntax : Investigating the Syntactic Knowledge of Neural Language Models Through the Lens of Dependency Parsing

    Author : Artur Kulmizev; Joakim Nivre; Roger Levy; Uppsala universitet; []
    Keywords : NATURVETENSKAP; NATURAL SCIENCES; syntax; language models; dependency parsing; universal dependencies; Datorlingvistik; Computational Linguistics;

    Abstract : Syntax — the study of the hierarchical structure of language — has long featured as a prominent research topic in the field of natural language processing (NLP). Traditionally, its role in NLP was confined towards developing parsers: supervised algorithms tasked with predicting the structure of utterances (often for use in downstream applications). READ MORE

  5. 5. Segmenting and Tagging Text with Neural Networks

    Author : Yan Shao; Joakim Nivre; Jörg Tiedemann; Christian Hardmeier; Yue Zhang; Uppsala universitet; []
    Keywords : NATURVETENSKAP; NATURAL SCIENCES; neural networks; sequence labelling; multilinguality; word segmentation; sentence segmentation; morpheme segmentation; transliteration; joint word segmentation and POS tagging;

    Abstract : Segmentation and tagging of text are important preprocessing steps for higher-level natural language processing tasks. In this thesis, we apply a sequence labelling framework based on neural networks to various segmentation and tagging tasks, including sentence segmentation, word segmentation, morpheme segmentation, joint word segmentation and part-of-speech tagging, and named entity transliteration. READ MORE