PDF] Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain Responses
Descrição
An encoder-decoder transfer learning method from computer vision is adapted to investigate the structure among 100 different feature spaces extracted from hidden representations of various networks trained on language tasks and reveals a low-dimensional structure where language models and translation models smoothly interpolate between word embeddings, syntactic and semantic tasks, and future word embedDings. How related are the representations learned by neural language models, translation models, and language tagging tasks? We answer this question by adapting an encoder-decoder transfer learning method from computer vision to investigate the structure among 100 different feature spaces extracted from hidden representations of various networks trained on language tasks. This method reveals a low-dimensional structure where language models and translation models smoothly interpolate between word embeddings, syntactic and semantic tasks, and future word embeddings. We call this low-dimensional structure a language representation embedding because it encodes the relationships between representations needed to process language for a variety of NLP tasks. We find that this representation embedding can predict how well each individual feature space maps to human brain responses to natural language stimuli recorded using fMRI. Additionally, we find that the principal dimension of this structure can be used to create a metric which highlights the brain's natural language processing hierarchy. This suggests that the embedding captures some part of the brain's natural language representation structure.
Shared neural representations and temporal segmentation of political content predict ideological similarity
When self comes to a wandering mind: Brain representations and dynamics of self-generated concepts in spontaneous thought
Autoencoders in Deep Learning: Tutorial & Use Cases [2023]
Considerations for constructing a protein sequence database for metaproteomics - Computational and Structural Biotechnology Journal
Neural representation of linguistic feature hierarchy reflects second- language proficiency - ScienceDirect
Physics-Inspired Structural Representations for Molecules and Materials
Full article: Machine learning in the analysis of biomolecular simulations
Sensors, Free Full-Text
Conscious cognitive effort in cognitive control - Shepherd - 2023 - WIREs Cognitive Science - Wiley Online Library
PDF] Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain Responses
What Is ChatGPT Doing … and Why Does It Work?—Stephen Wolfram Writings
PDF] Low-Dimensional Structure in the Space of Language Representations is Reflected in Brain Responses
Knowledge Across Reference Frames: Cognitive Maps and Image Spaces: Trends in Cognitive Sciences
Abstract representations emerge naturally in neural networks trained to perform multiple tasks
de
por adulto (o preço varia de acordo com o tamanho do grupo)