Search Results - "Vo, Vy A"
-
1
Value-driven attentional capture enhances distractor representations in early visual cortex
Published in PLoS biology (09-08-2019)“…When a behaviorally relevant stimulus has been previously associated with reward, behavioral responses are faster and more accurate compared to equally…”
Get full text
Journal Article -
2
Dissociable signatures of visual salience and behavioral relevance across attentional priority maps in human cortex
Published in Journal of neurophysiology (01-06-2018)“…Computational models posit that visual attention is guided by activity within spatial maps that index the image-computable salience and the behavioral…”
Get full text
Journal Article -
3
Spatial Tuning Shifts Increase the Discriminability and Fidelity of Population Codes in Visual Cortex
Published in The Journal of neuroscience (22-03-2017)“…Selective visual attention enables organisms to enhance the representation of behaviorally relevant stimuli by altering the encoding properties of single…”
Get full text
Journal Article -
4
Young Children Bet on Their Numerical Skills: Metacognition in the Numerical Domain
Published in Psychological science (01-09-2014)“…Metacognition, the ability to assess one's own knowledge, has been targeted as a critical learning mechanism in mathematics education. Yet the early childhood…”
Get full text
Journal Article -
5
Computational Language Modeling and the Promise of In Silico Experimentation
Published in Neurobiology of language (01-04-2024)“…Language neuroscience currently relies on two major experimental paradigms: controlled experiments using carefully hand-designed stimuli, and natural stimulus…”
Get full text
Journal Article -
6
Shared Representational Formats for Information Maintained in Working Memory and Information Retrieved from Long-Term Memory
Published in Cerebral cortex (New York, N.Y. 1991) (19-02-2022)“…Abstract Current theories propose that the short-term retention of information in working memory (WM) and the recall of information from long-term memory (LTM)…”
Get full text
Journal Article -
7
Inverted Encoding Models Assay Population-Level Stimulus Representations, Not Single-Unit Neural Tuning
Published in eNeuro (01-05-2018)Get full text
Journal Article -
8
Memory-Augmented Graph Neural Networks: A Brain-Inspired Review
Published in IEEE transactions on artificial intelligence (01-05-2024)“…Graph neural networks (GNNs) have been extensively used for many domains where data are represented as graphs, including social networks, recommender systems,…”
Get full text
Journal Article -
9
Memory-Augmented Graph Neural Networks: A Brain-Inspired Review
Published 22-09-2022“…We provide a comprehensive review of the existing literature on memory-augmented GNNs. We review these works through the lens of psychology and neuroscience,…”
Get full text
Journal Article -
10
OMPar: Automatic Parallelization with AI-Driven Source-to-Source Compilation
Published 23-09-2024“…Manual parallelization of code remains a significant challenge due to the complexities of modern software systems and the widespread adoption of multi-core…”
Get full text
Journal Article -
11
Brain encoding models based on multimodal transformers can transfer across language and vision
Published 20-05-2023“…Encoding models have been used to assess how the human brain represents concepts in language and vision. While language and vision rely on similar concept…”
Get full text
Journal Article -
12
Assessing Episodic Memory in LLMs with Sequence Order Recall Tasks
Published 10-10-2024“…Current LLM benchmarks focus on evaluating models' memory of facts and semantic relations, primarily assessing semantic aspects of long-term memory. However,…”
Get full text
Journal Article -
13
Memory in humans and deep language models: Linking hypotheses for model augmentation
Published 04-10-2022“…The computational complexity of the self-attention mechanism in Transformer models significantly limits their ability to generalize over long temporal…”
Get full text
Journal Article -
14
MPIrigen: MPI Code Generation through Domain-Specific Language Models
Published 14-02-2024“…The imperative need to scale computation across numerous nodes highlights the significance of efficient parallel computing, particularly in the realm of…”
Get full text
Journal Article -
15
MonoCoder: Domain-Specific Code Language Model for HPC Codes and Tasks
Published 20-12-2023“…With easier access to powerful compute resources, there is a growing trend in AI for software development to develop large language models (LLMs) to address a…”
Get full text
Journal Article -
16
Scope is all you need: Transforming LLMs for HPC Code
Published 18-08-2023“…With easier access to powerful compute resources, there is a growing trend in the field of AI for software development to develop larger and larger language…”
Get full text
Journal Article -
17
The Landscape and Challenges of HPC Research and LLMs
Published 02-02-2024“…Recently, language models (LMs), especially large language models (LLMs), have revolutionized the field of deep learning. Both encoder-decoder models and…”
Get full text
Journal Article -
18
Multi-timescale Representation Learning in LSTM Language Models
Published 26-09-2020“…International Conference on Learning Representations 2021 Language models must capture statistical dependencies between words at timescales ranging from very…”
Get full text
Journal Article -
19
Slower is Better: Revisiting the Forgetting Mechanism in LSTM for Slower Information Decay
Published 12-05-2021“…Sequential information contains short- to long-range dependencies; however, learning long-timescale information has been a challenge for recurrent neural…”
Get full text
Journal Article