Search Results - "Vo, Vy A"

  • Showing 1 - 19 results of 19
Refine Results
  1. 1

    Value-driven attentional capture enhances distractor representations in early visual cortex by Itthipuripat, Sirawaj, Vo, Vy A, Sprague, Thomas C, Serences, John T

    Published in PLoS biology (09-08-2019)
    “…When a behaviorally relevant stimulus has been previously associated with reward, behavioral responses are faster and more accurate compared to equally…”
    Get full text
    Journal Article
  2. 2

    Dissociable signatures of visual salience and behavioral relevance across attentional priority maps in human cortex by Sprague, Thomas C, Itthipuripat, Sirawaj, Vo, Vy A, Serences, John T

    Published in Journal of neurophysiology (01-06-2018)
    “…Computational models posit that visual attention is guided by activity within spatial maps that index the image-computable salience and the behavioral…”
    Get full text
    Journal Article
  3. 3

    Spatial Tuning Shifts Increase the Discriminability and Fidelity of Population Codes in Visual Cortex by Vo, Vy A, Sprague, Thomas C, Serences, John T

    Published in The Journal of neuroscience (22-03-2017)
    “…Selective visual attention enables organisms to enhance the representation of behaviorally relevant stimuli by altering the encoding properties of single…”
    Get full text
    Journal Article
  4. 4

    Young Children Bet on Their Numerical Skills: Metacognition in the Numerical Domain by Vo, Vy A., Li, Rosa, Kornell, Nate, Pouget, Alexandre, Cantlon, Jessica F.

    Published in Psychological science (01-09-2014)
    “…Metacognition, the ability to assess one's own knowledge, has been targeted as a critical learning mechanism in mathematics education. Yet the early childhood…”
    Get full text
    Journal Article
  5. 5

    Computational Language Modeling and the Promise of In Silico Experimentation by Jain, Shailee, Vo, Vy A, Wehbe, Leila, Huth, Alexander G

    Published in Neurobiology of language (01-04-2024)
    “…Language neuroscience currently relies on two major experimental paradigms: controlled experiments using carefully hand-designed stimuli, and natural stimulus…”
    Get full text
    Journal Article
  6. 6

    Shared Representational Formats for Information Maintained in Working Memory and Information Retrieved from Long-Term Memory by Vo, Vy A, Sutterer, David W, Foster, Joshua J, Sprague, Thomas C, Awh, Edward, Serences, John T

    Published in Cerebral cortex (New York, N.Y. 1991) (19-02-2022)
    “…Abstract Current theories propose that the short-term retention of information in working memory (WM) and the recall of information from long-term memory (LTM)…”
    Get full text
    Journal Article
  7. 7
  8. 8

    Memory-Augmented Graph Neural Networks: A Brain-Inspired Review by Ma, Guixiang, Vo, Vy A., Willke, Theodore L., Ahmed, Nesreen K.

    “…Graph neural networks (GNNs) have been extensively used for many domains where data are represented as graphs, including social networks, recommender systems,…”
    Get full text
    Journal Article
  9. 9

    Memory-Augmented Graph Neural Networks: A Brain-Inspired Review by Ma, Guixiang, Vo, Vy A, Willke, Theodore, Ahmed, Nesreen K

    Published 22-09-2022
    “…We provide a comprehensive review of the existing literature on memory-augmented GNNs. We review these works through the lens of psychology and neuroscience,…”
    Get full text
    Journal Article
  10. 10

    OMPar: Automatic Parallelization with AI-Driven Source-to-Source Compilation by Kadosh, Tal, Hasabnis, Niranjan, Soundararajan, Prema, Vo, Vy A, Capota, Mihai, Ahmed, Nesreen, Pinter, Yuval, Oren, Gal

    Published 23-09-2024
    “…Manual parallelization of code remains a significant challenge due to the complexities of modern software systems and the widespread adoption of multi-core…”
    Get full text
    Journal Article
  11. 11

    Brain encoding models based on multimodal transformers can transfer across language and vision by Tang, Jerry, Du, Meng, Vo, Vy A, Lal, Vasudev, Huth, Alexander G

    Published 20-05-2023
    “…Encoding models have been used to assess how the human brain represents concepts in language and vision. While language and vision rely on similar concept…”
    Get full text
    Journal Article
  12. 12

    Assessing Episodic Memory in LLMs with Sequence Order Recall Tasks by Pink, Mathis, Vo, Vy A, Wu, Qinyuan, Mu, Jianing, Turek, Javier S, Hasson, Uri, Norman, Kenneth A, Michelmann, Sebastian, Huth, Alexander, Toneva, Mariya

    Published 10-10-2024
    “…Current LLM benchmarks focus on evaluating models' memory of facts and semantic relations, primarily assessing semantic aspects of long-term memory. However,…”
    Get full text
    Journal Article
  13. 13

    Memory in humans and deep language models: Linking hypotheses for model augmentation by Raccah, Omri, Chen, Phoebe, Willke, Ted L, Poeppel, David, Vo, Vy A

    Published 04-10-2022
    “…The computational complexity of the self-attention mechanism in Transformer models significantly limits their ability to generalize over long temporal…”
    Get full text
    Journal Article
  14. 14

    MPIrigen: MPI Code Generation through Domain-Specific Language Models by Schneider, Nadav, Hasabnis, Niranjan, Vo, Vy A, Kadosh, Tal, Krien, Neva, Capotă, Mihai, Tamir, Guy, Willke, Ted, Ahmed, Nesreen, Pinter, Yuval, Mattson, Timothy, Oren, Gal

    Published 14-02-2024
    “…The imperative need to scale computation across numerous nodes highlights the significance of efficient parallel computing, particularly in the realm of…”
    Get full text
    Journal Article
  15. 15

    MonoCoder: Domain-Specific Code Language Model for HPC Codes and Tasks by Kadosh, Tal, Hasabnis, Niranjan, Vo, Vy A, Schneider, Nadav, Krien, Neva, Capota, Mihai, Wasay, Abdul, Ahmed, Nesreen, Willke, Ted, Tamir, Guy, Pinter, Yuval, Mattson, Timothy, Oren, Gal

    Published 20-12-2023
    “…With easier access to powerful compute resources, there is a growing trend in AI for software development to develop large language models (LLMs) to address a…”
    Get full text
    Journal Article
  16. 16

    Scope is all you need: Transforming LLMs for HPC Code by Kadosh, Tal, Hasabnis, Niranjan, Vo, Vy A, Schneider, Nadav, Krien, Neva, Wasay, Abdul, Ahmed, Nesreen, Willke, Ted, Tamir, Guy, Pinter, Yuval, Mattson, Timothy, Oren, Gal

    Published 18-08-2023
    “…With easier access to powerful compute resources, there is a growing trend in the field of AI for software development to develop larger and larger language…”
    Get full text
    Journal Article
  17. 17

    The Landscape and Challenges of HPC Research and LLMs by Chen, Le, Ahmed, Nesreen K, Dutta, Akash, Bhattacharjee, Arijit, Yu, Sixing, Mahmud, Quazi Ishtiaque, Abebe, Waqwoya, Phan, Hung, Sarkar, Aishwarya, Butler, Branden, Hasabnis, Niranjan, Oren, Gal, Vo, Vy A, Munoz, Juan Pablo, Willke, Theodore L, Mattson, Tim, Jannesari, Ali

    Published 02-02-2024
    “…Recently, language models (LMs), especially large language models (LLMs), have revolutionized the field of deep learning. Both encoder-decoder models and…”
    Get full text
    Journal Article
  18. 18

    Multi-timescale Representation Learning in LSTM Language Models by Mahto, Shivangi, Vo, Vy A, Turek, Javier S, Huth, Alexander G

    Published 26-09-2020
    “…International Conference on Learning Representations 2021 Language models must capture statistical dependencies between words at timescales ranging from very…”
    Get full text
    Journal Article
  19. 19

    Slower is Better: Revisiting the Forgetting Mechanism in LSTM for Slower Information Decay by Chien, Hsiang-Yun Sherry, Turek, Javier S, Beckage, Nicole, Vo, Vy A, Honey, Christopher J, Willke, Ted L

    Published 12-05-2021
    “…Sequential information contains short- to long-range dependencies; however, learning long-timescale information has been a challenge for recurrent neural…”
    Get full text
    Journal Article