Search Results - "Andersen, Tobias S."
-
1
Bayesian binding and fusion models explain illusion and enhancement effects in audiovisual speech perception
Published in PloS one (19-02-2021)“…Speech is perceived with both the ears and the eyes. Adding congruent visual speech improves the perception of a faint auditory speech stimulus, whereas adding…”
Get full text
Journal Article -
2
Intellectually able adults with autism spectrum disorder show typical resting-state EEG activity
Published in Scientific reports (08-11-2022)“…There is broad interest in discovering quantifiable physiological biomarkers for psychiatric disorders to aid diagnostic assessment. However, finding…”
Get full text
Journal Article -
3
Speech-specific audiovisual integration modulates induced theta-band oscillations
Published in PloS one (16-07-2019)“…Speech perception is influenced by vision through a process of audiovisual integration. This is demonstrated by the McGurk illusion where visual speech (for…”
Get full text
Journal Article -
4
Face configuration affects speech perception: Evidence from a McGurk mismatch negativity study
Published in Neuropsychologia (01-01-2015)“…We perceive identity, expression and speech from faces. While perception of identity and expression depends crucially on the configuration of facial features…”
Get full text
Journal Article -
5
Electrophysiological evidence for differences between fusion and combination illusions in audiovisual speech perception
Published in The European journal of neuroscience (01-11-2017)“…Incongruent audiovisual speech stimuli can lead to perceptual illusions such as fusions or combinations. Here, we investigated the underlying audiovisual…”
Get full text
Journal Article -
6
The early maximum likelihood estimation model of audiovisual integration in speech perception
Published in The Journal of the Acoustical Society of America (01-05-2015)“…Speech perception is facilitated by seeing the articulatory mouth movements of the talker. This is due to perceptual audiovisual integration, which also causes…”
Get full text
Journal Article -
7
Factors influencing audiovisual fission and fusion illusions
Published in Brain research. Cognitive brain research (01-11-2004)“…Information processing in auditory and visual modalities interacts in many circumstances. Spatially and temporally coincident acoustic and visual information…”
Get full text
Journal Article -
8
Towards a wearable multi-modal seizure detection system in epilepsy: A pilot study
Published in Clinical neurophysiology (01-04-2022)“…•Wearable EEG, ECG and accelerometry can detect correlates of seizure activity in an epilepsy monitoring unit.•A support vector machine trained on the…”
Get full text
Journal Article -
9
Classification of independent components of EEG into multiple artifact classes
Published in Psychophysiology (01-01-2015)“…In this study, we aim to automatically identify multiple artifact types in EEG. We used multinomial regression to classify independent components of EEG data,…”
Get full text
Journal Article -
10
Data-driven analysis of gaze patterns in face perception: Methodological and clinical contributions
Published in Cortex (01-02-2022)“…Gaze patterns during face perception have been shown to relate to psychiatric symptoms. Standard analysis of gaze behavior includes calculating fixations…”
Get full text
Journal Article -
11
Multistage audiovisual integration of speech: dissociating identification and detection
Published in Experimental brain research (01-02-2011)“…Speech perception integrates auditory and visual information. This is evidenced by the McGurk illusion where seeing the talking face influences the auditory…”
Get full text
Journal Article -
12
Resting-state EEG functional connectivity predicts post-traumatic stress disorder subtypes in veterans
Published in Journal of neural engineering (01-12-2022)“… Post-traumatic stress disorder (PTSD) is highly heterogeneous, and identification of quantifiable biomarkers that could pave the way for targeted treatment…”
Get more information
Journal Article -
13
Smartphones as pocketable labs: Visions for mobile brain imaging and neurofeedback
Published in International journal of psychophysiology (01-01-2014)“…Mobile brain imaging solutions, such as the Smartphone Brain Scanner, which combines low cost wireless EEG sensors with open source software for real-time…”
Get full text
Journal Article -
14
The Effect of Exposure Duration on Visual Character Identification in Single, Whole, and Partial Report
Published in Journal of experimental psychology. Human perception and performance (01-04-2012)“…The psychometric function of single-letter identification is typically described as a function of stimulus intensity. However, the effect of stimulus exposure…”
Get more information
Journal Article -
15
Audiovisual integration of stimulus transients
Published in Vision research (Oxford) (01-11-2008)“…A change in sound intensity can facilitate luminance change detection. We found that this effect did not depend on whether sound intensity and luminance…”
Get full text
Journal Article -
16
How low can you go: Spatial frequency sensitivity in a patient with pure alexia
Published in Brain and language (01-08-2013)“…•We test contrast sensitivity for a range of spatial frequencies in a pure alexia case.•We used a sensitive and demanding psychophysical task.•Detection and…”
Get full text
Journal Article -
17
Audiovisual integration of speech in a patient with Broca's Aphasia
Published in Frontiers in psychology (28-04-2015)“…Lesions to Broca's area cause aphasia characterized by a severe impairment of the ability to speak, with comparatively intact speech perception. However, some…”
Get full text
Journal Article -
18
The role of visual spatial attention in audiovisual speech perception
Published in Speech communication (01-02-2009)“…Auditory and visual information is integrated when perceiving speech, as evidenced by the McGurk effect in which viewing an incongruent talking face…”
Get full text
Journal Article -
19
Audio–visual speech perception is special
Published in Cognition (01-05-2005)“…In face-to-face conversation speech is perceived by ear and eye. We studied the prerequisites of audio–visual speech perception by using perceptually ambiguous…”
Get full text
Journal Article -
20
Maximum Likelihood Integration of rapid flashes and beeps
Published in Neuroscience letters (20-05-2005)“…Maximum likelihood models of multisensory integration are theoretically attractive because the goals and assumptions of sensory information processing are…”
Get full text
Journal Article