about
Defining Auditory-Visual Objects: Behavioral Tests and Physiological MechanismsPotential Use of MEG to Understand Abnormalities in Auditory Function in Clinical PopulationsUsing neuroimaging to understand the cortical mechanisms of auditory selective attentionTemporal alignment of pupillary response with stimulus events via deconvolutionAuditory attention strategy depends on target linguistic properties and spatial configuration.Directing eye gaze enhances auditory spatial cue discrimination.Auditory selective attention is enhanced by a task-irrelevant temporally coherent visual stimulus in human listeners.Switching auditory attention using spatial and non-spatial features recruits different cortical networks.Measuring auditory selective attention using frequency tagging.The cortical dynamics underlying effective switching of auditory spatial attention.Nothing is irrelevant in a noisy world: sensory illusions reveal obligatory within-and across-modality integrationAuditory selective attention reveals preparatory activity in different cortical regions for selection based on source location and source pitch.Selective attention in an overcrowded auditory scene: implications for auditory-based brain-computer interface design.A sound element gets lost in perceptual competitionImproving spatial localization in MEG inverse imaging by leveraging intersubject anatomical differencesNetwork dynamics underlying speed-accuracy trade-offs in response to errorsAttention drives synchronization of alpha and beta rhythms between right inferior frontal and primary sensory neocortex.Multimodal neuroimaging dissociates hemodynamic and electrophysiological correlates of error processingLeveraging anatomical information to improve transfer learning in brain-computer interfaces.Anomalous use of context during task preparation in schizophrenia: a magnetoencephalography studyPupillometry shows the effort of auditory attention switching.Mapping cortical dynamics using simultaneous MEG/EEG and anatomically-constrained minimum-norm estimates: an auditory attention example.Towards a next-generation hearing aid through brain state classification and modeling.Incorporating modern neuroscience findings to improve brain-computer interfaces: tracking auditory attention.Auditory Brainstem Responses to Continuous Natural Speech in Human Listeners.Integration of Visual Information in Auditory Cortex Promotes Auditory Scene Analysis through Multisensory Binding.Neural Switch Asymmetry in Feature-Based Auditory Attention Tasks
P50
Q26771599-77A900FD-33DB-47F0-9BF6-DE8D3FE8F0D7Q26822436-79F0ACA0-40C5-4477-89F8-51CAADFB6864Q26824258-04A51291-634D-4C87-B5B3-9549F540C5C9Q30358520-8A33B79E-9C08-43C0-B2E1-EA07AEF10DDEQ30380288-91F5B998-7A19-40BD-A25E-2DB1BEB16BD3Q30415694-9A74FA1C-CEE1-4E6A-A46E-FB98CE67B2C1Q30417238-92A93B54-E4A0-4B2A-B7F1-4A2526C6D894Q30422041-8C02F5C5-FCB5-4342-A52E-1D0247544A93Q30442888-7B901EBF-5083-4AD4-9495-227747E564C4Q30445184-F52DE50B-0248-42C8-B3D9-94F42B326221Q30456055-6E2E5D17-26DB-4616-BA2D-BE4F44BF819DQ30459028-25DBD859-9E76-4CBA-9C78-D65946E0B953Q30460953-66A1A1B0-28DA-4547-AC40-56716F42EDCFQ30479769-2A69437B-2608-4269-B60F-7F4B00544122Q34368762-6897D811-9DF9-4676-A78C-CF39B11EAA44Q34998795-79B3DD26-405B-40B7-A68A-F0317F29A2C4Q35044806-DA692DFB-2AC2-44BD-BEBE-7A54F257401BQ35409104-A0463A73-6021-46D5-A7D4-572DC2B76EAFQ35925370-11A33AF8-A61C-4388-9DEC-DA3C904F03CEQ36809024-65A2910F-AB6A-4254-9745-82286871CD74Q38808472-F20F292B-C5B9-49C9-8412-C3DB74978893Q41460421-CFFC6BBC-AE55-4E61-923A-FA2137D3A40CQ44156028-B57449B9-C65B-4F03-8490-40B7A316DBD4Q48517685-8BF6E1EC-40B2-48D8-9C2C-CE5BCB04E836Q49587454-E97F651E-4139-49D9-8039-DF3606A79703Q50196220-00ED73EC-ABDF-436A-A480-D7D580D23AC8Q91190613-5AD1684E-2184-49CD-9AB3-CD59599B84D5
P50
description
investigador
@es
researcher
@en
wetenschapper
@nl
name
Adrian K C Lee
@en
Adrian K C Lee
@nl
type
label
Adrian K C Lee
@en
Adrian K C Lee
@nl
prefLabel
Adrian K C Lee
@en
Adrian K C Lee
@nl
P108
P31
P496
0000-0002-7611-0500