Frontal top-down signals increase coupling of auditory low-frequency oscillations to continuous speech in human listeners.
about
Multi-Dimensional Dynamics of Human Electromagnetic Brain ActivityThe Role of High-Level Processes for Oscillatory Phase Entrainment to Speech Sound.Auditory processing in noise is associated with complex patterns of disrupted functional connectivity in autism spectrum disorderContributions of local speech encoding and functional connectivity to audio-visual speech perception.A statistical framework for neuroimaging data analysis based on mutual information estimated via a gaussian copula.Auditory cortical delta-entrainment interacts with oscillatory power in multiple fronto-parietal networks.Atypical right hemisphere response to slow temporal modulations in children with developmental dyslexia.Prediction Signatures in the Brain: Semantic Pre-Activation during Language Comprehension.Brain-inspired speech segmentation for automatic speech recognition using the speech envelope as a temporal referenceLip movements entrain the observers' low-frequency brain oscillations to facilitate speech intelligibilityIrregular Speech Rate Dissociates Auditory Cortical Entrainment, Evoked Responses, and Frontal Alpha.Tuning Neural Phase Entrainment to Speech.Synchronization by the hand: the sight of gestures modulates low-frequency activity in brain responses to continuous speech.Visual cortex entrains to sign language.Visual Benefits in Apparent Motion Displays: Automatically Driven Spatial and Temporal Anticipation Are Partially DissociatedA Predictive Coding Perspective on Beta Oscillations during Sentence-Level Language Comprehension.Ongoing slow oscillatory phase modulates speech intelligibility in cooperation with motor cortical activity.Prefrontal cortex modulates posterior alpha oscillations during top-down guided visual perception.Dysfunction of Rapid Neural Adaptation in Dyslexia.Difficulties in auditory organization as a cause of reading backwardness? An auditory neuroscience perspective.Evidence for causal top-down frontal contributions to predictive processes in speech perception.Non-linear auto-regressive models for cross-frequency coupling in neural time series.Phase Entrainment of Brain Oscillations Causally Modulates Neural Responses to Intelligible Speech.An interactive model of auditory-motor speech perception.Neuronal populations in the occipital cortex of the blind synchronize to the temporal dynamics of speech.A dynamical systems approach for estimating phase interactions between rhythms of different frequencies from experimental data.Synchronization of Electrophysiological Responses with Speech Benefits Syntactic Information Processing.Motor origin of temporal predictions in auditory attention.High-frequency neural activity predicts word parsing in ambiguous speech streams.Linguistic Bias Modulates Interpretation of Speech via Neural Delta-Band Oscillations.The coupling between auditory and motor cortices is rate-restricted: Evidence for an intrinsic speech-motor rhythm.The Human Neural Alpha Response to Speech is a Proxy of Attentional Control.Cortical Measures of Phoneme-Level Speech Encoding Correlate with the Perceived Clarity of Natural Speech.Perceptually relevant speech tracking in auditory and motor cortex reflects distinct linguistic features.Large-scale network dynamics of beta-band oscillations underlie auditory perceptual decision-making.A Visual Cortical Network for Deriving Phonological Information from Intelligible Lip Movements.IFCN-endorsed practical guidelines for clinical magnetoencephalography (MEG)Representational interactions during audiovisual speech entrainment: Redundancy in left posterior superior temporal gyrus and synergy in left motor cortexLower Beta: A Central Coordinator of Temporal Prediction in Multimodal Speech
P2860
Q26772811-B2BB56CF-73E9-4414-AA57-CE489A07CC62Q26773123-C298CF92-DFE8-4683-8897-EBE77D708C70Q30353987-60546365-5D2B-482A-8B62-F268C6E5402EQ30354582-A428CC9F-024D-41BA-91AB-9ABDAC5CD536Q30361555-937011A6-FE57-4C7F-8C66-9030561CB862Q30362658-C25E2D39-C812-408A-8745-A8047F87C265Q30368141-C43A1079-9F51-4C17-BE5C-E7E13A3FF778Q30369082-25A38F00-FA36-4A53-94A5-7CAB08A2CFD0Q30369162-1B4B43A6-C619-4E3E-91C1-2A0AB75199E1Q30381652-42E4DEDC-10DC-4FC5-9376-84109E625A07Q30398782-5034DBA6-7537-4F69-AA23-3945419FA9E3Q30401250-5723605A-C00D-499A-8616-F5B283126E38Q30401526-93EDAE34-9318-457F-828A-9F27BF7F007CQ33810390-091B4BA9-E625-4CA8-925E-F2E374AD6A71Q35856782-380C1CC4-75AD-4F97-8638-62ADFDBEA2DCQ38747377-73C55FF5-711E-4C10-8F6D-234F2C20C133Q41365407-69D6F628-1B6B-4B7F-8F22-A206C300A794Q46913819-DB4F6077-C4F1-42C1-98A6-3A65D0603382Q47132209-5F870576-246E-4020-8B93-552CA2A042E4Q47160770-DF544FF3-BBFF-42FA-AE7B-8CE6AA11F5D4Q47171360-3FB2EEDF-40E0-459E-AD87-516D22110B52Q47330751-67F0CE10-8A07-44D1-A1B1-9A40736197B4Q47718749-E1D5DF66-6DC7-43E5-B099-4A057DDE1332Q47768635-31624630-5E3E-44A2-948C-99904A127EFDQ47861965-145BD90C-3682-4714-8209-C0858E2EC9CDQ47862370-5D38BD78-C053-4B5A-9E4C-09B462A96F1BQ47870440-6B39F641-9A61-4773-B0D3-E44A57CA655BQ48035198-ADC54B72-381D-40E2-94CF-F7EB8D0C0524Q48524701-EBFB9A88-2908-4F2A-9FF5-E47BD7D23990Q48548354-A2921DBF-AF6B-4F98-BB75-DC816685EC07Q49996441-F2354127-676C-4857-86F1-861B2BA0AC20Q50591030-F29C72AE-51F9-408C-9030-1121DE34FDE5Q52584183-93D4CA94-8F45-478A-B507-598749642BB8Q55091406-6AFDFCEB-DE53-4986-BB75-8475A402FCF9Q55352606-74886503-033A-4C72-87C5-4FB97679624AQ55531127-2D886D00-724E-4986-B975-52C0C95F00C9Q56374580-4CB6C84B-E1AE-4B63-A04B-E8AFBF9C7CE7Q57888917-F1C21AEC-AC8D-47CF-9524-7B050644C38AQ58566571-536020F1-6710-4C9D-9674-54A73CAACD1F
P2860
Frontal top-down signals increase coupling of auditory low-frequency oscillations to continuous speech in human listeners.
description
2015 nî lūn-bûn
@nan
2015 թուականի Մայիսին հրատարակուած գիտական յօդուած
@hyw
2015 թվականի մայիսին հրատարակված գիտական հոդված
@hy
2015年の論文
@ja
2015年論文
@yue
2015年論文
@zh-hant
2015年論文
@zh-hk
2015年論文
@zh-mo
2015年論文
@zh-tw
2015年论文
@wuu
name
Frontal top-down signals incre ...... ous speech in human listeners.
@ast
Frontal top-down signals incre ...... ous speech in human listeners.
@en
type
label
Frontal top-down signals incre ...... ous speech in human listeners.
@ast
Frontal top-down signals incre ...... ous speech in human listeners.
@en
prefLabel
Frontal top-down signals incre ...... ous speech in human listeners.
@ast
Frontal top-down signals incre ...... ous speech in human listeners.
@en
P2860
P50
P1433
P1476
Frontal top-down signals incre ...... uous speech in human listeners
@en
P2093
Philippe G Schyns
Robin A A Ince
P2860
P304
P356
10.1016/J.CUB.2015.04.049
P407
P577
2015-05-28T00:00:00Z