about
A Citation-Based Analysis and Review of Significant Papers on Timing and Time PerceptionBrain Bases of Working Memory for Time Intervals in Rhythmic Sequences.A brain basis for musical hallucinationsNeural Correlates of Auditory Figure-Ground Segregation Based on Temporal CoherencePeriodicity versus Prediction in Sensory Perception.Large-Scale Analysis of Auditory Segregation Behavior Crowdsourced via a Smartphone App.Commentary: Beta-Band Oscillations Represent Auditory Beat and Its Metrical Hierarchy in Perception and ImageryA unified model of time perception accounts for duration-based and beat-based timing mechanismsWorking memory for time intervals in auditory rhythmic sequencesBeta drives brain beats.The right hemisphere supports but does not replace left hemisphere auditory function in patients with persisting aphasia.Segregation of complex acoustic scenes based on temporal coherence.Resource allocation and prioritization in auditory working memoryNavigating the auditory scene: an expert role for the hippocampus.Gamma band pitch responses in human auditory cortex measured with magnetoencephalography.Distinct neural substrates of duration-based and beat-based auditory timing.Brain bases for auditory stimulus-driven figure-ground segregation.Slow GABA transient and receptor desensitization shape synaptic responses evoked by hippocampal neurogliaform cells.Single-subject oscillatory γ responses in tinnitusStructure predicts function: combining non-invasive electrophysiology with in-vivo histology.Reading front to back: MEG evidence for early feedback effects during word recognition.Properties of the internal clock: first- and second-order principles of subjective time.Resource allocation models of auditory working memory.Auditory training changes temporal lobe connectivity in 'Wernicke's aphasia': a randomised trial.Temporal Processing in Audition: Insights from Music.Representations of specific acoustic patterns in the auditory cortex and hippocampus.Recent advances in understanding the auditory cortexEvidence for the Common Coding of Location in Auditory and Visual Space
P50
Q26740749-90745F9E-5694-483D-BEB3-3076FE889408Q27312287-6B7601E7-4456-4779-A0A8-617DA25AD2E5Q30358105-879A7B79-0044-4493-BD0C-4BC343D73452Q30375587-4517C7DF-C1D6-4C00-9C44-BCD0DB486233Q30377099-677A9F8A-8B1D-43DA-8136-7642094CDFE6Q30384957-21386CF3-0FE2-409B-8D92-3DADCA32FA6AQ30392604-B170B01B-537F-41B6-A473-61F7A2F54EC1Q30412816-4FA78DCF-B9D9-4642-92E4-84F9CDCCAF4AQ30423613-1171870D-3F0A-4E98-952B-48A61108D93FQ30430934-741A125F-C105-46AE-9890-EAB44D3D185CQ30437462-157F300E-5DB5-4885-807A-70913FCF815AQ30451734-AF1939C7-8AF3-40F9-BCBA-20B6B24B9CEFQ30457501-26599C6C-E306-401A-93B0-194E5F5190FBQ30457865-F6B1D9CD-30AD-41BE-9943-E48753B621CFQ30471436-FCD13FB0-31B8-469A-A5E9-380E8855021EQ30474376-DC2CCD85-F7AD-4064-8E60-EDC0B0C63243Q30474899-951EF77E-89C6-4527-B099-33F6BA357E42Q30516500-FAA1230D-1A82-44B2-A963-1B15242FCC23Q30525644-BAB887D9-8278-4A67-8D44-673E847D8A23Q35102053-5F3CDCEE-426C-4B23-A6DF-A7E4F048B134Q37577108-D5BA2501-E4D1-4F84-A746-37278A29D3DBQ38139503-880E4339-18DC-41B4-87EE-4FAFE9495155Q38719396-2A40139C-95A7-4B22-AA17-E3A17B3D38B9Q47148866-939EB6D3-290C-4A98-B2F0-E5B302995EBBQ47565752-DCE7295F-9E93-4F86-AC22-65E93D4FFE44Q55658832-4B1CC08D-7217-4D24-AE8F-4BD79D8A431DQ57821273-212C17EC-3F05-4482-BD66-759FD8712BD6Q63988691-8AE1119F-530C-460D-BDFB-14CD9B616E0E
P50
description
researcher
@en
wetenschapper
@nl
հետազոտող
@hy
name
Sundeep Teki
@ast
Sundeep Teki
@en
Sundeep Teki
@es
Sundeep Teki
@nl
Sundeep Teki
@sl
type
label
Sundeep Teki
@ast
Sundeep Teki
@en
Sundeep Teki
@es
Sundeep Teki
@nl
Sundeep Teki
@sl
prefLabel
Sundeep Teki
@ast
Sundeep Teki
@en
Sundeep Teki
@es
Sundeep Teki
@nl
Sundeep Teki
@sl
P106
P1153
36446755200
P31
P496
0000-0002-7951-6581