awgldk
/
wikidata
/
Login
Register
TriplyDB
Wikidata
Browser
Table
SPARQL
Graphs
1
1
Services
1
1
Assets
0
0
Insights
Schema
BETA
Class frequency
Class hierarchy
Q27321525-4E730253-BFB1-4E73-94C2-25F5D998E105
Q27321525-4E730253-BFB1-4E73-94C2-25F5D998E105
BestRank
Statement
http://www.wikidata.org/entity/statement/Q27321525-4E730253-BFB1-4E73-94C2-25F5D998E105
No, there is no 150 ms lead of visual speech on auditory speech, but a range of audiovisual asynchronies varying from small audio lead to large audio lag
P2860
Q27321525-4E730253-BFB1-4E73-94C2-25F5D998E105
BestRank
Statement
http://www.wikidata.org/entity/statement/Q27321525-4E730253-BFB1-4E73-94C2-25F5D998E105
rank
NormalRank
type
BestRank
Statement
wasDerivedFrom
59b84643b59e9717829f62a5c200775c4b4ddcd3
P2860
Auditory cortex tracks both auditory and visual stimulus dynamics using low-frequency neuronal phase modulation.