about
Localizing non-retinotopically moving objectsPost-determined emotion: motor action retrospectively modulates emotional valence of visual imagesWeight lifting can facilitate appreciative comprehension for museum exhibitsPattern randomness aftereffect.Placing joy, surprise and sadness in space: a cross-linguistic study.Fear of eyes: triadic relation among social anxiety, trypophobia, and discomfort for eye cluster.Appraisal of space words and allocation of emotion words in bodily space.Factor Structure, Reliability, and Validity of the Japanese Version of the Disgust Propensity and Sensitivity Scale-RevisedCan you eat it? A link between categorization difficulty and food likabilityThe stuffed animal sleepover: enhancement of reading and the duration of the effect.Crossmodal Modulation of Spatial Localization by Mimetic Words.When categorization-based stranger avoidance explains the uncanny valley: A comment on MacDorman and Chattopadhyay (2016).Early Visual Perception Potentiated by Object Affordances: Evidence From a Temporal Order Judgment Task.Does weight lifting improve visual acuity? A replication of Gonzalo-Fonrodona and Porras (2013).Directionless vection: A new illusory self-motion perceptionRegular Is Longer.Erroneous selection of a non-target item improves subsequent target identification in rapid serial visual presentations.Object Affordances Potentiate Responses but Do Not Guide Attentional Prioritization.Awareness shaping or shaped by prediction and postdiction: Editorial.Trypophobic Discomfort is Spatial-Frequency DependentI speak fast when I move fast: the speed of illusory self-motion (vection) modulates the speed of utterances.Avoidance of Novelty Contributes to the Uncanny Valley.Involuntary protection against dermatosis: A preliminary observation on trypophobia.The scintillating grid illusion: influence of size, shape, and orientation of the luminance patches.Emotion biases voluntary vertical action only with visible cues.One's own name distorts visual space.Emotional sounds influence vertical vection.Emotion colors time perception unconsciously.Scents boost preference for novel fruits.Time-to-contact estimation modulated by implied friction.How an abrupt onset cue can release motion-induced blindness.Temporal course of position shift for a peripheral target.Manipulating the Alpha Level Cannot Cure Significance Testing.Arousing emoticons edit stream/bounce perception of objects moving past each other.Disgust and the rubber hand illusion: a registered replication report of Jalal, Krishnakumar, and Ramachandran (2015).How to Crack Pre-registration: Toward Transparent and Open ScienceTwo noncontiguous locations can be attended concurrently: evidence from the attentional blinkMislocalization of a target toward subjective contours: attentional modulation of location signalsDividing attention between two different categories and locations in rapid serial visual presentationsInvisible motion contributes to simultaneous motion contrast
P50
Q27320785-54D7BD2D-C6E3-4D10-A2F6-5FB9FB3D979FQ27333122-309A596B-B191-4FAB-858B-3B1859BACE67Q28658601-C512A32F-DAC2-4AB7-AE0B-514E28ECEDFAQ30550039-086D5CAD-1667-4A4A-9D0E-C188F6BD3933Q33840833-49EE2A31-D975-4F56-B1F5-D879EF05AB33Q34526141-4DCCA18C-0545-40AC-AAFA-5424F43BDE58Q35069724-06F6CD33-100E-4714-9B62-22C704CF12DDQ36161569-F546E794-11B3-4FAB-AE9D-F7E21E1FB78EQ36211506-F9DED934-DBFE-44AC-96BE-C6B6AD2AB018Q36303714-9DFBFA44-1CE3-44CB-87F8-8E79E3DD97D7Q37499414-F2F1D2EA-0151-44E0-9F34-44E3DE6A91D3Q39376140-51F86446-EB5B-4BC3-B18F-143D4741024EQ41140424-498B3C77-93D1-4807-AEE9-3964AC86C2ADQ41211882-14D73ED8-7C14-41A7-A27F-18D9C2560BD4Q41536587-FEB053AE-6D89-4EE2-9D62-183A57BE051FQ41714338-6C2523B8-30CA-4253-BC18-BF109C9E2421Q42010931-4020D5F4-AE56-4579-A1B4-C74155A9AD57Q42087949-0384042B-C210-4B75-88C4-7E69022C32B1Q42215505-D1036E17-47A4-4B7E-B29D-674A17DE1863Q42374707-FA84100E-554F-4873-9779-06C259BFD432Q42547633-5F2EDFA4-CE91-4A29-B8D7-C2BAEA050FDCQ47119222-7AFD75D5-84D3-4B7F-B1C3-4F4E3087B981Q47158684-60B020DF-5FAA-4799-A6E3-B0B01EF178C7Q47172515-99520809-4326-405A-B013-81AC92ABB179Q47612257-419ECEA8-DAF6-4FA3-A3F2-3C107DCC2A16Q48307526-B25D46F0-03F8-4D5C-98C1-68E35A3531E3Q50548889-D680CE18-CCE6-45D7-A506-E5E1947D75CCQ50619296-2AA850DE-E25F-41CA-BB51-EDDE36312DDAQ50655275-E253033E-0700-423C-8BA4-D77059F2455CQ50656264-9989F9D0-755D-45F9-AFA9-D08441569B20Q50725497-BDF5DC78-1A38-46E8-A345-93B40F11A198Q51867764-C62EB4A0-880B-44B4-BAD2-6312BF07D78BQ55109981-CEFC5EA7-D9C7-41F8-B153-8AA60B7ED253Q55280591-EA186EC9-BB2F-4288-AC2E-CB1848D221C9Q55333190-4296F03E-AB32-4D1C-BD11-53EC7EA54352Q57491659-83E83F34-A849-4A84-85C6-AB257C5EE3DAQ79481701-4223D968-895F-4691-9F6D-ED18E9A362F7Q79859019-9EBC47C7-87CD-4E37-A8B2-E81A93088BB3Q80008500-6D94AA56-DF05-4C64-8EF1-5344AB469BC2Q83277407-5A4A95FF-6E61-4BA3-B3B5-DAD8DB5FB5BD
P50
description
onderzoeker
@nl
researcher
@en
հետազոտող
@hy
name
Yuki Yamada
@ast
Yuki Yamada
@en
Yuki Yamada
@es
Yuki Yamada
@sl
type
label
Yuki Yamada
@ast
Yuki Yamada
@en
Yuki Yamada
@es
Yuki Yamada
@sl
prefLabel
Yuki Yamada
@ast
Yuki Yamada
@en
Yuki Yamada
@es
Yuki Yamada
@sl
P1053
B-2671-2008
P106
P1153
55375527500
P2798
P31
P3829
P496
0000-0003-1431-568X