Deep supervised, but not unsupervised, models may explain IT cortical representation
about
A Model of Representational Spaces in Human Cortex.Category-Selectivity in Human Visual Cortex Follows Cortical Topology: A Grouped icEEG StudyComparison of deep neural networks to spatio-temporal cortical dynamics of human visual object recognition reveals hierarchical correspondenceToward an Integration of Deep Learning and NeuroscienceSearching for Category-Consistent Features: A Computational Approach to Understanding Visual Category Representation.The Role of Architectural and Learning Constraints in Neural Network Models: A Case Study on Visual Space Coding.Contextual modulation of primary visual cortex by auditory signalsHow Invariant Feature Selectivity Is Achieved in CortexNeural representations of emotion are organized around abstract event featuresRepresentation of Naturalistic Image Structure in the Primate Visual CortexContributions of low- and high-level properties to neural processing of visual scenes in the human brain.Resolving the neural dynamics of visual and auditory scene processing in the human brain: a methodological approach.Generic decoding of seen and imagined objects using hierarchical visual features.Shape Selectivity of Middle Superior Temporal Sulcus Body Patch Neurons.Applying artificial vision models to human scene understanding.Deep Neural Networks as a Computational Model for Human Shape Sensitivity.Comparison of Object Recognition Behavior in Human and MonkeyRepresentational models: A common framework for understanding encoding, pattern-component, and representational-similarity analysis.A data driven approach to understanding the organization of high-level visual cortex.Neural evidence that three dimensions organize mental state representation: Rationality, social impact, and valence.Visual features as stepping stones toward semantics: Explaining object similarity in IT and perception with non-negative least squaresSemantics of the Visual Environment Encoded in Parahippocampal CortexShape-independent object category responses revealed by MEG and fMRI decoding.Selectivity and tolerance for visual texture in macaque V2.Humans and Deep Networks Largely Agree on Which Kinds of Variation Make Object Recognition Harder.What the success of brain imaging implies about the neural code.Computational approaches to fMRI analysisRepresentation of Semantic Similarity in the Left Intraparietal Sulcus: Functional Magnetic Resonance Imaging Evidence.Fixed versus mixed RSA: Explaining visual representations by fixed and mixed feature sets from shallow and deep computational models.Evolutionary Constraints on Human Object Perception.Modeling the N400 ERP component as transient semantic over-activation within a neural network model of word comprehension.Multi-Connection Pattern Analysis: Decoding the representational content of neural communication.Adjudicating between face-coding models with individual-face fMRI responses.Using goal-driven deep learning models to understand sensory cortex.Representational Distance Learning for Deep Neural Networks.Models of visual categorization.Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.Hierarchical Neural Representation of Dreamed Objects Revealed by Brain Decoding with Deep Neural Network Features.Making Sense of Real-World Scenes.Representational Dynamics of Facial Viewpoint Encoding.
P2860
Q27302403-C97796B4-489E-491C-80C9-855E9730BA1CQ27340018-AF460E73-52C3-4123-B6B8-7D0E9D458B88Q27342471-FB030946-C671-40A8-8C5F-6C7BE567A017Q28595812-5A506A5E-6EDF-4428-B2F7-BD3A1C0183DBQ30355389-9610835E-05F9-4E62-8AA6-6E1CEEB75ADBQ30359506-77FABD38-7810-48DE-9957-3771233A5084Q30362991-EA08272A-D7DD-4CDB-97D6-5BA71F76EB11Q30374870-3225ACA1-4D79-45A2-9645-BF9E8B2BAC5BQ30377349-86710211-91E3-45E5-836E-EE4D6C7C5673Q30944949-04815400-3ADD-4A1D-80B1-9647D66F4AABQ31152340-35F5EA9A-53D3-48D5-B30B-0267B3B85773Q31152343-410D1079-A326-4E3B-AFB8-9B5C32378EF7Q33761430-6D64302E-9770-4A7E-ADDA-4C16CE4D599BQ33836348-E502FB4E-0122-4FEF-8AEA-FDA89886CF44Q35048019-BE39C703-E37D-4518-A85A-DA2BA2928C7CQ36002072-1561D971-D6DB-4874-A5C6-621497EA5CA0Q36017231-9C903A6D-030E-480E-A674-3005423AA944Q36355061-B749C9F7-CE05-4A05-8480-9D6192A3396AQ36406948-CCB8050E-AD07-40B6-B0A1-F85FDF24100AQ36459396-721EF085-A841-4F4B-BE30-63A9C2FAB87EQ36666141-F2DBA4DC-5C49-4B12-AB79-522C51A99A3AQ36884283-70886496-D27E-492B-B27E-2F65AC5842C8Q36907654-28F563D3-5862-4402-B3AA-F9D49B70B179Q36978211-883E15AB-7CC7-449A-BB02-308316177E22Q37240979-7257B098-AC4A-4AF2-AFFE-5EF15DAB612DQ37594645-929B1B47-A175-4292-9CDA-7C5EF4819F2FQ38225906-DF3ABC36-025F-4499-A87E-6C9236503906Q38371895-7B0E8F7D-CF4C-4488-9411-F998BC6EB3E3Q38379394-0C4E0667-528E-4A75-AC07-0F8F3B6D2798Q38382985-8C7AC08D-268F-41F6-A8DB-195EAA0DFC46Q38384443-0BC39724-601D-40CA-AAEF-2A2B8504AEDAQ38622062-F15A4CAF-4D61-426A-B133-739F72BEBF6CQ38660212-12681098-D7CA-4B11-9895-260F2036592FQ38746874-FC8620DF-9FA3-4CF4-9D12-909C7025F7BFQ38771086-85DB66B9-1566-4F09-B395-59D6B8933182Q38782536-24FE6E92-CB26-4A3F-90A3-44BAC960BB63Q38911276-5F0E47C2-E69A-4DA3-BE38-C23DA8F9C856Q38961524-18613287-87FC-4885-B47B-5923787AFBC8Q38987858-B1E692BD-C60E-4D01-A3F1-2726728F4B9FQ39239442-1481C8DB-3D9E-44D9-A456-A342767133DD
P2860
Deep supervised, but not unsupervised, models may explain IT cortical representation
description
2014 nî lūn-bûn
@nan
2014 թուականի Նոյեմբերին հրատարակուած գիտական յօդուած
@hyw
2014 թվականի նոյեմբերին հրատարակված գիտական հոդված
@hy
2014年の論文
@ja
2014年論文
@yue
2014年論文
@zh-hant
2014年論文
@zh-hk
2014年論文
@zh-mo
2014年論文
@zh-tw
2014年论文
@wuu
name
Deep supervised, but not unsupervised, models may explain IT cortical representation
@ast
Deep supervised, but not unsupervised, models may explain IT cortical representation
@en
Deep supervised, but not unsupervised, models may explain IT cortical representation
@nl
type
label
Deep supervised, but not unsupervised, models may explain IT cortical representation
@ast
Deep supervised, but not unsupervised, models may explain IT cortical representation
@en
Deep supervised, but not unsupervised, models may explain IT cortical representation
@nl
prefLabel
Deep supervised, but not unsupervised, models may explain IT cortical representation
@ast
Deep supervised, but not unsupervised, models may explain IT cortical representation
@en
Deep supervised, but not unsupervised, models may explain IT cortical representation
@nl
P2860
P3181
P1476
Deep supervised, but not unsupervised, models may explain IT cortical representation
@en
P2093
Nikolaus Kriegeskorte
P2860
P304
P3181
P356
10.1371/JOURNAL.PCBI.1003915
P407
P577
2014-11-01T00:00:00Z