Assessing agreement with multiple raters on correlated kappa statistics.
about
What About Their Performance Do Free Jazz Improvisers Agree Upon? A Case Study.Asymptotic distributions of kappa statistics and their differences with many raters, many rating categories and two conditions.Rating experiments in forestry: How much agreement is there in tree marking?Training Programs on Endoscopic Scoring Systems for Inflammatory Bowel Disease Lead to a Significant Increase in Interobserver Agreement Among Community Gastroenterologists.
P2860
Assessing agreement with multiple raters on correlated kappa statistics.
description
2016 nî lūn-bûn
@nan
2016年の論文
@ja
2016年学术文章
@wuu
2016年学术文章
@zh-cn
2016年学术文章
@zh-hans
2016年学术文章
@zh-my
2016年学术文章
@zh-sg
2016年學術文章
@yue
2016年學術文章
@zh
2016年學術文章
@zh-hant
name
Assessing agreement with multiple raters on correlated kappa statistics.
@en
Assessing agreement with multiple raters on correlated kappa statistics.
@nl
type
label
Assessing agreement with multiple raters on correlated kappa statistics.
@en
Assessing agreement with multiple raters on correlated kappa statistics.
@nl
prefLabel
Assessing agreement with multiple raters on correlated kappa statistics.
@en
Assessing agreement with multiple raters on correlated kappa statistics.
@nl
P2093
P2860
P356
P1433
P1476
Assessing agreement with multiple raters on correlated kappa statistics.
@en
P2093
Anne F Peery
Evan S Dellon
Hongyuan Cao
Pranab K Sen
P2860
P356
10.1002/BIMJ.201500029
P577
2016-02-18T00:00:00Z