Rubric development and inter-rater reliability
Our ePortfolio cohort met regularly to refine various drafts of the HDFS ePortfolio Rubric. Before each meeting, two new ePortfolio exemplars were selected from the ePortfolio Project webpage and rated by all cohort members. Ratings’ were recorded at the beginning of each meeting and discrepancies among ratings were discussed. The ensuing conversation among cohort members led to the creation and continual adjustment of the ePortfolio Supporting Document. The supporting document contains important and precise definitions for interpreting the ePortfolio rubric.
Following rubric development meetings, inter-rater reliability coefficients were calculated from the unadjusted exemplar ratings recorded at the beginning of the meetings. Inter-rater reliability coefficients acquired from early implementations of the rubric demonstrated adequate inter-rater reliability, e.g., ICC(2,6) = 0.54. However through the process of revising both the rubric and the ePortfolio Supporting Document, we achieved excellent inter-rater reliability coefficients, e.g., ICC(2,6) = 0.88; 95% CI [0.78, 0.95].
The final version of the ePortfolio rubric, along with the supporting document was implemented during the Fall 2014 semester in the HDFS 2030 course.