The University of Auckland
Browse

Faculty development for strengthening online teaching capability: being responsive to what staff want, evaluated with Kirkpatrick’s model of teaching effectiveness

Version 4 2023-06-24, 03:55
Version 3 2023-06-22, 23:02
Version 2 2023-06-21, 01:30
Version 1 2023-06-02, 21:39
Posted on 2023-06-22 - 23:02 authored by Rachelle Singleton
<p>  </p> <p>We gathered five datasets, which comprised an all-teaching staff survey (staff n=105), interviews (staff n=20), staff H5P workshop exit surveys (staff n=86), student grades and student evaluations (n=809). </p> <p><br></p> <p>Each dataset speaks to different levels of evaluation using the BEME 2006 adaptation (Steinert <em>et al.</em> 2006) of Kirkpatrick's Four-Level Model (BEME evaluation) to evaluate the teaching effectiveness of FD initiatives (see Electronic supplementary material (ESM) A). Datasets I and II capture "what staff want" in terms of increasing their confidence and capability to teach online. Data set III captures the reaction and learning of teaching staff after their H5P workshop (BEME evaluation levels 1 and 2A). Datasets IV and V provide results of implementing H5P (BEME evaluation level 4B) with students. We used institutional H5P license usage data to evaluate staff behavior change and adoption of H5P use (BEME evaluation level 3 and 4B). </p> <p>  </p> <p><em>Datasets I and II:</em> We designed the all-teaching staff surveys (dataset I) (see ESM B) and interviews (dataset II) (see ESM C) to capture a holistic view of the teaching staff's perceptions of all the Faculty Development (FD) available and how they have applied any knowledge or skills acquired from them. Members of the research team completed the survey and refined it. We emailed an online survey to the Faculty of Medical and Health Sciences (FMHS) University of Auckland teaching community, comprising approximately 460 staff members. We included the invitation to an interview by a link at the end of the survey. Following face validity, we undertook pilot testing of dataset I (all-teaching staff survey) with the first five survey responses, minor changes were made and the recruiting continued. As this is a descriptive study, we did not undertake further validity and reliability testing of the dataset I survey. We went through an iterative approach to refining the interview question guide with our external consulting agency (<a href="https://www.academic-consulting.co.nz/" target="_blank">Academic Consulting LTD</a>), whom we outsourced collection of the interview data, in order o minimize bias in the study.</p> <p><br></p> <p>Qualitative methods sampling continues until data saturation—when no new information emerges from the collected data—which, according to Ando et al. (Ando et al. 2014), often occurs with 12 participants in a relatively homogenous sample. We closed the all-teaching staff survey (dataset I) at 86 respondents and stopped interview recruitment at 20 participants (dataset II).   </p> <p><br></p> <p><em>Dataset III:</em> The H5P workshop exit survey (see ESM D) incorporated elements of two existing questionaires, Brookfield's (1998) critical incident questionnaire; captured participants' reactions in the emotional domain using an emoji sentiment scale (Marder <em>et al.</em> 2020), and used the Net Promoter Score (Reichheld 2003) as a global satisfaction metric. Members of the research team completed the survey and refined it. As this is a descriptive study, we did not undertake further validity and reliability testing of the dataset III survey. We distributed these surveys at the end of each H5P FD workshop and obtained an approximate 30% response rate.    </p> <p><br></p> <p><em>Dataset IV:</em> We collected routine anonymous Student Evaluation of Teaching [SET] (IVa) and formative evaluations (IVb) from the course 'MEDSCI 203 - Mechanisms of Disease' from 2018 to 2021. We retrospectively analyzed the data to compare students' perceptions of the course before and after incorporating interactive online resources made with H5P, first introduced in 2019 by an early adopter of H5P. The response rate range was 14.1 - 97% out of a class of 182 (2018) - 314 (2021) students per year.</p> <p><br></p> <p><em>Dataset V:</em> We analyzed student grades from the MEDSCI 203 course to compare student grades in coursework assignments before (2018) and after incorporating interactive online resources using H5P in 2019.</p> <p><br></p> <p><br></p> <p> </p> <p> </p>

CITE THIS COLLECTION

DataCite
No result found
or
Select your citation style and then place your mouse over the citation text to select it.

FUNDING

The author(s) declared that no grants were involved in supporting this work.

Publisher

University of Auckland

SHARE

email
need help?