Improving eLearning Project Intake in a Government Agency
Amy Jones
University of West Florida Libraries
Doctor of Education (EDD), University of West Florida
2024
Metrics
248 File views/ downloads
80 Record Views
Abstract
In this study, I identified, implemented, and evaluated an appropriate solution set to improve the quality of asynchronous online training (i.e., eLearning) project intake at a research site in a government agency. The current project intake mechanism left eLearning production teams without sufficient information to move forward with educational projects efficiently.
Organizations hold education departments accountable for the return on investment for training (Rossett, 2006). Data collected during project intake and effective translation of those data set the stage for evaluating training projects and determining their value (Rossett, 2006; Watkins et al., 2012). Thus, finding appropriate remedies for eLearning intake issues is imperative to ensure education departments can collect necessary data during intake and evaluation to validate return on investment for their products.
I conducted a study to examine what the research site could do to improve eLearning project intake. The selected framework was grounded in human performance technology methodology (Van Tiem et al., 2012) and best practices, including front-end analysis, gap and cause analyses, intervention selection, implementation, and evaluation.
The front-end analysis included organizational and environmental analysis activities, which required data collection. I collected elicited data through qualitative one-on-one interviews and focus groups with research site employees participating in intake activities. Extant data sources included artifacts from previous analyses, organizational transformation activities, intake records, and agency regulations. Each source underwent a systematic document review process. I collected all data to inform the current and desired states of eLearning project intake.
The Road Map for the Process of Qualitative Analysis guided the overarching qualitative data analysis process (Bloomberg, 2023). I deidentified and sorted transcripts into units of meaning. I developed coding, categorization, theming, and key findings according to the codes-to-theory model for qualitative inquiry and best practices outlined in The Coding Manual for Qualitative Researchers (Saldaña, 2021). I defined codes in a codebook to ensure consistency in the coding process. Categories shaped themes in the data, translating into assertions about the characteristics of the larger population. These assertions revealed four performance gaps between the research site’s current and desired eLearning intake mechanism states.
I prioritized the four performance gaps according to organizational importance and how feasible they would be to address in the organization’s current state. One gap regarding knowledge and understanding of the instructional design role in intake was feasible and important for the organization to address.
Due to organizational limitations at the time of this study, I could not address the root cause of eLearning project intake deficiencies. However, I identified two contributing causes using a current reality tree, including the instructional design gap. Then, I aligned the contributing causes with an appropriate solution: an instructional design community of practice (CoP). Additionally, participant control was essential to mitigate the risk of change fatigue due to the many organizational changes that staff experienced in the 3 years leading up to this study.
I designed the Instructional Design CoP following CoP recommendations and best practices from several sources, including Building Successful Communities of Practice (Webber, 2016). Principles from the diffusion of innovations change management framework (Rogers, 2003) guided CoP implementation. CoP design experts recommend that members of a CoP develop the community organically, with support from the organization and a human performance technology practitioner (Barab et al., 2006; Webber, 2016). Per change management best practices, the initial CoP meetings included instructional designers who were considered innovators and early adopters of change (Rogers, 2003).
I used the Dessinger-Moseley full-scope evaluation model (Dessinger & Moseley, 2002) to guide the CoP evaluation procedures and plans. CoP design underwent formative evaluation through a series of meetings with the executive sponsor. I developed a plan for summative and confirmative evaluations, which followed the Dessinger-Moseley full-scope evaluation model (Dessinger & Moseley, 2002), the guiding or evaluative criteria for examining a designed CoP (Barab et al., 2006), and the collection of value creation stories (Wenger et al., 2002). Following the initial collaboration of CoP members, I transitioned CoP activities and evaluation plans to the primary stakeholders.
Files and links (1)
pdf
Improving eLearning Project Intake in a Government Agency6.15 MBDownloadView
Preprint Dissertation pdf Open Access
Details
Title
Improving eLearning Project Intake in a Government Agency