G-0R1JG2KD7Q
Content_184 COPY


Predicting and understanding student dropout in vocational education - PART II

Irene Eegdeman

In January 2020, we wrote a blog about predicting and understanding student dropout in vocational education. Back then, the conclusion was that, to improve students’ educational careers and their success rate in vocational education, we needed to come up with better measures of predicting dropout as early as possible. This blog serves to update the LEARN! audience on the developments of our project.


In January 2020, we wrote a blog about predicting and understanding student dropout in vocational education. Back then, the conclusion was that, to improve students’ educational careers and their success rate in vocational education, we needed to come up with better measures of predicting dropout as early as possible. Achieving this goal would also require more sophisticated data and machine learning algorithms that could process such data to predict at-risk students. In the meantime, we gained more insight into the expectations of students at the beginning of their program, and we went forward with designing a method that uses a series of machine learning algorithms to identify at-risk students efficiently. Finally, we compared our machine learning models with the predictions of teachers based on their gut feeling. This blog serves to update the LEARN! audience on the developments of our project.

 

Do students who drop out have a different expectation about the content of the program they are going to follow than the successful students?


Unrealistic expectations with regard to one’s study program have been linked to negative consequences for future academic success (e. g. Baker, McNeil, & Siryk, 1985; Fonteyne, Duyck, & De Fruyt, 2017; Helland, Stallings, & Braxton, 2002; Jacob & Wilder, 2010; Maloshonok & Terentev, 2017; Tinto, 1975; Wigfield & Cambria, 2010; Zijlstra & Meijers, 2006). Previous studies that have shown this are often retrospective, however, and focus on performance-related expectations (e.g., expected grades), while unrealistic expectations about the required effort and the content of the program (content-related expectations) may be more relevant for explaining dropout in tertiary education. 

In our study, we prospectively investigated whether the content-related expectations of 208 Dutch Sport Academy students elicited before the start of their vocational program are associated with subsequent dropout and academic performance. Our results show that drop-out students did not differ in expected grades (even though they did differ in prevocational GPA). Moreover, their content-related expectations at the start of the program did not differ from successful students, nor were they less realistic. Still, upon retrospective inquiry, 50% of the students answered that the concerning program did not fit. This result suggests that retrospective reports of inadequate expectations may not reflect low expectations before starting the program. Instead, tertiary educational programs may defy expectations in both successful and unsuccessful students, with surprises being pleasant for successful students and unpleasant for unsuccessful ones.

 

Teacher vs. computer, who can predict dropout best?


Machine learning algorithms use data to identify at-risk students early on such that future dropouts can be prevented. We presented a new method that uses a series of machine learning algorithms to efficiently identify students at risk and makes the sensitivity/precision trade-off inherent in targeting students for dropout prevention explicit (Eegdeman, Cornelisz, Meeter, & Klaveren, 2022). On the other hand, teachers tend to use subjective rules for signaling at-risk students (gut feeling)(Südkamp, Kaiser, & Möller, 2012). Are such subjective observations of teachers indeed predictive for identifying at-risk students, and can these subjective observations help increase the prediction performance of machine learning algorithms? 

We put nine teachers in upper secondary vocational education to the test. For each of the 95 freshmen students enrolled to the educational program, these teachers were asked whether a student would have dropped out at the end of their freshmen year. Teachers answered this question at the beginning of the program and again after the first ten weeks of the program. The results show that the average teacher can predict dropout better than the machine learning algorithms at the start of the program, but not after the first ten weeks, even though their prediction accuracy increases over time.

 

To improve students’ educational careers and improve their success rate in vocational education, we need to make our models better, especially at the start of the program when we still can help students. Future research might assess whether including teachers’ predictions at the start of the program as potential features to be selected by the machine learning algorithms allow these algorithms to increase their accuracy. Doing so might enable a highly targeted approach in combatting student dropout in the face of capacity constraints for counseling or participation in dropout prevention programs. Next, we do not only want to know who we have to target for a dropout prevention program, but we also want to know what to do in such a prevention program. Stay tuned to learn about the next steps and what a ‘practoraat’ is…

 

Irene Eegdeman is a PhD student with a doctoral grant for teachers (Dutch Research Council), her (co) promotors are Martijn Meeter and Chris van Klaveren. Ilja Cornelisz also contributed to several studies. Irene hopes to finish her PhD trajectory this year

 

 

 

Jongsma_126

MEET THE AUTHOR

 

 

 

 

 

 

 

Irene Eegdeman, MSc

External PhD Candidate, Faculty of Behavioural and Movement SciencesEducational and Family Studies

External PhD Candidate, LEARN! - Learning sciences

 


Predicting and understanding student dropout

Klik op het menu voor inhoud en andere functies.

Gebruik de pijlen aan de zijkant om door het magazine te bladeren.
Loading ...