Abstract
Teachers' assessments of students' writing are key for adapting instruction to individual needs but are often influenced by biases. The present studies examined whether student characteristics – gender and migration background – affect text assessments in the digital tool Student Inventory. In Study 1 (n = 117), pre-service teachers assessed six texts differing in quality and students' gender. In Study 2 (n = 127), students' migration background was manipulated instead. Texts were assessed holistically and analytically, considering content, style, and linguistic accuracy. Results from both studies showed that participants reliably differentiated between low- and high-quality texts on all assessment scales. The studies revealed small significant gender biases only for medium-quality texts. The findings suggest that the Student Inventory facilitates an objective assessment of students' writing and supports fair assessment practices, regardless of students' gender or migration background.