Have a personal or library account? Click to login
The German Panel of Teacher Education Students: Surveying (Prospective) Teachers from Higher Education into Working Life Cover

The German Panel of Teacher Education Students: Surveying (Prospective) Teachers from Higher Education into Working Life

Open Access
|May 2023

Full Article

1 Background

This paper describes the design, survey instruments, data, and their potential for use of the German Panel of Teacher Education Students (Lehramtsstudierenden-Panel (LAP)). This study is a longitudinal study of (prospective) teachers in Germany that follows their professional and competence development from the very beginning of teacher training through to preparatory service and into professional life. It is a collaborative project of the Leibniz Institute for Educational Trajectories (Leibniz-Institut für Bildungsverläufe; LIfBi) and the German Centre for Higher Education Research and Science Studies (Deutsches Zentrum für Hochschul- und Wissenschaftsforschung; DZHW). The project has been funded by the Federal Ministry of Education and Research (Bundesministerium für Bildung und Forschung; BMBF).

The study is closely intertwined with Starting Cohort First-Year Students (SC5) of the National Educational Panel Study (NEPS; Blossfeld & Roßbach, 2019; Brachem et al., 2019), a panel study that started in 2010 and accompanies first-year students from the winter term of 2010/2011 onwards and well into their careers.1 When drawing the sample, teacher education students were considerably oversampled, thus preparing the ground for the LAP study. In terms of design, the LAP study is fully integrated into NEPS SC5; in terms of constructs and survey instruments, it has complemented NEPS SC5 since 2014 (wave 8). In total, data were collected in 19 panel waves conducted between 2010 und 2022.

The main objective of the LAP is to improve the data basis for research on teacher education and the professional activities of teachers in Germany by (1) taking a longitudinal perspective (see Chapter 2.1), (2) drawing a large sample that covers the entire range of teacher education programmes in all German states (see Chapter 2.4), (3) addressing a broad range of topics (see Chapter 2.5), (4) allowing for comparisons with students and professionals in non-teaching areas, and (5) making the data available to the scientific community (see Chapter 3). The primary research interest underlying data collection is to describe and explain the development of teachers’ professional competencies and educational practices.

Central to this research focus is a theoretical approach that considers (prospective) teachers’ competencies, professional development, and teaching activities to be the result of the interplay between individual attributes of (prospective) teachers and the provision and use of learning opportunities. Following this opportunity-use model (Fend, 2006; Helmke, 2012), we measured both individual attributes of the panel members such as personality, interests, and self-concept (cf. Wohlkinger et al., 2019) and the provision and characteristics of learning opportunities. The assessment of the quality of learning environments is based on the so-called SSCO model (Bäumer et al., 2019; Schaeper & Weiß, 2016). This theoretical approach is applied throughout the NEPS and distinguishes four dimensions: structure (S), support (S), challenge (C), and orientation (O).

The concept of competence used in the LAP study follows Baumert and Kunter’s (2013) multidimensional competence model, which includes both cognitive (e.g., professional knowledge) and non-cognitive (e.g., beliefs, motivation) dimensions. Since it was not possible to measure teachers’ professional knowledge, the study focuses on beliefs (e.g., beliefs about teaching and learning), motivational orientations (e.g., motivation for choosing teacher education, teaching-related self-efficacy), and occupational self-regulation. However, as part of the NEPS SC5 testing programme, basic domain-specific competencies and domain-general or meta-competencies such as ICT literacy and social competencies (cf. Weinert et al., 2019) were assessed (see Chapter 2.5.2). This data can be, and has been, used to answer teacher-related questions, such as the level of ICT literacy (Senkbeil et al., 2021) or the impact of social competencies (Carstensen & Klusmann, 2021). In addition, data on teaching-related abilities in information and communication technologies (ICT) was collected using self-report instruments.

When assessing professional competencies, the LAP study measures both general aspects (e.g., professional self-concept for teaching, enthusiasm for teaching) and specific aspects. Given the challenges associated with each of inclusive teaching, increasingly multicultural classes, and the growing importance of ICT for teaching and learning, the study places a special emphasis on these three areas and gathers information, for example, on inclusion-related, multicultural, and ICT-related beliefs. In order to provide an opportunity to expand knowledge on the effects of these competencies and the driving factors behind them, we collected data on learning opportunities for teaching in inclusive and multicultural classes and also on several teaching practices. Using this data, Menge et al. (2021) examined the effect of learning opportunities and experiences on (prospective) teachers’ beliefs towards inclusion and self-efficacy regarding inclusive teaching.

The LAP study addresses all stages of teacher formation. First, the initial phase in higher education: Although it was not possible to include specific teacher-related questions in the survey programme before 2014, the comprehensive basic surveys of NEPS SC5 open up the opportunity to examine many research questions on teacher candidates and teacher education. Complementing previous research on specific characteristics of teacher education students, the large and diverse sample of the LAP study allows for a more differentiated view, which has already been taken, for example, by Hartmann and Ertl (2021), Hartmann et al. (2022), Neugebauer (2020), and Osada and Schaeper (2021), or to focus on specific groups, such as students with a migrant history (cf. Besa & Vietgen, 2017; Gülen, 2021, 2022). Other questions, which can be analysed using the core data of NEPS SC5, refer to the learning experiences (see the paper of Rochnia et al, 2019) and to the factors that influence educational choices during higher education.

The second stage of teacher training, the preparatory service or induction phase, combines practical training in schools with studies in educational theory and subject-related didactics at seminars, and concludes with a state examination. Since this Referendariat or Vorbereitungsdienst is considered a crucial phase of teacher education, which is often associated with high levels of occupational stress (Drüge et al., 2014; Klusmann et al., 2012), the LAP study gathers information on relevant aspects of this learning environment.

‘Learning while working as a teacher’ constitutes the third stage of teacher education (Fussangel et al., 2016). This stage includes professional development (PD) activities and continuing education in formal and nonformal learning environments but also learning in informal settings (for the distinction between different types of learning contexts see Bäumer et al. (2019)). Following the model of Richter et al. (2010) for explaining participation in continuing education, the LAP study collects information on context-specific, social, psychological, and situational factors.

Not only do colleagues and school leaders play a role in individual decisions to take advantage of continuing learning opportunities, but colleagues also offer informal learning opportunities, and colleagues and the school management are elements of the school context that influences teaching and teachers’ learning (Lipowsky & Rzejak, 2019). The LAP study measures cooperation among colleagues as perceived by the individual teacher using a three-stage model that distinguishes between exchange, joint work, and co-construction (Dizinger & Böhm-Kasper, 2019; Gräsel et al., 2006).

Through their leadership style school leaders can have a significant impact on the school climate and culture, and in this way also on the individual teachers’ motivation, beliefs, well-being, and instructional practices (Blömeke & Klein, 2013; Leithwood & Sun, 2012; Pietsch & Tulowitzki, 2017; Windlinger et al., 2020). The LAP study collects information on two widely adopted (Pietsch & Tulowitzki, 2017) and highly predictive (see the meta-analyses of Hattie, 2009; Judge & Piccolo, 2004; Robinson et al., 2008; Sturm et al., 2011) concepts: instructional leadership, which is learning- and teaching centred (Bush & Glover, 2014; Pietsch & Tulowitzki, 2017), and transformational leadership, which is directed towards “building up the capacity of those they work with, motivating them towards instilling change and transformation” (Pietsch & Tulowitzki, 2017, 632).

Finally, the LAP study addresses “alternatively certified” teachers, who have come into the focus of policy and research because of teacher shortage, especially in STEM subjects. Therefore, Germany has opened up alternatives to the traditional route into the teaching profession. Under certain conditions, it is possible to work as a teacher without a teaching-related higher education degree; either with a completed preparatory service (Quereinstieg) or without having undertaken preparatory service (Seiteneinstieg; cf. Lucksnat et al., 2020).

The variety of topics briefly mentioned above shows that the LAP data can be used to address a wide range of research questions, particularly in educational psychology. More details can be found in Chapter 4.

2 Methods

2.1 Study design

As mentioned above, the main general objective of the NEPS is to collect longitudinal data on the development of competencies, educational processes, educational decisions, and returns to education in formal, nonformal, and informal contexts throughout the life span. In order to make relevant information on educational transitions available as soon as possible, the educational biography has been divided into eight stages with a particular focus on critical transitions. Within these eight stages, the NEPS started with six different cohorts in order to observe educational careers and transitions. This multicohort-sequence design (Blossfeld et al., 2009; von Maurice et al., 2016) makes it possible to collect comparative data on competence development and educational pathways over the entire life course, while at the same providing the data in a relatively short timeframe. Theoretical dimensions, so-called ‘pillars’, provide a conceptual framework and integrate data collection in these six starting cohorts.

One of the starting cohorts mentioned refers to the stage of higher education. For this cohort, first-year students, enrolled at a German higher education institution (HEI) in the winter term of 2010/2011, were selected (Aßmann et al., 2019). HEIs include universities and equivalent institutions such as colleges of art or music, as well as universities of applied sciences. Students attending private HEIs and students enrolled in a teaching degree programme were oversampled. As mentioned above, the latter form the basis of the LAP project. For details on the sampling design see Chapter 2.4.

In Starting Cohort First-Year Students several modes of data collection were employed: self-administered paper-and-pencil questionnaires (PAPI), computer-assisted telephone and personal interviewing (CATI and CAPI), online surveys (computer-assisted web interviewing; CAWI), and competence testing.

Following the initial PAPI in the winter term of 2010/2011, annual CATIs (and one CAPI in wave 12) focus primarily on (an update of) the life course, including the educational and employment history of the participants. Central longitudinal measures of the NEPS pillars can be found in CATI/CAPI as well as in CAWI, which are conducted annually from 2011 until 2014 and biennially thereafter. In addition, questions particularly aimed at higher education students are integrated into the CAWIs. All survey instruments were basically also addressed to (prospective) teachers. They already open up a wide range of possibilities for research on teacher training and the teaching profession. When the LAP project started, an additional survey programme specifically for (prospective) teachers has been implemented in the NEPS SC5 surveys since wave 8 in autumn 2014, with the LAP study fully integrated into the survey design.

The data collected alongside the life course information within a wave can be interpreted as cross-sectional data. However, since the central constructs are usually collected in more than one wave, this data often contains longitudinal information that makes it possible to observe intra-individual developments.

A variety of modes were used for the competence test (Brachem et al., 2019). The first competence test in wave 1 was administered as a paper-based assessment (PBA). For the second test of competencies in wave 5, a mode experiment was conducted with individual web-based testing (CBWA), as well as three test modes in group settings: conventional paper-based assessment (PBA), paper-based assessment with digital pencils (E-Pen), and computer-based assessment (CBA). In wave 7, a subject-specific competence test in business administration was administered as an individual paper-and-pencil test to students or graduates of business administration and economics. The last competence test of NEPS SC5 was conducted in wave 12, again as a mode experiment in two modes: CBA during a CAPI survey and CBWA after a CATI survey (for more details, see the ‘Information on Competence Tests’ and ‘Field Reports’ at: https://www.neps-data.de/Data-Center/Data-and-Documentation/Start-Cohort-Students/Documentation).

2.2 Time of data collection

Between 2010 and 2022, 19 panel waves were conducted. The CATIs with their focus on the life course always started in spring, with the online surveys usually starting in autumn. Only the nineteenth wave, in 2022, deviates somewhat from this pattern; wave 19 is a combined CATI and CAWI survey, where all study participants who took part in the telephone interview were invited to the online survey immediately afterwards. Although unscheduled, the study participants of all NEPS starting cohorts were also asked online about the effects of the Corona pandemic in May/June 2020.2 A detailed list of the studies conducted can be found in Table 1. The table also shows when the competence tests took place. The data published to date comprises data from the first 17 panel waves and the additional 2020 Corona survey.

Table 1

Overview of surveys and tests.

WAVEMODE/METHODSTARTENDCOMMENT
1PAPI (recruitment survey) & CATI30.11.201028.01.2012
1TCompetence test21.03.201122.07.2011Group administered PBA
2CAWI26.10.201111.12.2011
3CATI10.04.201218.08.2012
4CAWI29.10.201217.12.2012
5CATI19.03.201303.08.2013
5TCompetence test02.05.201331.07.2013Mode experiment: group administered PBA/E-Pen/CBA & individual CBWA
6CAWI29.10.201315.12.2013
7TCompetence test24.01.201406.04.2014Individual subject-specific PBA
7a)CATI28.04.201413.09.2014
8b)CAWI29.10.201407.12.2014
9CATI27.04.201531.08.2015
10CATI21.03.201607.08.2016
11CAWI02.11.201611.12.2016
12CAPI
CATI
27.02.2017
25.04.2017
04.08.2017
23.09.2017
Mode experiment: CAPI/CATI
12TCompetence test27.02.201730.11.2017Mode experiment: individual CBA and CBWA
13CATI23.04.201801.09.2018
14CAWI07.11.201816.12.2018
15CATI18.03.201903.08.2019
16CATI16.03.202001.08.2020
CAWI13.05.202022.06.2020Additional NEPS survey on the Corona pandemic, published since version 14.1.0 of the Scientific Use File (SUF)
17CAWI04.11.202011.01.2021
18c)CATI22.03.202121.08.2021
19CATI +
CAWI
19.04.202217.09.2022
13.11.2022
Combined CATI and CAWI survey
a) Without the LAP oversample; b) Additional questions specific to teaching for the first time (see Chapter 2.1 and 2.5); c) For internal reasons, the field period ended for the LAP oversample on 19.06.2021 and the additional questions specific to teaching were suspended.

Please note that for funding reasons the initial sample of teacher education students was randomly divided into a “LAP basic sample” and a “LAP oversample”. The relative size of the basic sample corresponds to the proportion of teacher education students in the population, and the size of the oversample is the surplus. The two groups were sometimes treated differently.

2.3 Location of data collection

All data was collected across all German federal states. In addition, study participants who belong to the initial sample were also surveyed abroad when they moved to another country.

2.4 Sampling, sample and data collection

The target population of NEPS Starting Cohort First-Year Students consists of all students who enrolled at a public or state-approved HEI in Germany for the first time in the winter term of 2010/2011 and were aiming to complete one of the following degrees: bachelor’s degree, a state examination (Staatsexamen) in medicine, law, pharmacy, or teaching, or a diploma or master’s degree in theology. However, first-year students studying at higher education institutions run by federal ministries or federal states for members of their own public services were excluded. During sampling, special emphasis was placed on students with non-traditional entrance qualifications, i.e., students without a school-leaving certificate qualifying for higher education. Furthermore, students at private HEIs and teacher education students were overrepresented in the sample, the latter building the initial sample of the LAP. More details on the population and sampling are described in Zinn et al. (2017) and Aßmann et al. (2019).

The sample for the first wave was drawn using a stratified cluster sampling approach. A particular field of study at a specific higher education institution represented one cluster (primary sampling unit), and all students within each cluster were surveyed. To oversample teacher education students (and students at private HEIs), a first stratification level distinguished between clusters in terms of type of HEI, institutional control of HEI, and teacher education programme. This first stratification level consisted of four strata (h1 = clusters linked to teaching tracks; h2 = all other fields of study at public universities; h3 = all other fields of study offered at public universities of applied sciences; h4 = all study tracks at private universities or private universities of applied sciences). To reduce sampling error, a second stratification level was introduced, which was defined using aggregated fields of studies. A detailed report of the stratified cluster sampling approach as well as a detailed description of the clusters and strata can be found in Zinn et al. (2017).

In order to achieve higher response rates, two separate approaches were used for recruiting study participants: In a first step, each student was invited by letter to participate. In a second step, field workers were sent into central first-year courses to personally recruit participants for the survey. This method had been previously tested in a pilot study and resulted in both better participation rates and higher panel attendance (Brachem et al., 2019).

In the first wave, a total of 17,909 first-year students participated in the NEPS student cohort including 5,554 teacher education students (according to variable tg02001; see Chapter 3.10) who form the initial LAP target group. The various school types of the German school system are also mirrored in different teacher education programmes. In the initial sample, 13% of the teacher education students had taken up a training course for primary education, 19.5% for lower secondary education (excluding lower secondary education at a Gymnasium), 54.5% for upper secondary education (including lower secondary education at a Gymnasium but excluding vocational education), 5% for special education, and 6.5% for vocational education. One and a half per cent of the teacher education students stated that their degree programme did not differentiate between school types. As is common in teacher education programmes, most students are female (75.5%).

However, educational and occupational trajectories are not linear, and dropping out or changing study programmes is common. Hence, the target population changes over time due to people dropping out of or moving into teacher education, or entering the teaching profession without having completed a teaching degree (e.g., lateral entrants). The LAP target population was initially limited to teacher education students, trainee teachers in preparatory service and working teachers. From wave 11 onwards, the definition of the target population was expanded to include respondents who (a) have completed preparatory service and intend to work as teachers, but are not yet employed, (b) have completed the first phase of teacher education and intend to complete preparatory service but have not yet started it, and (c) have temporarily interrupted employment as a teacher, e.g., due to parental leave.

Table 2 reports the number of participants from wave 8 to wave 17 who belong to the LAP target group. The table also shows the auxiliary variables generated from wave 8 onwards to filter the teacher-specific questions and determine the LAP sample. More information on identifying the LAP target population, also in earlier waves, and auxiliary variables provided for this purpose is given in Chapter 3.10. A full description of the student cohort’s recent panel development can be found in Ziesmer (2022).

Table 2

Size of the LAP target group by status (wave 8 to wave 17).

WAVEVARIABLETEACHER EDUCATION PROGRAMMEPREPARATORY SERVICEEMPLOYED AS TEACHERTOTAL
8a)tg600112,7572,757
9b)tg60014 & tg640122,8703413,211
10tg60022_v1, tg60015 & tg640121,6798572822,818
11tg60012_v19057762911,972
12tg60013_v18231,0426912,556
13tg60013_g1v24206861,0232,129
14tg60017_v12223048401,366
15tg60013_g1v22443181,2791,841
16tg600131351521,4441,731
17tg60017109821,1971,388
a) Information on participants in preparatory service or working as teachers not shown separately; b) Information on participants working as teachers not shown separately.

The analysis shows that participation rates are consistently lower in CAWI compared to CATI. The proportion of students who could be reached for an interview or test declined significantly over time. The main reasons for panel attrition are missing contact information, continuous non-participation over a period of three years (so-called ‘final dropouts’), and withdrawal of panel consent (Aßmann et al., 2019). There is evidence that accessibility and participation of study participants depend on both the topic of the survey and personal availability as well as on socio-demographic factors (Liebeskind & Vietgen, 2017; Zinn et al., 2018). To account for selective study participation, weights have been estimated for every wave. More details on participation development, panel attrition, sample selectivity and weights can be found in Zinn et al. (2018) and, for the most recent Scientific Use File (SUF), Ziesmer (2022).

2.5 Materials and survey instruments

2.5.1 Materials

The survey instruments of all published waves as well as other information materials are well documented and publicly available. The following list describes some of the main sources of information and documentation materials for data users (see also Table 4):

Table 3

Overview of NEPS SC5/LAP datasets (as of February 2023).

DATA FILEa)CONTENT
BasicsBasic information about participants, such as sociodemographic characteristics
BiographyIntegrated and edited life course data; serves to facilitate the analysis of complex life course data
CohortProfileMeta information for all participants and waves, such as participation status and interview date
EditionBackupsInformation on single values that have been changed or modified in the data edition process
EducationLongitudinal information on transitions in participants’ educational history
MethodsCATIParadata from the CATI interviews, e.g., interview duration, interviewer characteristics
MethodsCAWIParadata from the CAWI interviews, e.g., interview duration
MethodsCompetenciesParadata from the competence tests, e.g., test date, interviewer characteristics
pTargetCATIData from CATI questionnaires
pTargetCAWIData from CAWI questionnaires
pTargetMicromSmall-scale regional indicators of participants’ place of residence; only available via on-site access
spChildInformation on participants biological, foster, and adopted children
spChildCohabData on cohabitation spells with children
spCoursesInformation on courses/trainings taken during employment, unemployment, parental leave, and other episodes
spEmpInformation on employment episodes
spFurtherEdu1Information on further courses since the last interview that have not been reported elsewhere
spFurtherEdu2Additional information on two randomly selected course from the ‘spCourses’ and ‘spFurtherEdu1’ modules
spGapInformation on gaps in the individual life course identified by a check module (e.g., type of gap)
spInternshipInformation on internship episodes that took place after completing school and before or during higher education
spMilitaryData on episodes of military/civilian service and voluntary gap years
spParLeaveData on parental leave episodes
spPartnerData on the participants’ partnership history and basic information about the participants’ partners
spSchoolData on participants’ general school history
spSchoolExtExamInformation on school certificates acquired by recognition or external examination
spSiblingBasic information on participants’ siblings reported in wave 1
spUnempData on unemployment episodes
spVocBreaksInformation on breaks further education and training with a special focus on higher education
spVocExtExamInformation on vocational education certificates acquired outside the regular German educational system, e.g., certificates acquired abroad or as an external examinee
spVocPrepData on episodes of vocational preparation after general education
spVocTrainInformation on participants’ vocational education history covering all further (vocational and/or academic) trainings
StudyStatesInformation on participants’ higher education phase or status, derived from ‘spVocTrain’; e.g., type of degree acquired, ongoing/completed degree programme
WeightsSample weights, cluster and stratification variables
xEcoCAPIData of a subject-specific competence test in business administration, conducted on a subsample of students in economics
xInstitutionContext data for all higher education institutions (e.g., size, regional unemployment rate) and subject areas within institutions (e.g., number of female and male first-year students, number of professors); most variables only available on-site
xPlausibleValuesPlausible values of the competence data stored in ‘xTargetCompetencies’
xTargetCompetenciesData from competence tests in reading (German), mathematics, science, ICT, English, and domain-general cognitive functioning
xTargetCORONAData collected in an additional survey in May/June 2020 regarding the impact of the COVID-19 pandemic on participants’ life
NEPS-SC5-ADIABCombined NEPS SC5 data and administrative data from the Institute for Employment Research (IAB); available in restricted access environments
a) The complete names of the datasets consist of (1) an indicator for the starting cohort (e.g., SC5), (2) the unique file name indicating content and type of data (e.g., Basics), (3) an indicator for the confidentiality level (D = download; R = remote access; O = on-site access) and (4) an indicator for the release version. For example, the ‘Basics’ file of NEPS SC5, download version, SUF release 17.0.0 is labelled ‘SC5_Basics_D_17-0-0’. Unique file names with the prefix ‘x’ refer to cross-sectional data, the prefix ‘sp’ indicate spell data, and the prefix ‘p’ panel data. Datasets without a prefix are generated by the LIfBi Research Data Center.
Table 4

Overview of relevant documents and materials (as of February 2023).

DOCUMENT/MATERIALDESCRIPTION
NEPSPlorera)Tool for searching through survey instruments of all NEPS Starting Cohorts
Glossaryb)Overview of the NEPS terminology and abbreviations and specific terms in the German education and occupation system
Questionnairesb)Provided as SUF and field versions (SUF versions in German and English)
Field reportsb)Information on the data collection process
Interviewer manualsb)Information for the interviewers on the process and content of the interviews
Check module for life course datab)Description of the module that checks the life course data reported in CATI for gaps between and overlaps of episodes
Information on competence testsb)Description of the measured constructs, data, and psychometric properties
Data manualb)Overview of the panel waves and data
Codebookb)Overview of all variables measured
Semantic data structure fileb)Contains meta-data stored in the SUF; useful to explore the data
Release notesb)Information on known bugs, solutions, and changes compared to previous versions
Anonymisation manualb)c)Description of the anonymisation procedure; overview of restricted variables
Sample, weights, nonresponseb)Information on the sampling process and the construction of the weighting variables
Merging matrixb)Instruction of how to link information from different datasets
Non-traditional studentsb)Documentation of the variable tg24150 (non-traditional students)
Context datab)Description of the context data for higher education institutions (file ‘xInstitution’)
Regional datab)Documentation of the regional structural information; can be merged on-site
Immigrants in the NEPSd)Information on how immigrants were identified and categorised in the NEPS
Special Stata commandse)NEPS-specific Stata commands in the package ‘NEPStools’
a) https://www.neps-data.de/Data-Center/Overview-and-Assistance/NEPSplorer.
b) https://www.neps-data.de/Data-Center/Data-and-Documentation/Start-Cohort-Students/Documentation.
c) https://www.neps-data.de/Data-Center/Data-Access/Sensitive-Information.
d) Olczyk et al., 2016.
e) https://www.neps-data.de/Data-Center/Overview-and-Assistance/Stata-Tools.
  • The NEPSplorer (https://www.neps-data.de/Data-Center/Overview-and-Assistance/NEPSplorer; description in Fuß & Wenzig, 2019; Skopek et al., 2016) is a tool that “performs a full text search through the German and English survey instruments of all released Scientific Use Files with the exception of competence tests” (Fuß & Wenzig, 2019, 370).

  • The Codebook of the most recent SUF gives an overview of all variables measured and provides information on the number of cases and measurement time points. The codebook for NEPS SC5 including the LAP data is available in German and English.

  • Questionnaires are provided as SUF and field versions. While the field versions consist of the original paper & pencil questionnaires or programming templates, the SUF versions include additional information such as the variable names. The SUF versions of the survey instruments have also been translated into English. Survey instruments of panel waves not yet published can be accessed once having registered as a NEPS data user. However, it is not possible to get exact information on the competence tests (e.g., wording of the test items, instruction).

  • The Data Manual is an important and useful source of information on general features of the panel waves, conventions, and data structure. In addition, it contains a chapter dealing with special issues such as special types of variables and coding strategies. In this chapter, data users will also find a special LAP subchapter. We strongly recommend that users check the data manual regularly for updates and changes made in the published SUF.

  • Field reports give information on the process of data collection, and document—for example—tracking of panel members, incentives given to respondents, reminders, and measures taken for contacting participants. They are written by the social research institute “infas Institute for Applied Social Sciences” (infas Institut für angewandte Sozialwissenschaft, Bonn), which conducted most of the surveys and tests. Field reports are only available in German.

Apart from the NEPSplorer, all documentation materials described above can be downloaded from https://www.neps-data.de/Data-Center/Data-and-Documentation/Start-Cohort-Students/Documentation. Links to the documentation of previous SUFs can also be found on this page.

2.5.2 Survey instruments

In the following sections, we describe the survey instruments specifically targeted to (prospective) teachers and selected survey instruments of the basic NEPS SC5 programme addressed to all panel members (see Section “Selected NEPS constructs” below). Because of the many panel waves conducted in NEPS SC5, it is not possible to give an exhaustive account of the constructs measured. However, more information on the instruments used in NEPS SC5 including theoretical frameworks is given in various chapters of the books edited by Blossfeld et al. (2016) and Blossfeld and Roßbach (2019). For clarity, we present basic information such as construct name, subscales, and sources in tabular form where appropriate, and organise the description in subsections. Information on psychometric properties is not provided but will be published shortly in the documentation of the LAP survey instruments.

Although the data of the most recent panel waves has not yet been released, we will describe the measured constructs of all waves. As described in Chapter 3.8, this data will be published in the foreseeable future.

Selected NEPS constructs: Aspects related to studying and the course of studies, personality, motivation, and well-being

As in other NEPS starting cohorts, data collection in NEPS SC5 revolves around the question of competence acquisition and development in formal and nonformal/informal learning environments, educational decisions and transitions, their determinants and consequences, and monetary as well as nonmonetary returns to education (Brachem et al., 2019). To this end, the study assesses domain-general cognitive abilities (figural reasoning, perceptual speed), basic domain-specific cognitive competencies (German-language competencies, mathematical literacy, scientific literacy, English-language competencies), meta-competencies (ICT literacy) and social competencies using competence tests (Weinert et al., 2019). In NEPS SC5, in addition, a subject-specific competence test in business administration has been developed and was administered in wave 7 (Brachem et al., 2019; Lauterbach, 2016).

A central feature of the NEPS is the life course approach. The NEPS, therefore, records and updates the individual biography in different life spheres (e.g., schooling, vocational training (including higher education), employment, family and partnership) and provides detailed event-history data for these domains. In this way, the complete course of study is recorded in great detail, including moves to another higher education institution, change of subject, and change of degree. In addition, the NEPS collects information on learning environments, academic performance, psychological factors, and outcomes. Some of the constructs that represent these domains and are considered relevant to the study of teacher education, teaching and the teaching profession are described in Table A1 in the Appendix.

Preparatory service

In order to describe the preparatory service, the LAP study gathers information on basic characteristics such as teaching track, perceived learning environment, and different teaching practices applied in lessons given during the preparatory service (see Appendix, Table A2). Questions about all of these aspects were asked in computer-assisted telephone interviews (CATI).

The instrument used for characterising the learning environment focuses on the support dimension of the above-mentioned SSCO model. While most taxonomies of social support distinguish between informational, instrumental, and emotional support (Helgeson, 2003), the LAP study only addresses instrumental support, which has been found to be predictive of emotional exhaustion and self-efficacy (Richter et al., 2011) and “involves people providing concrete assistance” (Helgeson, 2003, 25). However, social support is measured in relation to three important reference groups of trainee teachers: peers, mentor teachers, and the head of the teaching seminar. In addition to social support, the LAP study also gathers information on the challenge dimension of the SSCO model (e.g., constructivist interaction with mentors; see Appendix, Table A2) and the orientation dimension (e.g., integration of theory and practice; see Appendix, Table A2).

To measure instructional practices applied during preparatory service, we used two subscales of the instrument proposed by Weresch-Deperrois et al. (2009). These subscales are informed by the “Standards for teacher education: Educational sciences”, which have been adopted by the Kultusministerkonferenz (Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany; Kultusministerkonferenz, 2014) and describe relevant instructional skills. However, it turned out that the psychometric properties of the scales were not convincing. Therefore, a different operationalisation was used when surveying the teaching practices of all teachers in computer-assisted web interviews (CAWI; see below).

Learning opportunities and professional experiences

In the assessment of teacher competencies, the LAP project placed particular emphasis on competencies related to heterogeneity in the classroom and, in later waves of the survey, on the use of ICT for teaching. Since the development of these competencies depends not least on corresponding learning opportunities, the surveys asked which experiences the respondents had gained with regard to inclusion, multiculturalism, and digital media during teacher training and in the teaching profession (see Appendix, Table A3).

The learning opportunities and professional experiences regarding multiculturalism and inclusion were measured in the online surveys from 2016 onwards, based on instruments proposed by Laschke and König (2014; learning opportunities) and also by the project “Attitudes towards inclusive education in schools” (Projekt E1NS) at Hildesheim University (Stiftung Universität Hildesheim, 2016; professional experiences). Starting in 2018, professional experiences with different special education focuses (e.g., visual impairment, autism, or special needs in physical and motor development) were collected in a more differentiated manner, based on an instrument of NEPS Starting Cohorts Grade 5 (SC3) and Grade 9 (SC4) (PAPI 2012/13; LIfBi, 2016). In 2021, learning opportunities and professional experiences related to the use of digital media in the classroom were added to the survey, replacing the focus on inclusion.

As an extension of capturing professional experiences, teachers were asked about the extent to which they perceive a heterogeneous student body as an impediment of their teaching. The dimensions of heterogeneity considered include cultural and social heterogeneity, performance heterogeneity, and differences in behaviour and motivation. The nine-item instrument, based on a measurement by Baumert et al. (2008), has been used in CATI studies since 2020 (see Appendix, Table A3).

General aspects of professional competencies

Following the competence model proposed by Baumert and Kunter (2013), the LAP study measures general motivational orientations, beliefs, and self-regulation. General motivational orientations include motivation for choosing teacher education or a career in teaching, teacher enthusiasm, and teacher self-efficacy.

The motivation for career choice was measured twice: The first measurement took place retrospectively in 2014 (wave 8) with regard to choosing a teacher education programme (see Appendix, Table A4; instrument adapted from Pohlmann & Möller, 2010; Retelsdorf & Möller, 2012). Please note that the data is stored in different variables, depending on whether the respondents are still studying for a teaching degree or have already earned a degree in teaching. The variables can and should be analysed together. The second measurement six years later (wave 19) focuses on the career decision of teachers who had not earned a higher education degree in teaching. Part of the instrument is identical to the one used in wave 8, however, some subscales were omitted, the subscale “fallback career” was added (items adapted from Watt et al., 2012), and some items had to be rephrased.

Teacher enthusiasm can be conceptualised as an affective construct that belongs to the domain of positive emotion and intrinsic motivation (Kunter et al., 2011). However, teacher self-efficacy can also be conceived as a specific form of beliefs, namely “self-beliefs” (Kunter & Pohlmann, 2015; Pajares, 1992).

While self-efficacy is classified differently, beliefs about teaching and learning and teachers’ professional self-concept clearly belong to the competence aspect “beliefs”. Beliefs about teaching and learning were measured in all CAWI waves since wave 8, albeit with differing instruments. In wave 8, the German-language questionnaire of “The Teaching and Learning International Survey (TALIS)” was applied (OECD, 2009). Because in this case measurement quality was not good, the LAP project decided to implement a shortened version of the instrument described by Kunter et al. (2017) in subsequent waves. Both instruments cover the dimensions transmission beliefs and constructivist beliefs.

Teachers’ professional self-concept was measured in all CAWI waves from wave 8 onwards, using selected subscales and items of the questionnaire developed by Retelsdorf et al. (2014). Because it became necessary to shorten the survey, the dimension “self-concept in consulting” was omitted from wave 11 onward.

The questionnaire used to measure occupational self-regulation is based on Schaarschmidt and Fischer’s (2001) AVEM inventory. Although this inventory was reduced from eleven dimensions with 66 items to four dimensions with 13 items, the shortened version still showed sufficient measurement quality (Menge & Schaeper, 2019).

Specific aspects of professional competencies: dealing with inclusive education, cultural diversity, and digital media; teachers’ stereotypes

Regarding specific aspects of teachers’ professional competencies, the LAP study addresses inclusive education, cultural diversity, and teaching with digital media, and measures corresponding beliefs and self-efficacy expectations (see Appendix, Table A5). While the issue of inclusive education and teaching in culturally diverse classes has been included since CAWI wave 11, teaching with digital media was added to the survey programme later (wave 17 or 19) and partially replaced the topic inclusion.

Information on the quality of the scales used to measure inclusion-related beliefs and self-efficacy expectations can be found in Menge et al. (2021). To date, validity and reliability analyses of the other scales using the LAP data have not been published. However, in NEPS SC3 teachers’ cultural beliefs were measured using an almost identical instrument. The only difference is that in NEPS SC3 the subscale “multicultural beliefs” consists of one additional item (“During counselling sessions with parents who have a different cultural background than I do, I try to respect cultural particularities.”). A recent analysis of this data yielded satisfactory values for Cronbach’s alpha and confirmed the three-dimensional structure of teachers’ cultural beliefs (Schotte et al., 2022).

Stereotypes, conceptualised as “beliefs about the characteristics, attributes, and behaviors of members of a certain group” (Hilton & von Hippel, 1996, 240; cited from Wenz et al., 2016, 3), are a major source of discrimination. Therefore, stereotypes held by teachers about competencies and abilities of different groups may help to explain discriminatory judgements and behaviours and, consequently, educational and performance inequalities (for details see Wenz, 2020; Wenz et al., 2016). To measure teachers’ stereotypes, Wenz and colleagues (Wenz, 2020; Wenz et al., 2016) developed a questionnaire that was first used in NEPS Starting Cohort “Kindergarten” (SC2). This questionnaire assessed teachers’ stereotypes regarding reading and mathematical competencies in the following groups: female and male students; students with low, middle, and high socioeconomic backgrounds; and students of Turkish and Russian origin, immigrants in general, and ethnic majority students in general. A slightly different instrument was implemented in the LAP study (see Appendix, Table A5). While focusing on the same social groups, the 18 items address (prospective) teachers’ stereotypes relating to reading competencies and parental support.

Contextual information on the professional situation

As important contextual information on employment as a teacher (see Appendix, Table A6), we measured the type of school, the (level and size of) classes and the subject groups that are (predominantly) taught by the respondents. This contextual information plays an important role especially when corresponding differentiations are required in data analyses.

Since dealing with heterogeneity in the classroom is one of the focal points of the LAP project, we also collected information on the proportion of students in the school with a migration background and the composition of students in the classes taught.

In order to be able to analyse, for example, the influence of the teaching staff, the school management or the classroom context on teachers, the length of time the participants had been working at their school was also recorded. As a supplement to the questions about the leadership style of the school management (see below and Appendix, Table A6), the participants were also asked if they worked as (deputy) head teachers, since in this case they were not asked the further questions on leadership style that would require them to evaluate themselves.

Teaching practices

It goes without saying that students’ competence development depends on the way teachers teach and the quality of instruction (Hattie, 2009). Classroom management (Structure), cognitive activation (Challenge), and constructive support (Support) are considered to be central dimensions of teaching quality (Klieme et al., 2006). Together with Orientation as a fourth dimension (Radisch et al., 2007) these factors are the components of the above-mentioned SSCO model.

Cognitive activation was initially measured using three items adapted from Kunter et al. (2017) (see Appendix, Table A7). To increase the internal consistency of the scale, two more items from the same source were added as of wave 14.

As has often been the case in previous research, the LAP study distinguished two dimensions of classroom management: disruptions/effective use of time and monitoring. Initially, both dimensions were measured using three items from Kunter et al. (2017). Due to the low Cronbach’s alpha value of the monitoring scale, two additional items from other instruments were later introduced.

Central features of the support dimension are positive emotional relationships, support for autonomy and competence, and social embedding (Bäumer et al., 2019). Measures of support also include internal differentiation and individualisation of instruction (Kunter et al., 2013). Differentiation and individualisation, in turn, are crucial for the participation of all students in the classroom, i.e., inclusion (Gebhardt et al., 2014). These teaching principles are not only characteristic of inclusive teaching but also of a supportive approach to dealing with cultural heterogeneity. Just as for the two previously mentioned facets of teaching practices, differentiation/individualisation was measured four times, in CAWI waves 11, 14, 17, and 19.

The next-to-last instrument listed in Table A7 in the Appendix has a somewhat different focus. It is not directed towards teaching practices but rather towards the emphasis teachers give to developing different ICT-based abilities. The items were taken from Vennemann et al. (2021) and applied once in CAWI wave 19.

Proactive behaviour in occupations

In 2020, the LIfBi launched the first open “Call for Modules” (CfM), giving external researchers the opportunity to incorporate additional questions in the NEPS survey programme. Three different proposals were selected and one of them was chosen to be implemented in the NEPS SC5 survey 2022. The proposal “The innovative teacher? Proactive behaviour in different occupations” includes a five-item scale (see Appendix, Table A7) measuring proactive professional behaviour, and was submitted by Mareike Kunter, Franziska Baier, Julia Dohrmann and Verena Jörg from the Leibniz Institute for Research and Information in Education (Leibniz-Institut für Bildungsforschung und Bildungsinformation (DIPF)). Proactive behaviour, defined as self-initialised and change-oriented activities (Hüttges & Fay, 2019; Ohly et al., 2006; Thomas et al., 2010), is a prerequisite for dealing both with changes in society and in the school system, and has been shown to be predictive of occupational success (Hüttges & Fay, 2019; Thomas et al., 2010). It was measured using the subscale “Voice”, which is conceptualised as the “individuals’ propensity to proactively discuss change-oriented and constructive ideas” (Thomas et al., 2010, 277). The operationalisation is based on the instrument of van Dyne and LePine (1998), which has been translated, shortened and adapted for teachers by Kunter et al. (2017) (see Appendix, Table A7). To be able to compare proactive professional behaviour in different professions, two versions exist; a version for teachers, taken from Kunter et al. (2017), and a slightly modified version for study participants in other professions.

Experiences and situation during the Corona pandemic

As already mentioned in various publications (e.g., Fickermann & Edelstein, 2021), the Covid-19 pandemic and school closures required many short-notice changes to regular approaches to teaching in schools very early on, thus exposing students and their families—but also schools and teachers—to manifold challenges and difficulties. Various aspects of these new requirements and adaption to teaching were conceptualised and measured during and after the pandemic-related school closures (see Appendix, Table A8). (Prospective) teachers were asked first and foremost about the situation in their schools and classes and—if they were still studying or in preparatory service—how much the pandemic affected the course of study or preparatory service or the study situation (e.g., taking exams, attending courses).

Additionally, data was collected on the challenges faced by teachers during school closures (e.g., with respect to providing learning material, motivating students for remote learning, support by colleagues and principles). Other questions refer to communication with families and provision of information and learning materials for students during school closures (e.g., online platform, school clouds, telephone or regular mail). The scales were adopted and partly adapted from similar research projects on teaching during the pandemic-related school closures (e.g., NEPS-C (https://www.neps-data.de/Data-Center/Data-and-Documentation/NEPS-C); Lorenz et al., 2020).

As mentioned above, teachers were asked about their attitudes towards digital media in schools, their self-efficacy with respect to using digital media for teaching (see Appendix, Table A5), and teaching with digital media both in general (see Appendix, Table A3) and specifically with regard to school closures. The corresponding questions, adapted from Bos et al. (2010), measure the extent to which the use of digital media for various teaching purposes changed after the reopening of the schools compared to the time before the schools were closed.

Pandemic-related questions were also asked to all study participants of NEPS SC5. The first survey with pandemic-related questions was conducted in May 2020 during the first nationwide school closures in Germany. All panel members were invited to participate in a web-based survey with general questions regarding their family situation and child-care arrangements in the context of closed education institutions and—if they had children themselves at that time—how well they managed at home with remote schooling. Furthermore, they were asked about their own experiences with remote learning if they still attended formal education as well as about life satisfaction, and their work or study situation during the lockdown. In the following surveys in autumn 2020 through to spring 2021 as well as one year later repeated measures of pandemic-related instruments were included in the regular questionnaire programme.

Professional development activities

Lifelong learning is embedded in educational and professional careers (Allmendinger et al., 2019). Therefore, data on continuing education is also collected in the NEPS SC5 life course interviews. As continuing education plays an important role in maintaining and expanding teachers’ professional competence (see Chapter 1), additional questions have been included in the surveys since 2019 (see Appendix, Table A9).

All teachers who reported at least one further training in the life course interview were asked whether they had participated in PD activities related to their job as a teacher in the last twelve months. If yes, the respondents were asked to indicate the topic(s) of the training. Ten of the eleven topics presented were selected from TALIS (Teaching and Learning International Survey) 2008, conducted by the German Education Union (GEW) (Gagarina & Saldern, 2010), TALIS 2013 (Europäische Kommission/EACEA/Eurydice, 2015), IGLU (International Primary School Reading Survey) 2016 (Hußmann et al., 2017), and the general teacher questionnaire of NEPS SC4. Where necessary, the original items were modified; one item was developed in-house. The selection covers the subject-specific/content pedagogical, general pedagogical and method-oriented topics identified by Richter et al. (2013).

In addition, data was collected on factors that influence teachers’ participation in PD activities. Referring to Richter et al. (2010), who used Cookson’s (1986) model for explaining participation in continuing education and applied it to the situation of teachers, the LAP project distinguishes between context-specific, social, psychological, and situational factors.

Apart from predictors described in the previous sections, the LAP, therefore, measured teachers’ beliefs regarding the importance of continuing education and the relevant school context. The subjective importance of continuing education was operationalised using the scale from the IQB National Assessment Study 2011 (Richter et al., 2014), which was only slightly modified. Regarding aspects of the relevant school context, the focus was placed on attitudes towards continuing education in the teaching staff and in the school, support from school principals (the aforementioned aspects were measured with selected items from the IQB National Assessment Study 2011), and on the availability of resources for PD activities (measured by adapted items of IGLU 2001 (Bos et al., 2005)) (see Appendix, Table A9).

Teacher cooperation and school leadership styles

As pointed out in Chapter 1, colleagues and school leaders are important context factors, predictive of various other factors at the teacher, student, and school levels. Several models have been proposed in order to conceptualise and describe cooperation among teachers. The LAP study opted for the three-stage model of Gräsel et al. (2006), which is strongly influenced by the work of Little (1990), and her typology of four ideal typical forms of teacher cooperation. Gräsel et al. (2006) distinguish three dimensions or stages—exchange, joint work, and co-construction—which are characterised by increasing requirements. The three dimensions were measured in CAWI waves 14, 17, and 19, each with three to four items (see Appendix, Table A10).

There are several models for conceptualising leadership and classifying leadership styles (overview in Bush & Glover, 2014; Greubel, 2017). In accordance with the relevance of instructional and transformational leadership for teachers, students, and schools (see Chapter 1), the LAP study also includes scales that measure these leadership styles of school principals, as perceived by the teachers. The instructional leadership scale, taken from Pietsch et al. (2014) and slightly modified, initially consisted of four items and was later expanded to five items. The transformational leadership scale is based on the German adaptation (Heinitz & Rowold, 2007) of the Transformational Leadership Inventory (TLI) by Podsakoff et al. (1990). Of the six dimensions that measure transformational leadership, three subscales were selected (articulating a vision, fostering the acceptance of group goals, providing an appropriate model), each comprising three items. The exact wording was taken from Ewen (2013), who adapted the questionnaire to the school context. These instruments have also been used three times since CAWI wave 14.

Occupational well-being

Indicators of occupational well-being are outcomes of educational and professional careers, situations, and experiences, and can therefore be considered as non-monetary returns to education (Gross et al., 2019). On the other hand, they are predictors of educational and occupational decisions, competencies and behaviours and can help to explain dropping out from preparatory service and from the teaching profession (Blömeke et al., 2017; Klassen & Chiu, 2011; Skaalvik & Skaalvik, 2011, 2017), instructional quality (Klusmann et al., 2008; Kunter et al., 2013), and student outcomes (Arens & Morin, 2016; Klusmann et al., 2016; Klusmann & Richter, 2014).

The LAP study focuses on two aspects of occupational well-being: emotional exhaustion and job (or career) satisfaction. Emotional exhaustion “represents the basic individual stress dimension of burnout […] and refers to feelings of being overextended and depleted of one’s emotional and physical resources” (Maslach et al., 2001, 399). It was operationalised using the slightly modified four-item scale proposed by Kunter et al. (2017) (see Appendix, Table A11). This scale in turn is based on the Maslach Burnout Inventory (MBI; Maslach & Jackson, 1981) and the German translation by Enzmann and Kleiber (1989). In the LAP, the four items were presented once for measuring emotional exhaustion during the preparatory service and annually from CATI wave 10 onwards for target persons working as teachers.

Job satisfaction can be defined as “a positive (or negative) evaluative judgment one makes about one’s job or job situation” (Weiss, 2002, 175). Accordingly, Skaalvik and Skaalvik (2011, 1030) conceptualise teacher job satisfaction as teachers’ affective reactions to their work or to their teaching role. Some researchers, however, distinguish between career satisfaction (Berufszufriedenheit) and job satisfaction (Arbeitszufriedenheit) (e.g., Abele et al., 2011; Hagmaier et al., 2018): While job satisfaction refers to the evaluation of one’s working conditions (Abele et al., 2011), career satisfaction is considered a broader construct that takes “a long-term perspective to work experiences” (Hagmaier et al., 2018, 142) and refers to the “evaluation of the accumulated experiences in one’s career” (Hagmaier et al., 2018, 142). Focusing on the overall assessment of career choice, the four-item scale used in the LAP study and taken from Kunter et al. (2017) represents career satisfaction rather than job satisfaction. Data was collected from respondents who are in or have completed preparatory service or who are working as teachers.

2.6 Quality Control

Study planning in the larger NEPS context

In the NEPS, a large network of education researchers from different disciplines and institutions works together and coordinates and exchanges information on the survey and testing programme. A master plan meeting, several coordination meetings, and a meeting for the final approval of the survey or test are held for each study. To ensure high data quality and comparability across cohorts and over time, several standardised processes for the development of survey instruments and fieldwork have been implemented in the NEPS. These processes have also been adopted by the LAP study.

Selection of survey content and assurance of the quality of the constructs

Once relevant contents for the survey had been chosen based on extensive literature reviews and pre-announced in the consortium, potential measures were examined. In most cases, the LAP had to rely on tested and well-established operationalisations and could not develop and pre-test new instruments. The quality of established questionnaires was controlled by reviewing the documentation and further literature regarding survey instruments and theoretical approaches, contacting item developers, and performing secondary analyses using available data from other studies in which the instruments were used.

In many cases, time restrictions made it necessary to shorten the original measures for an application in the LAP programme. In order to obtain a short scale with the best possible scale properties, statistical criteria such as reliabilities, internal consistency, discriminant and convergent validity, discriminatory power of items, and factor structure with and without certain items were all considered in addition to theoretical considerations and occasional recommendations from experts.

After the initial use of the instrument in the LAP study, each instrument was checked regarding, for example, factor structure, internal consistency and item non-response. In individual cases, if the scale properties did not meet the quality standards, the instruments were modified for the next use. Following the maxim “If you want to measure change, don’t change the measure” (Beaton & Zwick, 1990, cited from Beaton & Barone, 2017, 252), moderate adjustments to the wording or response scale were made rarely and only when necessary, e.g., to take account of changing conditions regarding attainable degrees or childcare arrangements. Before any adjustments were made, the measurement accuracy and comparability were weighed up for each individual case. Adjustments are documented and can be recognised in the data by the versioning of the variables concerned. For quick access to information, a documentation on the LAP survey instruments, although not available at time of writing, will be published in the near future, describing the properties, sources and changes in the instruments.

Questionnaire construction, field preparation and field control

The NEPS/LAP studies were mainly conducted by the survey institute infas (infas Institute for Applied Social Sciences, Bonn, Germany).3 To ensure high survey quality, several steps were taken, which partly differ between online and telephone surveys.

For online and computer-assisted telephone surveys, the survey institute and members of the LAP project and the NEPS consortium thoroughly controlled the programming in several test loops. The surveys were checked for correct filtering, proper construction of auxiliary variables and – in the case of web-based surveys – correct display on different technical devices (e.g., PCs, laptops, smartphones). Since 2016, the online surveys have been optimised for different devices. When programming was completed, an additional data storage check was performed.

In the case of computer-assisted telephone surveys, interviewers were trained before each new survey. The interviewer training was complemented by an interviewer manual that provided important information about the sample, the survey programme, and the interview process. Supervisors in the CATI call centre and a field operations manager were available for support.

Once a study was in the field, the survey institute regularly reported the number of contacted target persons, realised interviews, and interview duration. At several points during the field period, all data collected up to that point was transmitted. This intermediate data was checked in detail. This allowed for accurate field control and timely intervention in case of any anomalies. To control the quality in the field, several telephone interviews were recorded and examined. Additionally, feedback from interviewers in telephone surveys and comments from target persons in online surveys were analysed and checked for potential adjustments for the following studies.

2.7 Data anonymisation and ethical issues

To participate in the panel study, students had to give prior informed written consent. Moreover, informed consent to take part in the study was also given by the educational institutions, so their students were allowed to be contacted. Data protection and security officers of the LIfBi and DZHW approved the consent procedure.

The anonymity of the participating students and the respective HEIs is guaranteed by following two principles of the NEPS anonymisation concept: First, disclosure of participants’ identities should be impossible and, second, a high utility of the data should be preserved (Schier et al., 2019). Therefore, information that would allow identification of individuals or HEIs (e.g., names, addresses) is not provided in the Scientific Use Files. Direct identifiers are separated from the data and are not delivered from the survey institute to the LIfBi. In addition, the anonymous identifiers used by the survey institute are replaced by new ones in the SUFs.

To provide secure and comfortable data access for the scientific community, a combination of five approaches has been implemented (see Schier et al., 2019).

Organisational data protection: The data is only available to the scientific community. Access to the data is managed by the LIfBi Research Data Center, which checks the status of the data user, their connection to a university or a research institute and the scientific interest of the data request.

Legal data protection: Data users are informed about data protection and data security. They are committed to data protection and must sign a contract with corresponding regulations.

Statistical data protection: Techniques of statistical data protection are used to ensure the respondents’ privacy by generating factually anonymous data. Various modification approaches exist and are applied in the NEPS, e.g., aggregating data or slightly changing variable values. Which approach is chosen primarily depends on the way in which data access is realised (see below).

Informational data protection: LIfBi staff offer a special training programme to data users that explains the complex data structure and gives information on data protection and data security. Furthermore, detailed documentation is provided that includes information about data protection and anonymisation measures.

Technical data protection: As the data collected in LAP/NEPS is digital, technical data protection in the form of hardware and software solutions is essential. One area of technical data protection concerns data dissemination. Depending on the sensitivity of the data, LIfBi offers three different modes of access to the Scientific Use Files (see Chapter 3.7). Regardless of the access mode, all data is anonymised using random person and institution identifiers so that neither individual study participants nor HEIs can be identified. Information that is more sensitive is only available under restricted conditions following data protection rules. For example, information on federal states and places of residence of participants is not available in the downloadable dataset but in a data-secured environment that can only be accessed with an additional data usage contract (Fuß & Wenzig, 2019). The anonymisation manual on neps-data.de provides information on which variables are available for which access mode (see https://www.neps-data.de/Data-Center/Data-and-Documentation/Start-Cohort-Students/Documentation).

2.8 Existing use of data

All publications that make use of the NEPS data can be found on the LIfBi website https://www.neps-data.de/Project-Overview/Publications. In December 2022, 140 publications could be counted for the cohort of first-year students, of which 19 relate to (prospective) teachers; they are listed in Table A12 in the Appendix. In addition, another paper has been accepted and will be published in 2023. Based on the Data Use Agreements concluded with the LIfBi for accessing the NEPS and LAP data (see https://www.neps-data.de/Data-Center/Research-Projects), the database is used by a large number of research projects. In total, there are 4,446 contracts for all NEPS Starting Cohorts combined. Consequently, more research results will be published shortly.

3 Dataset description, access and how to use the data

3.1 Repository location

All data collected by the LAP project is fully integrated in the datasets of NEPS SC5, which are published as Scientific Use Files by the LIfBi Research Data Center.

The most recently released SUF of NEPS SC5 and LAP data is version 17.0.0 (doi:10.5157/NEPS:SC5:17.0.0; NEPS Network, 2022). Data users are recommended always to use the most recently published issue. Older versions and their accompanying materials remain accessible.

After having signed the mandatory Data Use Agreement (see also Chapter 3.7), the datasets can be downloaded from the website https://www.neps-data.de/Data-Center/Data-and-Documentation/NEPS-Data-Portfolio. More information on how to access the data can be found at: https://www.neps-data.de/Data-Center/Data-Access.

3.2 Object/file name

Due to the complex structure and the rich amount of NEPS/LAP data, the data is stored in not just one but several different datasets (see Table 3). To help data users with this complexity, the LIfBi Research Data Center provides a merging matrix which gives an overview on how to link information from different datasets. In addition, users may find the semantic data-structure file useful. This file does not contain actual data but meta-data such as variable names, labels and scheme options. With this information data users can explore the data structure without signing a Data Use Agreement. For more information on the data structure and its complexity, see the data manual. All the materials mentioned are listed in Table 4.

3.3 Data types

The NEPS SC5/LAP Scientific Use File contains different types of data but mostly primary data from the target persons. Information on other (context) people such as partners and children was collected from the participants themselves. These data files include many derived variables, for example, classifications of occupations (e.g., ISCO, EGP) and education (e.g., ISCED, CASMIN), aggregated test scores and plausible values, and auxiliary variables.

In addition to primary data, methods data, context data, and process-generated employment-related social welfare data (administrative data) is also available. Context and administrative data can only be analysed on-site at the LIfBi and, in the case of the last-mentioned data, at the Institute for Employment Research (IAB) in Nürnberg or at some of the other IAB locations. In the methods data files users find paradata such as occurrence of problems during interviewing or testing, length of interview or survey, participation, and, in the case of telephone interviews and competence tests, information about the interviewer.

For NEPS SC5/LAP two types of context data are provided: Information for all 413 higher education institutions listed in the codebook of the Federal Statistical Office in 2010/2011, and information on the study programmes (aggregated to subject areas) offered by these HEIs (see Weber, 2014). Examples of variables to be found in this dataset are size and institutional control of HEI, gender composition and financial resources for each subject area in a HEI, and economic and social structure of the regional context of a HEI. The second type of context data provides information on the area of the respondents’ home such as age distribution, house type, milieus, family structure, and probability of payment default (see Schönberger & Koberg, 2017).

The last dataset to be mentioned is the NEPS-SC5-ADIAB (doi:10.5164/IAB.FDZD.2112.en.v1; Bachbauer et al., 2022). It is provided jointly by LIfBi and IAB and consists of the NEPS SC5/LAP data and the administrative data of the IAB. This administrative data includes, for example, information on employment history, benefit recipient history, and jobseeker history.

3.4 Format names and versions

The download version of the SUF is provided as Stata and SPSS files and data users need their own software licences. If users analyse the data via remote access (RemoteNEPS) or on site, they use the statistical software implemented on the LIfBi server system. In this case, Stata, SPSS, and R are available but not, for example, Mplus.

3.5 Language

Most of the documentation and supplementary materials are provided in English. The Stata files contain German and English variable and value labels. Stata users can easily choose between the languages (de, en) with the Stata command “label language languagename”.

3.6 License

All NEPS/LAP data published as SUFs is only accessible for scientific use by researchers, due to German data protection laws and corresponding participation agreements with all survey participants. As data access is regulated by Data Use Agreements, the application of (open) licenses is not required.

3.7 Limits to sharing

The LIfBi Research Data Center aims to make the NEPS/LAP data available to researchers as quickly as possible and tries to publish the Scientific Use File within 18 months of the end of fieldwork. Data access is granted to the scientific community for scientific purposes only and on condition that a Data Use Agreement is concluded (see below and Fuß & Wenzig, 2019). The amount of sensitive information that researchers can analyse depends on the access mode: The downloadable SUF version is characterised by the highest degree of data modification and anonymisation and contains little sensitive information. Datasets available via remote access provide more sensitive information than the download SUF. Finally, the on-site version, which requires a guest stay at the LIfBi, has the lowest level of anonymisation and the highest degree of sensitivity. If data users want to analyse the remote access SUF data or the on-site data version, they need to sign supplements to the NEPS Data Use Agreement. Using the data via remote access also requires a biometric authentication (keystroke biometrics). Further information on which variables may be restricted in use due to sensitivity and are therefore only accessible via remote access or on site can be found in the anonymisation report, which is released with every data issue (https://www.neps-data.de/Data-Center/Data-and-Documentation/Start-Cohort-Students/Documentation; short overview: https://www.neps-data.de/Data-Center/Data-Access/Sensitive-Information).

3.8 Publication date

The most recently published Scientific Use File has been available since 7 November 2022. With each new version the data will be extended and updated (e.g., by adding additional survey waves, data sets or corrections). Information on the publication date of earlier SUF releases can be found online (https://www.neps-data.de/Data-Center/Data-and-Documentation/Start-Cohort-Students/Data-and-Citation). By clicking on the respective link in the table, the corresponding documentation is accessible.

Since the last NEPS SC5/LAP survey was conducted in 2022, there are only a few panel waves that have not yet been published. According to the publication guideline mentioned above, publication of the 2021 CATI can be expected around spring 2023 (SUF version 18) and the release of the combined online and telephone survey conducted in 2022 is due in spring 2024 (SUF version 19). The data release schedule on the NEPS website (https://www.neps-data.de/Data-Center/Overview-and-Assistance/Zeitplan-en-US) is constantly updated.

3.9 FAIR data/Codebook

Findability: LAP data is integrated in the NEPS SC5 datasets and is therefore easy to find via the DOI (digital object identifier). Each subsequent data release is given its own and thus unique DOI. Part of this identifier code is the release number of that Scientific Use File, which is also used to link all related materials to a specific issue. In this way, data users can easily connect documentation material such as the data manual or codebook to the specific data. The LIfBi Research Data Center provides a wealth of information and documentation materials on their website (https://www.neps-data.de/Data-Center/Data-and-Documentation/Start-Cohort-Students/Documentation).

Accessibility: Access to the data is free of charge and managed by the LIfBi Research Data Center. Each researcher who wants to use the data has to sign a Data Use Agreement (see Chapter 3.7 for more information). It is recommended for new users to participate in one of the NEPS data training courses, which take place on a regular basis and are offered free of charge by the LIfBi Research Data Center. A signed Data Use Agreement is not limited to a specific SUF release; if a new SUF version is released during the contract period, data users are free to use the new issue. Information connected to outdated releases is still accessible via the LIfBi Research Data Center website.

Interoperability: Interoperability is ensured by providing SUFs in widely-used data formats (e.g., Stata or SPSS). For more information on how to access the NEPS/LAP data see Chapter 3.7.

Reuse: The LIfBi Research Data Center provides rich documentation materials (like codebooks, questionnaires and data manuals) and help for data users (like NEPSplorer, NEPSforum, data training courses). The website lists a huge variety of open access publications (e.g., NEPS Survey Papers, LIfBi Working Papers) and moreover, each data usage contract is listed with a short description on the website, which gives an opportunity to view ongoing research. The Data Use Agreement necessary to use the NEPS and therefore the LAP data clearly defines the regulations that all data users must follow.

Relevant meta-data: In addition to those materials mentioned in chapter 2.5.1, there are some further helpful sources of information: methods datasets; samples and weights; via NEPSplorer: information on what items were part of which survey(s), to what theoretical construct they belong, and which references have to be cited (BibTeX); and soon: documentation of the LAP survey instruments.

3.10 How to analyse the LAP data – technical advice, special variables and more

Due to the complexity of the study design and the longitudinal information on the teacher education students, working with the data can be a challenge. In particular, identifying the LAP target persons such as teacher education students, future teachers in preparatory service and in-service teachers can be difficult since no separate dataset for LAP exists, but rather all information is integrated in the NEPS SC5 SUF.

To facilitate data use, specific variables can be used to identify LAP target persons and the phase of teacher education they were in at the time of (selected) surveys.

Identifying teacher education students in NEPS SC5

Information about the first study programme of the study participants was collected in two different surveys, the initial paper-and-pencil questionnaire and the first telephone interview. Respondents who classify themselves as teacher education students in the initial PAPI can be identified with variable tg02001 in the data file ‘pTargetCATI’, which distinguishes between various degrees, or variable tg02001_g1 (also in ‘pTargetCATI’), which takes the possibility of polyvalent bachelor’s degrees with the option of specialising in teacher education into account. To identify teacher education students at the beginning of their studies in the winter term of 2010/2011 with data from the CATI, variable tg24201_g1 in ‘pTargetCATI’ can be used. Since the information on the intended degree at the start of studies was gathered in different ways, the number of teacher education students differs between the first PAPI and CATI.

Whether or not study participants were enrolled in a teaching degree programme in the course of the subsequent educational history can be determined from the episode data in the data file ‘spVocTrain’ using the variable ts15221_g1 (Intended vocational qualification, revised) in combination with tg24201 (Intended teaching degree). A more detailed description of the different ways to identify teacher education students can be found in the most recent data manual for the Scientific Use File of NEPS SC5.

There is more than one possible answer to the question of which respondents in CAWI waves 2, 4, 6, and 8 were enrolled in a teaching degree programme at the time of the survey, but regardless of the solution, data users need to combine information from the data file ‘spVocTrain’ of the last CATI in which the respondents participated with information from the respective CAWI.

As of wave 11, auxiliary variables have been introduced to help identify LAP target persons at the time of the survey and to distinguish between different phases of teacher training and employment. The variables tg60012 and tg60017 refer to CAWI waves and are stored in the data file ‘pTargetCAWI’. Variable tg60017 is an updated version of tg60012, which was introduced in wave 14 and considers whether respondents denied a previously stated teacher (education) context later in the survey. Similarly, variable tg60013 refers to CATI waves. These variables exist in different versions (see the most recent data manual) and capture the training/employment status of (future) teachers at the time of the telephone or online interview.

Coding and identifying the type of (intended) teaching degree

Information on the type of intended teaching degree was collected with an open-ended question and then coded based on a classification of the Standing Conference of the Ministers of Education and Cultural Affairs of the Länder in the Federal Republic of Germany (Kultusministerkonferenz/KMK). The coding scheme, however, does not distinguish teacher education programmes that include several levels. Instead, the answer was coded under the highest level, when more than one type of teaching degree or degree programmes spanning several levels were reported.

Information on the type of intended teaching degree that refers to the first study programme in the winter term of 2010/2011 can be found in ‘pTargetCATI’ (tg24202_g2, tg24202_ha, tg03001_g2). The intended teaching degree reported for different study episodes is stored in ‘spVocTrain’ in variable tg24202_g1. Determining the type of teaching degree (and subjects) in the CAWI waves is somewhat more complicated, as the information was not newly collected but was rather updated when respondents had changed their intended degree (or subject). Therefore, it is necessary to use information stored in ‘spVocTrain’ and perhaps combine it with data from the CAWI.

Identifying teacher candidates in preparatory service

Information on whether participants attend preparatory service is provided in ‘spEmp’. For each employment episode (including working as trainee teacher), (1) the type of employment (e.g., employment with training character, self-employment) is determined, (2) if applicable, the type of employment with training character is recorded (variable ts23214), and (3) in case of a Referendariat (preparatory service), it is asked whether the Referendariat qualifies for teaching. This information is stored in tg64001 (Teaching Referendariat yes/no).

From wave 11 onwards, the auxiliary variables (and their versions) tg60012 or tg60017 in ‘pTargetCAWI’ and tg60013 in ‘pTargetCATI’ can be used to identify participants who are in preparatory service at the time of the survey.

Identifying in-service teachers

The information on whether the study participants are already working as teachers at the time of the interview is again provided in the auxiliary variables from wave 11 or wave 14 onwards. If data users are interested in whether panel members have worked as teachers throughout their employment history, they can use the information on occupations in the episode dataset ‘spEmp’. This information is collected with an open-ended question and then coded using established classifications of job titles. For example, in the German Classification of Occupations of the Federal Employment Agency (KldB 2010; variable ts23201_g2) all codes beginning with 841 stand for teachers in general education schools, and codes beginning with 842 refer to those teaching vocational subjects, in-company training and in-company pedagogy. Depending on the specific research interest and questions, other classifications may also be used.

Please note: Variables based on job titles do not distinguish between trainee teachers and in-service teachers. A primary school teacher in the preparatory service receives the same code as a fully qualified primary school teacher. To make this distinction, the variable tg64001 can be used.

Auxiliary variables that data users shall use with caution

The auxiliary variables tg60011 (wave 8), tg60014 (wave 9), tg60015 (wave 10), and tg60016 (from wave 13 on) have been generated for purposes of navigation through the survey only. Therefore, it is highly recommended to use these variables with caution. While they may give a first overview of the data, for detailed analyses the original episode data should be used.

Missing data and weights

Different codes distinguish between the types of missing values that appear in the data. All missing values are negative or defined as “system missing”. Basically, three categories of missing codes are distinguished: First, item non-response if a study participant did not (validly) answer the question. Second, not applicable including missing values that occur when an item does not apply to a respondent. And third, edition missings, which are generated in the data preparation process and include codes for anonymised data (see the Data Manual for more information).

Regarding unit non-response in the individual surveys and to account for the sampling procedure, weights are provided for each wave. Detailed information on the construction of design and panel weights, as well as on their successive adjustments, is given by Zinn et al. (2017). Weights for the latest wave (17) are described in Ziesmer (2022); they will be updated for the upcoming waves.

General recommendations

Versioned variables: Variables sometimes need to be modified (e.g., because of additional categories or corrections); such changes result in different versions of the variables. All versioned variables can be identified by the suffix “_v*” added to the variable name. All those changes are documented in the release notes of each published Scientific Use File and the data manual. Additional information can be retrieved with the NEPS-specific Stata command “infoquery varname”.4

Using spell data/harmonised episodes: If users want to analyse information stored in spell data format, they are advised to read carefully the corresponding chapters in the data manual. In particular the use of subspells and harmonised episodes may be challenging.

Handling participation or dropout status of respondents: In all NEPS SC5 surveys a special definition of participation status was applied. It differentiates between respondents who participated in the previous CATI or who are considered as temporary dropouts. Temporary dropouts are defined as panel members who did not participate in the last or the last two CATIs. Therefore, the gross sample of each survey consists of panel members who took part in at least one of the last three CATIs (and who did not withdraw their consent prior to the following survey commencement). If participants did not take part in any of the last three telephone surveys, she or he is defined as final dropout. These differentiations are only made for telephone surveys. The gross sample of an online survey is therefore nearly identical to that of the following CATI. For all participants, participation status (at the time of each interview) is stored in the variable tx80220 in the data file ‘CohortProfile’.

Further helpful tips and advice for working with NEPS data in general, can be found in the chapters by Bela (2016) and Fuß and Wenzig (2019).

4 Reuse potential

Strengths and limitations of the data and data collection process

The LAP project has created an important and unique database on (prospective) teachers that has significantly expanded the existing data for empirical educational research in Germany. Combined with the NEPS study, it provides longitudinal information from over ten years of panel surveys, covering the entire teacher education phase and early years in the teaching profession with a large sample of (prospective) teachers in Germany. Unlike other large teacher surveys in Germany, the LAP data is not limited to certain German federal states, school types or subjects taught. Consequently, they allow for a more comprehensive analysis of teacher training and the teaching profession as well as comparisons between different teaching programmes. Through the publication of the Scientific Use Files (currently SUF 17-0-0), the data has the potential of being used for a variety of (secondary) analyses regarding relevant educational processes within the different phases of teacher education and aspects of professional activities of teachers.

Not only does the data provide the empirical basis for numerous questions on teacher education, the transition to the teaching profession, and the first years in the teaching profession, but furthermore the measurement instruments (further) developed in the project are also beneficial for future research on these issues. In order to cover a variety of relevant topics without overburdening respondents, many short scales were developed based on existing instruments. Sometimes, the use of short instruments is viewed critically; however, their quality was examined in detail and is therefore ensured. Documentation on the items and scales used in the LAP study is being prepared and will be published in the near future.

As is common in longitudinal studies, the NEPS/LAP study is confronted with significant panel attrition. Due to the wide range of personal information about the respondents, their psychological characteristics and life circumstances, interesting methodological analyses of panel attrition are possible.

Research potential

The data can be used to analyse the educational and occupational trajectories as well as the professional situation, professional practices, and self-assessed competencies of (prospective) teachers. A wide range of research questions about teacher education can be answered, such as the factors that influence educational decisions and trajectories, educational outcomes, and the importance of learning environments for competence development and educational decisions.

Regarding educational choices, little is known about the decision to stay in teacher education, to move to non-teaching programmes, to drop out of higher education or to move from non-teaching programmes into teacher education. Furthermore, moves within teacher education, e.g., changing the teacher training track or the teaching subject, have rarely been examined. Because the NEPS collects detailed event history data on educational biography, including moves to other higher education institutions, it provides a unique opportunity to examine the course of studies of teacher education students.

Regarding the teaching profession, for example, it is possible to examine the probation in the teaching profession or the influence of educational experiences in higher education and preparatory service on professional practices. The data also allows for analysing in-post learning, as the third phase of teacher education in Germany. In this context, it is possible, for example, to study the determinants of participation in further education or to investigate how collegial cooperation, the leadership behaviour of principals, and the participation in continuing professional education influence the professional competencies, well-being, resilience, emotional exhaustion, and career retention of teachers at the beginning of their career.

The sample also includes participants who chose teaching as a second career and did not pursue a traditional teaching degree. The data makes it possible to examine the factors that lead to the career decision of these teachers and to compare their career choice motives and self-assessed competencies with their traditionally certified colleagues.

By linking the LAP data with NEPS data, it is also possible to address a wide range of further research questions, including comparisons between different career choices and professions. For example, one can compare the professional well-being between teachers, lawyers, physicians and other occupational groups in order to check whether teachers represent a particularly stressed occupational group. One can examine how the choice for the teaching career or other career paths is influenced by interests, personality traits and school performance. It can also be analysed as to whether students from different disciplines differ in terms of study duration and dropout risk, or whether and to what extent the transition to the labour market after graduation and salaries depend on the field of study.

Implications for practice and policy

LAP data can be used to reduce research gaps in empirical teacher education research and expand knowledge in the field of teacher-related research. From the project’s research findings published and presented at conferences so far, as well as from further research with the data, recommendations can be derived for practice (e.g., the design of teacher education) as well as for education policy and monitoring. The findings may also provide the basis for reform measures in teacher education. For example, studies on predictors of dropout and career change can help to investigate some of the causes of the (internationally prevailing) shortage of teachers and to derive recommendations on how to support trainee teachers in successfully completing their teacher training and to make the teaching profession more attractive again. And by examining second career teacher more closely it will be possible to better assess their potential and risks and identify their support needs, regarding, for example, the transition into the teaching profession and their teaching quality.

Additional File

The additional file for this article can be found as follows:

Appendix

Tables A1 to A12. DOI: https://doi.org/10.5334/jopd.76.s1

Notes

[1] In the following, we refer to this starting cohort as NEPS SC5.

[2] The Corona pandemic was addressed for a second time in the autumn 2020 web survey of NEPS SC5.

[3] Only the web-based surveys of waves 2, 4, 6, and 8 were administered by the DZHW.

[4] This Stata command is part of the package ‘NEPStools’ that can be found at: https://www.neps-data.de/Data-Center/Overview-and-Assistance/Stata-Tools.

Acknowledgements

This paper uses data from the National Educational Panel Study (NEPS; see Blossfeld & Roßbach, 2019). The NEPS is carried out by the Leibniz Institute for Educational Trajectories (LIfBi, Germany) in cooperation with a nationwide network. Over the course of the project phases the LAP project teams were supported by many parties to whom we are very grateful. The study was conducted in cooperation with the NEPS and in collaboration with the scientific network of the NEPS, especially with the team from NEPS stage 7 at the DZHW. It was coordinated and prepared for publication as Scientific Use Files by multiple research data infrastructure units at the LIfBi. Furthermore, many student assistants supported the project during all three project phases.

We want to take the opportunity and thank our former colleagues who contributed to the LAP project in all stages and aspects (in alphabetic order): Dr. Thomas Bäumer, joint project coordinator at the LIfBi, who participated in the first two project phases (2014–2019) and was mainly responsible for developing parts of the grant proposals; Dr. Kris-Stephen Besa (2014–2016), Thorsten Euler (2018) and Dr. Julia-Carolin Osada (née Brachem; 2015–2016), all members of the project research group at the DZHW, who were mainly responsible for developing the survey instruments and programming templates. All of them also contributed to the research project with presentations and publications.

We would also like to thank the anonymous reviewers and the editors of this Special Issue for their helpful comments and remarks, which helped to improve our manuscript.

Funding Statement

The project has been funded by the German Federal Ministry of Education and Research (Bundesministerium für Bildung und Forschung; BMBF). The first project phase “Lehramtsstudierenden-Panel (LAP)—Zusatzstudie zur Längsschnittuntersuchung der Studienanfängerinnen und Studienanfänger des Wintersemesters 2010/2011 (NEPS-Startkohorte 5)” (grant numbers B1014A and B1014B) lasted from October 2014 to December 2018. The second project phase “Lehramtsstudierenden-Panel (LAP II)—Professionelles Handeln und professionelle Entwicklung von Lehrkräften im Kontext Schule und Weiterbildung. Weiterführung des Lehramtsstudierenden-Panels” (grant numbers B1018A and B1018B) lasted from Januar 2019 to June 2021. The third and final project phase “Lehramtsstudierenden-Panel (LAP III)—Herausforderungen in den ersten Berufsjahren: Professionelles Handeln und professionelle Entwicklung von Lehrkräften unter besonderer Berücksichtigung von Quereinstieg und Digitalisierung. Fortführung des Lehramtsstudierenden-Panels 2021–2023” (grant numbers B1021A and B1021B) started in July 2021 and will end in June 2023.

Competing Interests

The authors have no competing interests to declare.

Peer Review Comments

Journal of Open Psychology Data has blind peer review, which is unblinded upon article acceptance. The editorial history of this article can be downloaded here:

PR File 1

Peer Review History. DOI: https://doi.org/10.5334/jopd.76.pr1

DOI: https://doi.org/10.5334/jopd.76 | Journal eISSN: 2050-9863
Language: English
Published on: May 10, 2023
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2023 Hilde Schaeper, Andreas Ortenburger, Sebastian Franz, Stefanie Gäckle, Claudia Menge, Ilka Wolter, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.