In the words of the project's National Advisory Committee, PROM/SE has assembled an extensive and unprecedented database of highly curriculum-sensitive baseline information related to student learning, teachers' knowledge, preparation and instructional time on topics, and district standards that are organized to provide useful information to participating schools and districts. This database contains data secured from the administration of 11 instruments. Three of the11 instruments provide evidence for curriculum instantiations and are described below. The design and implementation of an elaborate system for measuring curriculum at the state, district, school, and classroom level is a unique feature of PROM/SE.
In spring of 2004, approximately 204,000 students in 587 schools were administered assessments in mathematics and science. Blueprints for these assessments were created by national experts, mathematicians and scientists, and mathematics and science educators from Michigan State University. Based on recommendation of psychometricians experienced in large-scale assessments, the duplex design by Bock and Mislevy (1988) was adopted. Fifteen parallel forms were designed for both mathematics and science for each of the three grade bands: grades 3-5; grades 6-8; grades 9-12. Together, the item pool contained over 1300 items in mathematics. These items represented 22 distinct strands (topics) at the elementary level; 26 strands for middle school; 27 topic strands for the high school assessments. Thus, we were able to obtain student scores for each on of the strands and provide curriculum specific information to the schools and districts. The science items were created similarly with a total of over 640 items representing 11-12 strands.
To provide international benchmarks and linkages a substantial portion of items from the 1995 TIMSS were included in all three assessments. Each student assessment booklet also contained background items which were used to create socio-economic indicators used in multilevel hierarchical analyses of TIMSS data.
This instrument, administered as a web-survey, asked teachers to indicate the number of class periods they taught specific mathematics or science topics. Information on the number of class periods was converted to percent teaching time devoted to each topic. The list of topics was exhaustive of school mathematics or science topics as represented by the TIMSS Curriculum Frameworks for Mathematics and Science. Responses were received from approximately 6,500 teachers.
The Topic Trace Map (TTM) methodology employed in the 1995 TIMSS was adapted for use with PROM/SE districts. Mathematics and science curriculum specialists in each district were asked to indicate which topics were intended to be taught at which grade(s). Completed TTMs were received from all 62 participating school districts. This indication of the intended curriculum demonstrated significant relationships with both teaching time and student learning in analyses of the 1995 TIMSS.
Grade level summaries of teacher content goal data were used to depict the variability in teachers' coverage of topics at the school and district level. Grade-level disaggregation of data at the school level supported curriculum review and revision processes within each school.
Similar summaries were provided for teacher background characteristics preparation and self-reported confidence in teaching specific topics. Student assessment data were summarized and in additional to overall score comparison against international benchmarks, at the district level, we were able to generate sub scores for broad topics in mathematics and science. The sub scores provided teachers and district personnel useful curriculum-sensitive information for professional development and curriculum revision.
Topic Trace Map data were used to generate a profile of the district's intended curriculum. The state curriculum was mapped to the district curriculum to determine if there was any district level variation.
Relational analyses were conducted to assess the linkages between teacher content coverage and student learning and achievement. Based on coverage of specific topics, a derived score called International Grade Placement (IGP) was calculated for each classroom. The relational analysis used IGP and student assessment scores to determine the nature and strength of the relationship.