Diagnostic Science Assessments for Middle School Teachers serve two purposes: (1) to describe the breadth and depth of science content knowledge so that researchers and evaluators can determine teacher knowledge growth over time, the effects of particular experiences (courses, professional development) on teachers' knowledge, or relationships among teacher content knowledge, teaching practice, and student performance and (2) to describe middle school teachers' strengths and weaknesses in science knowledge so that teachers can make appropriate decisions with regard to courses or further professional development.
The assessments measure science knowledge in four content domains: (Physical Science; Life Science; Earth/Space Science). Each assessment is composed of 25 items—20 multiple-choice and 5 open-response. Six versions of each assessment are available in paper-and-pencil format so that researchers, professional development providers, and course instructors can administer them as pre- and post-tests before and after workshops, institutes, or courses to determine growth in teachers' content knowledge.
Teams of researchers analyzed a number of standards documents and research literature to synthesize the science content (detailed below) middle school teachers should know. Five types of knowledge (detailed below) were also identified. This provided a 2-dimensional chart within which questions were generated to ensure both breadth of coverage (content) and depth of coverage (knowledge type). Click on Middle School Science Content Summary Chart [PDF] to see a summary of the content analysis of these documents. The numbers in each cell represent page numbers in the documents and the letters (A1, PS3, NC6, . . .) represent bibliographic references for research articles. Science topics that were identified in more than half of the sources (A in the far right column) were included in the assessments. The chart below summarizes this structure for the physical science assessment. Click on Types of Science Knowledge for Middle School Teacher Assessments to see descriptions of the knowledge types.
Teams of practicing science teachers, science teacher educators, and scientists generated test items intended to simultaneously target a particular content area and a particular knowledge type. Assessment-wide, items were targeted to be balanced across both dimensions.
Test items for each content area were sent out to approximately 40 external reviewers from each of the same three groups (science teachers, science educators, scientists). These external reviewers categorized questions into a content category and a knowledge type. They also rated the appropriateness of each question and provided other suggestions for improving the questions.
Based on reviewer feedback, questions were selected, revised, and assembled into field tests. Parallel questions were generated to produce 6 versions of each content-area field test. Tests are designed to be completed by test-takers within an hour. Each test consisted of 20 multiple-choice and 5 open response-questions. Each assessment has 3-4 science subdomains. Click on Middle School Science Subcategories to see the specific topics in each subcategory. The table below summarizes the subdomains for each assessment:
Physical Science | Life Science | Earth/Space Science |
---|---|---|
Matter | Structure/Function | Atmosphere/Hydrosphere |
Motion and Forces | Internal Regulation | Lithosphere |
Energy | Heredity/Diversity | Space |
Interdependence |
Each team developed item specification charts for each of the assessments. These charts describe the content and knowledge type of items on each of the four assessments. Click on Physical Science [PDF], Life Science [PDF], or Earth/Space Science [PDF] to view the item specification chart for each assessment.
Evidence of validity of the items for measuring teacher content knowledge in the various categories was established by asking external reviewers to review the items. Items were edited and sorted into randomized sets. They were sent to reviewers along with a review form that solicited: 1) the correct answer to the multiple choice items; 2) categorization of each item into a content category and subcategory; 3) categorization of each item into a knowledge type category; 4) a rating of the item as STS or not; and 5) a rating of the appropriateness of the item for middle school teachers.
Reviewers for each content assessment included scientists, science educators, and science teachers. Each item was reviewed by 27-31 reviewers in life science, 29-33 reviewers in physical science, and 20-22 reviewers in earth science. Each person reviewed about 75 items.
Data from the reviewers were analyzed to identify items that met criteria the DTAMS staff established for measuring the assigned constructs. The criteria for an item receiving verification as fitting a content category was that at least 75% of the reviewers identified the item as assessing a given category. To guarantee a balanced distribution within each category, subcategories within the categories had to be agreed upon by more than 50% of the reviewers; both of these criteria were required for an item to be accepted on the content category criterion. The knowledge type criterion was considered acceptable if more than 50% of reviewers rated the item as belonging to one type. For appropriateness, items that received an average ranking over 2.4 (on a scale of 1=low, 2=medium, 3=high) were considered appropriate. If an item met all three of those criteria, it was accepted to be included in the field tests. If it met two of the criteria, it was reviewed to determine if the wording could be clarified or improved. Revised items were or will be sent out for a second review. The items that met review criteria were selected to be the prototype for items in the field tests.
Currently these assessments are available for use free of charge. However, the assessments will be scored for a fee of $10 per teacher per assessment by CRMSTD staff. Once scored, CRMSTD staff will send instructors and professional development providers detailed summary of teachers' performance that includes scores on individual items, on each science subdomain in the content area, and on four different knowledge types (memorized, conceptual understanding, higher-order thinking, pedagogical content knowledge), allowing them to use the scoring summary to analyze performance on specific items, subdomain topics, or knowledge level.
Send an email to CRMSTD staff at CRMSTD indicating your interest with a brief description of your intended use (e.g. with Math-Science Partnership grant, for a research study, for other professional development purposes, etc.). Also include the following information to help us plan and schedule our scorers:
If you have other questions about these assessments, please contact Dr. Thomas Tretter at 502-852-0595 or tom.tretter @ louisville.edu or Dr. Sherri Brown at 502-852-0599 or s.brown @ louisville.edu.