These questions are loosely based on ones asked by the Chicago Public Schools Board when they were considering adopting the MOP curriculum.
How does your program or materials line up with the Illinois Learning Standards or the NRC's National Science Education Standards?
See our comparison of the Minds•On Physics approach with the NSES (1996). A more detailed comparison can be found in the PDF attachment at the bottom of that page.
Do you have any data about student achievement in districts that have been using the program for a minimum of two years?
No, but the evaluation team headed by Allan Feldman said the following in their executive summary based on the beta field-testing:
When the MOP approach is used with the MOP materials as a comprehensive curriculum... students gain access to knowledge and skills that allow them to develop expert-like, concept-based problem solving abilities that are inaccessible with traditional curricula. In addition, students who used MOP regularly showed a greater awareness of their metacognitive process in solving physics problems than did students who used MOP only occasionally. (Feldman & Kropf, MOP Executive Summary, p.2, 7/23/97)
Can you give us some relevant background about the instructional framework of the materials and the research that guided the materials development?
The instructional framework for MOP is called Concept-based Problem Solving. The approach emphasizes analysis and reasoning over both pure conceptual understanding and pure numerical problem solving. Students spend a lot less time solving problems, but they develop deeper understanding and more robust problem-solving skills.
The developers are researchers in physics education, with credentials in expert-novice studies, metacognition, and bilingual research. The materials were developed using a cognitive framework based on multiple strands of educational research, including misconceptions, schema acquisition, cognitive overload, and the knowledge store and problem-solving techniques of experts and novices.
The approach is described in several sources. For instance, Supplement B of the (MOP) Motion Teacher's Guide is entitled Concept-based Problem Solving: Combining educational research results and practical experience to create a framework for learning physics and to derive effective classroom practices. A slightly shorter version can be found under the title Analysis-based Problem Solving: Making analysis and reasoning the focus of physics instruction.
Can you give us a detailed description of the development process, including the names of the primary developers and the pilot/field testing process, and a list of field or pilot test sites?
The development process can be divided into three phases: piloting, field-testing, and publishing. We started in 1990 with a team of four physics education researchers (UMPERG) and four high school teachers from a range of settings and with a range of skills and experience. The teachers reviewed the module activities drafted by UMPERG, making comments, corrections, and suggestions that were then used to revise the modules before piloting them. The teachers piloted the modules, gave UMPERG more advice about the materials, and allowed UMPERG into their classrooms to see how they were used and to talk to students. At the end of the pilot project in 1992, we had developed and tested 24 modules, covering only about 1/3 of the school year. The accompanying teacher support materials consisted of answers with short explanations to all of the module questions. The modules were developed under the National Science Foundation grant MDR-9050213.
In the field-testing phase (1993-97), we divided the modules into smaller activities that could be started and completed during the same class period. We also began developing additional activities in order to span a full year, and we added a "Student Reader" to summarize the ideas raised in the activities. The teacher support materials became "Answers and Instructional Aids for Teachers" with tidbits such as Preparation for Students, Anticipated Difficulties for Students, and Suggested Points for Class Discussion. We added 30 more teachers in Massachusetts, Tennessee, and Louisiana to field-test the materials, having meetings with local teachers every 6 weeks or so to gather insights and share experiences. Meetings with teachers in Chattanooga, TN and New Orleans, LA occurred less frequently but were just as informative. At the end of this phase, we had field-tested more than 100 activities. These activities were developed under NSF grant ESI-9255713.
In the publishing phase (1997-2003), we organized the activities into a "core" curriculum and a "supplemental" curriculum having three volumes each. The field-tested activities became the core curriculum, and we developed another 80 activities for the supplemental curriculum. We also organized and expanded the existing Answers and Instructional Aids for Teachers into more complete Teacher's Guides, one to accompany each volume of activities, and wrote Teacher's Guides for the three volumes of new activities. The last Teacher's Guide was published earlier this year.
For more detailed information about the actual development process, please refer to our model-based design paradigm as described in section IV of the technical report ASK-IT/A2L: Assessing Student Knowledge with Instructional Technology.
UMPERG [as of the writing of MOP --- ed] is William Gerace, Jose Mestre, Robert Dufresne, and William Leonard. All four are Physics PhDs with extensive teaching and research experience. They have conducted more than 60 workshops and mini-courses on learning and instruction, have written numerous articles on educational issues, given almost 200 talks on physics instruction, and done more than 70 science demonstration shows for nearly 2000 school children in the U.S. and South Africa. (These shows demonstrate how physics is manifest in everyday situations with common items.)
The pilot sites were in four very different settings. One is an excellent private school in NW Massachusetts. Another is a suburban HS with a diverse student population. A third is an average urban HS in Springfield, MA, with a large, though not predominantly, minority student population, and the fourth is an inner city, bottom-rung, 100% minority, school-of-last-resort for students in Hartford, CT.
The pilot teachers are also very different. One has an Ed.D. from the University of Massachusetts, and has even co-authored his own instructional manual for teachers. He is involved in lots of innovative programs, and is always willing to try something new. The second is a traditional teacher, well trained in physics and not overly creative in his teaching. The third is a biology teacher with almost no content knowledge teaching physics, but he is brilliant in creating group activities, however, and we learned much from him about adapting our materials to group work. The fourth is an energetic and highly adaptable physics and astronomy teacher.
A list of the field-test teachers can be found in the Acknowledgments of any one of the Student Activities books. There are 34 in all. Most are located in Massachusetts, but one cadre was located in and around Chattanooga, TN, and another was located in and around New Orleans, LA.
How does the program or materials address diversity in the student population?
Diversity is not limited to ethnic and cultural diversity. Every student has a unique set of skills, past experiences, and approaches, and so, in Minds•On Physics, we show teachers that there are many paths to success and many different ways to do physics. If a student, for example, does not have the algebraic skills to solve a traditional problem, we show how to help that student reason through to an answer. If a school system does not have a large equipment budget, we show them how to think and live and do physics without any expensive equipment.
If you look at MOP, you will see an emphasis on thinking, analyzing, and reasoning, which anyone can be encouraged to do at any age with almost any background. Also, you will see common everyday manipulatives, such as balls, toy cars, and rubber bands, items that students are familiar with and almost any school system can afford.
MOP is activity-based, which means the teacher decides the depth and level of coverage. MOP has been used in many different contexts, such as 8th and 9th grade physical science, 11th and 12th grade college prep physics, as supplementary materials in introductory college physics, and in graduate level teacher preparation courses. There are even a few students I know who have used the "Complex Systems" volume to help them with their Junior level (University) Statistical Physics class.
MOP has been used in schools with large minority student populations in Chattanooga, TN, New Orleans, LA, Springfield, MA, and Hartford, CT, and MOP has recently been adopted by Grand Rapids, MI. MOP has also been used in "bridging" programs for under-prepared black university students in South Africa.
Have you done an external evaluation of the materials by objective and respected sources?
The evaluation of Minds•On Physics was done by a team of researchers led by Allan Feldman of the University of Massachusetts School of Education. An executive summary is available.
Attachment | Size |
---|---|
MOP Evaluation Executive Summary | 41.25 KB |