What is your expected success rate if this kind of relevance explanation is delivered to middle school or secondary school students? That example works for math people, not us mere mortals. Jaime Escalante could have done it, but only after he lassoed his students' motivation and got them to appreciate how the procedure is actually making their job easier.
In my state, the lesson is taught in high school.
Does the teacher push the button based on an explanation like you gave above, or something related to a subject like Nascar or football? In other words, in our diverse classrooms, how is the teacher trained to note when the relevance is understandable to all of the students versus one or even half?
You are assuming that all relevance must align to the personal likes and dislikes of the student. I can see nothing in the day-to-day activities of a student that would benefit from polynomial long division, but does that mean it is unimportant? If so, why teach it?
We must be teaching the lesson for a reason, and we need to explain that reason to students. Yes, the closer we attach the lesson to the students' personal lives the better, but it is not a fundamental requirement.
This is tied to the discussion because those of us who like to think of education as a process need to understand that a process needs to be appropriate for the materials, providing the right treatments and using appropriate measurement methods to be gauged effective. The tricky thing is that you're measuring people - notoriously fickle and difficult - and the success of not only the lesson's practicality but the students' motivation to learn it.
I'm not measuring people; I'm measuring behaviors. I'm looking for things to take place, not gauging the worth of the teacher. Big difference.
Suppose a teacher teaches her students that analyzing poetry builds the skills needed in advertising to write ads, and then provides some examples. I have no way of knowing to what degree the relevance struck home with each student. That is the job for the educational psychologist. I focus on the teacher, and in this case the teacher gets credited for teaching the relevance of the lesson.
And I also don't overly focus on that particular lesson. Suppose the teacher botched the relevance by stating something that was incorrect. It doesn't matter to me, because the worth of a teaching method is not lesson-specific. What matters is that the teacher understands the importance of teaching relevance and attempts to do so, not whether the relevance on that particular day is high-quality.
I think you would benefit from three years as a teacher's aide in the schools, so you can get one-on-one time with students, be held responsible for outcomes of the methods you are proposing, note their improvement over that period, and try to capture the ones we're losing - they are the reason our schools are performing so poorly. The poor test scores are from students who have been left behind in lessons and the teacher never noticed, or was not able/willing to bridge the gap. Solve that and you'll make yourself famous.
Again, you cannot overly focus on the student. Students come and go, so a teaching method has to be robust enough that it can be applied day in and day out over a number of years. That doesn't mean the students' characteristics are unimportant, but I don't measure students. That would be the job for an education psychologist.
I respect your efforts but the renaissance is already underway, just quietly so, district by district. If you want to improve education nationally, check out what's already been developed and proven effective, then make it go viral.
Quiet improvement over a limited number of districts is not a renaissance. On a large scale, the situation to me is getting worse, not better. The reaction to standardized testing is one reason.
Looking at what District X is doing to see what we should be doing in District Y usually fails. Why? Because District Y does not have the type of process control that will enable the staff to optimize the method to its own situation. They just insert the method and wait a year for the test scores to come out. When the scores don't show the improvements they were expecting, they quit. And since that method failed, why bother with anything else? "Been there, done that."
It's all about process control. If we need anything to go viral, that is it. I was hoping ISO 9001 would act as a vehicle for making process control go viral, but apparently not.