Tuesday, July 15, 2008

Interactive Concept Development

I'm reading through the Overview and Implementation Guide for enVisionMath, by Scott Foresman-Addison Wesley (Texas). Because I am used to a traditional, instructivist curriculum, where the teacher guides the student step by step through the lesson, teaching algorithms to help students with knowledge and understanding, I just naturally have some concerns. So, how "new math", how "fuzzy math" is this math textbook?

(My principal stated, as she was giving me the materials, that there were many group activities for developing new concepts and I might want to look over them. Hmmm.)

Several things JUMP out at me.

On about every page so far: "Research says . . ." followed by an explanation of what enVisionMath provides. (There are lots of "Research says . . ." explanations.)

I notice that this seems to be TAKS (our Texas test) driven, and I see that all of the 20 topics are designed to be covered prior to the spring date for students' taking that test.

[One of my prior posts (Expectations Need to be Measurable and Concepts Need Time) referred to an article by William H. Schmidt, where he explains that "top-achieving countries" focus on fewer concepts so that teachers can cover them in depth, rather than the many (up to 20) that our country's math curriculum force teachers to cover.]

And I see that these 20 topics are covered in about 127 lessons (so they can be covered prior to the TAKS), some topics being covered in as few as 4 lessons. So some of the topics are small, bite-sized that hopefully are reinforced throughout the year. I hope there is built-in reinforcement. I'm hoping . . .

Uh-ohhh! Here it is! Interactive Learning

Research Says that students learn best when they have opportunities to interact with teachers and with other students. . . [Research says] Problem-based instruction (before making math concepts explicit) enhances learning because it gets students actively engaged in thinking about a problem and shows students that their thinking is valued.

Teachers are instructed to pose the problem, asking students to work in groups on a problem and share their thinking "before receiving teacher guidance that makes the math explicit".

Before making math concepts explicit ???

And I see lots of writing, writing to list and explain the steps you used, writing to explain what you and your partner decided to do to get the answer and why you chose that method.

Oh my!!!

And while these interactive groups are working, teachers are to be making sure that students are discussing what they are doing and that they are using the proper language and vocabulary in their discussions. How on earth can a teacher be listening to 10 pairs of students at one time to be sure they are using proper math language????

This is why teacher directed instruction is so important. When I teach (prior to the group practice) and when I make concepts explicit (prior to group practice), I can also make sure students practice explicit oral vocabulary (prior to group practice).

More later . . .

6 comments:

Anonymous said...

Hi! I'm really keen to know if any of the 'research says' statements are followed by citations. I'm researching fluency building and was fascinated by your blog, particularly the idea that the authors of the new-new maths are claiming they have research behind them. What sort of research is it? Who are they citing? Von, Perth, Western Australia

Concerned Teacher (Happily Retired) said...

To anonymous:

I have read much on "research says", specifically 3-4 years ago when I first became aware of "fuzzy math". What I have found is that curriculum writers can make research say whatever they want it to say.

I've read articles, but can't put my fingers on them right now, articles which challenged the research, which showed that the "research study groups" were not properly done, where the research did not use a diverse enough group or did not control the study group properly, or even apparently deliberately chose study groups designed to prove their research study.

Test groups can completely skew the results. For example, in the state of Texas, school districts have been caught deliberately not even testing specific groups of children which they knew would drag down the test results of that school district.

In the new book, Envision, that I am looking at, all of the "research says" cites specific studies and the names of the researchers and year. I did my best, on one of them, to find the original study and was not able to find the research which proved what the research supposedly said. I'm trying to be very careful here and not accuse the text of error.

I do know that fuzzy math programs all have vowed that their research shows that their curriculum raised scores. YET, they create problems wherever they are used. So be skeptical of all "research says" when all across the U.S. school district after school district have opposite results.

You might try going to Kitchen Table Math archives and search through their writings from 3 years ago for writings on studies.

I'm going to give you the information right from the Overview and Implementation Guide.

"Research says . . .Problem based instruction (before making math concepts explicit) enhances learning because it gets students actively engaged in thinking about a problem and shows students that their thinking is valued." (Mack,1990)

And from the Research Bibliography:

Mack, N.K."Learning Fractions with Understanding: Building on Informal Knowledge" Journal for Research in Mathmatics Education 21 (1990), pp. 16-32.

They cite 60+ authors and their studies, far to many to mention here.

Anonymous said...

Thank you so much for responding!

I'm getting all fired up about precision teaching because it is evidenced based. One day I am going to have to discuss it with my teacher friends, including one who researches how children learn maths, and I need to be really sure I know what sort of research they have behind them.

I was hoping you would say that there are never any citations but now I see I am actually going to have to look into some of these articles. As a psychologist I have an a bit of an idea about randomization etc. but I am terrified of maths so I hope I can make some sense of it all! I'm still writing up my thesis on fluency building so it will have to wait until next year. By then, my oldest will be 7 and further immersed in what I suspect is a pretty fuzzy maths curriculum. Tick, tick..

Thanks again for your detailed response. I'm trying not to read blogs too often at the moment in order to get some study done, so I'm sorry I didn't reply straight away.

Von, Perth, Western Australia

1crosbycat said...

Thanks for your post - I am in the middle of an email to our principal to complain about this enVision math, and since it is new there are not a lot of reviews. What I have been seeing my 3rd grader bring home is ridiculous - this is our 2nd year, the board changed curriculum without advising anyone first. I can say from our experience that enVision math indeed does cause psychosis in parents!

Concerned Teacher (Happily Retired) said...

One important thing to remember about all "new math" programs is that difficult problems are usually a few lessons ahead (AHEAD) of the lesson where the concept is taught! This is terribly inefficient for students, causing them to waste valuable time on hard things (that are called challenging) which later will be explained. Not only do such problems waste a student's (as well as a parent's) time, but they serve to so frustrate the student, make them feel foolish, and give them such a negative attitude about Math, that the student often never can be convinced again that they can "get" Math!

As a parent, I'd want to know what instructions or practice had been given in class to prepare students for a particular problem.

Our 5th grade enVision had a number of different problems in each lesson, some were reinforcement for the lesson being taught, others not at all related. As a Math teacher, I could see a connection, but to the student nothing had yet been taught for them to "hang" that new reasoning on!! I had to carefully select certain problems, omitting others. Sometimes I missed a problem and we had tears because only about 10% of the class had the reasoning ability to "catch" it.

1crosbycat said...

enVisionMATH has very long pre-tests, 20 or 25 problems for 3rd grade (takes my kids a long time) and 40 problems for 5th. I do not see why this is good, effective, or useful. If they havent learned something yet, why are we testing them on it? Especially for students who have been at the same school for years so the teachers know what has been taught. But the prevailing theory is assess, assess, assess even if its a pointless waste of time.

I did ask our district and our enVisionMATH representative here is Western PA for documentation of the research, but the principal of our school had nothing to offer ebsides the test of actual schools using it (results of 2nd yr of study, little late now plus not the data they all bragged about) and Mario the Pearson guy ignored me. Perhaps I'll post my emails to hime everywhere...because a lot of kids do not resonate with the all-word-problem-all-the-time in 3rd grade and its not reasonable or effective if a portion of your average intelligence kids cannot do 9 HW problems in less than an agonizing hour at home. Something is wrong with the program, not the kids...