The New York Times, in yet another of its front page articles extolling improvements in education is very excited that: “Starting this fall, the school district in Chappaqua, N.Y., is setting aside 40 minutes every other day for all sixth, seventh and eighth graders to read books of their own choosing.”
Woo hoo!
You mean occasionally they will allow children to do something that they are actually interested in doing in school?
Not so fast.
Will students be able to bring in Popular Mechanics or even the New York Times? No, of course not. They will choose books approved by teachers. But, even this appalls the Times approved Bush appointee Diane Ravitch, who is always on the side of everything backward in education. She worries that no “child is going to pick up Moby Dick.”
Indeed.
The Times goes on to say that: “In the method familiar to generations of students, an entire class reads a novel — often a classic — together to draw out the themes and study literary craft. That tradition, proponents say, builds a shared literary culture among students, exposes all readers to works of quality and complexity and is the best way to prepare students for standardized tests.”
It didn’t take them long did it? Yea tests.
This is just more baloney intended to make the public feel like things are getting better in schools when In fact things are so bad that no one is happy (except maybe Diane Ravitch.) You can’t allow real choice in school because then you can’t test it to see what kids have done.
I once built a program meant to get kids to learn the geography of the U.S without really trying, as they searched around the country for stuff they were interested in. It worked quite well. Kids loved it and they learned geography.
Nope. Rejected. Why?
Because some students might go to California and others might go to New York. How would we test them? As soon as the tests appear innovation goes out the window. You mean kids would learn different stuff? Omigod!
In any case, this “choose what to read program” is an illusion. It is better than being force fed Moby Dick for sure but what it is the real goal? The Times says; “Letting students choose their own books, they say, can help to build a lifelong love of reading.”
That is the goal. Making kids read a lot in the hope that some of them will like it. Same as the math goal of shoving algebra down their throats in case any one likes it. Kids rarely like what you make them do, or am I the only who has noticed that?
Can you live a long and happy life without having a love of literature? I think so. It is important to learn to read but that does not mean, by any means, that one needs to read “literature.” If it isn’t obvious to people by now, literature will soon be ancient history anyway. While humans have always told stories and always learned from them, they have not always had “literature.” Novels have become common place for a very brief moment in human history and are now clearly being replaced by television and movies (for better or for worse, that is what is happening.)
Teachers and politicians hate this of course. What I hate is that the idea of discussing life choices and issues in getting along in this world, which is a positive benefit of discussing literature, can only be done by reading Moby Dick according to the experts. There are any other ways to do learn to think about life.
We have, as a society, lost the forest for the trees. While we could be teaching deeply about why they do what they do, instead we are teaching them to pass tests. We insist that they learn what was fashionable for the elite to learn a century ago. And we torture them and wonder why they drop out. Moby Dick indeed!
Sunday, August 30, 2009
Friday, August 28, 2009
Strengthening Student Support: A Sensible Proposal with What Results?
Cross-posted from Brainstorm
Anyone who's taken a hard look at the reasons why more students drop out of community college realizes it's got to have at least something to do with their need for more frequent, higher-quality advising. After all, in many cases these are students who are juggling multiple responsibilities, only one of which is attending college, and they need to figure out a lot of details-- how to take the right courses to fit their particular program (especially if they hope to later transfer credits), how to get the best financial aid package, how to work out a daily schedule that can maximize their learning, etc. It's fairly easy to figure that in fact community college students would likely stand to benefit more from good advising than their counterparts at many 4-year institutions.
Except high-quality advising isn't what they get. Counselor-student ratios are on average 1000:1. That's right-- one counselor for a population the size of a decent high school. In elementary and secondary schools the ratio is 479:1. There's a pay disparity as well-- in k-12 the Bureau of Labor Statistics reports that the median annual earnings for a counselor in 2006 was nearly $54,000. For counselors at community colleges it was $48,000 (and for those at other colleges it was $42,000). Now, perhaps the salary differentials reflect the different work load, and assumptions about it being easier to counsel adults. But I tend to think this is offbase-- these are outdated notions of who community college students are and what they need.
So what would happen if we reduced the counselor/student ratio at community colleges to a standard even better than the national average in k-12? And at the same time ramped up the intensity of the counseling? Theory would suggest we should see some meaningful results. Many studies, including my own, point toward a persistent relationship between parental education and college outcomes that's indicative of the importance of information-- and information (plus motivation) is what counseling provides. So, putting more counselors into a community college and increasing the quality of what they provide should work-- if students actually go and see them.
To test these hypotheses, MDRC (a terrific NYC-based evaluation firm) recently conducted a randomized program evaluation in two Ohio community colleges. In a nutshell, at college A students in the treatment group were assigned (at random) to receive services from a full-time counselor serving only 81 students, while at college B students in the treatment group had a counselor serving 157 students. In both cases, the control group students saw counselors serving more than 1,000 students each. In addition to serving far fewer students than is typical, these counselors were instructed to provide services that were "more intensive, comprehensive, and personalized." In practice, students in the treatment group did see their counselors more often. The "treatment" lasted two semesters.
The students in this study are Midwesterners, predominantly (75%) women, predominantly white (54%), with an average age of 24, half living below the poverty line and half are working while in school. I think it's also worth pointing out that while all applied for financial aid, these were not folks who were overwhelming facing circumstances of deprivation-- 88% had access to a working car, and 64% had a working computer in their home. And 98% were U.S. citizens.
The results indicate only modest results. After one semester of program implementation, the biggest effects occured-- students in the treatment group were 7 percentage points more likely to register for another semester (65 vs. 58%). But those differences quickly disappeared, and no notable differences in outcomes like the number of credits taken and other academic outcomes occured. Moreover, the researchers didn't find other kinds of effects you might expect--such as changes in students' educational goals, feelings of connection to the college, or measured ability to cope with struggles.
So what's going on? The folks at MDRC suggest 3 possibilities: (1) the program didn't last long enough to generate impacts, (2) the services weren't comprehensive enough, (3) advising may need to be linked to other supports--including more substantial financial aid--in order to generate effects. I think these are reasonable hypotheses, but I'd like to add some more to this list.
First and foremost, there's a selection problem. MDRC tested an effect of enhanced advising on a population of students already more likely to seek advice-- those who signed up for a study and more services. Now, of course this is a common problem in research and it doesn't compromise the internal validity of the results (e.g. I'm not saying that they mis-estimated the size of the effect). And, MDRC did better than usual in using a list of qualified students (all of whom, by the way had to have completed a FAFSA) and actively recruiting them into the study-- rather than simply selecting participants from folks who showed up to a sign-up table and agreed to enter a study. But, in the end they are testing the effects of advising on a group that was responsive to the study intake efforts of college staff. And we're not provided with any data on how that group differed from the group who weren't responsive to those efforts--not even on the measures included on the FAFSA (which it seems the researchers have access to). Assuming participants are different from non-participants (and they almost always are), I'm betting the participants have characteristics that make them more likely to seek help-- and therefore are perhaps less likely to accrue the biggest benefits from enhanced advising. I wish we had survey measures to test this hypotheses-- for example we could look at the expectations of participants at baseline and compare them to those of more typical students-- but the first survey wasn't administered until a full year after the treatment began. To sum, up, this issue doesn't compromise the internal validity of the results, but it may help explain why such small effects were observed-- there are often heterogeneous effects of programs, and those students for whom you might anticipate the bigger effects weren't in the study at all.
A second issue: we just don't know nearly enough about the counterfactual in this case-- specifically, what services students in the control group received. (We know a bit more about differences in what they were offered, e.g. from Table 3.3, but not in terms of what they received,) We are provided comparisons in services received by treatment status only for one measure-- services received 3+ times during the first year of the study (Appendix Table c.3), but not for the full range of services such as those shown in Appendix Table C.1. For example we don't know that students in the control and treatment groups didn't have similar chances of contacting a counselor 1 or 2 times, only the incidence of 3+ contacts. If the bar was rather high, it may have been tougher to clear (e.g. the treatment would've needed to have a bigger impact to be significant).
Having raised those issues, I want to note that these are fairly common problems in evaluation research (not knowing much about either study non-participants or about services received by the control group), and they don't affect MDRC's interpretations of findings. But these problems may help us understand a little bit more about why more substantial effects weren't observed.
Before wrapping up, I want to give MDRC credit for paying attention to more than simply academic outcomes in this study-- they tested for social and health effects as well, including effects on stress (but didn't find any). As I've written here before, we need to bring the study of student health and stress into educational research in a more systematic way, and I'm very glad to see MDRC doing that.
So, in the end, what have we learned? I have no doubt that the costs of changing these advising ratios are substantial, and the impacts in this case were clearly low. Right now, that doesn't lend too much credence to increasing spending on student services. But, this doesn't mean that more targeted advising might not be more effective. Perhaps it can really help men of color (who are largely absent from this study). Clearly, (drumroll/eye-rolling please), more research is needed.
Anyone who's taken a hard look at the reasons why more students drop out of community college realizes it's got to have at least something to do with their need for more frequent, higher-quality advising. After all, in many cases these are students who are juggling multiple responsibilities, only one of which is attending college, and they need to figure out a lot of details-- how to take the right courses to fit their particular program (especially if they hope to later transfer credits), how to get the best financial aid package, how to work out a daily schedule that can maximize their learning, etc. It's fairly easy to figure that in fact community college students would likely stand to benefit more from good advising than their counterparts at many 4-year institutions.
Except high-quality advising isn't what they get. Counselor-student ratios are on average 1000:1. That's right-- one counselor for a population the size of a decent high school. In elementary and secondary schools the ratio is 479:1. There's a pay disparity as well-- in k-12 the Bureau of Labor Statistics reports that the median annual earnings for a counselor in 2006 was nearly $54,000. For counselors at community colleges it was $48,000 (and for those at other colleges it was $42,000). Now, perhaps the salary differentials reflect the different work load, and assumptions about it being easier to counsel adults. But I tend to think this is offbase-- these are outdated notions of who community college students are and what they need.
So what would happen if we reduced the counselor/student ratio at community colleges to a standard even better than the national average in k-12? And at the same time ramped up the intensity of the counseling? Theory would suggest we should see some meaningful results. Many studies, including my own, point toward a persistent relationship between parental education and college outcomes that's indicative of the importance of information-- and information (plus motivation) is what counseling provides. So, putting more counselors into a community college and increasing the quality of what they provide should work-- if students actually go and see them.
To test these hypotheses, MDRC (a terrific NYC-based evaluation firm) recently conducted a randomized program evaluation in two Ohio community colleges. In a nutshell, at college A students in the treatment group were assigned (at random) to receive services from a full-time counselor serving only 81 students, while at college B students in the treatment group had a counselor serving 157 students. In both cases, the control group students saw counselors serving more than 1,000 students each. In addition to serving far fewer students than is typical, these counselors were instructed to provide services that were "more intensive, comprehensive, and personalized." In practice, students in the treatment group did see their counselors more often. The "treatment" lasted two semesters.
The students in this study are Midwesterners, predominantly (75%) women, predominantly white (54%), with an average age of 24, half living below the poverty line and half are working while in school. I think it's also worth pointing out that while all applied for financial aid, these were not folks who were overwhelming facing circumstances of deprivation-- 88% had access to a working car, and 64% had a working computer in their home. And 98% were U.S. citizens.
The results indicate only modest results. After one semester of program implementation, the biggest effects occured-- students in the treatment group were 7 percentage points more likely to register for another semester (65 vs. 58%). But those differences quickly disappeared, and no notable differences in outcomes like the number of credits taken and other academic outcomes occured. Moreover, the researchers didn't find other kinds of effects you might expect--such as changes in students' educational goals, feelings of connection to the college, or measured ability to cope with struggles.
So what's going on? The folks at MDRC suggest 3 possibilities: (1) the program didn't last long enough to generate impacts, (2) the services weren't comprehensive enough, (3) advising may need to be linked to other supports--including more substantial financial aid--in order to generate effects. I think these are reasonable hypotheses, but I'd like to add some more to this list.
First and foremost, there's a selection problem. MDRC tested an effect of enhanced advising on a population of students already more likely to seek advice-- those who signed up for a study and more services. Now, of course this is a common problem in research and it doesn't compromise the internal validity of the results (e.g. I'm not saying that they mis-estimated the size of the effect). And, MDRC did better than usual in using a list of qualified students (all of whom, by the way had to have completed a FAFSA) and actively recruiting them into the study-- rather than simply selecting participants from folks who showed up to a sign-up table and agreed to enter a study. But, in the end they are testing the effects of advising on a group that was responsive to the study intake efforts of college staff. And we're not provided with any data on how that group differed from the group who weren't responsive to those efforts--not even on the measures included on the FAFSA (which it seems the researchers have access to). Assuming participants are different from non-participants (and they almost always are), I'm betting the participants have characteristics that make them more likely to seek help-- and therefore are perhaps less likely to accrue the biggest benefits from enhanced advising. I wish we had survey measures to test this hypotheses-- for example we could look at the expectations of participants at baseline and compare them to those of more typical students-- but the first survey wasn't administered until a full year after the treatment began. To sum, up, this issue doesn't compromise the internal validity of the results, but it may help explain why such small effects were observed-- there are often heterogeneous effects of programs, and those students for whom you might anticipate the bigger effects weren't in the study at all.
A second issue: we just don't know nearly enough about the counterfactual in this case-- specifically, what services students in the control group received. (We know a bit more about differences in what they were offered, e.g. from Table 3.3, but not in terms of what they received,) We are provided comparisons in services received by treatment status only for one measure-- services received 3+ times during the first year of the study (Appendix Table c.3), but not for the full range of services such as those shown in Appendix Table C.1. For example we don't know that students in the control and treatment groups didn't have similar chances of contacting a counselor 1 or 2 times, only the incidence of 3+ contacts. If the bar was rather high, it may have been tougher to clear (e.g. the treatment would've needed to have a bigger impact to be significant).
Having raised those issues, I want to note that these are fairly common problems in evaluation research (not knowing much about either study non-participants or about services received by the control group), and they don't affect MDRC's interpretations of findings. But these problems may help us understand a little bit more about why more substantial effects weren't observed.
Before wrapping up, I want to give MDRC credit for paying attention to more than simply academic outcomes in this study-- they tested for social and health effects as well, including effects on stress (but didn't find any). As I've written here before, we need to bring the study of student health and stress into educational research in a more systematic way, and I'm very glad to see MDRC doing that.
So, in the end, what have we learned? I have no doubt that the costs of changing these advising ratios are substantial, and the impacts in this case were clearly low. Right now, that doesn't lend too much credence to increasing spending on student services. But, this doesn't mean that more targeted advising might not be more effective. Perhaps it can really help men of color (who are largely absent from this study). Clearly, (drumroll/eye-rolling please), more research is needed.
Wednesday, August 26, 2009
The "myth" of educational reform
The Obama administration is very busy bemoaning the "myths' that the general public believes about their proposed health care package. In a recent statement they mentioned myths such as: "About five out of 10 believe the federal government will become directly involved in making personal health care decisions." and "Roughly six out of 10 Americans believe taxpayers will be required to pay for abortions."
Who is to blame for the fact that the American public cannot separate truth from myth and cannot think their way out of a paper bag? Here is my best guess: the schools. The schools do not teach people how to think, how to discern truth, or how to figure out what the real agenda of talk show hosts might be. They learn algebra, and they analyze Shakespeare, and they memorize physics formulas, and still they can't think. Amazing!
Could we try teaching them to think so that they won't believe "myths?" Apparently not. Mr. Obama insists on a national curriculum and more testing on the same old crap. Rest assured Mr. President, that future Presidents will have to deal with these kinds of myths as well, because the students you will be creating will be just as incapable of thinking as the citizens that you have to deal with now on a daily basis.
Who is to blame for the fact that the American public cannot separate truth from myth and cannot think their way out of a paper bag? Here is my best guess: the schools. The schools do not teach people how to think, how to discern truth, or how to figure out what the real agenda of talk show hosts might be. They learn algebra, and they analyze Shakespeare, and they memorize physics formulas, and still they can't think. Amazing!
Could we try teaching them to think so that they won't believe "myths?" Apparently not. Mr. Obama insists on a national curriculum and more testing on the same old crap. Rest assured Mr. President, that future Presidents will have to deal with these kinds of myths as well, because the students you will be creating will be just as incapable of thinking as the citizens that you have to deal with now on a daily basis.
Wednesday, August 12, 2009
Answering: “what should I go to school for?”
These days one can easily find out how people get to one’s website. My outrage column is often found via the question "what should I go to school for?" This question drives the answer seeker to my column on “why little girls shouldn’t go to school,” which is certainly not what they were looking for. (Of course, I don’t think little boys should go to school either, in case you were wondering.)
So, I thought I would attempt to answer their question since people keep asking it. The problem is that the question is ambiguous. They could be asking why go to school at all and they could be asking what should I study in school? As I have no idea which meaning predominates, I will take a shot at answering both questions. I will make the assumption that the people asking these questions are in high school and perhaps thinking about going to college
Why go to school at all?
In a society other than the one in which we live, this is a very good question. I think school, as it exists today, is a very bad idea. Still, I would be remiss in answering this question by saying drop out. Drop outs are viewed badly in our society. School is stupid, but dropping out is stupider. Why? Because, as one travels through life one accumulates a set of accomplishments. Quitting, no matter what you quit, is never a great accomplishment. Unless, of course, you quit for something better. If have a good plan that will net you something better and enable you to say I quit to start Microsoft or the equivalent, by all means quit. One learns very little of value in high school. Still the credential entitles you to a minimal amount of respect that you may need at some point. So stick it out if you can.
Now to the more important question. What should you study in high school, or more importantly, because there are more choices, in college? Let’s start with what you shouldn’t study. Study no academic subject. Do not study English, History, Math, Physics, Biology, or any of the other standard subjects that one always starts with in high school. Whoa! Did I really say that? Heresy. So, why not then?
It is important to realize that there are many myths in our society and that these myths are usually offered by people who stand to gain if people believe in them. The you must drink 8 glasses of water a day myth, for example, is offered up by companies that sell bottled water. In school the significance of studying literature, or mathematics, or history, or science, is offered up by those who teach those subjects, those who make a living testing those subjects, and more importantly by book publishers and others who have serious vested interests in selling things related to those subjects. In addition, the educated elite, having been educated in those subjects, can pooh pooh anyone who doesn’t know them and keep the high ground for themselves. If you don’t know what they know you can’t be much. This attitude has always been with us, in every society, but the subjects change. Sometimes the subject is religion, sometimes astrology, sometimes some secret knowledge that only the village elders have. These days it is literature, which certainly won’t last, mathematics, which makes hardly any sense at all in the age of computers, and history, which never made any sense since history is written by those who come out bets in the telling . Science seems to be making a big move these days. When I was young science was for geeks and those who knew it were looked down upon by the people who knew important stuff. Things change.
There is, not surprisingly, a serious lack of employment possibilities in those areas of study. So many people have been pushed to study those subjects that there is a serious oversupply of job seekers who were English majors, for example. It should not be possible to be an English major, but tell that to English professors.
So what should you go to school for? This is really an easy question to answer. First ask yourself what you really like to do in life, what you think about on a regular basis, whom you admire, and whom you wish to be? Only you can answer those questions. When you come up with answers, ask if there are jobs in that area. Be creative. Make up a job if you don’t think one exists. Ask what you need to learn to do in order to become a person who thinks about or does all day whatever it is you like to think about and do all day. Extrapolate up. If you like working on your car, maybe you would like working on airplanes or ships for example. If you like hanging out and talking, ask yourself who gets paid to do that (salesmen?). Find out where those who do what seems to be fun learned to do it. Often the answer is “on the job.” If that is the answer ask yourself how you can get a low level job in that area and work your way up. People learn by doing. Ask how you can start doing.
If you do need training to start doing what you want, find a community college that offers that kind of training. Most of all do not go to school if you have no inkling at all about what you think you would like to learn to do. Work for a while and start finding out more about the world, then ask the above questions again.
In the U.S. most people go to college immediately after high school. My experience as a professor was that those students who did something else, who went into the army, the Peace Corps, traveled around, worked for a while and such, made much better students in college. They knew why they were there. Do not go to school if the only reason you are there is to get a degree. Wrong reason. Know yourself first, then learn what you need to know that will make you become a person who you would respect.
So, I thought I would attempt to answer their question since people keep asking it. The problem is that the question is ambiguous. They could be asking why go to school at all and they could be asking what should I study in school? As I have no idea which meaning predominates, I will take a shot at answering both questions. I will make the assumption that the people asking these questions are in high school and perhaps thinking about going to college
Why go to school at all?
In a society other than the one in which we live, this is a very good question. I think school, as it exists today, is a very bad idea. Still, I would be remiss in answering this question by saying drop out. Drop outs are viewed badly in our society. School is stupid, but dropping out is stupider. Why? Because, as one travels through life one accumulates a set of accomplishments. Quitting, no matter what you quit, is never a great accomplishment. Unless, of course, you quit for something better. If have a good plan that will net you something better and enable you to say I quit to start Microsoft or the equivalent, by all means quit. One learns very little of value in high school. Still the credential entitles you to a minimal amount of respect that you may need at some point. So stick it out if you can.
Now to the more important question. What should you study in high school, or more importantly, because there are more choices, in college? Let’s start with what you shouldn’t study. Study no academic subject. Do not study English, History, Math, Physics, Biology, or any of the other standard subjects that one always starts with in high school. Whoa! Did I really say that? Heresy. So, why not then?
It is important to realize that there are many myths in our society and that these myths are usually offered by people who stand to gain if people believe in them. The you must drink 8 glasses of water a day myth, for example, is offered up by companies that sell bottled water. In school the significance of studying literature, or mathematics, or history, or science, is offered up by those who teach those subjects, those who make a living testing those subjects, and more importantly by book publishers and others who have serious vested interests in selling things related to those subjects. In addition, the educated elite, having been educated in those subjects, can pooh pooh anyone who doesn’t know them and keep the high ground for themselves. If you don’t know what they know you can’t be much. This attitude has always been with us, in every society, but the subjects change. Sometimes the subject is religion, sometimes astrology, sometimes some secret knowledge that only the village elders have. These days it is literature, which certainly won’t last, mathematics, which makes hardly any sense at all in the age of computers, and history, which never made any sense since history is written by those who come out bets in the telling . Science seems to be making a big move these days. When I was young science was for geeks and those who knew it were looked down upon by the people who knew important stuff. Things change.
There is, not surprisingly, a serious lack of employment possibilities in those areas of study. So many people have been pushed to study those subjects that there is a serious oversupply of job seekers who were English majors, for example. It should not be possible to be an English major, but tell that to English professors.
So what should you go to school for? This is really an easy question to answer. First ask yourself what you really like to do in life, what you think about on a regular basis, whom you admire, and whom you wish to be? Only you can answer those questions. When you come up with answers, ask if there are jobs in that area. Be creative. Make up a job if you don’t think one exists. Ask what you need to learn to do in order to become a person who thinks about or does all day whatever it is you like to think about and do all day. Extrapolate up. If you like working on your car, maybe you would like working on airplanes or ships for example. If you like hanging out and talking, ask yourself who gets paid to do that (salesmen?). Find out where those who do what seems to be fun learned to do it. Often the answer is “on the job.” If that is the answer ask yourself how you can get a low level job in that area and work your way up. People learn by doing. Ask how you can start doing.
If you do need training to start doing what you want, find a community college that offers that kind of training. Most of all do not go to school if you have no inkling at all about what you think you would like to learn to do. Work for a while and start finding out more about the world, then ask the above questions again.
In the U.S. most people go to college immediately after high school. My experience as a professor was that those students who did something else, who went into the army, the Peace Corps, traveled around, worked for a while and such, made much better students in college. They knew why they were there. Do not go to school if the only reason you are there is to get a degree. Wrong reason. Know yourself first, then learn what you need to know that will make you become a person who you would respect.
Subscribe to:
Posts (Atom)