Textbooks seem to have suddenly become a hot topic again, with the recent release of a study of NYC schools’ textbook habits by Charles Sahm. Robert Pondiscio has a nice summary of Sahm’s important work. He also ends with a mention of my textbook FOIA work and some quotes from me about how silly it is that we don’t track these data routinely.
In that spirit, I thought it might be of interest if I gave a little update on where things stand with my research. Perhaps this explanation will explain my somewhat diminished presence here and on Twitter in the past two weeks.
I sent out 3,014 FOIA requests a bit before Memorial Day, and the responses started coming back on Tuesday, May 26. Since then, I’ve received:
- Textbook adoption data reported on http://www.nsftextbookstudy.org for approximately 650 districts.
- Email or snail mail responses for approximately 850 districts (some overlap with the above, but not much).
Of these 850 email/snail mail responses, I’d estimate about 10% say they don’t keep textbook data, about 50-60% provide the data, and about 30-40% say they need more time to collect it.
And of those who provide the data, it’s relatively clear that some substantial proportion of them–perhaps half–do not routinely keep a list of the data, but rather they pulled something together for me (by law, by the way, they do not have to do this, so I am quite appreciative). Some of these pulled-together data include handwritten lists in cursive.
Five districts have so far charged me for the information (ranging from $1.19 to $27), which they are legally entitled to do if it took them time to pull the documents together or if they made copies.
One school originally demanded that I come pick the data up in Rochester, New York. However, after a somewhat testy email from me, the principal finally offered to email me a PDF for $2.25.
Two district leaders have sent threatening emails or left nasty messages indicating they would rat me out to my Dean (for what, I don’t know). I told me Dean and she said no big deal.
Perhaps 10 respondents have expressed great interest in the research or sent thoughtful notes.
It’s very clear that New York districts track these data less routinely than Illinois or Texas districts. However, my sense is that the response rate is quite a bit higher in New York than in the other two districts, so perhaps it is just that the districts that don’t track the data in Illinois and Texas are ignoring me, whereas in New York they’re telling me they don’t track it.
So that’s where we are at this point. We have data of some kind from ~40% of districts in these four states. I haven’t really gotten beyond that to look at what they’re actually reporting yet. We’ll start with follow-up emails and letters to the nonrespondents in a few weeks.
All in all, I’ve been amazed at how unbelievably effective this has been as a research strategy, even if I feel somewhat bad for having had to deploy it. I’m very excited for the data gathered to this point, and I think it will be useful both to me and to other researchers moving forward.