1) Exposure to the Bible would convince people there is something to Christianity
2) Exposure to the Bible improves morality
I'm not so sure either of those are true.
Regardless of how states might get the Bible into schools it doesn't seem like there's any way constitutionally teachers will be able to proselytize, moralize, etc. using the Bible. They're not going to be able to sell Christianity and approach the Bible as one of the foundations for doing so.
So what do folks think? Is exposure to the Bible really all that effective on it's own?