It isn’t one or the other. The history of the continent of America is an area that has become dominated by Christianity in all its forms (don’t for get them many catholic Hispanics and the like.) But the influence of many other religions is just as deeply inter-linked and fundamental in the nation’s formation. You can’t ignore the huge financial and political power that the Jewish community held and still hold. Asian religions and more lend their voice in the voices of migrant workers. Don’t forget the European religions (versions of Christianity) that persecuted the founding fathers causing them to seek a better life.
And which version of Christianity would you want to teach? Out of the entire collection if different versions, which is most worthy of state indoctrination? Protestant, Catholic, JW’s, LDS, Creationism? Which?
Anyway you cut it; to suggest you only need to teach a specific Christian viewpoint to teach young Americans the depth and breadth of the history of the land is avoiding the true depth and richness in America’s theological heritage.
If you were Muslim, would you still insist that Christianity is the key to teaching American history? Somehow I doubt it.