When you learned about slavery and segregation in America, was it taught as American history, or more specifically Southern history?
As a kid growing up I just thought of it as American history. It wasn’t until I was older did I really think more about the differences between the North and South during segregation and the civil rights movement. History was my worst subject in school, I hated it. I think probably a lot didn’t sink in even if it was taught, but I really don’t remember my teachers making an emphasis about the Southern states primarily having a racial divide. The South certainly wasn’t the only part of the country with stories of problems between races or ethnic groups.
Observing members:
0
Composing members: 0
Composing members: 0