Don't know where to stick it so posting it here :)
I've been reading up on American history, the time of the English colonists, and how we dealt with the Native Americans, and shortly after building settlements, African slaves.
Quite the appalling history. Of course I already knew and read about it many times before, but it remains a horrifying thing.
I understood that the Indians had a different status during the beginning of the colonists arriving, which is logical, but later they were treated as badly as the Africans.
I'm wondering, who's the worst of these days in general in the US, the African Americans or the Indians? Is there a vast difference in this in the north and south?
Do these 2 groups get along or discriminate each other too?
One more question: do Americans learn at school about this history, and if so, is it then the truth or a glorified version, based on "the whites having the right to do what they did" kind of basis?
Or is the emphasis on this history not there in education, and is it more on your war against Britain etc.?
Upload photo
Would you look at a profile that doesn't have photos?
Probably not! Upload a photo for others to be interested.
- Higher position in search results!
- Users with pictures get 10 times more responses in their messages
- Most people only contact those with pictures
Jenny
Lina
Anna
Jessica
Dony