• cstine@lemmy.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    0
    ·
    1 year ago

    As a product of American eduation, I can say resolutely that no, that was absolutely not taught.

    Of course, this is partially because American education sucks and partially because we never HAD common land here: everything was privately owned, after it was stolen from the people who already lived here, and then most of it had people who had no say in the matter enslaved to work on it for the people who stole the land.

    Of course, this is ALSO not really taught, because it’d make people feel sad and make the US look kinda bad, so it’s always talked about but you get like, a week of coverage on both subjects, at most.

    • KairuByte@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      arrow-down
      1
      ·
      1 year ago

      It’s all but against the law in Florida (maybe other states as well?) to teach that aspect of history. Wouldn’t want the white kids to feel guilty for being white… because they know about things that happened in the past.