Ok so I have been living in Hawaii now for over a year and a half. Like most people I really love this place but I have never understood the history between Native Hawaiians and The United States of America. Living here you hear about how America stole this land. Of course I am familiar with the concept of Manifest Destiny, that the United States of America should stretch from sea to shining sea, even if the land belongs to a native population. It is obvious that, it was that kind of mentality that led to the overthrow of the Hawaiian monarchy.
Today in my Writing for New Media class we were shown a film called Act of War. It was the perfect documentary to really understand how these islands became American territory and then a State. I was shocked at how forceful our government was with the overthrow of the monarchy and the annexation to America. I cant help but feel a sense of white guilt, since it was all conspired by white men in suites and impeccable mustaches.
This was all very long ago in history, and whats done is done, but I realize how important it is to acknowledge the way this great State came to be. It simply was not just. I cant help but think though, about how things would have been if all that had not happened. Would there still be a Hawaiian monarchy today? Would some other country take advantage of the strategical positioning these islands posses?
There are plenty of what if's, but those questions do not ease the pain native hawaiians feel. Hawaiians were told for years that they were inferior. After seeing their lands taken, their way of life radically altered, their culture degraded, and their Queen pushed around; its no wonder why so many of them have suffered the perils of addiction, homelessness, and incarceration. If you tell people their less than human, after a while they must start to believe it.
The problem is, there is no easy solution to reconcile such an ill history.