/

John Wayne’s Racist Remarks From 48 Years Ago Are Tarnishing His Legacy

Although the actor known as John Wayne has been deceased for years, his words still inspire many people through his movies. Wayne always played a stalwart hero who unflinchingly faced the bad guys, who were often minorities. Now, however, Wayne’s entire image and perception are being thrown into question after an interview with Playboy magazine from 1971 has resurfaced with Wayne making an alarming proclamation that he was a white supremacist.

Wayne made it clear that he felt white people were superior to other races of people on Earth. This is an alarming thing to hear from a man portrayed as a hero on the big screen. To think that Wayne’s beliefs were not far off from those of Adolf Hitler and the Nazi party’s ideals that led to the massacre of millions of innocent people is something that should alarm everyone. Wayne was supposed to be the good guy, but the statements that were published in the Playboy issue shed light on a much darker side of John Wayne that has the power to erode his reputation forever.

“I believe in white supremacy,” he said to the Hugh Hefner-owned magazine.

Besides his belief that whites should rule the world, Wayne also looked down on African Americans. He argued that they were not equal and should not be allowed to have power in America.

We can’t all of a sudden get down on our knees and turn everything over to the leadership of the blacks. I don’t believe in giving authority and positions of leadership and judgment to irresponsible people,” he said.

Wayne did not stop with those highly controversial and racist comments. He took his views one step further and proclaimed that he had no shame about what the white people who founded America did to African slaves.

“I don’t feel guilty about the fact that five or 10 generations ago these people were slaves,” he said. “Now, I’m not condoning slavery. It’s just a fact of life, like the kid who gets infantile paralysis and has to wear braces so he can’t play football with the rest of us.”

When colonists came to North America, they killed countless Native Americans in cold blood just to steal their land. This is a pivotal part of American history that is left out of most history books. John Wayne, who built a career out of fighting Native Americans in his films, felt no empathy for what happened to their cultures.

“I don’t feel we did wrong in taking this great country away from them if that’s what you’re asking,” Wayne said. “Our so-called stealing of this country from them was just a matter of survival. There were great numbers of people who needed new land, and the Indians were selfishly trying to keep it for themselves.”

Wayne previously sparked controversy after it was discovered he’d skipped out on World War II while other actors served America. Instead, Wayne took their roles in the movies. He felt guilty about skipping out on serving American and was forced to live with that.

%d bloggers like this: