The popularity of zombie themed entertainment has been with us now for a long time. Even back to “Night of the Living Dead,” in the sixties, and “Dawn of The Dead” in the seventies, there always seemed to be a subtle subtext pointing to politics, race and consumerism. In the current time I feel that there is a new hidden drive, perhaps not even intended, being the dehumanization of those we see as “other.”
As a child I was always a horror film fan, and always looked with interest at the zombie phenomenon. I am fascinated by how it seems to go. There are zombie and vampire television shows that follow the same process: an infectious disease somehow manifests, creating an apocalyptic scenario. All those who are infected suddenly and irredeemably become part of “them.” Once you 'go zombie' you never go back. The uninfected - “the good” - then spend a good part of the show brutalizing zombies, who threaten to infect the whole world.
What does this mean for our society? Why are many so drawn to these shows? I suspect that part of the attraction is in the fact that more than ever we are polarized. This is certainly true in the United States, and appears to be true worldwide. There are many who feel that compromise is not an option.
What if the real infection is self-righteousness? Are we truly justified in disrespectful patterns of behavior? What if we are the zombies? A zombie is not fully conscious and follows a simplistic and biased set of preconceived values, infecting others with the same view. Perhaps we are the problem?