I feel like a lot of zombie fiction where characters know what zombies are and the dangerous of getting bitten end up being semi-satirical comedies. Movies and shows where the idea of zombies didn’t previously exist seem to be a bit more serious from what I’ve experienced. I don’t know if it’s the aura of suspense and mystery or because it leads to more pandemonium.
Not necessarily, i mean WWZ - while not explicitly stated - clearly exists in a world with zombie media
Book, not film, obvs.
WWZ is based on a book? Damn. TIL.
…in the same way christianity is based on the teachings of Jesus.
There’s not much in common.
In that the movie has the same name.
The book is far better.