When the rumors of WWE being sold first started going around years ago, many were against it because they were worried what a corporate goliath would do to wrestling.
At this point, any wrestling fan should be rooting for WWE to be sold, right? Especially looking at how companies like Disney have treated other fan favorites, the best possible thing that could happen to WWE (and pro wrestling in general) would be a sale so we can finally get rid of Vince & his yes man, and give other people a chance to run the most recognizable name in sports entertainment.
WWE can still keep its identity – good-looking people, muscles, more focus on entertainment, flashy presentation, etc – but without the awful direction that is made to please an audience of one.
Been thinking about this for a while with too many examples to list, but the dual gut punch of Gunther and Sarray being turned into a magical school girl feels like, okay, we're really done here.
I know Disney is always the popular choice for buying them, but really NBC Universal is the more realistic suitor that would likely end up getting them. As long as whoever buys it starts monetizing the old content again, I'll be happy.