Why does everyone give aliens such a bad image?
Why does everyone assume aliens will eat or destroy us?
Why does everyone assume they want to conquer Earth?
Why does everyone assume that they are the intelligent species?
Why does everyone assume that they had special powers?

It really isn't fair! Everyone's been watching too many movies and built up a steriotypical view of them!

Well thanks allot!

Now they'll never come down to meet us, make peace and offer an allience! Because if they do they'll just get locked up and disected by the stupid FBI or MIB!

Not Fair!