I heard of all this. I just don't care for it though. I mean if any of this is true, why was it only far left 'investigative' reporters talking about, and not the media? Yeah I know, "Because the Right controls the media". Except, I don't believe that. I mean, if that was the case, why are the vast majority of the stories you here now are negative reporting of Iraq? If the right controlled the media, you'd think they would be propagading the Iraq war like crazy. <_<
Of course, my original thoughts on the matter was how pissed off I was that Florida was handed to Bush. Keyword, was. Well, this is why I try to keep myself out of political topics.