Is this guy serious?!
-
@box said:
ccbiggs/cory, let me show you how your post reads, I'll chop a section out of the middle just to make it shorter.
@ccbiggs said:
Just part of an email that ended up in my inbox.
EMAIL
**A LITTLE GUN HISTORYIn 1929, the Soviet Union established gun control. From 1929 to 1953, about 20 million dissidents, unable to defend themselves, were rounded up and exterminated.
.
.
.
.
.
Defenseless people rounded up and exterminated in the 20th Century because of gun control: 56 million.
------------------------------**
Your Conclusion%(#4040FF)[Guns in the hands of honest citizens save lives and property and, yes, gun-control laws adversely affect only the law-abiding citizens.
Take note my fellow Americans, before it's too late!
The next time someone talks in favor of gun control, please remind them of this history lesson.
With guns, we are 'citizens'. Without them, we are 'subjects'.
During WWII the Japanese decided not to invade America because they knew most Americans were ARMED!]
I'm aware now that this isn't what you meant, but that is how it comes across at first reading, so you might be able to see how it could lead to some strong reactions and some confusion over the next posts.
I think he was just quoting an email not actually writing that himself - just to clarify.
-
Yes I got that, was just pointing out that it could be confused.
-
Here is a very interesting article, one I have read many times and still I find myself captive to myself on many issues.
I know it's long, but worth the read.
How facts Backfire
Itâs one of the great assumptions underlying modern democracy that an informed citizenry is preferable to an uninformed one. âWhenever the people are well-informed, they can be trusted with their own government,â Thomas Jefferson wrote in 1789. This notion, carried down through the years, underlies everything from humble political pamphlets to presidential debates to the very notion of a free press. Mankind may be crooked timber, as Kant put it, uniquely susceptible to ignorance and misinformation, but itâs an article of faith that knowledge is the best remedy. If people are furnished with the facts, they will be clearer thinkers and better citizens. If they are ignorant, facts will enlighten them. If they are mistaken, facts will set them straight.
In the end, truth will out. Wonât it?
Maybe not. Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. Itâs this: Facts donât necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.
This bodes ill for a democracy, because most voters â the people making decisions about how the country runs â arenât blank slates. They already have beliefs, and a set of facts lodged in their minds. The problem is that sometimes the things they think they know are objectively, provably false. And in the presence of the correct information, such people react very, very differently than the merely uninformed. Instead of changing their minds to reflect the correct information, they can entrench themselves even deeper.
âThe general idea is that itâs absolutely threatening to admit youâre wrong,â says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon â known as âbackfireâ â is âa natural defense mechanism to avoid that cognitive dissonance.â
These findings open a long-running argument about the political ignorance of American citizens to broader questions about the interplay between the nature of human intelligence and our democratic ideals. Most of us like to believe that our opinions have been formed over time by careful, rational consideration of facts and ideas, and that the decisions based on those opinions, therefore, have the ring of soundness and intelligence. In reality, we often base our opinions on our beliefs, which can have an uneasy relationship with facts. And rather than facts driving beliefs, our beliefs can dictate the facts we chose to accept. They can cause us to twist facts so they fit better with our preconceived notions. Worst of all, they can lead us to uncritically accept bad information just because it reinforces our beliefs. This reinforcement makes us more confident weâre right, and even less likely to listen to any new information. And then we vote.
This effect is only heightened by the information glut, which offers â alongside an unprecedented amount of good information â endless rumors, misinformation, and questionable variations on the truth. In other words, itâs never been easier for people to be wrong, and at the same time feel more certain that theyâre right.
âArea Man Passionate Defender Of What He Imagines Constitution To Be,â read a recent Onion headline. Like the best satire, this nasty little gem elicits a laugh, which is then promptly muffled by the queasy feeling of recognition. The last five decades of political science have definitively established that most modern-day Americans lack even a basic understanding of how their country works. In 1996, Princeton Universityâs Larry M. Bartels argued, âthe political ignorance of the American voter is one of the best documented data in political science.â
On its own, this might not be a problem: People ignorant of the facts could simply choose not to vote. But instead, it appears that misinformed people often have some of the strongest political opinions. A striking recent example was a study done in the year 2000, led by James Kuklinski of the University of Illinois at Urbana-Champaign. He led an influential experiment in which more than 1,000 Illinois residents were asked questions about welfare â the percentage of the federal budget spent on welfare, the number of people enrolled in the program, the percentage of enrollees who are black, and the average payout. More than half indicated that they were confident that their answers were correct â but in fact only 3 percent of the people got more than half of the questions right. Perhaps more disturbingly, the ones who were the most confident they were right were by and large the ones who knew the least about the topic. (Most of these participants expressed views that suggested a strong antiwelfare bias.)
Studies by other researchers have observed similar phenomena when addressing education, health care reform, immigration, affirmative action, gun control, and other issues that tend to attract strong partisan opinion. Kuklinski calls this sort of response the âI know Iâm rightâ syndrome, and considers it a âpotentially formidable problemâ in a democratic system. âIt implies not only that most people will resist correcting their factual beliefs,â he wrote, âbut also that the very people who most need to correct them will be least likely to do so.â
Whatâs going on? How can we have things so wrong, and be so sure that weâre right? Part of the answer lies in the way our brains are wired. Generally, people tend to seek consistency. There is a substantial body of psychological research showing that people tend to interpret information with an eye toward reinforcing their preexisting views. If we believe something about the world, we are more likely to passively accept as truth any information that confirms our beliefs, and actively dismiss information that doesnât. This is known as âmotivated reasoning.â Whether or not the consistent information is accurate, we might accept it as fact, as confirmation of our beliefs. This makes us more confident in said beliefs, and even less likely to entertain facts that contradict them.
New research, published in the journal Political Behavior last month, suggests that once those facts â or âfactsâ â are internalized, they are very difficult to budge. In 2005, amid the strident calls for better media fact-checking in the wake of the Iraq war, Michiganâs Nyhan and a colleague devised an experiment in which participants were given mock news stories, each of which contained a provably false, though nonetheless widespread, claim made by a political figure: that there were WMDs found in Iraq (there werenât), that the Bush tax cuts increased government revenues (revenues actually fell), and that the Bush administration imposed a total ban on stem cell research (only certain federal funding was restricted). Nyhan inserted a clear, direct correction after each piece of misinformation, and then measured the study participants to see if the correction took.
For the most part, it didnât. The participants who self-identified as conservative believed the misinformation on WMD and taxes even more strongly after being given the correction. With those two issues, the more strongly the participant cared about the topic â a factor known as salience â the stronger the backfire. The effect was slightly different on self-identified liberals: When they read corrected stories about stem cells, the corrections didnât backfire, but the readers did still ignore the inconvenient fact that the Bush administrationâs restrictions werenât total.
Itâs unclear what is driving the behavior â it could range from simple defensiveness, to people working harder to defend their initial beliefs â but as Nyhan dryly put it, âItâs hard to be optimistic about the effectiveness of fact-checking.â
It would be reassuring to think that political scientists and psychologists have come up with a way to counter this problem, but that would be getting ahead of ourselves. The persistence of political misperceptions remains a young field of inquiry. âItâs very much up in the air,â says Nyhan.
But researchers are working on it. One avenue may involve self-esteem. Nyhan worked on one study in which he showed that people who were given a self-affirmation exercise were more likely to consider new information than people who had not. In other words, if you feel good about yourself, youâll listen â and if you feel insecure or threatened, you wonât. This would also explain why demagogues benefit from keeping people agitated. The more threatened people feel, the less likely they are to listen to dissenting opinions, and the more easily controlled they are.
There are also some cases where directness works. Kuklinskiâs welfare study suggested that people will actually update their beliefs if you hit them âbetween the eyesâ with bluntly presented, objective facts that contradict their preconceived ideas. He asked one group of participants what percentage of its budget they believed the federal government spent on welfare, and what percentage they believed the government should spend. Another group was given the same questions, but the second group was immediately told the correct percentage the government spends on welfare (1 percent). They were then asked, with that in mind, what the government should spend. Regardless of how wrong they had been before receiving the information, the second group indeed adjusted their answer to reflect the correct fact.
Kuklinskiâs study, however, involved people getting information directly from researchers in a highly interactive way. When Nyhan attempted to deliver the correction in a more real-world fashion, via a news article, it backfired. Even if people do accept the new information, it might not stick over the long term, or it may just have no effect on their opinions. In 2007 John Sides of George Washington University and Jack Citrin of the University of California at Berkeley studied whether providing misled people with correct information about the proportion of immigrants in the US population would affect their views on immigration. It did not.
And if you harbor the notion â popular on both sides of the aisle â that the solution is more education and a higher level of political sophistication in voters overall, well, thatâs a start, but not the solution. A 2006 study by Charles Taber and Milton Lodge at Stony Brook University showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which theyâre totally wrong. Taber and Lodge found this alarming, because engaged, sophisticated thinkers are âthe very folks on whom democratic theory relies most heavily.â
In an ideal world, citizens would be able to maintain constant vigilance, monitoring both the information they receive and the way their brains are processing it. But keeping atop the news takes time and effort. And relentless self-questioning, as centuries of philosophers have shown, can be exhausting. Our brains are designed to create cognitive shortcuts â inference, intuition, and so forth â to avoid precisely that sort of discomfort while coping with the rush of information we receive on a daily basis. Without those shortcuts, few things would ever get done. Unfortunately, with them, weâre easily suckered by political falsehoods.
Nyhan ultimately recommends a supply-side approach. Instead of focusing on citizens and consumers of misinformation, he suggests looking at the sources. If you increase the âreputational costsâ of peddling bad info, he suggests, you might discourage people from doing it so often. âSo if you go on âMeet the Pressâ and you get hammered for saying something misleading,â he says, âyouâd think twice before you go and do it again.â
Unfortunately, this shame-based solution may be as implausible as it is sensible. Fast-talking political pundits have ascended to the realm of highly lucrative popular entertainment, while professional fact-checking operations languish in the dungeons of wonkery. Getting a politician or pundit to argue straight-faced that George W. Bush ordered 9/11, or that Barack Obama is the culmination of a five-decade plot by the government of Kenya to destroy the United States â thatâs easy. Getting him to register shame? That isnât.
Joe Keohane is a writer in New York.
-
Thanks Pete... that's really depressing.
-
@ccbiggs said:
There is no difference between a gun or a knife.
Then why buy a gun, given there's likely knives around the kitchen?
-
@pbacot said:
Thanks Pete... that's really depressing.
yeah.. at least the pic he posted at the end was funny
(i like the little baby with the bedpan head) -
More batshit crazy stuff from Alex the douche Jones:
-
@solo said:
More batshit crazy stuff from Alex the douche Jones:
How the hell did he pass the mental checks when buying his 50 guns?
-
@rodentpete said:
@solo said:
More batshit crazy stuff from Alex the douche Jones:
How the hell did he pass the mental checks when buying his 50 guns?
Simple answer is he never had to take any mental exams to own a gun, they are not required in Texas.
-
What can I say? What can anyone say. It seems this thread came to a grinding halt after that Video. The loony in this unbalanced, evidently Psychotic individual runs deep Scary thing is it stands to "reason" (a word the concept of which Loony Jones would have no familiarity with) that he is not alone. Very scary, vary scary indeed
-
@solo said:
More batshit crazy stuff from Alex the douche Jones:
maybe batshit crazy in the way he lets it out (he's an entertainer after-all..)
some of the stuff he's talking (or whatever that is he's doing) about though, might have some truth in there...
[edit] the thing i personally find batshit crazy about him (and i've mentioned this earlier itt), is that he's seriously contemplating/imagining/(fantasizing about?) being in some sort of gunfight with the people he's ripping on.. straight up suicide at this point in the game..
-
"edit] the thing i personally find batshit crazy about him (and i've mentioned this earlier itt), is that he's seriously contemplating/imagining/(fantasizing about?) being in some sort of gunfight with the people he's ripping on.. straight up suicide at this point in the game".
Not sure if this would be a bad thing, as long as innocent bystanders are not caught up in it. The conspiracy folk seem to end up like Timothy McVae. Escalation to the point of blowing up buildings is a whole new level of psychosis.
-
-
Let me make an adjustment - "Believes his interpretation of the constitution is right"
-
Isn't that a rifle for destroying a tank? Why? Why do you need one?!
-
You know, in case.
-
"Isn't that a rifle for destroying a tank? Why? Why do you need one?"!
It actually looks like an old .30 browning machine gun. We had them on armoured cars like the ferret for light duties in tank and infantry units. Why it has the stock etc added is a mystery as these were normally fired from vehicle mounts or heavy tripods. A case of never mind the quality, get a load of my big gun, bigger than yours etc.
Whether or not the weapon could be fired accurately in that state of build is questionable, those things have a heck of a kick on a ground mount and I beleive the chances of shoulder dislocation are more than good should anyone try to fire like that.
Darwin had more than a point imho.
-
Sorry it's actually an elephant gun due to its "tusks"....I watch Sons of Guns, sorry to be precise and it's an anti tank rifle shot by a soldier! Come on guys I'm not even American and I know this
It's a Finnish rifle: http://en.wikipedia.org/wiki/Lahti_L-39
-
Thanks for the clarification Oli, I was looking at the reciever between the barrel and the stock which has a very chunky oblong shape (tall and narrow) which is what made me think of a browning machine gun. Very similar in the original shot. Thanks again, interesting weapon if you are at the 'blunt' end.
-
Mike I am by no means an enthusiast but I love watching the american gun shows on discovery, if only for their engineering passion and skills.
Sorry to be off topic but there are even more ridiculous rifles!!
Advertisement