I'm Big Into Anal
Arcane
i'll second that
please, please, teach me how to increase my ratio
You are not smart enough to call me a dumbfuck.You're a dumbfuck, so good riddance, faggot. Go eat shit and cease your pathetic life.
You are not smart enough to call me a dumbfuck.You're a dumbfuck, so good riddance, faggot. Go eat shit and cease your pathetic life.
Goodness ther are like 10 retards in this thread. None of them are smart enough. I haven't really read any of the repsonses but I assume they are supportive
Thanks for the interesting read.Here’s a simple arithmetic question: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?
The vast majority of people respond quickly and confidently, insisting the ball costs ten cents. This answer is both obvious and wrong. (The correct answer is five cents for the ball and a dollar and five cents for the bat.)
For more than five decades, Daniel Kahneman, a Nobel Laureate and professor of psychology at Princeton, has been asking questions like this and analyzing our answers. His disarmingly simple experiments have profoundly changed the way we think about thinking. While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman, the late Amos Tversky, and others, including Shane Frederick (who developed the bat-and-ball question), demonstrated that we’re not nearly as rational as we like to believe.
When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions. These shortcuts aren’t a faster way of doing the math; they’re a way of skipping the math altogether. Asked about the bat and the ball, we forget our arithmetic lessons and instead default to the answer that requires the least mental effort.
Although Kahneman is now widely recognized as one of the most influential psychologists of the twentieth century, his work was dismissed for years. Kahneman recounts how one eminent American philosopher, after hearing about his research, quickly turned away, saying, “I am not interested in the psychology of stupidity.”
The philosopher, it turns out, got it backward. A new study in the Journal of Personality and Social Psychology led by Richard West at James Madison University and Keith Stanovich at the University of Toronto suggests that, in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.
West and his colleagues began by giving four hundred and eighty-two undergraduates a questionnaire featuring a variety of classic bias problems. Here’s a example:
In a lake, there is a patch of lily pads. Every day, the patch doubles in size. If it takes 48 days for the patch to cover the entire lake, how long would it take for the patch to cover half of the lake?
Your first response is probably to take a shortcut, and to divide the final answer by half. That leads you to twenty-four days. But that’s wrong. The correct solution is forty-seven days.
West also gave a puzzle that measured subjects’ vulnerability to something called “anchoring bias,” which Kahneman and Tversky had demonstrated in the nineteen-seventies. Subjects were first asked if the tallest redwood tree in the world was more than X feet, with X ranging from eighty-five to a thousand feet. Then the students were asked to estimate the height of the tallest redwood tree in the world. Students exposed to a small “anchor”—like eighty-five feet—guessed, on average, that the tallest tree in the world was only a hundred and eighteen feet. Given an anchor of a thousand feet, their estimates increased seven-fold.
But West and colleagues weren’t simply interested in reconfirming the known biases of the human mind. Rather, they wanted to understand how these biases correlated with human intelligence. As a result, they interspersed their tests of bias with various cognitive measurements, including the S.A.T. and the Need for Cognition Scale, which measures “the tendency for an individual to engage in and enjoy thinking.”
The results were quite disturbing. For one thing, self-awareness was not particularly useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to underestimate how long it will take to complete a task—“as it was before I made a study of these issues,” he writes.
Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.” This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves. Although the bias blind spot itself isn’t a new concept, West’s latest paper demonstrates that it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.
And here’s the upsetting punch line: intelligence seems to make things worse. The scientists gave the students four measures of “cognitive sophistication.” As they report in the paper, all four of the measures showed positive correlations, “indicating that more cognitively sophisticated participants showed larger bias blind spots.” This trend held for many of the specific biases, indicating that smarter people (at least as measured by S.A.T. scores) and those more likely to engage in deliberation were slightly more vulnerable to common mental mistakes. Education also isn’t a savior; as Kahneman and Shane Frederick first noted many years ago, more than fifty per cent of students at Harvard, Princeton, and M.I.T. gave the incorrect answer to the bat-and-ball question.
What explains this result? One provocative hypothesis is that the bias blind spot arises because of a mismatch between how we evaluate others and how we evaluate ourselves. When considering the irrational choices of a stranger, for instance, we are forced to rely on behavioral information; we see their biases from the outside, which allows us to glimpse their systematic thinking errors. However, when assessing our own bad choices, we tend to engage in elaborate introspection. We scrutinize our motivations and search for relevant reasons; we lament our mistakes to therapists and ruminate on the beliefs that led us astray.
The problem with this introspective approach is that the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.
Do you have a friend who's super smart, but when it comes to street smarts he's ... let's say "lacking"? Even the smartest people pull dumb moves sometimes, and for some reason it's extra surprising and disappointing when a smart person screws up. How could that president or general carry on an affair knowing it could easily get out? How did that company CEO think he could embezzle millions and no one would find out?
The truth is that book smarts or business savvy don't make a person perfect. Or streetwise. In fact, smart people seem prone to spectacular lapses in judgment more so than "average" people.
Why? One study published in the Journal of Personality and Social Psychology gave logic problems to people to solve and found that smart people tended to make more mistakes than those of average intellect, because smart people were more likely to take shortcuts or make assumptions due to overconfidence. This is called the bias blind spot [source: West et al].
Of course, overconfidence isn't the only road to a dumb decision. Many of the dumb choices you'll see on this list were motivated by greed, pride, stress, and even sheer laziness. Let's look at 10 memorable moments of "what were you thinking?"
After serving two terms in the U.S.'s highest office, President Bill Clinton started the Clinton Foundation to address some of the most pressing issues affecting the world today, from childhood obesity and climate change to global health. So, how did such a charitable and intelligent guy become part of one of the most notorious presidential sex scandals?
In 1999, President Clinton faced impeachment after details of an affair with 21-year-old intern Monica Lewinsky. While the affair itself was a pretty dumb move -- if you're going to have an affair, maybe don't choose someone that works for you -- the even dumber thing Clinton did was lie under oath.
The affair came to light in 1998 as part of a sexual harassment investigation filed by Paula Jones against Clinton [source: Linder]. In January 1998, Clinton was questioned about it formally by Jones's lawyers and lied under oath, saying the affair with Lewinsky never happened. Who can forget Clinton wagging his finger at the press and saying, "I did not have sexual relations with that woman, Miss Lewinsky"? He stuck to that lie until that August when her infamous blue dress -- stained with Clinton's semen -- came to light. Clinton later said they "only" had oral sex so he had not lied when he said they did not have sexual relations.
If Clinton hadn't lied under oath about his affair with Lewinsky, there would have been much less fodder for an impeachment case later on, but Clinton was acting out of fear and stress that the revelation would hurt his political career [source: Linder].
Whether it did is debatable. While Clinton was found not guilty in his impeachment trial, some say the whole ordeal damaged the mystique of the presidency [source: Linder]. However, Clinton's other acts as president -- like ending the war in Bosnia and balancing the federal budget -- helped save his reputation. In fact, he left office with the highest approval rating of any postwar president [source: American Experience].
Gary Hart was a married politician, lawyer, author, and college professor whose hubris led him to making an incredibly dumb move: provoking the media.
Hart's pitfall -- besides having an affair with a model named Donna Rice while running for office -- was assuming that he was smarter than reporters. Hart must have thought that he could count on absolute discretion from Rice and everyone else who knew about the relationship. And with his background he should have known better.
Hart was a campaign manager-turned-politician, and in 1987, the favorite for the Democratic presidential nomination [source: Currie]. Reporters suspected an affair between Hart and Rice, but it was Hart's arrogance that did him in. When rumors surfaced that he was cheating on his wife, rather than dodging the questions or coming clean, Hart adamantly denied the rumors, and dared the media to follow him around. ("You'll be bored," was his actual comment.)
Surprise! Reporters did just that, and that same day, they spotted Rice leaving Hart's house. Then they discovered that Hart had taken a romantic cruise with Rice, on a boat called -- no, seriously -- "Monkey Business." Then, reporters began hounding Rice's close friend (and "Monkey Business" shipmate) Lynn Armandt about the relationship. Armandt dodged reporters for weeks before she finally caved and confessed to knowing first-hand about the Hart-Rice affair [source: Green]. From the account of the affair that Armandt later shared with People Magazine, the biggest surprise in this scandal is that it didn't break sooner. Neither party was very discreet, and Rice had told several friends about her tryst.
The Rice scandal rocked Hart's presidential bid, and he withdrew from the race in May of 1988 [source: Sabato]
Robert McCormick was CEO of an Internet technology company Savvis, but that position didn't prevent him from making a colossal blunder in the common sense department.
McCormick went to an exclusive "gentlemen's" club -- appropriately named Scores -- and managed to ring up a $241,000 tab on his company credit card [source: Maull]. Yes, we said the company credit card. Scores is known for its high prices: $10,000 lap dances, bottles of champagne that cost thousands of dollars, and -- McCormick claims -- for fraud.
When McCormick received the extravagant bill, he disputed almost all of the charges, telling American Express that he rang up no more than a paltry $20,000. Scores countered that the club has a policy in place to verify any charge over $10,000. They take cardholder's fingerprints and even have the customer call their credit card company to verify the charges over the phone. After two years without payment, and McCormick unable to produce any documentation showing fraud, American Express sued McCormick for the money [source: Maull].
Savvis, McCormick and American Express eventually settled the case confidentially and out of the courtroom, but not before McCormick resigned from the company over the scandal [source: Rivera].
In 1998, Dr. Andrew Wakefield, a well-regarded scientist, published an article in the prestigious medical journal, The Lancet, claiming that there was a link between autism and the Measles Mumps and Rubella (MMR) vaccine.
The trouble is, Wakefield falsified much of the data in that paper.
Investigative reporters and the medical community have since discovered that Wakefield's paper was a complete fraud. He faked his patients' medical histories and published the results of his fraudulent study all in the name of money. What Wakefield didn't count on was that payoff coming to light.
The British Medical Journal discovered that Wakefield had received $674,000 from lawyers who were hoping to sue vaccine companies [source: CNN]. In order to get the results that the lawyers wanted, Wakefield faked his data in a couple of different ways: He chose some patients in his 12-person study who already had signs of autism and lied about others developing autism after getting the MMR vaccine [source: CNN].
In 2004, some of his fellow researchers found out about the law firm backing the research and withdrew their names as study co-authors [source: CNN]. The Lancet retracted the paper in 2010 and Wakefield was stripped of his medical license.
Wakefield and some of his fellow scientists continue to defend the study, saying that there was a scheme to cover up the link between vaccines and autism, but no peer-reviewed study has been able to replicate Wakefield's results [source: CNN].
That faked paper from the '90s is having real public health effects to this day. Some parents -- fearing for their children's safety -- are still opting not to get the MMR vaccine. This drop in vaccination rates has caused a spike in cases of measles, a dangerous childhood illness [source: CNN].
It's good to see that someone here started reading the books I recommended.Here’s a simple arithmetic question: A bat and ball cost a dollar and ten cents.
They should be. It would make the Codex a better place
Was there a particular event that caused his mod powers to be revoked in the first place? Or was it a gradual decline of his mental state? Or a Jew trick?
0.05 dollar?Here’s a simple arithmetic question: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?
please, please, teach me how to increase my ratio
please, please, teach me how to increase my ratio
please, please, teach me how to increase my ratio
nudes