|
Home | Switchboard | Unix Administration | Red Hat | TCP/IP Networks | Neoliberalism | Toxic Managers |
(slightly skeptical) Educational society promoting "Back to basics" movement against IT overcomplexity and bastardization of classic Unix |
|
Resurgence of Islam and Christian fundamentalism have common foundation -- modern culture (not necessary only American, despite a long tradition of anti-intellectualism in the country) devalues knowledge and rationalism. Supporting material include findings that only about half of Americans read a book in any year, only 26% accept Darwin's theory of evolution, and only a minority can name the four gospels or the first book of the Bible.
Jacoby also contends that anti-intellectualism and knowledge is worse in the U.S. than any other developed economy - but offers no evidence.
McCarthyism, the growth of fundamentalism and junk-science, and a celebrity-focused culture might be financed by defenders of status quo, but the reality is more complex. There is little evidence in this except in the case of junk science, especially neo-classical economics which definitely has right-wing backers.
An important point is that the impact of anti-intellectualism is much greater today than the 1800's when science and medicine had much less to offer.
|
Other candidates should also be considered for blame - the growth of particularly strong anti-intellectualism among inner-city African-American youth, endless self-promoting junk science "research" from other sources (e.g.. drug companies, various "diet gurus"), elevation of race-and-gender-based reverse discrimination (ouster of Larry Simmers is one recent example) , the growth of "political correctness," truth-twisting by politicians, misleading and overly simplistic books and articles (especially stock-market casino promotions e.g. concluding causation via correlation), and media's minimal efforts at investigative journalism.
Jacoby also fails to note that the average citizen's aversion to knowledge and rationalism can at least be partially explained. After all, who wants more work after their eight+ hours on the job and fighting traffic, preparing and eating breakfast and dinner, PLUS taking care of the children and other family matters?
Further, separating junk science from the real thing requires considerable subject matter knowledge.
As for politics, situation is more complex and intellectuals in politics might be more dangerous then useful. In a way this might be the only area were some level of anti-intellectualism might make sense. Too many intellectuals are radicals in politics often with negative bias like involvement as a waste of time - "nothing changes," "they all lie," and "only big donors have input."
On the other hand, it is troubling to see how readily misinformed Americans acquiesce to acceptance of non-thinking ideology and major misdirections in American governance. And it hurts to see those pathetic performances on "Are You Smarter Than a Fifth-Grader," and U.S. pupils vs. foreigners.
It might make sense to make it academically much more difficult to enter college, as it is in Asia, and adding a requirement for understanding logic and statistical reasoning. Finally, lawmakers should demand that PBS provide additional documentaries, and encourage ABC, CBS, NBC etc. to do likewise.
|
Switchboard | ||||
Latest | |||||
Past week | |||||
Past month |
The whole 'equality of opportunity' is just BS thrown from the Right to cover the looting of society by the Right's wealthy and corporate sponsors, and to paralyze their critics on the Left.
Look at Scott Walker and the Wisconsin legislature: $250 million cut from the state budget for the University of Wisconsin, whose activities help support *Equality of Opportunity,* going instead to subsidize 'Inequality of Outcome:'
A sports arena whose profits will be harvested by billionaires. And what else does the Right say: " Oh, now the university will be more efficient." and "Those billionaires will go elsewhere if we don't give them enough money."
The fact is, while the sports arena does provide entertainment 'value,' it is not a factory, and produces nothing of substance, nothing of lasting wealth for society. It is a tax on the resources of society. The university, however, is a social investment: It is an investment in its students, and the knowledge base of society, both of which provide society with lasting wealth and income. Scott Walker is not even an agent of capitalism. He is an agent of looters, and Wisconsin would be better off if those looters went elsewhere.
Anybody who just *listens* to what the Right says, and does not watch what they actually do, is a danger to themselves and others.
anne said in reply to anne...
http://www.voxeu.org/article/inequality-opportunity-policy-construct
June 10, 2015
Inequality of opportunity: Useful policy construct or will o' the wisp?
By Ravi Kanbur and Adam WagstaffReducing inequality of opportunity, rather than inequality of outcome, is often heralded as an appropriate target for policy. This column explores the challenges of identifying inequality of opportunity. Disentangling how effort and circumstance contribute to outcomes is difficult, and this leads to a tendency to underestimate inequality of opportunity.
This lends support for generalised social protection measures in dimensions such as income, health and education, irrespective of whether the outcomes can be specifically attributed to circumstance or to effort.
pgl said...
Brad DeLong and Tom Davis discuss US macroeconomic policy on Bloomberg TV:
"Brad DeLong: Well, I don't think it is just Democrats who would like to see more spending. Back in the 1970s Milton Friedman looked back at the Great Depression. He talked about what his teachers had recommended as policies and what he would have advocated in the Great Depression. He called for, in situations like that, and, I think, in situations like this, for coordinated monetary and fiscal expansion. With interest rates at their extraordinarily low levels, now, as in the 1930s, is a once-in-a-century opportunity to pull all the infrastructure spending we will be doing over the next generation forward in time and do it over the next five years, when the government can finance it at such extraordinarily good terms.
Matt Miller: We have a national infrastructure crisis, right? Roads and bridges, ports and airports are at levels that are critical and certainly not worthy of a first-world country. Tom, don't you agree we need to fix that up quickly?Tom Davis: I agree with that. Look, I think that with the stimulus package that was passed in 2009 they blew an opportunity to do more for infrastructure. We should have had something to show at the end of that. With the money, maybe we got a short-term stimulus, but we should have gotten something long-term.
Brad DeLong: They had to get it through with only Democratic votes. Why weren't there any Republicans willing to deal? We could have gotten a larger and much better-crafted program."
An excellent discussion which I hope JohnH paid attention to.
Lafayette said in reply to pgl...
{Why weren't there any Republicans willing to deal?}
They did with the first go-round (ARRA) that spent close to $850B to stimulate the economy.
The second-time around, when it became obvious that ARRA worked (to at least arrest the skyrocketing unemployment rate at 10%) and that ARRA2 was necessary to actually reduce unemployment, the Replicant spewed their nonsensical Austerity Budgeting palaver.
Why? Wickedly, for the 2012 presidential elections they strategized that high unemployment would be usefull to help elect the Replicant candidate. So, as a consequence, more Americans had to suffer in the slow crawlout from the Great Recession ...
Chris Herbert said...
Keynes counseled that the only metric one should use in determining what policy lever to use in a recession or depression is employment. If the lever raised employment, use it. If not, move on.
FDR said the nation needed to 'experiment' in order to find ways out of the Great Depression. Politically, he was allowed the freedom to do so, particularly in his first term. But into his second term, conservatives pushed back sufficiently well to stall his programs. Temporarily as it turned out, because the cutback in spending turned into another recession.
FDR also appreciated an important point. He called it 'peace' as in peace of mind. He called his opponents that 'enemies of peace.' Today I think people would understand that FDR was indentifying programs that reduced 'anxiety' amongst the general population. Health care, minimum wages, jobs. He moved Heaven and earth to relieve anxiety amongst the general population. And for the most part it worked quite well.
But it has persistent opposition from the wealthy class, which protects above all else their capital assets. Unfortunately, in the United States of Amnesia, working men and women don't appreciate that their financial improvement can come only at the expense of the wealthy. Class warfare? Absolutely. And believe you me the wealthy understand this right down to their trust funds.
Lafayette said in reply to Lafayette...
Any sense of a Social Democracy, with equal opportunity, means that whatever is essential to that opportunity (like education) is or should be a Public Service.
And if, indeed, education/training is an essential Public Service, then it should be provided in a manner that all citizens of a nation be able to accede to it uniformly.
Which means (to overburden the word) Educational Opportunity should be subsidized by Federal Funding, and be as close to free, gratis and for nothing as is humanly possible.
(Ahem - this is what Europe does.)
My point1: Education does not guaranty either a job or a level of career success, but it almost certainly is an important determinant. The other one is ONE HELLUVA LOTTA LUCK - often consisting simply of being at the right place at the right time.
My point2: Achievement in America is overly dependent upon financial success as a barometer. Which is why the listing of billionaires is often reported -or at least insinuated in the news. Picasso was never a billionaire, but was a great success in his chosen profession. Mozart as well, though he was buried in a pauper's grave ...
DrDick said...
The reality is that there is not, and cannot be, real equality of opportunity when there is pre-existing massive inequality of outcomes. Those from wealthier and higher status families have far more and better opportunities than those from poor and low status families. As George and Jeb Bush illustrate, as well as Donald Trump, privilege confers success even on the totally incompetent.
Sam said...
One half of Americans, workers and students, are below median IQ. These are the ones to be concerned about. This is a circumstance that higher education cannot correct.
Apr 18, 2015 | science.slashdot.org
timothy on Saturday April 18, 2015 @11:22AM
HughPickens.com writes David Robson has an interesting article at BBC on the relationship between high intelligence and happiness. "We tend to think of geniuses as being plagued by existential angst, frustration, and loneliness," writes Robson. Think of Virginia Woolf, Alan Turing, or Lisa Simpson – lone stars, isolated even as they burn their brightest." As Ernest Hemingway wrote: "Happiness in intelligent people is the rarest thing I know."The first steps to studying the question were taken in 1926 when psychologist Lewis Terman decided to identify and study a group of gifted children. Terman selected 1,500 pupils with an IQ of 140 or more – 80 of whom had IQs above 170. Together, they became known as the "Termites", and the highs and lows of their lives are still being studied to this day. "As you might expect, many of the Termites did achieve wealth and fame – most notably Jess Oppenheimer, the writer of the classic 1950s sitcom I Love Lucy. Indeed, by the time his series aired on CBS, the Termites' average salary was twice that of the average white-collar job. But not all the group met Terman's expectations – there were many who pursued more "humble" professions such as police officers, seafarers, and typists.
For this reason, Terman concluded that "intellect and achievement are far from perfectly correlated". Nor did their smarts endow personal happiness. Over the course of their lives, levels of divorce, alcoholism and suicide were about the same as the national average."
According to Robson, one possibility is that knowledge of your talents becomes something of a ball and chain. During the 1990s, the surviving Termites were asked to look back at the events in their 80-year lifespan. Rather than basking in their successes, many reported that they had been plagued by the sense that they had somehow failed to live up to their youthful expectations (PDF).
Bo'Bob'O (95398) on Saturday April 18, 2015 @11:41AM (#49500291)
The third factor (Score:5, Insightful)
I surely wouldn't qualify as one of the 'termites' in the study, but there still things in my life I take to quickly. There is a third metric that I am in my coming to respect even more: motivation and inspiration.
There is a big difference between having the ability to do something, having the need to do something, and having a want and drive to do something. That last one seems to get people much further then being at the very top in intelligence. It also provides a framework of interaction and social connection between peers, if it is truly a passion.
So maybe it takes being the best and brightest to be first chair violinist in a prestigious symphony, but being brilliant alone won't get you there. Meanwhile hundreds of others have a long and successful career they make out of their perseverance.
radtea (464814) on Saturday April 18, 2015 @11:57AM (#49500359)
Re:The third factor (Score:5, Interesting)
You've likely encountered this quote, but it bears repeating:
Nothing in the world can take the place of Persistence. Talent will not; nothing is more common than unsuccessful men with talent. Genius will not; unrewarded genius is almost a proverb. Education will not; the world is full of educated derelicts. Persistence and determination alone are omnipotent. The slogan 'Press On' has solved and always will solve the problems of the human race. -- Calvin Coolidge, 30th president of US (1872 - 1933)
E-Rock (84950) on Saturday April 18, 2015 @01:39PM (#49500767) Homepage
Re:Persistence is not omnipotent. (Score:5, Insightful)
Persistence doesn't mean trying the same thing over and over until it works. Persistence is trying to achieve your goals over and over again until you're successful.
So you might bang your head on the wall a few times, realize that won't work and then try different things until you break it down.
NixieBunny (859050) on Saturday April 18, 2015 @12:08PM (#49500403) Homepage
Re:The third factor (Score:4, Interesting)
Happiness has a lot to do with attitude. I find that being generally happy is easy if you use your abilities to put yourself into situations that make you happy. I used to work for a place that got to be more and more like Dilbert.
Instead of drowning in it, I broke loose and made a new life, using my brains to create interesting, fun things. I found part-time work in the sciences, and have extra time to make wacky inventions and volunteer with kids, teaching them how to do similar things.
I am careful to take on projects only if they are likely to make me happier. The latest was building the red telephone for this [rollingstone.com]...
Bengie (1121981) on Saturday April 18, 2015 @05:37PM (#49501635)
Re:The third factor (Score:2)
If you have no peers, you can get lonely and no amount of attitude can completely help a human who is lonely.
lkcl (517947) <[email protected]> on Saturday April 18, 2015 @11:42AM (#49500295) Homepage
Read "Outliers" (Score:5, Informative)
this is nothing new: i believe the same study was the basis of the famous book "Outliers", which is a fascinating study of what makes people successful. if i recall correctly, it's completely the opposite of what people expect: your genes *do* matter. your attitude *does* matter. your circumstances *do* matter. working hard *does* matter. and luck matters as well. but it's all of these things - luck, genetics, circumstances *and* hard work - that make for the ultimate success story. bill gates is one of the stories described. he had luck and opportunity - by being born at just the right time when personal computing was beginning - and circumstances - by going to one of the very very few schools in the USA that actually had a computer available (for me, that opportunity was when i was 8: i went to one of the very very few secondary schools in the UK that had a computer: a Pet 3032).
so, yeah - it's not a very popular view, particularly in the USA, as it goes against the whole "anyone can make it big" concept. but, put simply, the statistics show that it's a combination of a whole *range* of factors, all of which contribute, that make up success. just "being intelligent" simply is not enough.
drinkypoo (153816) <[email protected]> on Saturday April 18, 2015 @02:27PM (#49500967) Homepage Journal
Re:Read "Outliers" (Score:4, Insightful)
bill gates is one of the stories described. he had luck and opportunity - by being born at just the right time when personal computing was beginning - and circumstances - by going to one of the very very few schools in the USA that actually had a computer available
Yes, and by having rich parents. That is the single most reliable predictor of economic success. As such, it is anything but surprising that Gates was successful.
PeterM from Berkeley (15510) <[email protected]> on Saturday April 18, 2015 @11:44AM (#49500299) Journal
Scientific American begs to differ (Score:3)
Some ten or fifteen years ago, Scientific American published an article about the positive correlation of "general intelligence" with virtually every measure of success in life.
Like earning enough money to be comfortable, having the emotional intelligence to have a successful marriage, etc.
They showed that "general intelligence" which is correlated with but not directly measured by things like SAT scores, was basically a ticket to (or highly correlated with) a good life, and even good health.
And the article was mighty persuasive.
--PeterM
the_skywise (189793) on Saturday April 18, 2015 @12:03PM (#49500387)
The problem isn't intelligence - per se (Score:5, Insightful)
(See? I used per se, so I'm... oh never mind...)
Intelligence and being highly observant are great skills both in society and from an evolutionary/survivalist standpoint.
But in a society I've found it brings up two downsides:
Guilt, because your intelligence allows you to avoid pain or achieve a higher level of comfort in society. You weren't "superman" you just made rational choices based upon your understanding of how the system works and now your friends and family are suffering because they didn't and you want to help them which requires more energy and effort or you can't which means your intelligence has limits and all you can do is watch them suffer.
Stress and anxiety. Once you figure out that you can problem solve and improve your quality of life it's natural, like any athlete, to grow and push your boundaries. But intellectual pursuits aren't as cut and dried as physical ones - It's easy to know that you can only bench press 200lbs and that's what you need to work on - Less so when you're trying to solve problems like familial and social discord but nobody will listen or trying to improve your company's fortunes by making proper investment choices. More to the point, I'm an engineer and there's nothing more frustrating trying to solve a problem you've encountered with your design that YOU pushed for, can't figure out why it's not working, might not work AT ALL and the boss is breathing down your neck (oh and the company is on the line). There's plenty of days I've driven by a building crew and daydreamed about just running the earth mover or driving a dump truck.
In an Agrarian society - in a pre-industrialized world these issues just didn't come about for intellectualism - Partially because it wasn't as much of a survival skill. (And that's probably why steampunk is so romanticized today)
reboot246 (623534) on Saturday April 18, 2015 @07:29PM (#49502135) Homepage
This may be why (Score:5, Interesting)
The danger when you have the intelligence to do anything you want to do in life is doing nothing. You hesitate to focus narrowly on one field of study because that means you'll have less time for all the others.
I won't say what my IQ is, but it's up there. My grades, especially in science courses, were practically perfect. People were expecting me to go into all kinds of careers, including medicine, chemistry, physics, computer science, etc.. But, I'm interested in everything! Always have been. I chose a career that didn't need much thought so I could keep up with what was happening in science and technology. It's worked. How many 62 year olds do you know who build their own computers? Or just bought two new microscopes? Or diagnose their own problems before going to the doctor?
I know a lot of successful people. Most of them have very little time for fishing, hunting, camping, going to ball games, watching television, listening to music, playing with the children & grandchildren, or working in the garden. I have all the time in the world to enjoy life. Isn't that what it's all about?
Anonymous Coward on Saturday April 18, 2015 @01:12PM (#49500661)
Re:*Grabs a bowl of popcorn* (Score:5, Interesting)
I do not know if I qualify as a genius, but I would like to think I am above average in intelligence. I topped my undergraduate class in engineering, scored near perfect score in my GRE (2380/2400, back when it actually included an analytical section with puzzles), and was a graduate student in quantum computing at a top school.
I subsequently dropped out because I realized two things:
- Most of my classmates were really good at the subject (e.g., people who won International Math and Physics Olympiads). They started their PhDs at a really young age, and were almost bored by the coursework. Homework that I would spend a Saturday doing were completed while still in class by these bored teenagers.
- Most of them really loved the subject (i.e., people who loved doing physics at the expense of all else, such as dating, money, or having a social life). Or the subject was so easy that they had the time to pursue other things.
I realized I neither loved physics unconditionally nor was I good enough at it to warrant the pursuit of a PhD, not to mention the subsequent post doc and so on. All this happened at the same time that I fell in love with my now-wife, started a company, and subsequently got into management consulting to make money instead.
I do not mean to phrase this as a tautology (i.e., doing a PhD is mutually exclusive from making money or having a social life), but in my experience, the biggest sacrifice was watching classmates who were relatively mediocre (in my opinion) get "business" degrees and do exceedingly well in life in terms of money and relationships.
Most of my cohort completed their PhDs and now have very successful academic careers. I still love math, theoretical physics, and computer science. I keep myself apprised of most of the publications in the field, and occasionally, write a paper or two myself, and I certainly miss the challenge of advanced math and physics. I still envy my peers, and I am sure some of them envy me.
But now being in an unhappy relationship, being a parent, having the burdens of a pointless life (the hardest thing I do is a spreadsheet that just helps some fool company make millions of dollars), I question my past choices. So much possibility lay ahead of me, and I gave it all up for what? For a few bucks, beers, and a few lays?
I'm probably considered successful by the measure of the quintessential American dream -- by ~30, I was a rising star at a top management consulting firm, had over 7 figures to my name, owned a large home in one of the best neighborhoods in Boston, and had a beautiful wife and son. I drove expensive cars, wore bespoke suits and expensive watches, spent time mountaineering in the Alps and the Himalayas, and traveled the world. But still, I always felt that I had missed something. That I will never come ahead of time. That no matter how successful I become in life, I will probably never have a theorem named after me or spend my days basking in the beauty of math.
No amount of sex or expensive liquor or material goods can equate the joys of just proving a theorem. I will forever have this knowledge, that I could have been more, and chose less. My life now reminds me of a Pink Floyd lyrics -- "Did you exchange a walk-on part in a war for a lead role in a cage?".
justthinkit (954982) <[email protected]> on Saturday April 18, 2015 @07:51PM (#49502227) Homepage Journal
Here is what you are missing (Score:3)
Here is what you are missing -- helping others.
Most of the activities of my life have been trivially easy for decades. Helping others remains challenging.
If you really are "so smart", you are able to see what a disaster this world is today. Well, get busy changing it. You will be up against the most powerful, greedy, selfish & moneyed people on the face of the Earth. Challenge enough for me. What about you?
Spugglefink (1041680) on Saturday April 18, 2015 @05:35PM (#49501619)
Re:The biggest problem: the "long view" (Score:5, Interesting)
I can relate to that. People who live more in the moment are happier, because the long view always involves decline, death, and dying. I'm petting and really enjoying my dog, and somewhere I'm thinking how I might have another eight years before I have a 120 pound problem who is pissing and shitting huge logs everywhere, who is going to be a royal bitch to dig a hole for one day. I'm having sex with my wife, and somewhere I'm thinking how much it's going to suck looking at her when she's 80. The big picture long view always seems to have a down side, and it's depressing.
I can relate to the expectations thing too. Everybody looks up to you, and a lot of them are jealous of you, and it makes it that much harder to choose an ordinary life. I'm a truck driver, and I like my profession fine, but I constantly feel a need to apologize for not owning the trucking company or being a professor or something; for not aiming higher in general. I've found a lot of people don't like me, because they don't think they're good enough for me for some reason, and yet I feel the same toward them. I'd love to just be normal, and not have to think so much about everything. Too much knowledge can be crippling, instead of helpful. It's hard to invest in a business idea, knowing every conceivable way it might fail, and what all the odds are.
My mother was even more intelligent than I am, and she died young, of alcoholism. She was a miserable woman.
Intelligence is overrated. One side effect for me is that I can never enjoy the opiate of a nice handy sky daddy to make me feel less infinitesimal in the scheme of things. We evolved to see sky daddies in everything, and I have the same need in my brain as any other human, but there's nothing to plug into it. I haven't found the religion yet that wasn't just totally inconsistent and goofy.
captjc (453680) on Saturday April 18, 2015 @10:06PM (#49502711)
Re:The biggest problem: the "long view" (Score:3)
That has nothing to do with intelligence and everything to do with outlook and perspective. Lets just say, I'm a pretty smart guy and the best piece of advice that I was ever given was to focus on the now. It is easy to foresee problems and possible scenarios and it is good to take measures to prevent the obvious. However, the sooner you realize that shit happens that you will never be able to plan for or there are simply various inevitable outcomes that will be sad and painful that you simply will not want to deal with, the sooner you will realize that there is just no point in worrying about them.
It has almost become a catchphrase for me, "Cross that bridge when you get to it." Focus on what can be dealt with now. Try to keep yourself in the best possible situation that you can and don't worry about what is around the corner until it is within sight to actually deal with it. Friends will come and go, loved ones will leave you, cars and tools will fail you when you need them the most, at some point your job will end, and eventually you will die. These are simple truths of life but if you spend even a second worrying about any of them before there is anything you can do about them, it is purely wasted energy that could be put to use tackling the problems that you do have.
I'm not saying it is easy to change the way you look at the world. It can take some work if not serious effort and it is easy to let yourself fall into ruts of depression and self-loathing. I know, I was there. That is nothing but perverse mental masturbation that does nothing but waste your energy and destroy what little happiness you can achieve. If you can learn to refocus yourself to only what you can affect, the happier and more productive you will become.
marknesop.wordpress.com
et Al, April 3, 2015 at 3:02 pmkirill, April 3, 2015 at 3:06 pmSlashdot: Google 'Makes People Think They Are Smarter Than They Are
http://search.slashdot.org/story/15/04/02/178220/google-makes-people-think-they-are-smarter-than-they-areKaren Knapton reports at The Telegraph that according to a study at Yale University, because they have the world's knowledge at their fingertips, search engines like Google or Yahoo make people think they are smarter than they actually are giving people a 'widely inaccurate' view of their own intelligence that can lead to over-confidence when making decisions. In a series of experiments, participants who had searched for information on the internet believed they were far more knowledgeable about a subject that those who had learned by normal routes, such as reading a book or talking to a tutor. Internet users also believed their brains were sharper….
####This is none more obvious that in the retarded comments you read in the Pork Pie News Networks. It is one thing to look up a 'fact', but to understand it within context, its limitations and not stretch it way beyond reasonable interpretations to fit your argument takes it in to altogether different territory.
I think the good news is that as the Internet is still quite young and people are learning that a) the first answer you find may not be true; b) it helps to do more research if you could be bothered. It's not hard to differentiate between the political bs'ers and the properly curious.
The best thing I think is that we are also learning to ask the right questions in the right way. Most of us can now spot obfuscation through deliberately complicated answers (as is technique often used by people who think they are clever) and are starting to spot what isn't there, or what isn't said simply through logic and following the process or the steps that should lead to a logical conclusion. If that is not done, followed or points to some other conclusion, then the red flags (I don't mean communist ones!) should go up that something is not quite kosher and should be treated with care. Still, it's early days.
et Al, April 3, 2015 at 3:23 pmPeople are brainwashed from birth to believe that knowledge of facts is the same as intelligence. I have seen this trope in numerous TV shows and movies. It is total rubbish. People spend years at university and in post-doctoral studies engaged in problem solving. No amount of Google searches is going to teach internet Einsteins that skill.
I can't be as pessimistic as you. Yes, brainwashing does start very early, but this is just the beginning of a brave new world (if we don't become nuclear toast first) and the new industrial revolution has only just started. The field is wide open and old actors will be turfed out or overturned by the new and hungry.
If the turdification of higher education continues in certain countries, then those countries are simply hollowing out themselves from the inside. They simply will not be able to find sufficient numbers of competent people to maintain what they have.
It is one of the many reasons that I am for free education and unlimited free (or at least heavily subsidized) return to education and retraining until you pop your clogs. In fact, I think it is essential if we are going to live longer and more productive lives. If the state (us) fund it, then we all benefit from it over the long term. So far Western countries have been able to attract some of the best foreign talent from other countries and benefit from it, but the rest of the world is catching up fast.
March 18, 2015 | The American Conservative
A phony populism is denying Americans the joys of serious thought.
... ... ...
Universities, too, were at fault. They had colonized critics by holding careers hostage to academic specialization, requiring them to master the arcane tongues of ever-narrower disciplines, forcing them to forsake a larger public. Compared to the Arcadian past, the present, in this view, was a wasteland.
It didn't have to be this way. In the postwar era, a vast project of cultural uplift sought to bring the best that had been thought and said to the wider public. Robert M. Hutchins of the University of Chicago and Mortimer J. Adler were among its more prominent avatars. This effort, which tried to deepen literacy under the sign of the "middlebrow," and thus to strengthen the idea that an informed citizenry was indispensable for a healthy democracy, was, for a time, hugely successful. The general level of cultural sophistication rose as a growing middle class shed its provincialism in exchange for a certain worldliness that was one legacy of American triumphalism and ambition after World War II. College enrollment boomed, and the percentage of Americans attending the performing arts rose dramatically. Regional stage and opera companies blossomed, new concert halls were built, and interest in the arts was widespread. TV hosts Steve Allen, Johnny Carson, and Dick Cavett frequently featured serious writers as guests. Paperback publishers made classic works of history, literature, and criticism available to ordinary readers whose appetite for such works seemed insatiable.
Mass circulation newspapers and magazines, too, expanded their coverage of books, movies, music, dance, and theater. Criticism was no longer confined to such small but influential journals of opinion as Partisan Review, The Nation, and The New Republic. Esquire embraced the irascible Dwight Macdonald as its movie critic, despite his well-known contempt for "middlebrow" culture. The New Yorker threw a lifeline to Pauline Kael, rescuing her from the ghetto of film quarterlies and the art houses of Berkeley. Strong critics like David Riesman, Daniel Bell, and Leslie Fiedler, among others, would write with insight and pugilistic zeal books that often found enough readers to propel their works onto bestseller lists. Intellectuals such as Susan Sontag were featured in the glossy pages of magazines like Vogue. Her controversial "Notes on Camp," first published in 1964 in Partisan Review, exploded into public view when Time championed her work. Eggheads were suddenly sexy, almost on a par with star athletes and Hollywood celebrities. Gore Vidal was a regular on Johnny Carson. William F. Buckley Jr.'s "Firing Line" hosted vigorous debates that often were models of how to think, how to argue, and, at their best, told us that ideas mattered.
As Scott Timberg, a former arts reporter for the Los Angeles Times, puts it in his recent book Culture Crash: The Killing of the Creative Class, the idea, embraced by increasing numbers of Americans, was thatdrama, poetry, music, and art were not just a way to pass the time, or advertise one's might, but a path to truth and enlightenment. At its best, this was what the middlebrow consensus promised. Middlebrow said that culture was accessible to a wide strat[um] of society, that people needed some but not much training to appreciate it, that there was a canon worth knowing, that art was not the same as entertainment, that the study of the liberal arts deepens you, and that those who make, assess, and disseminate the arts were somehow valuable for our society regardless of their impact on GDP.
So what if culture was increasingly just another product to be bought and sold, used and discarded, like so many tubes of toothpaste? Even Los Angeles, long derided as a cultural desert, would by the turn of the century boast a flourishing and internationally respected opera company, a thriving archipelago of museums with world-class collections, and dozens of bookstores selling in some years more books per capita than were sold in the greater New York area. The middlebrow's triumph was all but assured.
The arrival of the Internet by century's end promised to make that victory complete. As the Wall Street Journal reported in a front-page story in 1998, America was "increasingly wealthy, worldly, and wired." Notions of elitism and snobbery seemed to be collapsing upon the palpable catholicity of a public whose curiosities were ever more diverse and eclectic and whose ability to satisfy them had suddenly and miraculously expanded. We stood, it appeared, on the verge of a munificent new world-a world in which technology was rapidly democratizing the means of cultural production while providing an easy way for millions of ordinary citizens, previously excluded from the precincts of the higher conversation, to join the dialogue. The digital revolution was predicted to empower those authors whose writings had been marginalized, shut out of mainstream publishing, to overthrow the old monastic self-selecting order of cultural gatekeepers (meaning professional critics). Thus would critical faculties be sharpened and democratized. Digital platforms would crack open the cloistered and solipsistic world of academe, bypass the old presses and performing-arts spaces, and unleash a new era of cultural commerce. With smart machines there would be smarter people.
Harvard's Robert Darnton, a sober and learned historian of reading and the book, agreed. He argued that the implications for writing and reading, for publishing and bookselling-indeed, for cultural literacy and criticism itself-were profound. For, as he gushed in The Case for Books: Past, Present, and Future, we now had the ability to make "all book learning available to all people, or at least those privileged enough to have access to the World Wide Web. It promises to be the ultimate stage in the democratization of knowledge set in motion by the invention of writing, the codex, movable type, and the Internet." In this view, echoed by innumerable worshippers of the New Information Age, we were living at one of history's hinge moments, a great evolutionary leap in the human mind. And, in truth, it was hard not to believe that we had arrived at the apotheosis of our culture. Never before in history had more good literature and cultural works been available at such low cost to so many. The future was radiant.
Others, such as the critics Evgeny Morozov and Jaron Lanier, were more skeptical. They worried that whatever advantages might accrue to consumers and the culture at large from the emergence of such behemoths as Amazon, not only would proven methods of cultural production and distribution be made obsolete, but we were in danger of being enrolled, whether we liked it or not, in an overwhelmingly fast and visually furious culture that, as numerous studies have shown, renders serious reading and cultural criticism increasingly irrelevant, hollowing out habits of attention indispensable for absorbing long-form narrative and sustained argument. Indeed, they feared that the digital tsunami now engulfing us may even signal an irrevocable trivialization of the word. Or, at the least, a sense that the enterprise of making distinctions between bad, good, and best was a mug's game that had no place in a democracy that worships at the altar of mass appeal and counts its receipts at the almighty box office.
... ... ...
...Today, America's traditional organs of popular criticism-newspapers, magazines, journals of opinion-have been all but overwhelmed by the digital onslaught: their circulations plummeting, their confidence eroded, their survival in doubt. Newspaper review sections in particular have suffered: jobs have been slashed, and cultural coverage vastly diminished. Both the Los Angeles Times and the Washington Post have abandoned their stand-alone book sections, leaving the New York Times as the only major American newspaper still publishing a significant separate section devoted to reviewing books.
Such sections, of course, were always few. Only a handful of America's papers ever deemed book coverage important enough to dedicate an entire Sunday section to it. Now even that handful is threatened with extinction, and thus is a widespread cultural illiteracy abetted, for at their best the editors of those sections tried to establish the idea that serious criticism was possible in a mass culture. In the 19th century, Margaret Fuller, literary editor of the New York Tribune and the country's first full-time book reviewer, understood this well. She saw books as "a medium for viewing all humanity, a core around which all knowledge, all experience, all science, all the ideal as well as all the practical in our nature could gather." She sought, she said, to tell "the whole truth, as well as nothing but the truth."
The arrival of the Internet has proved no panacea. The vast canvas afforded by the Internet has done little to encourage thoughtful and serious criticism. Mostly it has provided a vast Democracy Wall on which any crackpot can post his or her manifesto. Bloggers bloviate and insults abound. Discourse coarsens. Information is abundant, wisdom scarce. It is a striking irony, as Leon Wieseltier has noted, that with the arrival of the Internet, "a medium of communication with no limitations of physical space, everything on it has to be in six hundred words." The Internet, he said, is the first means of communication invented by humankind that privileges one's first thoughts as one's best thoughts. And he rightly observed that if "value is a function of scarcity," then "what is most scarce in our culture is long, thoughtful, patient, deliberate analysis of questions that do not have obvious or easy answers." Time is required to think through difficult questions. Patience is a condition of genuine intellection. The thinking mind, the creating mind, said Wieseltier, should not be rushed. "And where the mind is rushed and made frenetic, neither thought nor creativity will ensue. What you will most likely get is conformity and banality. Writing is not typed talking."
The fundamental idea at stake in the criticism of culture generally is the self-image of society: how it reasons with itself, describes itself, imagines itself. Nothing in the excitements made possible by the digital revolution banishes the need for the rigor such self-reckoning requires. It is, as Wieseltier says, the obligation of cultural criticism to bear down on what matters.
♦♦♦
Where is such criticism to be found today? We inhabit a remarkably arid cultural landscape, especially when compared with the ambitions of postwar America, ambitions which, to be sure, were often mocked by some of the country's more prominent intellectuals. Yes, Dwight Macdonald famously excoriated the enfeeblements of "mass cult and midcult," and Irving Howe regretted "This Age of Conformity," but from today's perspective, when we look back at the offerings of the Book-of-the-Month Club and projects such as the Great Books of the Western World, their scorn looks misplaced. The fact that their complaints circulated widely in the very midcult worlds Macdonald condemned was proof that trenchant criticism had found a place within the organs of mass culture. One is almost tempted to say that the middlebrow culture of yesteryear was a high-water mark.
The reality, of course, was never as rosy as much of it looks in retrospect. Cultural criticism in most American newspapers, even at its best, was almost always confined to a ghetto. You were lucky at most papers to get a column or a half-page devoted to arts and culture. Editors encouraged reporters, reviewers, and critics to win readers and improve circulation by pandering to the faux populism of the marketplace. Only the review that might immediately be understood by the greatest number of readers would be permitted to see the light of day. Anything else smacked of "elitism"-a sin to be avoided at almost any cost.
This was a coarse and pernicious notion, one that lay at the center of the country's longstanding anti-intellectual tradition. From the start of the republic, Americans have had a profoundly ambivalent relationship to class and culture, as Richard Hofstadter famously observed. He was neither the first nor the last to notice this self-inflicted wound. As even the vastly popular science-fiction writer Isaac Asimov understood, "Anti-intellectualism has been a constant thread winding its way through our political and cultural life, nurtured by the false notion that democracy means that 'my ignorance is just as good as your knowledge.'"
... ... ...
When did "difficulty" become suspect in American culture, widely derided as anti-democratic and contemptuously dismissed as evidence of so-called elitism? If a work of art isn't somehow immediately "understood" or "accessible" by and to large numbers of people, it is often ridiculed as "esoteric," "obtuse," or even somehow un-American. We should mark such an argument's cognitive consequences. A culture filled with smooth and familiar consumptions produces in people rigid mental habits and stultified conceptions. They know what they know, and they expect to find it reinforced when they turn a page or click on a screen. Difficulty annoys them, and, having become accustomed to so much pabulum served up by a pandering and invertebrate media, they experience difficulty not just as "difficult," but as insult. Struggling to understand, say, Faulkner's stream-of-consciousness masterpiece The Sound and the Fury or Alain Resnais's Rubik's Cube of a movie "Last Year at Marienbad" needn't be done. The mind may skip trying to solve such cognitive puzzles, even though the truth is they strengthen it as a workout tones the muscles.
Sometimes it feels as if the world is divided into two classes: one very large class spurns difficulty, while the other very much smaller delights in it. There are readers who, when encountering an unfamiliar word, instead of reaching for a dictionary, choose to regard it as a sign of the author's contempt or pretension, a deliberate refusal to speak in a language ordinary people can understand. Others, encountering the same word, happily seize on it as a chance to learn something new, to broaden their horizons. They eagerly seek a literature that upends assumptions, challenges prejudices, turns them inside out and forces them to see the world through new eyes.
The second group is an endangered species. One reason is that the ambitions of mainstream media that, however fitfully, once sought to expose them to the life of the mind and to the contest of ideas, have themselves shrunk. We have gone from the heyday of television intellection which boasted shows hosted by, among others, David Susskind and David Frost, men that, whatever their self-absorptions, were nonetheless possessed of an admirable highmindedness, to the pygmy sound-bite rants of Sean Hannity and the inanities of clowns like Stephen Colbert. Once upon a time, the ideal of seriousness may not have been a common one, but it was acknowledged as one worth striving for. It didn't have to do what it has to today, that is, fight for respect, legitimate itself before asserting itself. The class that is allergic to difficulty now feels justified in condemning the other as "elitist" and anti-democratic. The exercise of cultural authority and artistic or literary or aesthetic discrimination is seen as evidence of snobbery, entitlement and privilege lording it over ordinary folks. A perverse populism increasingly deforms our culture, consigning some works of art to a realm somehow more rarified and less accessible to a broad public. Thus is choice constrained and the tyranny of mass appeal deepened in the name of democracy.
... ... ...Steve Wasserman, former literary editor of the Los Angeles Times, is editor-at-large for Yale University Press.
This essay is adapted with permission from his chapter in the forthcoming The State of the American Mind: Sixteen Critics on the New Anti-Intellectualism, edited by Adam Bellow and Mark Bauerlein, to be published by Templeton Press in May 2015.
Sep 24, 2012 | stumblingandmumbling.typepad.com
Chris Skidmore, one of the authors of Britannia Unchained, says:People aren't interested in looking at medians and graphs. We have a duty to try and broaden that message outside of the think tank zone.
I don't know what to make of this. It could be that Skidmore is recommending that politicians use social science in the way Paul Krugman urges economists to use maths - you base your policy upon it, but then find a way of advocating the policy in more populist language.
Sadly, though, it is not at all obvious that Britannia Unchained's authors are using this reasonable approach. They seem instead to have skipped the science and evidence and gone straight to the populism.
This suggests an unkinder interpretation - that Skidmore thinks formal science has no place in politics. What matters is what sells, not what's right.
The problem here is that there is no strong obstacle to this descent into post-modern politics. The anti-scientific culture of our mainstream media means they will not call politicians out on their abuse of facts, unless the abuser is not in their tribe - as Jonathan complained in noting the press's reaction to Britannia Unchained.
But does this matter? In one sense, maybe not. Expert support and empirical evidence does not guarantee that a policy will be a success - though I suspect it improves the odds.
Instead, what worries me is that this threatens to further corrode the standard of political discourse.Fact-free politics need not be the sole preserve of the right; some of my readers will have the name of Richard Murphy in their minds. And if we go down this road, we'll end up with one tribe thinking the poor are all scroungers and the other thinking our economic problem can be solved by a crackdown on tax dodging. And the two tribes will just be throwing insults at each other. And there's a few of us who think this would be dull.
BenSix | September 22, 2012 at 12:09 PM
I don't think that's what Skidmore's saying but nor do I think that what he's saying is any less silly. He replies to charges of slipshod research and laziness by saying...
"...it's a 116-page book, there's 433 footnotes to it."I see this a lot: the implicit claim that the merit of work can be judged by the amount of references that it contains. Yet that says nothing about the quality of its research or interpretation. I could argue that I'm God and add 433 footnotes that reference self-published blogposts in which I proclaim that I'm a deity but it wouldn't make it a work of scholarship.
Chris | September 22, 2012 at 10:05 PM
"Fact-free politics need not be the sole preserve of the right"
They need not be, but they are.Blissex | September 23, 2012 at 12:47 PM
Continuing my previous comment on voter hypocrisy, yes there are many voters who consider politics a spectator sport, a source of entertainment, just like news.
But my impression is that "fact free" politics is really a cover for an unwillingness to discuss the available facts, because they are unpleasant, as they relate to nasty self interest and distributional issues.
Politics thus may be fact free because the facts cannot be be discussed in a politically correct way, and therefore dog whistling abounds.
It is not a question of tribes, but of interests, even if these interests relate fairly directly to culture and in particular theology (most "culture" is the corrupted legacy of some dead theologian).
Consider this quote:
http://www.independent.co.uk/voices/commentators/owen-jones-workingclass-toryism-is-dying-and-its-taking-the-party-with-it-7851880.html"When I was at university, a one-time very senior Tory figure put it succinctly at an off-the-record gathering: the Conservative Party, he explained, was a "coalition of privileged interests. Its main purpose is to defend that privilege. And the way it wins elections is by giving just enough to just enough other people"."
Sam | September 24, 2012 at 05:31 PM
But my impression is that "fact free" politics is really a cover for an unwillingness to discuss the available facts, because they are unpleasant, as they relate to nasty self interest and distributional issues.
Perhaps. It could also be that looking at the facts will force you to realize that your simplistic 1D-model of how things work doesn't actually fit the available data.
Dec 02, 2014 | Crooked Timber
One especially unfortunate aspect of economics is that its penchant for just-so stories can reinforce its imperialist blindnesses.
If you've been trained systematically to look for examples of market efficiency winning out, you'll likely be inclined to treat your own, and your discipline's success as examples of market efficiency in action.
George Mason University law school's Moneybollocks mythology provides one cautionary tale as to how this can lead one to systematically overlook the role of politics in determining who wins and who loses.
The underlying point of the Fourcade et al. article is that politics and power play a far larger role in determining both the success of economics and the success of economics than economists are prepared to admit in public. Or, more succinctly, sociology provides a much better account of economics' success than economics itself does. Obviously, that's a claim that's going to be uncongenial to economists, as well as one that many economists will have difficulty in absorbing (they usually aren't trained to think in that way). If they were better versed in sociology, and also somewhat paranoid, they might want to treat the piece as a meta-Bourdieuian Trojan horse, that inherently elevates sociology at the expense of economics (although these imaginary well-read paranoid economists would still somehow have to deal with Fourcade's previous work, which has tacitly rebuked economic sociology for its obsession with disproving economics). But the point would still remain – that the internal structures of economics, as well as its external influence, are very far indeed from a free market.
Ben 12.02.14 at 9:24 pm
In other word, economics has done a better job of cozying up to power.
Which is not entirely the discipline's own fault, given that the powerful have always seen it as an important tool for their legitimation. (e.g. see here)
js 12.02.14 at 10:40 pm
It's a great paper. And man, further confirmation-if any were needed-that business schools majorly suck!
Sasha Clarkson 12.02.14 at 11:13 pm
"… It's probably because…drumroll…economics is the discipline that studies the economy. "
Economics is not the discipline: it is a set of disciplines which share a name, some jargon and common ideas: not unlike, say, evolutionary biology and "intelligent" design creationism. Except that in economics there is more than one form of creationism: Marxism and Austrianism come to mind. Creationist economics studies the world(s) some people believe ought to exist. Austrianists even define "inflation" in their own unique way, but then try to coerce everyone else's reality to match their theology.
The economics of the real world has evolved since Keynes, just as biology has changed since Darwin and cosmology since Kepler. But the key thing about non-creationist economics is that evidence matters and will lead to the model being modified accordingly. After Tycho Brahe died, in 1601, Johannes Kepler tried to develop a new cosmological theory based on circular orbits around an off-centre sun. After years of work, he rejected his beloved theory because it was incorrect about Mars' position by 8 minutes of arc: 2/15 of a degree. Kepler wrote: "Because these 8′ could not be ignored, they alone have led to a total reformation of astronomy."* He then spent several more years developing his theory of elliptical orbits which made predictions accurate enough to satisfy him.
Some "economists" have been predicting US hyperinflation for years: it's failure to materialise is a vast error compared with Kepler's 8 minutes of arc: but there has been no urge to modify the theory.
*Translated by Arthur Koestler in The Sleepwalkers.
door 12.02.14 at 11:21 pm
The root problem is that economists, and economics, frequently make policy prescriptions - normative judgments - even though they have very little knowledge of or training in normative ethics and moral philosophical argumentation. The real normative action is instead often concealed under technical phrases and axioms and simplifications that when scrutinized lack convincing justification and tend to be biased towards right-wing policies and the status quo distribution of power and wealth.
Rakesh 12.03.14 at 12:59 am
Another great thing is the comfort economics gives me that amidst all the chaos of unemployment, bankruptcies, and cycles there is an equilibrium that markets are just about to achieve, perhaps with just a little expert guidance and the right human sacrifices at the right time.
JanieM 12.03.14 at 4:38 am"The economy" has become like this mythical, but nonetheless terribly important and pitifully fragile, little flower, that we can't ever actually see or touch, but all have to look after and think about and be terrifically careful of, otherwise it will just suddenly die and take us all with it. It's nonsense. We should be thinking about how we, as human beings, look after each other and the earth that sustains us.
Well said.
John Emerson 12.03.14 at 5:11 am
Suppose a Pol Pot came to power and all economists were liquidated. How much would the economy suffer? Would the economics profession be revived in its present form, or would something strikingly different be developed which did all the jobs economics does, but without the arrogance and the ideological and methodological dead weight?This is a THOUGHT EXPERIMENT, not a suggestion. Like a trolley car problem. There are no actual trolley cars with fat men being pushed in front of them, and there is no actual Pol Pot in the offing. Perhaps I should have hypothesized that The Rapture carried off every economist in the world, but no one else, but that's even less realistic. Let's just assume that all economists were pensioned off at twice their present salary on the condition that they quit doing economics. That would be both humane and practical.
Bruce Wilder 12.03.14 at 6:02 am
Rakesh @ 19 - I admit it: I got that far before catching on.Tom @ 16:
Economists have quant skills (as Smith says) and also their skills can be used in sectors where there is a lot of money. Finance, first of all. . . . That is why they make more than statisticians and that is why actuaries make more than economists. And that is why math and engineering people who study finance end up making a fair amount of money. . . . Obviously expertise is a matter of legitimation . . .
Economists have legitimated making a lot of money in finance, even though making a lot of money in finance is pretty obviously deleterious for society, aka the vast majority of people. It is kind of circular: bad economics opens opportunities for bad economists to make a lot of money doing bad economic things.
Commenter @ 13: All the evidence it contains is consistent with economics being a hierarchy-obsessed cargo cult. But all its evidence is also consistent with economists having better and more consistent quality criteria, better sorting, and larger grad programs at the top.
Two mints in one! Policy macroeconomics pretty much is a cargo cult, as far as its content is concerned. This is widely acknowledged in Naked Emperor remarks and so on, but it doesn't seem to matter. Which, I suppose figures in the motivations for the research of Fourcade et alia. It is the contrast between outside political critiques and "derision" directed at economics and the arrogant confidence of its inside practitioners about which the authors are most curious.
Tabasco @ 9
You don't need to know anything about the economy to be a highly successful economist, in the sense of the getting papers published in the best journals. And knowing a lot about the economy not only is no guarantee of career success, it invites condescension from economists who wear the ignorance about the economy as a badge of honor and sneer openly that economists who study the economy are just journalists.
Said as plainly and bluntly as that, it can seem like superficial sarcasm, but it is so accurate a description. The emphasis on "rigor" and the pride in irrelevant maths becomes a remarkable absence of curiosity and a doctrinal rigidity in a sizeable and highly influential minority of economists. And, all that is compounded by the rank corruption afforded by those fabled consulting opportunities.
A H 12.03.14 at 7:05 am
The market is for private sector PhD economists is not that large, so I don't think there is a direct outside demand pulling up econ wages. If anything, PhDs have a reputation for being awful at making money in finance.* Though it is pretty easy for a new PhD to jump into a consulting career.I would guess is that where wages are getting driven up is in the demand for business school teachers. As inequality increases, MBAs become a path to the 1% and B schools become profit centers. They need lots of econ profs hence wages go up in Econ.
*Here is a fun recent example http://thereformedbroker.com/2014/05/28/brokers-liquid-alts-and-the-fund-that-never-goes-up/
Sep 27, 2014 | Economist's View
Paul Krugman reviews Jeff Madrick's book "Seven Bad Ideas: How Mainstream Economists Have Damaged America and the World":
Seven Bad Ideas: The economics profession has not, to say the least, covered itself in glory these past six years. Hardly any economists predicted the 2008 crisis - and the handful who did tended to be people who also predicted crises that didn't happen. More significant, many and arguably most economists were claiming, right up to the moment of collapse, that nothing like this could even happen.Furthermore, once crisis struck economists seemed unable to agree on a response. They'd had 75 years since the Great Depression to figure out what to do if something similar happened again, but the profession was utterly divided when the moment of truth arrived.In "Seven Bad Ideas: How Mainstream Economists Have Damaged America and the World," Jeff Madrick - a contributing editor at Harper's Magazine and a frequent writer on matters economic - argues that the professional failures since 2008 didn't come out of the blue but were rooted in decades of intellectual malfeasance. ...Jesse:This is what happens when professionals become addicted to the pursuit of money and personal power. Granted it was the prevailing compulsion of the age of greed, and economists were hardly the worst, being more enablers than perpetrators.
So what ought professional thought leaders do next?
-examine past errors with the help of an objective advisor;
-make amends for these errors;
-learn to live a new life with a new code of behavior;
-help others who suffer from the same compulsions especially in the area of finance and politics, and from the damages that have been inflicted on them.pgl:
This is a book we all need to read.
ilsm -> pgl:
Historical context:
Age of Greed: The Triumph of Finance and the Decline of America, 1970 to the Present by Jeff Madrick (Author) (C) 2011.
anne :
September 26, 2014
Europe's Austerity Zombies
By Joseph E. StiglitzNEW YORK – "If the facts don't fit the theory, change the theory," goes the old adage. But too often it is easier to keep the theory and change the facts – or so German Chancellor Angela Merkel and other pro-austerity European leaders appear to believe. Though facts keep staring them in the face, they continue to deny reality.
Austerity has failed. But its defenders are willing to claim victory on the basis of the weakest possible evidence: the economy is no longer collapsing, so austerity must be working! But if that is the benchmark, we could say that jumping off a cliff is the best way to get down from a mountain; after all, the descent has been stopped.
But every downturn comes to an end. Success should not be measured by the fact that recovery eventually occurs, but by how quickly it takes hold and how extensive the damage caused by the slump.
Viewed in these terms, austerity has been an utter and unmitigated disaster, which has become increasingly apparent as European Union economies once again face stagnation, if not a triple-dip recession, with unemployment persisting at record highs and per capita real (inflation-adjusted) GDP in many countries remaining below pre-recession levels. In even the best-performing economies, such as Germany, growth since the 2008 crisis has been so slow that, in any other circumstance, it would be rated as dismal.
The most afflicted countries are in a depression. There is no other word to describe an economy like that of Spain or Greece, where nearly one in four people – and more than 50% of young people – cannot find work. To say that the medicine is working because the unemployment rate has decreased by a couple of percentage points, or because one can see a glimmer of meager growth, is akin to a medieval barber saying that a bloodletting is working, because the patient has not died yet.
Extrapolating Europe's modest growth from 1980 onwards, my calculations show that output in the eurozone today is more than 15% below where it would have been had the 2008 financial crisis not occurred, implying a loss of some $1.6 trillion this year alone, and a cumulative loss of more than $6.5 trillion. Even more disturbing, the gap is widening, not closing (as one would expect following a downturn, when growth is typically faster than normal as the economy makes up lost ground).
Simply put, the long recession is lowering Europe's potential growth. Young people who should be accumulating skills are not. There is overwhelming evidence that they face the prospect of significantly lower lifetime income than if they had come of age in a period of full employment.
Meanwhile, Germany is forcing other countries to follow policies that are weakening their economies – and their democracies. When citizens repeatedly vote for a change of policy – and few policies matter more to citizens than those that affect their standard of living – but are told that these matters are determined elsewhere or that they have no choice, both democracy and faith in the European project suffer.
France voted to change course three years ago. Instead, voters have been given another dose of pro-business austerity. One of the longest-standing propositions in economics is the balanced-budget multiplier – increasing taxes and expenditures in tandem stimulates the economy. And if taxes target the rich, and spending targets the poor, the multiplier can be especially high. But France's so-called socialist government is lowering corporate taxes and cutting expenditures – a recipe almost guaranteed to weaken the economy, but one that wins accolades from Germany.
The hope is that lower corporate taxes will stimulate investment. This is sheer nonsense. What is holding back investment (both in the United States and Europe) is lack of demand, not high taxes. Indeed, given that most investment is financed by debt, and that interest payments are tax-deductible, the level of corporate taxation has little effect on investment.
Likewise, Italy is being encouraged to accelerate privatization. But Prime Minister Matteo Renzi has the good sense to recognize that selling national assets at fire-sale prices makes little sense. Long-run considerations, not short-run financial exigencies, should determine which activities occur in the private sector. The decision should be based on where activities are carried out most efficiently, serving the interests of most citizens the best.
Privatization of pensions, for example, has proved costly in those countries that have tried the experiment. America's mostly private health-care system is the least efficient in the world. These are hard questions, but it is easy to show that selling state-owned assets at low prices is not a good way to improve long-run financial strength.
All of the suffering in Europe – inflicted in the service of a man-made artifice, the euro – is even more tragic for being unnecessary. Though the evidence that austerity is not working continues to mount, Germany and the other hawks have doubled down on it, betting Europe's future on a long-discredited theory. Why provide economists with more facts to prove the point?
Joseph E. Stiglitz is a Nobel laureate in economics and University Professor at Columbia University.Sandwichman :
According to Krugman "Madrick is able to claim that Say's Law is pervasive in mainstream economics only by lumping it together with a number of other concepts that, correct or not, are actually quite different." Krugman asks: "But is this 'mainstream economics'?"
Well, yes and no. Not literally or in Keynes's "supply creates its own demand" caricature. But Nassau W. Senior reformulated Say's argument (which wasn't presented by Say as a law) in terms of a "first fundamental proposition" that he described as a law with the same universality and certainty as the law of gravity in physics.
It might not sound like "supply creates its own demand" on first hearing but Senior deduces the same conclusion of an impossibility of a general glut from it. Senior's proposition IS widely taken for granted or propounded in mainstream economics. It states :
"That every man is desirous to obtain, with as little sacrifice as possible, as much as possible of the articles of wealth."
One might even detect an affinity with Lionel Robbins's definition of economics as "the science which studies human behavior as a relationship between ends and scarce means which have alternative uses." In a word, it's the maximization principle.
In "Expectation and Rational Conduct" (1937) Terence Hutchison argued that Senior's first fundamental proposition shared "one remarkable characteristic" with "almost all" formulations of the utilitarian maximization doctrine: "they appear further to postulate, and only are applicable if the further postulate is made, that all expectations are perfectly correct." In these formulations, uncertainty is relegated to an ambiguous ceteris paribus alibi which makes the utilitarian calculus immune from criticism.
anne -> Sandwichman:
http://www.nytimes.com/2014/09/28/books/review/seven-bad-ideas-by-jeff-madrick.html
September 27, 2014
No. 2 on Madrick's bad idea list is Say's Law, which states that savings are automatically invested, so that there cannot be an overall shortfall in demand. A further implication of Say's Law is that government stimulus can never do any good, because deficit spending by the public sector will always crowd out an equal amount of private spending.
But is this "mainstream economics"? Madrick cites two University of Chicago professors, Casey Mulligan and John Cochrane, who did indeed echo Say's Law when arguing against the Obama stimulus. But these economists were outliers within the profession. Chicago's own business school regularly polls a representative sample of influential economists for their views on policy issues; when it asked whether the Obama stimulus had reduced the unemployment rate, 92 percent of the respondents said that it had. Madrick is able to claim that Say's Law is pervasive in mainstream economics only by lumping it together with a number of other concepts that, correct or not, are actually quite different....
-- Paul Krugman
Sandwichman -> anne:
Say's Law DOESN'T state that savings are automatically invested. It both assumes it and implies it, though -- because it is a tautology it implies what it assumes. Say stated that every product brought to the market opens up a market for (other) goods to the full extent of its value.
anne -> Sandwichman:
"A further implication of Say's Law is that government stimulus can never do any good, because deficit spending by the public sector will always crowd out an equal amount of private spending."
-- Paul Krugman
[ This is the passage that I mean to argue against. ]
Main Street Muse :
The devotion some economists have for idealistic visions like "the invisible hand," etc. is appalling.
And some still trot out that failed BS called "trickle down" - despite the fact that evidence has never supported the trickle down theory.
But who needs facts when ideology trumps all? At least that's how it seems when looking at practitioners of that dismal science.
Economist's View
Paul Krugman continues the conversation on New Classical economics::
The New Classical Clique: Simon Wren-Lewis thinks some more about macroeconomics gone astray; Robert J. Waldmann weighs in. For those new to this conversation, the question is why starting in the 1970s much of academic macroeconomics was taken over by a school of thought that began by denying any useful role for policies to raise demand in a slump, and eventually coalesced around denial that the demand side of the economy has any role in causing slumps.I was a grad student and then an assistant professor as this was happening, albeit doing international economics – and international macro went in a different direction, for reasons I'll get to in a bit. So I have some sense of what was really going on. And while both Wren-Lewis and Waldmann hit on most of the main points, neither I think gets at the important role of personal self-interest. New classical macro was and still is many things – an ideological bludgeon against liberals, a showcase for fancy math, a haven for people who want some kind of intellectual purity in a messy world. But it's also a self-promoting clique. ...MaxSpeak:Regarding Waldmann's remark about the ideological proclivities of, among others, Martin Feldstein, at the recent NBER meetings in D.C. he presented a paper on reducing tax expenditures to lower the deficit, wherein no tax expenditure favoring saving or investment fell to his axe. When someone in the audience mentioned that tax expenditures for saving probably reduced saving more than increased it (since the cost to the Gov exceeds the marginal effect on saving), he professed ignorance of whether or not that is true. (It is.)
bakho :
In the 70s models were started in a lot of fields in addition to economics including biology, environmental science, ecology. In part it looks to have been physics envy. But the physicists made models of systems that were far more simple than economics. Modelers in the 70s went to work using computers that filled whole rooms but had less computing power than my laptop. By necessity they had to make assumptions that made the models crude predictors. But computing power was increasing and the optimists believed that eventually it would be possible to model such complex systems as economics using micro foundations, a lifetimes work.............. Not. Micro founded models make about as much sense as building a weather model based on individual atoms. Computers still are not there yet and may never be. The modelers of the 70s were overoptimistic about what they could deliver and buffaloed many people into thinking they were hot stuff. It was an exclusive club with higher math skills required as a ticket to admission. It is most difficult to cut losses on sunk costs, but that is their legacy.
Rather than detailed models that provide insights to the minutiae of economies, models have many short cuts and assumptions that assume as givens what might be important insights. They assumed the economy of the time: Full employment and supply limited. The models of the 70s fell apart with the 80s recession but regained their footing after the recovery and great moderation when the economy was in a sweet spot that required little action for the Fed. Thus the models had several decades to coast along without major fail. We hit an economy that was demand limited and high unemployment. The things many models simply assumed away were the very problems that needed addressing.
The 70s models belong on the dust heap of history in the company of many failed models in other disciplines that have long since been abandoned. As the Nobel Laureate Max Planck noted, "Science advances one funeral at a time."
bakho -> bakho...
SWL wonders why Keynes was dismissed in the 70s.
Very wealthy special interests disliked Keynes, disliked the New Deal and spent some of their money to support academics and intellectuals that could dismiss Keynes or show that his policies were in error. They funded people to work on the project of undermining Keynes and still do. Wealthy elites were eager to support economists to work on economics projects that denounced Keynes. Researchers of all striped try to keep their patrons happy. If refutation of Keynes was the price demanded in order to build up a computer based economics from micro foundations so be it. Keynes wasn't needed for micro foundations. Keynes was an impediment to funding. It is not surprising that Keynes was jettisoned. When micro foundations sputtered, there was too much crow to be eaten. Better to double down than die of embarrassment.
Skeptical Inquirer
It is no coincidence that the world's first great popularizer of totalitarianism was also the first great spokesman in the West of Philosophical Idealism, the doctrine which preaches that the everyday horrors with which men beset mankind are of no real consequence or significance, are indeed nonexistent, illusions, figments of our own perverted outlook created by our blinded, crippled senses. It was Plato who advocated the "Noble Lie," the lie the ruler, the Philosopher King, would broadcast to the ruled, always of course for the ruled's own good.
Rulers of church and state, the sempiternal Establishment of this world, have always seen things in this congenial light, the light that Plato ignited for them 2,400 years ago. No doubt they still would have if Plato had never lived, but with Plato as their Authority, the argument that pain and injustice are unreal, mere images and imaginings, gains repute, upstanding, righteousness, and, above all, philosophical status.
All dictatorships that have emerged in the West since 350 B.C. are a mere exegesis on Plato, the man who wrote that laughter is undignified, who chose Sparta rather than Athens; and on Philosophical Idealism, which is nothing more than an attempt to divide Existence itself into two unequal parts, external appearance and Inner Truth, The Good Essence and its bad shadow, the Divine Inexpressible and the sublunary meat and potatoes. As Bergen Evans reminds us in The Natural History of Nonsense, "Obscurantism and tyranny go together.... The mist of mysticism has always provided good cover for those who do not want their actions too closely looked into."
This danger inherent in obscurantism is not merely of theoretical interest. Martin Heidegger, Carl Jung, Konrad Lorenz, Alexis Carrel, Ezra Pound, Louis Ferdinand Celine, D.H. Lawrence, and T.S. Eliot were all highly intelligent and, at least two of them, humane, kind, and thoroughly decent men. Yet all experienced no trouble whatsoever in embracing a strong element of fascism to his heart, this after a lifetime spent in the contemplation and evocation of obscurantism. Once one has developed the habit of abjuring the rational in favor of the willfully obscure and mystical, the descent to the bottom of the night is an easy ride.Obscurantism is ten parts humbug, and humbug is Tyranny's first name, the one it has chosen for itself and by which it is known to all its closer acquaintances. Was there one honest tyrant ever, was there one, who said, I have taken charge and mean to keep it for the good of myself. Not for the sake of the People, nor the State, nor this Faction nor that Party, nor God, nor the Holy Mother Church, nor the Prophet of God, nor Right, Freedom, Equality, Justice, and the Brotherhood of Man but for myself, for my own good, because it pleases me to do so.
Only in the name of humbug shall Tyranny declare itself, at least in its more public utterances. Privately, in its own house, Tyranny may unbutton its vest, put on its slippers, and call itself honestly enough, though even of this we cannot be sure. But whenever it broadcasts its message it uses only its first name, humbug, just as any king or emperor, as if it had no parents, no ancestry, but had sprung, full-panoplied, out from the skull of God.
But it is not only obscurantism that forms a smokescreen behind which tyranny can hide. There is also the little matter of dividing all existence into two, and just two, totally opposed, totally opposite categories. People from whom we might expect better reasoning processes to be in evidence surprise us by their unexpected lack in perspicuity.
Sociologists and psychologists tell us about "dominant" or "leader" types, "dominated" or "follower" types, implicitly or very often explicitly inquiring of us which we would rather be, implying through it all that of course our choice, assuming nature allows us to have one--which is another kettle of red herring--should be to lead, to command, to dominate, to impose our wills and our concepts on others, for their own good of course, to win friends and influence people. This "either/or" insistence by supposedly intelligent and mentally skilled professionals is deeply worrying. It never seems to occur to these highly trained experts on human behavior that the only sensible answer to the query, Do you wish to be a leader or follower, a master or a slave, is: What sane person would want to be either.
Dividing everything into two is all right in the monkey house, the sandbox, and the digital computer, but one can only wish this binary reasoning, this plus-or-minus-and-no-nonsense-please approach to things applied in social situations, would end somewhere between our twelfth and sixteenth birthdays.
The ability to see the world, its meaning, and its humanity, in strict dualistic terms, as two, and only two, distinct and mutually exclusive entities, forms the backdrop and backbone of almost all religion, philosophy, politics, economics, law, sociology, psychology--indeed virtually every human enterprise that has led to disintegration or petrification of society or the individual. Plato or Aristotle, St. Augustine or Descartes, Calvin or Torquemada, Freud or Jung, Nietzsche or Baudelaire, Eysenck or Skinner, Stalin or Hitler, no matter in what specifics they may have differed one from the other, agree that everything is to be divided into Two, which, whatever they choose to call them, can be reduced to a common denominator: the saved and the unsaved, those on God's side and those on Satan's, the abstractions of the pure reasoner and the concretisms of the pure observer, the totally sensory and the completely analytic, the blessed and the damned, the holy and the infidel, the material and the spiritual, the mind and the heart, wave and particle, holism and reductionism, nature and nurture, the Hellenic and the Hebraic, the superman and the herd, the Romanticist and the Classicist, the representational and the symbolic, the exploited and the exploiter, the gifted and the dull-witted, inductive analysis and hypothetico-deductivism, the justly rich and the deservedly poor, the black and white, Us and Them. Pluralism remains a luxury that only a few minds can afford, or even window-shop for.
February 17, 2008 | Washington Post
Call Me a Snob, but Really, We're a Nation of Dunces
"The mind of this country, taught to aim at low objects, eats upon itself." Ralph Waldo Emerson offered that observation in 1837, but his words echo with painful prescience in today's very different United States. Americans are in serious intellectual trouble -- in danger of losing our hard-won cultural capital to a virulent mixture of anti-intellectualism, anti-rationalism and low expectations.
This is the last subject that any candidate would dare raise on the long and winding road to the White House. It is almost impossible to talk about the manner in which public ignorance contributes to grave national problems without being labeled an "elitist," one of the most powerful pejoratives that can be applied to anyone aspiring to high office. Instead, our politicians repeatedly assure Americans that they are just "folks," a patronizing term that you will search for in vain in important presidential speeches before 1980. (Just imagine: "We here highly resolve that these dead shall not have died in vain . . . and that government of the folks, by the folks, for the folks, shall not perish from the earth.") Such exaltations of ordinariness are among the distinguishing traits of anti-intellectualism in any era.
The classic work on this subject by Columbia University historian Richard Hofstadter, "Anti-Intellectualism in American Life," was published in early 1963, between the anti-communist crusades of the McCarthy era and the social convulsions of the late 1960s. Hofstadter saw American anti-intellectualism as a basically cyclical phenomenon that often manifested itself as the dark side of the country's democratic impulses in religion and education. But today's brand of anti-intellectualism is less a cycle than a flood. If Hofstadter (who died of leukemia in 1970 at age 54) had lived long enough to write a modern-day sequel, he would have found that our era of 24/7 infotainment has outstripped his most apocalyptic predictions about the future of American culture.
Dumbness, to paraphrase the late senator Daniel Patrick Moynihan, has been steadily defined downward for several decades, by a combination of heretofore irresistible forces. These include the triumph of video culture over print culture (and by video, I mean every form of digital media, as well as older electronic ones); a disjunction between Americans' rising level of formal education and their shaky grasp of basic geography, science and history; and the fusion of anti-rationalism with anti-intellectualism.
First and foremost among the vectors of the new anti-intellectualism is video. The decline of book, newspaper and magazine reading is by now an old story. The drop-off is most pronounced among the young, but it continues to accelerate and afflict Americans of all ages and education levels.
Reading has declined not only among the poorly educated, according to a report last year by the National Endowment for the Arts. In 1982, 82 percent of college graduates read novels or poems for pleasure; two decades later, only 67 percent did. And more than 40 percent of Americans under 44 did not read a single book -- fiction or nonfiction -- over the course of a year. The proportion of 17-year-olds who read nothing (unless required to do so for school) more than doubled between 1984 and 2004. This time period, of course, encompasses the rise of personal computers, Web surfing and video games.
Does all this matter? Technophiles pooh-pooh jeremiads about the end of print culture as the navel-gazing of (what else?) elitists. In his book "Everything Bad Is Good for You: How Today's Popular Culture Is Actually Making Us Smarter," the science writer Steven Johnson assures us that we have nothing to worry about. Sure, parents may see their "vibrant and active children gazing silently, mouths agape, at the screen." But these zombie-like characteristics "are not signs of mental atrophy. They're signs of focus." Balderdash. The real question is what toddlers are screening out, not what they are focusing on, while they sit mesmerized by videos they have seen dozens of times.
Despite an aggressive marketing campaign aimed at encouraging babies as young as 6 months to watch videos, there is no evidence that focusing on a screen is anything but bad for infants and toddlers. In a study released last August, University of Washington researchers found that babies between 8 and 16 months recognized an average of six to eight fewer words for every hour spent watching videos.
I cannot prove that reading for hours in a treehouse (which is what I was doing when I was 13) creates more informed citizens than hammering away at a Microsoft Xbox or obsessing about Facebook profiles. But the inability to concentrate for long periods of time -- as distinct from brief reading hits for information on the Web -- seems to me intimately related to the inability of the public to remember even recent news events. It is not surprising, for example, that less has been heard from the presidential candidates about the Iraq war in the later stages of the primary campaign than in the earlier ones, simply because there have been fewer video reports of violence in Iraq. Candidates, like voters, emphasize the latest news, not necessarily the most important news.
No wonder negative political ads work. "With text, it is even easy to keep track of differing levels of authority behind different pieces of information," the cultural critic Caleb Crain noted recently in the New Yorker. "A comparison of two video reports, on the other hand, is cumbersome. Forced to choose between conflicting stories on television, the viewer falls back on hunches, or on what he believed before he started watching."
As video consumers become progressively more impatient with the process of acquiring information through written language, all politicians find themselves under great pressure to deliver their messages as quickly as possible -- and quickness today is much quicker than it used to be. Harvard University's Kiku Adatto found that between 1968 and 1988, the average sound bite on the news for a presidential candidate -- featuring the candidate's own voice -- dropped from 42.3 seconds to 9.8 seconds. By 2000, according to another Harvard study, the daily candidate bite was down to just 7.8 seconds.
The shrinking public attention span fostered by video is closely tied to the second important anti-intellectual force in American culture: the erosion of general knowledge.
People accustomed to hearing their president explain complicated policy choices by snapping "I'm the decider" may find it almost impossible to imagine the pains that Franklin D. Roosevelt took, in the grim months after Pearl Harbor, to explain why U.S. armed forces were suffering one defeat after another in the Pacific. In February 1942, Roosevelt urged Americans to spread out a map during his radio "fireside chat" so that they might better understand the geography of battle. In stores throughout the country, maps sold out; about 80 percent of American adults tuned in to hear the president. FDR had told his speechwriters that he was certain that if Americans understood the immensity of the distances over which supplies had to travel to the armed forces, "they can take any kind of bad news right on the chin."
This is a portrait not only of a different presidency and president but also of a different country and citizenry, one that lacked access to satellite-enhanced Google maps but was far more receptive to learning and complexity than today's public. According to a 2006 survey by National Geographic-Roper, nearly half of Americans between ages 18 and 24 do not think it necessary to know the location of other countries in which important news is being made. More than a third consider it "not at all important" to know a foreign language, and only 14 percent consider it "very important."
That leads us to the third and final factor behind the new American dumbness: not lack of knowledge per se but arrogance about that lack of knowledge. The problem is not just the things we do not know (consider the one in five American adults who, according to the National Science Foundation, thinks the sun revolves around the Earth); it's the alarming number of Americans who have smugly concluded that they do not need to know such things in the first place. Call this anti-rationalism -- a syndrome that is particularly dangerous to our public institutions and discourse. Not knowing a foreign language or the location of an important country is a manifestation of ignorance; denying that such knowledge matters is pure anti-rationalism. The toxic brew of anti-rationalism and ignorance hurts discussions of U.S. public policy on topics from health care to taxation.
There is no quick cure for this epidemic of arrogant anti-rationalism and anti-intellectualism; rote efforts to raise standardized test scores by stuffing students with specific answers to specific questions on specific tests will not do the job. Moreover, the people who exemplify the problem are usually oblivious to it. ("Hardly anyone believes himself to be against thought and culture," Hofstadter noted.) It is past time for a serious national discussion about whether, as a nation, we truly value intellect and rationality. If this indeed turns out to be a "change election," the low level of discourse in a country with a mind taught to aim at low objects ought to be the first item on the change agenda.
Susan Jacoby's latest book is "The Age of American Unreason."
Mayberry Machiavelli is a satirically pejorative phrase coined by John J. DiIulio Jr., Ph.D., a former staffer of the George W. Bush presidential administration who ran Bush's Faithbased Initiative. After he quickly resigned from his White House post in late 2001, DiIulio told journalist Ron Suskind, describing that administration, as published in Esquire magazine:
"What you've got is everything--and I mean everything--being run by the political arm. It's the reign of the Mayberry Machiavellis."
The phrase is meant to invoke infamous Machiavellian style power politics coupled with a prejudicial sense of regional backwardness and incompetence in the American south as supposedly exemplified by the fictional, small, rural, North Carolina town of Mayberry R.F.D., from The Andy Griffith Show which ran on the American television network, CBS, from 1960 - 1968.
The phrase, seeming so felicitously apt, was picked up and virally repeated by many critics of the George W. Bush presidency.
"The trouble with most folks ain't so much their ignorance as knowing so many things that ain't so." "Men will never be free until the last king is strangled with the entrails of the last priest."-Meslier, Voltaire, Diderot?
What a world of contradictions. A world of many dead ends. Today I celebrate with anger the birthday of revolutionary Baptist minister Martin Luther King, Jr., mourn the death of jazz musician Alice Coltrane (a convert to Hinduism), and commemorate the birthday of a pioneer of freethought and the Enlightenment:
Jean Meslier (January 1664-1733): Priest, Materialist, Atheist
Here in the USA of course we are preoccupied with the threats of the Christian Right and fundamentalist Islam. More generally, we are known to complain about the Abrahamic religions-Judaism, Christianity, Islam-and more generally still about theism. But that's only the half of it. The rest of the world is as bankrupt as the half we know.
Some of us also have an interest in Eastern religions and mysticisms and are concerned with their validity or invalidity. Then of course there are African belief systems which outside of their areas of origin only have a significant impact on segments of the black diaspora.It's a world of ignorance, superstition, and savagery.
But it's also important to note that there is a whole history of collusion of western and non-western obscurantism that began with the European penetration of China and India in the 17th century, i.e. linkages to the most reactionary inidigenous ideologies-Confucianism and Hinduism. Such collusion persists in altered forms in the present day, with Western postmodernism fueling Hindu and Confucian revivals, for example.Globalization, instead of harkening a new Enlightenment, is bringing us to the verge of a new Dark Age. The main culprits are the neoliberal economic order, neo-imperialism, and neo-fascist religious revivalism, but this barbarism carries on its work in the realms of theology and philosophy as well.
Here are a few links to show you what I mean.First, you can keep up with other relevant writings of mine on my own blog:
Studies in a Dying CultureThe permalinks for recent entries are:
On another front, see a blog entry from December:
- Reactionary Chinese & other wisdom in comparative perspective
- The Legitimacy of Chinese Philosophy (1)
- The Legitimacy of Chinese Philosophy (2)
The Dead End of African Philosophy: Which Way Out?
On still another, see: Swami Agehananda Bharati (1923–1991)In December I published a review in the Indian press:
"Secularism, science and the Right"
[Review of Meera Nanda, The Wrongs of the Religious Right: Reflections on Science, Secularism and Hindutva], Frontline, Volume 23, Issue 24, Dec. 02–15, 2006.See also: Meera Nanda Online
"Fascism has awakened a sleeping world to the realities of the irrational, mystical character structure of the people of the world."
-Wilhelm Reich
February 20, 2008)
Is the U.S. a deeply anti-intellectual, anti-learning culture, and thus a deeply ignorant one? Every few years comes a book which argues persuasively, "yes." This year's entry is The Age of American Unreason . Longtime correspondent U. Doran alerted me to the book via this story link: Susan Jacoby: Bemoaning an America that values stupidity.
A generation ago the book du jour chastising the dumbing down of America was The Closing of the American Mind which judging by sales on amazon.com remains very much in the public consciousness.
I asked frequent contributor Michael Goodfellow for his take on the issue, and he responded with a number of fresh points of view:
This has been commented on a lot recently on the Net. I agree with some of the sentiment expressed here:There are more sources of information than ever if you want it. And it's not clear that we're really worse off in terms of intellectual health than before. American pop culture has been idiotic for decades, and people have moaned about lack of knowledge on the part of the public for decades as well. Interestingly, I remember reading a comment about 19th century England after the Sherlock Holmes stories were first serialized. The upper classes weren't celebrating that ordinary people were reading -- they were complaining that people who should be working were wasting their time reading novels!
Here's an analogy for you. If we were asking about American physical health, we'd be talking about increased obesity, lack of exercise, poor diet, etc. You could make the same comparisons about average American intellectual life, as the Jacoby article does. On the other hand, if you were asking about American athletics, you'd be talking about the increased numbers of serious athletic programs from grade school to college, the increased number of people who take athletics seriously, the improved training methods and broken records in practically every sport. In other words, nothing but serious improvement.
Again, you could say the same about serious intellectual life in this country. There's more and more to do and learn, better ways to do it, our best universities are world class, and there's never been more possibilities for making a living at intellectual pursuits. The results are obvious in science and technology. I'm not familiar enough with the arts to even guess at whether you can say there's been an improvement, but there's certainly more of it, from serious art to commercial art down to YouTube.
So overall, I'd say at worst, there's a widening split between the part of the country that enjoys intellectual activity, and the average person who doesn't. I really blame the school systems for that, not any increase in anti-intellectualism in the population. It's amazing anyone gets through K-12 public education in the U.S. and still wants to learn anything.
There is another interesting point of view I've heard on all of this. The story is that originally education was seen as "male" and had status. When mandatory education became widespread in the 19th century, and many cheap school teachers were needed, it was mostly women who filled that role. And so education became seen as "female", and its status dropped. For boys, education was some boring, spinster schoolmarm who slapped your wrist with a ruler if you didn't sit still. Within 50 years, the entertainment industry had converted the college professor from a "wise man" image to an overeducated dunce with no common sense -- a bit of a clown. The overall culture followed that same line, with women reading and men avoiding books.
"The old SDS dictum, 'People have to be organized around the issues that really affect their lives,' is really true… That is to say, that racism and imperialism really are issues that affect people's lives. And it was these things that people moved on, not dorm rules, or democratizing university governance, or any of that bullshit." -Mark Rudd, "Columbia-Notes on the Spring Rebellion"
- THESIS ONE: The war on Iraq represents, among other things, a crisis in education. It has been proven beyond a doubt that the war was waged on false pretenses, that the consent for the ongoing imperial occupation has been based on the inability of the American public to access real and useful information. [What about media role ?] Often, when students are exposed to alternative information in progressive classes their reaction is one of frustration. They realize that our education has failed us: we have not been provided with the intellectual resources to understand political questions within the context of history, we have not been trained to practice the public debate and civic engagement that are the necessary precondition of democracy (as argued in Henry Giroux's writings, http://www.henryagiroux.com). Instead, the academic-military-industrial complex has trained us in the logic of empire, leaving us prey to the invasion of our campuses by the empire's vultures: military recruiters who promise to make up for the state's unwillingness to fund our education.
- THESIS TWO: Higher education has failed the student because the university, along with most of domestic American culture, has been militarized. The most direct expression is in the military's major investments in research funding; but it goes beyond this. The standards of free political debate, so crucial to an atmosphere of intellectual growth, have been replaced by standards of defending "Western civilization" against the "terrorism" of dissent: look at the repression of Ward Churchill, Joseph Massad, and countless other professors. These professors are persecuted because they have not been playing their roles as American intellectuals, which, as Chomsky has tirelessly demonstrated, is to "manufacture the consent" of the masses. This is the future role of college students, who are learning early that those who question the currents of mainstream thought are punished and those who deceive the public and rationalize war crimes are rewarded. As future intellectuals, students are carefully taught the doctrines of imperialism: the Orientalist demonization of Arabs as a civilizational Other, a neoliberal economics that portrays the victims of transnational corporations as the beneficiaries of "development", the political mythology that portrays American military hegemony and its acts of aggression as benevolence. The university ensures the transmission of a system of ideology which guarantees that the decisions made by the corporate-authoritarian elite remain unquestioned by the disempowered American public.
- THESIS THREE: The academy acts in the interests of corporations because the university itself has moved toward a corporate structure. Former CEOs are hired as presidents, research funding comes from corporations, corporate sponsorships impose brand names on students, and the corporate ethic of competition and hierarchy are imposed upon faculty and students. Graduate students are often unable to unionize, and service and clerical employees are paid low wages. This corporatization arises as an expression of the role of the university within capitalist society: to reproduce the relations of production by training the majority of students to be laborers and consumers rather than citizens. Gramsci said that every relationship of hegemony was necessarily an educational one, that class rule was secured by training the working class to accept its exploitation; but he could not have foreseen that capital's invasive violations of the public sphere would make every relationship of education a hegemonic one. Historically, the mission of the university has been defined by the tradition of humanistic education, which seeks to develop the individual into a participating social agent. This democratic ideal has always coexisted with the pressures of the market and the ruling class, which varyingly attempt to turn the university into another kind of industry or an ideological state apparatus. The movements of militarization and corporatization have caused education to lose the battle.
- THESIS FOUR: The unusual socioeconomic role of the student has made the university a crucial battleground (see the work of Andre Gorz). The necessity of a techno-managerial elite in the workplace and the ongoing transition from material to immaterial production brought on by recent advancements in communications technology create a need for rigorous training in the universities. Of course, the students must develop the necessary technical ability, but they must also learn the ideological dynamics that preserve the social and economic hierarchy. Instead of doing useful or fulfilling work, students must practice rote memorization under the regime of exams, controlled by the absolute authority of teachers and judged by the artificial standards of grades; students are alienated from the teachers, who are often overworked and are equally constrained by the system; students are divided into hierarchies based on grades, class ranks, and social status; decisions are made exclusively by the administration or through a petty and useless student bureaucracy; the curriculum and pedagogy suppress critical thought; students must pay the constantly rising tuition, often forcing them to pay off loans for 10 or 20 years after graduation.
- THESIS FIVE: The alienated structure of education results immediately in a generalized discontent, and because of the transience and inconvenience of the student life, the student is not integrated into the system. A contradiction emerges, nascent within the intellectual rigor demanded by specialized training: the intellectual and social development that is necessary to the training of the specialized class creates a potential for critical thought, which contains the potential for radical action. After all, the university is not just an ideology machine-it is a battleground, a space in which the struggle for the human mind is fought. In exchange for incorporation into capitalism's bureaucratic class, the student is granted an unjust privilege-access to knowledge on a scale denied to those unable to afford it-but this privilege is a double-edged sword.
- THESIS SIX: In preparing us for alienated labor and consumption, student privilege brings us the stupidity and banality of student life, the emptiness of campus culture, the intellectual charlatanry of professors who use terms like "humanitarian intervention" and "free market." At the same time, it grants us the time, energy, and resources to study and understand history; it gives us a space in which to revolt and make history. In the face of a schooling system that seeks to train us for an intellectually, culturally, morally, existentially bankrupt life under the regime of bureaucratic capitalism, we can use our empty privilege to demand a real, critical education that prepares us to participate as active citizens in an autonomous society of our making. This entails nothing less than the abolition of the student; it means the institution of free education as a universal right of the citizen. Our society has developed the ability to realize this ambition, but has instead summoned its vast technological, intellectual, and physical powers in the service of death and destruction. Our training as students plays a central role in the perpetuation of the system which demands that violence rule the global order. To resist the ongoing occupation of Iraq is to resist a growing empire that robs the wretched of the earth of their lives and robs us of the potential to fulfill the dream of a participatory and cooperative society.
- THESIS SEVEN: We came to college for an education, but education is impossible in a society ruled by the logic of empire, which has reached its alarming peak in the subjugation of the people of Iraq. To realize the principles of a real education inside and beyond the university means an upheaval against empire. In the words of Che Guevara, "Revolution is the best education for honorable men."
Visit http://www.tools4change.org/wcr to learn more about and participate in the Week of Campus Resistance.
Asad Haider is a student and activist in State College, Pennsylvania. His writing has appeared on ZNet, Politics and Culture, Left Hook, Dissident Voice, and elsewhere. He can be reached at [email protected].
Society
Groupthink : Two Party System as Polyarchy : Corruption of Regulators : Bureaucracies : Understanding Micromanagers and Control Freaks : Toxic Managers : Harvard Mafia : Diplomatic Communication : Surviving a Bad Performance Review : Insufficient Retirement Funds as Immanent Problem of Neoliberal Regime : PseudoScience : Who Rules America : Neoliberalism : The Iron Law of Oligarchy : Libertarian Philosophy
Quotes
War and Peace : Skeptical Finance : John Kenneth Galbraith :Talleyrand : Oscar Wilde : Otto Von Bismarck : Keynes : George Carlin : Skeptics : Propaganda : SE quotes : Language Design and Programming Quotes : Random IT-related quotes : Somerset Maugham : Marcus Aurelius : Kurt Vonnegut : Eric Hoffer : Winston Churchill : Napoleon Bonaparte : Ambrose Bierce : Bernard Shaw : Mark Twain Quotes
Bulletin:
Vol 25, No.12 (December, 2013) Rational Fools vs. Efficient Crooks The efficient markets hypothesis : Political Skeptic Bulletin, 2013 : Unemployment Bulletin, 2010 : Vol 23, No.10 (October, 2011) An observation about corporate security departments : Slightly Skeptical Euromaydan Chronicles, June 2014 : Greenspan legacy bulletin, 2008 : Vol 25, No.10 (October, 2013) Cryptolocker Trojan (Win32/Crilock.A) : Vol 25, No.08 (August, 2013) Cloud providers as intelligence collection hubs : Financial Humor Bulletin, 2010 : Inequality Bulletin, 2009 : Financial Humor Bulletin, 2008 : Copyleft Problems Bulletin, 2004 : Financial Humor Bulletin, 2011 : Energy Bulletin, 2010 : Malware Protection Bulletin, 2010 : Vol 26, No.1 (January, 2013) Object-Oriented Cult : Political Skeptic Bulletin, 2011 : Vol 23, No.11 (November, 2011) Softpanorama classification of sysadmin horror stories : Vol 25, No.05 (May, 2013) Corporate bullshit as a communication method : Vol 25, No.06 (June, 2013) A Note on the Relationship of Brooks Law and Conway Law
History:
Fifty glorious years (1950-2000): the triumph of the US computer engineering : Donald Knuth : TAoCP and its Influence of Computer Science : Richard Stallman : Linus Torvalds : Larry Wall : John K. Ousterhout : CTSS : Multix OS Unix History : Unix shell history : VI editor : History of pipes concept : Solaris : MS DOS : Programming Languages History : PL/1 : Simula 67 : C : History of GCC development : Scripting Languages : Perl history : OS History : Mail : DNS : SSH : CPU Instruction Sets : SPARC systems 1987-2006 : Norton Commander : Norton Utilities : Norton Ghost : Frontpage history : Malware Defense History : GNU Screen : OSS early history
Classic books:
The Peter Principle : Parkinson Law : 1984 : The Mythical Man-Month : How to Solve It by George Polya : The Art of Computer Programming : The Elements of Programming Style : The Unix Hater’s Handbook : The Jargon file : The True Believer : Programming Pearls : The Good Soldier Svejk : The Power Elite
Most popular humor pages:
Manifest of the Softpanorama IT Slacker Society : Ten Commandments of the IT Slackers Society : Computer Humor Collection : BSD Logo Story : The Cuckoo's Egg : IT Slang : C++ Humor : ARE YOU A BBS ADDICT? : The Perl Purity Test : Object oriented programmers of all nations : Financial Humor : Financial Humor Bulletin, 2008 : Financial Humor Bulletin, 2010 : The Most Comprehensive Collection of Editor-related Humor : Programming Language Humor : Goldman Sachs related humor : Greenspan humor : C Humor : Scripting Humor : Real Programmers Humor : Web Humor : GPL-related Humor : OFM Humor : Politically Incorrect Humor : IDS Humor : "Linux Sucks" Humor : Russian Musical Humor : Best Russian Programmer Humor : Microsoft plans to buy Catholic Church : Richard Stallman Related Humor : Admin Humor : Perl-related Humor : Linus Torvalds Related humor : PseudoScience Related Humor : Networking Humor : Shell Humor : Financial Humor Bulletin, 2011 : Financial Humor Bulletin, 2012 : Financial Humor Bulletin, 2013 : Java Humor : Software Engineering Humor : Sun Solaris Related Humor : Education Humor : IBM Humor : Assembler-related Humor : VIM Humor : Computer Viruses Humor : Bright tomorrow is rescheduled to a day after tomorrow : Classic Computer Humor
The Last but not Least Technology is dominated by two types of people: those who understand what they do not manage and those who manage what they do not understand ~Archibald Putt. Ph.D
Copyright © 1996-2021 by Softpanorama Society. www.softpanorama.org was initially created as a service to the (now defunct) UN Sustainable Development Networking Programme (SDNP) without any remuneration. This document is an industrial compilation designed and created exclusively for educational use and is distributed under the Softpanorama Content License. Original materials copyright belong to respective owners. Quotes are made for educational purposes only in compliance with the fair use doctrine.
FAIR USE NOTICE This site contains copyrighted material the use of which has not always been specifically authorized by the copyright owner. We are making such material available to advance understanding of computer science, IT technology, economic, scientific, and social issues. We believe this constitutes a 'fair use' of any such copyrighted material as provided by section 107 of the US Copyright Law according to which such material can be distributed without profit exclusively for research and educational purposes.
This is a Spartan WHYFF (We Help You For Free) site written by people for whom English is not a native language. Grammar and spelling errors should be expected. The site contain some broken links as it develops like a living tree...
|
You can use PayPal to to buy a cup of coffee for authors of this site |
Disclaimer:
The statements, views and opinions presented on this web page are those of the author (or referenced source) and are not endorsed by, nor do they necessarily reflect, the opinions of the Softpanorama society. We do not warrant the correctness of the information provided or its fitness for any purpose. The site uses AdSense so you need to be aware of Google privacy policy. You you do not want to be tracked by Google please disable Javascript for this site. This site is perfectly usable without Javascript.
Last modified: March, 29, 2020