Feeds:
Artigos
Comentários

For God and Country

Megancer

Human structure and behavior are determined by thermodynamics. There’s not a single human characteristic that escapes its sculpting influence and this includes our brains. I’ve been thinking about the pervasiveness of religion lately and this too is likely a conserved trait that has been strongly selected within the architecture of the brain. And why would a structure that leads to irrational belief be selected? Surely a more realistic or scientific view contributes more to survival. If we go back into prehistory before written knowledge began accumulating en masse, the typical mind of a family or tribe member explained the world in terms of myths, spirits and ancestors. These animal and ancestor spirits could probably be spoken to in order to gain some influence upon the unfolding of uncertain events.

But the greatest survival advantage would come from a coalescing of smaller groups or tribes into the worship of a common God(s) through the intercession of a priestly class, perhaps the tribal leaders of the coalesced groups. Citizens of this new “civilization”, based upon the worship of a God at a central temple would organize themselves around supplying the temple with adequate sacrifices and offerings to satiate the all-powerful God(s). This net energy brought from the area surrounding the temple would feed the specialists residing there and complexity would flourish.

These large civilizations would easily defeat smaller tribes and gain slaves, resources, and territory at the periphery. The glue that holds it all together is the belief that a God made everything and intercedes on the earth and responds to the ritualized actions of priests and worshipers. Scientific explanations tend to dissolve the glue that binds and currently there is a tug-of-war between religious and scientific explanations. Which is most important? Can we maintain and practice technology and still pay homage to the Gods? I think that is exactly what we are doing while downplaying or obfuscating much of unpalatable reality. Those societies in which individuals have transcendent experiences or large releases of dopamine/opioids in the presence of the all powerful God are the more cohesive and most likely to continue believing as they conquer and assimilate the non-believers. The individuals of the religious societies, or those most likely to get rewarded by believing in the magic of a God have greater reproductive success than those groups that do not have the tendency to coalesce into religious tribes. Over time much of the world’s population has become hard-wired with a tendency (sometimes overcome) to be magical thinkers and that cause and effect are the whims, rewards and punishments emanating from the agency of a God.

https://qz.com/852450/the-neuroscience-argument-that-religion-shaped-the-very-structure-of-our-brains/

Fast forward to today and even with the wide availability of knowledge, religions still persist and thrive. Why? Because the population has evolved to believe that a God makes everything happen and they are unwilling, unable or too lazy to supplant that fantasy with a real study of physics, chemistry and biology. Atheism and intellectualism are denigrated and rejected in favor of the simple religious glue that holds society together and provides a common identity. For most, a reading of Charles Darwin’s “On the Origin of Species” simply does not have the same religiously orgasmic effect as the almighty power of God at a church revival. It is unlikely that the magical thinking will be abandoned because its tendency is hard-wired in the brain and it has emotionally-rallying survival value in holding together societies. Even the government, businesses and Federal Reserve jump on the God bandwagon for acceptance.

So there is a reluctance to accept real explanations that may undermine the belief system that helps hold a society together. A rejection of scientific explanations leaves room for divine intervention brought forth by prayer and offerings and a reward of eternal life in heaven.

Humans will likely ignore scientific warnings of our current perils while continuing to pursue rewards while hoping for magical interventions from a God. Natural selection will take any course that has a healthy positive EROEI, even towards warped human brain wiring that provides the framework for a skewed interpretation of reality. Rob Mielcarski of www.un-denial.com brought this article to my attention and I’ve provided an excerpt:

https://www.newyorker.com/magazine/2017/02/27/why-facts-dont-change-our-minds

Anúncios

carnaval_de_veneza1

 

Gail Tverberg

Economists tell us that within the economy there is a lot of substitutability, and they are correct. However, there are a couple of not-so-minor details that they overlook:

  • There is no substitute for energy. It is possible to harness energy from another source, or to make a particular object run more efficiently, but the laws of physics prevent us from substituting something else for energy. Energy is required whenever physical changes are made, such as when an object is moved, or a material is heated, or electricity is produced.
  • Supplemental energy leverages human energy. The reason why the human population is as high as it is today is because pre-humans long ago started learning how to leverage their human energy (available from digesting food) with energy from other sources. Energy from burning biomass was first used over one million years ago. Other types of energy, such as harnessing the energy of animals and capturing wind energy with sails of boats, began to be used later. If we cut back on our total energy consumption in any material way, humans will lose their advantage over other species. Population will likely plummet because of epidemics and fighting over scarce resources.

Many people appear to believe that stimulus programs by governments and central banks can substitute for growth in energy consumption. Others are convinced that efficiency gains can substitute for growing energy consumption. My analysis indicates that workarounds, in the aggregate, don’t keep energy prices high enough for energy producers. Oil prices are at risk, but so are coal and natural gas prices. We end up with a different energy problem than most have expected: energy prices that remain too low for producers. Such a problem can have severe consequences.

Let’s look at a few of the issues involved:

[1] Despite all of the progress being made in reducing birth rates around the globe, the world’s population continues to grow, year after year.

Figure 1. 2019 World Population Estimates of the United Nations. Source: https://population.un.org/wpp/Download/Standard/Population/

Advanced economies in particular have been reducing birth rates for many years. But despite these lower birthrates, world population continues to rise because of the offsetting impact of increasing life expectancy. The UN estimates that in 2018, world population grew by 1.1%.

[2] This growing world population leads to a growing use of natural resources of every kind.

There are three reasons we might expect growing use of material resources:

(a) The growing world population in Figure 1 needs food, clothing, homes, schools, roads and other goods and services. All of these needs lead to the use of more resources of many different types.

(b) The world economy needs to work around the problems of an increasingly resource-constrained world. Deeper wells and more desalination are required to handle the water needs of a rising population. More intensive agriculture (with more irrigation, fertilization, and pest control) is needed to harvest more food from essentially the same number of arable acres. Metal ores are increasingly depleted, requiring more soil to be moved to extract the ore needed to maintain the use of metals and other minerals. All of these workarounds to accommodate a higher population relative to base resources are likely to add to the economy’s material resource requirements.

(c) Energy products themselves are also subject to limits. Greater energy use is required to extract, process, and transport energy products, leading to higher costs and lower net available quantities.

Somewhat offsetting these rising resource requirements is the inventiveness of humans and the resulting gradual improvements in technology over time.

What does actual resource use look like? UN data summarized by MaterialFlows.net shows that extraction of world material resources does indeed increase most years.

Figure 2. World total extraction of physical materials used by the world economy, calculated using  weight in metric tons. Chart is by MaterialFlows.net. Amounts shown are based on the Global Material Flows Database of the UN International Resource Panel. Non-metallic minerals include many types of materials including sand, gravel and stone, as well as minerals such as salt, gypsum and lithium.

 

Continuar a ler »

Kimi Djabaté

Norman Pagett

 

But that prosperity is now propped up by infinite debt, to sustain the original illusion.

It is not just a European problem, but a global problem, where world commerce is supported by infinite debt.

While we might think of money as supporting our economy, only energy can support the solvency of a nation, and only surplus energy can fulfill the aspirations of its rulers and the desires of its citizens. Until the advent of the industrial revolution, and in particular the universal availability of cheap oil, that energy could only come from territory that could produce sufficient food and other essentials for any level of civilized living. We might ‘demand’ that our leaders provide new hospitals, schools, roads and all the other things that make life comfortable, but without the necessary surplus energy to do it, it is impossible. No political posturing or promises or taxation can change that.

Yet the gullible are convinced that prosperity can be voted into office.

Most deny it, but we live in an energy economy, not a money economy. Without the continually increasing forward thrust of energy input, no economy can exist in the context that we think of as ‘normal’. Our current society is not normal.

Those charged with governing every nation on Earth, have lost sight of the fundamental law of collective survival: if a nation doesn’t produce enough indigenous surplus energy to support the demands of its people, they must beg, buy, borrow or steal it from somewhere else, or face eventual collapse and starvation until their numbers reach a sustainable level.

Our lifestyle support system has been based on that premise since prehistory. Nomadic tribesmen, probably in the region of present day Iraq, had the bright idea of fixing borders around land, then growing their food supply instead of chasing after it. Fences and borders meant land could be owned and given value that could be measured in energy terms.

What we know as civilization is based on that simple concept. Land and its potential energy became capital, and our genetic forces ensured it was exploited to the full. Primitive farmers knew nothing of calorific values, or capitalism; only that too little food meant starvation, sufficient food averted famines, and surplus food offered prosperity. No one wanted to starve, few were content with sufficient, so the drive for surplus became relentless. It still is; only the scale has changed, it has become the profit motive in everything we do. Everybody wants a payrise, few refuse one. We are all capitalists, we differ only by a matter of scale.

Enclosed land needed strong control and the will to fight for it. Strength prevailed while weakness went under as resource competition ebbed and flowed across tribal territories. If land produced enough spare food and other necessary commodities, it was possible to equip and feed an army, and use it to occupy more territory. In that way collective energy could rapidly roll up small territories into a nation or an empire, create warlords and kings, and give credence to gods who were invariably on the winning side.

Possession of land and what it produces is the hidden support of what we now understand as our economy and the viability of our infrastructure. Conflict makes that economy even more profitable and one that is built on power and aggression provides the potential for endless resource warfare, whether bloody or political, or both.

The more land that could be held and ruled, the more food-energy could be produced. Surplus energy that came in the form of meat and grain and timber couldn’t be carried around, so tokens of gold and silver became an accepted measure of energy value.

Different civilisations arose and used different monetary systems, but all broadly followed the pattern we are locked into now: those who controlled the land controlled the energy that supported the prevalent economy, whether primitive or sophisticated, warlike or peaceful. With sufficient surplus and a big enough labour force held in some kind of serfdom or dependency, tokenized energy could be diverted to pay for the construction of cities, castles and cathedrals. While the labour of men to build them, the allegiance of soldiers to guard them, and the faith of priests to pray over them might be bought with gold and silver, the system depended on a supply of food and basic commodities well above subsistence level, ultimately provided by the heat of the sun. That’s why the great early civilisations and empires began in the warm tropical and sub tropical regions of the world. And why Eskimos did not field armies, build cities, or inflict the hysteria of mass religion on themselves; they didn’t get enough sunshine to provide the energy resources to do so.

Continuar a ler »

10 x Agustina

A Sibila (1954)
O Sermão do Fogo (1962)
A Dança das Espadas (1965)
As Fúrias (1977)
Conversações com Dmitri e Outras Fantasias (1979)
Adivinhas de Pedro e Inês (1983)
Um Bicho da Terra (1984)
Prazer e Glória (1988)
Ordens Menores (1992)
Alegria do Mundo I (1996)

Elizabeth Kolbert

In 1975, researchers at Stanford invited a group of undergraduates to take part in a study about suicide. They were presented with pairs of suicide notes. In each pair, one note had been composed by a random individual, the other by a person who had subsequently taken his own life. The students were then asked to distinguish between the genuine notes and the fake ones.

Some students discovered that they had a genius for the task. Out of twenty-five pairs of notes, they correctly identified the real one twenty-four times. Others discovered that they were hopeless. They identified the real note in only ten instances.

As is often the case with psychological studies, the whole setup was a put-on. Though half the notes were indeed genuine—they’d been obtained from the Los Angeles County coroner’s office—the scores were fictitious. The students who’d been told they were almost always right were, on average, no more discerning than those who had been told they were mostly wrong.

In the second phase of the study, the deception was revealed. The students were told that the real point of the experiment was to gauge their responses to thinking they were right or wrong. (This, it turned out, was also a deception.) Finally, the students were asked to estimate how many suicide notes they had actually categorized correctly, and how many they thought an average student would get right. At this point, something curious happened. The students in the high-score group said that they thought they had, in fact, done quite well—significantly better than the average student—even though, as they’d just been told, they had zero grounds for believing this. Conversely, those who’d been assigned to the low-score group said that they thought they had done significantly worse than the average student—a conclusion that was equally unfounded.

“Once formed,” the researchers observed dryly, “impressions are remarkably perseverant.”

A few years later, a new set of Stanford students was recruited for a related study. The students were handed packets of information about a pair of firefighters, Frank K. and George H. Frank’s bio noted that, among other things, he had a baby daughter and he liked to scuba dive. George had a small son and played golf. The packets also included the men’s responses on what the researchers called the Risky-Conservative Choice Test. According to one version of the packet, Frank was a successful firefighter who, on the test, almost always went with the safest option. In the other version, Frank also chose the safest option, but he was a lousy firefighter who’d been put “on report” by his supervisors several times. Once again, midway through the study, the students were informed that they’d been misled, and that the information they’d received was entirely fictitious. The students were then asked to describe their own beliefs. What sort of attitude toward risk did they think a successful firefighter would have? The students who’d received the first packet thought that he would avoid it. The students in the second group thought he’d embrace it.

Even after the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs,” the researchers noted. In this case, the failure was “particularly impressive,” since two data points would never have been enough information to generalize from.

The Stanford studies became famous. Coming from a group of academics in the nineteen-seventies, the contention that people can’t think straight was shocking. It isn’t any longer. Thousands of subsequent experiments have confirmed (and elaborated on) this finding. As everyone who’s followed the research—or even occasionally picked up a copy of Psychology Today—knows, any graduate student with a clipboard can demonstrate that reasonable-seeming people are often totally irrational. Rarely has this insight seemed more relevant than it does right now. Still, an essential puzzle remains: How did we come to be this way?

In a new book, “The Enigma of Reason” (Harvard), the cognitive scientists Hugo Mercier and Dan Sperber take a stab at answering this question. Mercier, who works at a French research institute in Lyon, and Sperber, now based at the Central European University, in Budapest, point out that reason is an evolved trait, like bipedalism or three-color vision. It emerged on the savannas of Africa, and has to be understood in that context.

Stripped of a lot of what might be called cognitive-science-ese, Mercier and Sperber’s argument runs, more or less, as follows: Humans’ biggest advantage over other species is our ability to coöperate. Coöperation is difficult to establish and almost as difficult to sustain. For any individual, freeloading is always the best course of action. Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve the problems posed by living in collaborative groups.

“Reason is an adaptation to the hypersocial niche humans have evolved for themselves,” Mercier and Sperber write. Habits of mind that seem weird or goofy or just plain dumb from an “intellectualist” point of view prove shrewd when seen from a social “interactionist” perspective.

Consider what’s become known as “confirmation bias,” the tendency people have to embrace information that supports their beliefs and reject information that contradicts them. Of the many forms of faulty thinking that have been identified, confirmation bias is among the best catalogued; it’s the subject of entire textbooks’ worth of experiments. One of the most famous of these was conducted, again, at Stanford. For this experiment, researchers rounded up a group of students who had opposing opinions about capital punishment. Half the students were in favor of it and thought that it deterred crime; the other half were against it and thought that it had no effect on crime.

The students were asked to respond to two studies. One provided data in support of the deterrence argument, and the other provided data that called it into question. Both studies—you guessed it—were made up, and had been designed to present what were, objectively speaking, equally compelling statistics. The students who had originally supported capital punishment rated the pro-deterrence data highly credible and the anti-deterrence data unconvincing; the students who’d originally opposed capital punishment did the reverse. At the end of the experiment, the students were asked once again about their views. Those who’d started out pro-capital punishment were now even more in favor of it; those who’d opposed it were even more hostile.

If reason is designed to generate sound judgments, then it’s hard to conceive of a more serious design flaw than confirmation bias. Imagine, Mercier and Sperber suggest, a mouse that thinks the way we do. Such a mouse, “bent on confirming its belief that there are no cats around,” would soon be dinner. To the extent that confirmation bias leads people to dismiss evidence of new or underappreciated threats—the human equivalent of the cat around the corner—it’s a trait that should have been selected against. The fact that both we and it survive, Mercier and Sperber argue, proves that it must have some adaptive function, and that function, they maintain, is related to our “hypersociability.”

Mercier and Sperber prefer the term “myside bias.” Humans, they point out, aren’t randomly credulous. Presented with someone else’s argument, we’re quite adept at spotting the weaknesses. Almost invariably, the positions we’re blind about are our own.

A recent experiment performed by Mercier and some European colleagues neatly demonstrates this asymmetry. Participants were asked to answer a series of simple reasoning problems. They were then asked to explain their responses, and were given a chance to modify them if they identified mistakes. The majority were satisfied with their original choices; fewer than fifteen per cent changed their minds in step two.

In step three, participants were shown one of the same problems, along with their answer and the answer of another participant, who’d come to a different conclusion. Once again, they were given the chance to change their responses. But a trick had been played: the answers presented to them as someone else’s were actually their own, and vice versa. About half the participants realized what was going on. Among the other half, suddenly people became a lot more critical. Nearly sixty per cent now rejected the responses that they’d earlier been satisfied with.

This lopsidedness, according to Mercier and Sperber, reflects the task that reason evolved to perform, which is to prevent us from getting screwed by the other members of our group. Living in small bands of hunter-gatherers, our ancestors were primarily concerned with their social standing, and with making sure that they weren’t the ones risking their lives on the hunt while others loafed around in the cave. There was little advantage in reasoning clearly, while much was to be gained from winning arguments.

Continuar a ler »