Документ взят из кэша поисковой машины. Адрес оригинального документа : http://www.astronomy.com/bonus/pseudo
Дата изменения: Unknown
Дата индексирования: Sat Apr 9 23:41:27 2016
Кодировка: IBM-866

Поисковые слова: п п п п п п п п п п
A Sickness Over The Land | Astronomy Magazine
A web exclusive story from Astronomy magazine

A Sickness Over The land

Collage by Astronomy: Chuck Braasch
Why scientists believe what they do, and why the public too often doesn’t go along.
By David J. Eicher

There’s a sickness over the land. It may not, as the biblical plagues did, kill anyone. But it is corrupting millions of people with an unrealistic, warped view of the universe around them.

Every hour, every day, every week, and every month, the quality of information shared between people is seemingly getting worse, plagued by inaccuracies, exaggerations, overreactions, and outright lies. It happens in newspapers, in magazines, on television, on the Internet, and especially on social media, more and more, seemingly, as time goes on.

It’s a gross misrepresentation of science, the rational thinking process that allows us to interpret the world.

And it shows absolutely no signs of stopping. In fact, it seems to accelerate as time rolls on.

“The Big Bang never happened!”

“GMOs are bad for you and there’s a huge conspiracy to push them at consumers.” 

 “Global warming doesn’t exist.” 

“Vaccines are killing children.” 

In far too much of the media, let alone individual writings and postings, wholesale, anti-scientific sentiments seem to be winning the day.

Why, exactly, is this?

Illustration by Astronomy: Chuck Braasch

What is science, anyway?

You may not be familiar with the term epistemology, but it’s been in operation since the earliest days of humanity. It’s the investigation of what determines truth; how do we know what the facts are about things and events that surround us? This theory of knowledge governs what we believe and how we perceive reality.

You come home after a long day of work and find that the toolbox with your hammer, specifically set aside for a project that night, is missing. Why is this? Your thought process tells you, perhaps with some investigation, what likely happened, and you believe that to be the truth.

Several methods of determining the truth exist. But for the sake of argument, we can examine the “big four,” the most common modes of reasoning that most everyone uses in some form every day.

The first is intuition, equivalent to imagination, speculation, and revelation. This most primitive thought process employs a priori reasoning, is highly dogmatic, and involves assumptions and even clairvoyance and extrasensory perception. It’s highly emotional and personal. (“I know that Frank took my toolbox because I dreamed he did,” might represent this kind of thinking.)

The second method, slightly better than intuition, is authoritarianism. We have all used this one a lot in our lives, whether we’d like to admit it or not. This is how schools, governments, and organizations usually work. You believe something to be the truth because parents, teachers, clergy, physicians, politicians, or celebrities tell you it is so. Such public testimony is especially readily accepted by those with lesser amounts of education and by philosophical traditionalists who resist change, innovation, and criticism. (“I know that Susie took my toolbox because Frank told me she did.”)

The third method, called rational thinking, is better than the first two. This approach includes mathematics and logic, and is based on stochastics (probabilities), heuristics (invention), and analogies (comparisons). Definitions and semantics become very important in this mode. Rules of logic prevail; for example, if A = B, then A ÷ B = 1. Statistics, which rely on this kind of reasoning, it should be noted, are valid only when based on large samples. (“I know that Herb took my toolbox because I think he was the only one who had the time and opportunity to do it.”)

The fourth and best mode of determining the truth, according to scientists, is empiricism. This is the basis, in fact, of science. Empirical methods rely on coincidental observations and agreement among educated people. They are unconstrained and rely on multiple sources of information. They involve experimentation, measurements, repetition, analysis, and prediction. (“I know that John took my toolbox because the surveillance camera shows us that he did, and when shown the footage, he admitted it.”)

In the classic 1960 film Inherit the Wind, science meets tradition in a dramatic rendition of the 1925 Scopes “Monkey Trial” that occurred in Dayton, Tennessee. In reality, schoolteacher John T. Scopes was tried for teaching evolution as fact in Tennessee schools, which at the time violated state law. In the film, actors’ personal belief systems reversed: conservative Catholic Spencer Tracy portrayed liberal attorney Henry Drummond (modeled after Clarence Darrow), while liberal Unitarian Fredric March played arch conservative lawyer and presidential candidate Matthew Harrison Brady, modeled after William Jennings Bryan.
Youtube

The dawn of science

Using this spectrum of ways of thinking about what is reality, what constitutes the truth, in conflicting manners has been an issue ever since the dawn of humanity. In ancient Greece, philosophers managed education, priests served religion, and kings controlled government, while more than 95 percent of the population was illiterate. And yet these wise men set off on a world-class search for the truth. Their carefully considered title, philosopher, came from the Greek philosophos, meaning lover of knowledge.

Thales of Miletus, generally credited with being the first scientifically thinking philosopher, is shown in this woodcut from the Nuremberg Chronicles, published in 1493.
Wikimedia Commons

The road to truly scientific thinking was, of course, a long one. Thales of Miletus (c. 624 B.C.–c. 546 B.C.) discounted mythology in favor of nature study. Erathosthenes of Cyrene (c. 276 B.C.–c. 195/194 B.C.) determined Earth was a sphere and measured its circumference to a surprising degree of precision. Claudius Ptolemy of Alexandria (c. 90–c. 168) produced the first great star catalog (based on Hipparchus’ earlier work), which influenced society for a millennium.

During the Middle Ages, the first great educational institutions sprang up. The universities in Paris, Oxford, and Cambridge were all founded within a few decades of each other beginning in the mid-12th century. Italian Dominican Thomas Aquinas (1225–1274) blended intuition with authority and rationalism in his major works. English Franciscan William of Ockham (c. 1287–1347) proposed the law of parsimony (better known as Ockham’s Razor), a staple of science to come, that the shortest explanations and simplest hypotheses were probably correct. (“Explanations should not be multiplied unnecessarily.”) In 1440, German printer Johannes Gutenberg (c. 1398–1468) produced his first works on paper with movable type. This created a revolution in sharing information that greatly advanced education.

And then came the Renaissance. Italian Leonardo da Vinci (1452–1519) typified the revolutionary thought processes of the period and became one of the greatest self-starting scientific experimentalists. Polish astronomer Nicolas Copernicus (1473–1543) commenced a revolution of understanding Earth’s place in the cosmos with his model that placed the Sun in the center of the solar system. Yet during the same period, French astrologer and seer Nostradamus (Michel de Nostredame, 1503–1566) became the epitome of anti-science thinking, writing more than 6,000 prophecies, most without definite descriptions, locations, and timings. His 5 percent record of accuracy could have been matched by anyone.

Galileo’s breakthrough

And then came Italian mathematician and astronomer Galileo Galilei (1564–1642). With the creation of his telescope and the change, one night, from staring at the church steeple near his Padua house to a view of the Moon, Galileo set off a new and more forceful revolution. His 10 theses, based on his telescopic observations and published in Sidereus Nuncius (“Starry Messenger”) in 1610, shook the world. They were:

Italian mathematician and astronomer Galileo Galilei (1564–1642) commenced the modern era of astronomy when he moved his telescope from a church steeple over to view the Moon, unleashing a huge era of discovery. This portrait by Justus Sustermans was rendered in 1636.
Wikimedia Commons

1. The Moon is not smooth, but contains “mountains” and “seas.”
2. The Moon has sunshine on its bright areas and earthshine on its dark portions.
3. The Moon’s terminator is curved, proving Earth is a sphere. (This observation was also made from earlier lunar eclipse viewings.)
4. The Moon’s period around Earth and its rotation are about the same, so that the same side of the Moon always faces Earth.
5. The Milky Way is a dense lane of stars.
6. The Sun’s spots move in the same direction, proving the Sun rotates.
7. The planets are disks, proving they are closer than the stars.
8. Venus shows phases indicating it rotates around the Sun.
9. Saturn has peculiarities (rings).
10. Jupiter has its own satellites (moons).

Galileo pushed the Catholic Church too far with his Dialogue Concerning the Two Chief World Systems, published in 1632, in which he spoofed the pre-Copernican understanding of the cosmos, earning him a charge of heresy. Galileo was held under house arrest throughout his remaining years. In 1992, Pope John Paul II expressed regret for the handling of the Galileo affair, stating that Galileo was “convicted in error,” some 350 years after the Italian scientist’s death.

The scientific method emerges

Galileo’s work, though it cost him his freedom, laid the cornerstone of what evolved into the modern scientific method. The basics of the scientific method are simple: First, scientists make observations. They observe everything they can, in every way they can. Scientific observations should be simple, relevant, reliable, precise, coherent, and comprehensive. Scientists then use the observations to create a hypothesis, which is an educated guess about what they have seen, and it may be true or false. This is what many people incorrectly call a “theory,” which to scientists is different. (Even many scientists, however, have often misused the terms. Both Einstein and Darwin called their hypotheses “theory” in their first publications.)

Hypotheses need to be tested, reviewed, and refined in order to be believed. Scientists make measurements and classifications, and the larger the number of observations, the more credence the hypothesis gets. By contrast, a scientific law is an accepted fact, which is specific in application and has been tested ad nauseam. These include Boyle’s law for the behavior of gases, Newton’s law of universal gravitation, and many others.

Finally, to scientists, a theory is a large body of principles accepted in the absence of conflicting information. A theory may be rejected or revised as new observations occur. Examples are Ptolemy’s geocentric theory of the cosmos being replaced by Copernicus’ heliocentric theory and Stahl’s phlogiston theory being replaced by Lavoisier’s oxidation theory in chemistry.

English polymath William Whewell (1794–1866), depicted here in the early 1860s, coined the terms science and scientist and made numerous contributions to science and philosophy.
Wikimedia Commons

Over time, theories receive incredible scrutiny, particularly in the modern age with millions of scientists making observations, and need to stand up universally in order to carry on as the leading idea of how a fact of science works. And yet theories with overwhelming evidence backing them often receive little support from those not educated as scientists — but more on that later.

The concept of science and scientists is a modern thing. It was the English polymath William Whewell (1794–1866) who, in 1833, coined the terms science and scientist. And for many centuries beforehand, they were inextricably locked up in an amalgam of religion and philosophy so that science and religion could never be cleanly separated, as Whewell foresaw that they should be.

The steady course of Charles Darwin

One of those who came to symbolize the divergence between science and religion, at a critical time in world history, was of course the English naturalist Charles Darwin (1809–1882). A member of the famous Darwin and Wedgwood families, Charles grew up in a manor house with gardens, orchards, farms, greenhouses, stables, workshops, museums, and libraries. He studied theology, medicine, geology, history, botany, and zoology.

English evolutionary biologist Charles Darwin (1809–1882) came to symbolize the struggle of science against old traditions; this portrait was made around 1874 by Leonard Darwin.
Wikimedia Commons

In 1831, Darwin joined the HMS Beagle for a planned one-year cruise to South America, and he didn’t return until 1836! As a scientist on the voyage, Darwin visited many lands and studied people, climate, geography, mineralogy, plants, and animals. Darwin sent back 3,907 specimens and wrote 1,529 detailed reports, afterward working on and eventually publishing 13 books, including the legendary The Origin of Species and The Descent of Man.

Darwin became an expert on comparative morphology and evolution. He described how variation is a characteristic of all species. He noted that more organisms are produced than can possibly survive due to limitations in the environment. He pointed out that real and inadvertent competition always exists for resources. And he demonstrated how transmission of variations and adaptations occur, which depend on the laws of heredity.

Long after Darwin’s time, debates have flared centering on evolution. Famously, in 1925 in Dayton, Tennessee, the Scopes “Monkey Trial” pitted the arch conservative William Jennings Bryan against the liberal Clarence Darrow, as dramatized in the popular film Inherit the Wind. It was astonishing enough to have such arguments taking place 60 years after the publication of Darwin’s milestone study. Yet despite the overwhelming evidence supporting the theory of evolution, the debate is still going on today, another 90 years later, as a 2014 Gallup poll showed that 42 percent of the U.S. believes in creationism, the idea that the universe and life on Earth originated from divine acts by a supreme being.

Wikimedia Commons

Science and religion

The poll also showed that 31 percent believe in evolution guided by a supreme being, while only 19 percent believe in human evolution without any sort of godly interference. This is interesting, as much has been written about the philosophical separation between science and religion. I have lectured at the Vatican Observatory at Castel Gandolfo and have had chats with George Coyne and José Funes, two Vatican astronomers and wonderful gentlemen, and their colleagues. Many whose spirit is deeply religious, and who also pursue science, keep their science and their faith in separate containers, it seems, and have no conflict whatsoever with the intersection of the two. And that’s fine.

Size of major religious groups, 2010. Percentages are of the global population.
Pew Research Center

For many, belief in science does not necessarily mean a disbelief in God, or vice versa. I have lots of friends and associates who belong to various religions, and some who are atheists or agnostics. But for me, I see science as a purely empirical exercise in knowing the clearest, sharpest view of the truth, and I want evidence for what I believe. That is a purely scientific viewpoint in my mind, and so I follow it to the logical conclusion.

I have evidence of atoms, of matter, of energy, of the countless discoveries of science, of Earth, of stars and galaxies, and I adore nature with the evidence that exists. Spiritualism, in my mind, can exist as a pure love and concern for fellow members of the human race — and for all the living beings on our planet. That, to me, is spirit. I don’t need anything more.

The war against scientific thinking

Regardless of religion, the gap between  scientific discovery and widespread public acceptance is stunning. “Ordinary folks” certainly understand the scientific properties of everyday things like tables and chairs, farms and cities, cats and dogs, and adults and children. They live with the fruits of applied science countless times every day, perhaps most frequently when they pull out their cell phones.

But they do not understand so well the other realms that scientists study — atoms and molecules, electrons and quarks, genes and nucleotides, and galaxies and dark matter. These subjects may cause concern and even fright among some folks who have not been schooled in these areas. Yet lack of knowledge does not prevent strong opinions. Some people fear the unknown, resist change, or tend to be comfortable in philosophically conservative, reactionary modes.

Nearly everyone believes in applied science when they are lying in a hospital bed. Why not the rest of the time?

Scientists would say that if people were better educated, they would embrace science and understand it far better. Teachers often ask, “Has education been a failure due to ignorance or due to apathy?” People who fail to see the value of education respond with, “I don’t know, and I don’t care!” 

The American comedian George Carlin (1937–2008) summed it up with: “Think of how stupid the average person is, and realize half of them are even stupider than that.” 

We all need to engage in more general education. It's a serious problem that limits what we as humans can accomplish.

With more education, we may prevent the kind of movement described in a March 2015 National Geographic cover story, as “The War on Science.” A story by Washington Post science writer Joel Achenbach correctly describes how “we live in an age when all manner of scientific knowledge — from safety of fluoride and vaccines to the reality of climate change — faces organized and often furious opposition.” 

For a variety of reasons — lack of dedication to advanced education, constant ridiculous streams of entertainment, and many others — Americans are especially caught up in a web of misinformation. Entire movements exist in which factions believe the Moon landings (and the recent Pluto flyby, too!) were faked. Rather than understanding the basics of genetics and the fact that organisms have been genetically modified wholesale for eons through traditional breeding, 57 percent of U.S. adults think that genetically modified organisms are unsafe to eat (compared to 11 percent of scientists), according to a 2014 Pew Research Center study.

The science of climate change, of introducing more carbon dioxide into Earth’s atmosphere and seeing a corresponding rise in mean temperatures (and resulting chaos), is relatively basic. The entire field of climate scientists agrees on global warming as a human-induced problem, with few exceptions. Yet, according to another recent Pew Research Center study, only 40 percent of Americans believe that global warming has been affected by fossil fuel burning, in contrast to the 97 percent of scientists who believe it.

One of the great anti-scientists of the Renaissance, Frenchman Michel de Nostradame (1503–1566), better known as Nostradamus, made countless astrological and supernatural predictions, “prophecies,” and about 5 percent of them came true, more or less. If you make thousands of very generalized predictions, you too can “predict the future” to an accuracy of about 5 percent.
Wikimedia Commons

Recent flaps over the alleged connection between vaccines and autism, a bonfire flared up by celebrities like actress Jenny McCarthy, are legendary. The anti-vaccine movement took off following a 1998 paper published in the British medical journal The Lancet that supposedly linked a vaccine to autism. But the study was soon retracted and discredited. The idea that vaccines cause autism took off nonetheless with the vocal minority and has gained momentum ever since. Kids have died because of this kind of false information.

And the same huge rift exists between scientists and the general public in a thousand other areas. That 2014 Gallop poll showed that more than third of Americans believe that human beings have existed just as they are now since the beginning of time. Science television programming shows as much coverage of ghosts, UFOs, time travel, and mythological creatures, and as it does anything that might actually begin to resemble science in a lab.

And these are just some examples of misinformation that goes on every day on TV and the throughout the Internet, and even in the press offices of institutions and universities. Hungry to make as big a splash as possible with their latest press release in order to sustain funding, these folks often cross the line, exaggerating the importance of studies. It gets to the point where nearly every scientific paper about to be published is hailed as resetting what we know about a particular small portion of a research area.

This kind of runaway hyperbole and wishful thinking for the magnitude of science results is disastrous. It’s contaminating and dumbing down the minds of countless readers every day. Unfortunately, science publications often jump right on the bandwagon, unable or unwilling to have enough expertise on staff to set these claims in context. They sound the alarm so that numerous findings each week and each month are radically rewriting what we know. It’s often way, way out in front of reality.

In the current world of science journalism, in countless blogs, newspapers, and magazines, nearly everything comes off like a eureka moment.

But science is a slow, gradual, self-correcting process of accumulating information to make those hypotheses and eventually those laws and theories.

Is all hope lost?

Is the cause for real science understanding lost? As I mentioned, it does seem that entertainment and communications are currently pushing the average person’s understanding of science, if anywhere, in reverse. But should we be pessimistic about the future of science in society? It all depends on the timescale in which you look.

If we go back half a millennium, to the year 1500, the average person was a whole lot worse off yet in understanding the world around her. The first portable watch had just hit the streets. Copernicus had yet to unveil the heliocentric theory. Magellan was just about to set off on the circumnavigation of the globe. Erasmus was readying himself to test the free will of the church. Disease was rampant, and life was hard. Understanding of life on Earth and the greater heavens around us was stuck in a relatively primitive state.

There is hope for the future. Now, 500 years later, nearly half of Americans, in some contexts, understand and embrace the real tenets of science. (Even the Gallop poll showed that only 28 percent of 18- to 29-year-olds in the U.S. believes in creationism over evolution.) But should we hope for a more universal understanding of science and better education for a larger percentage of the population all the way around?

At the very least, 10 thousand billion billion stars exist in the cosmos. The universe, in fact, may be infinite. We now see planets around a large number of stars near us in the Milky Way Galaxy. I sometimes wonder, sitting outside under the stars at night, if planets exist where most all of the beings are believers in science and in math — the language and governing laws of the universe. Maybe there are just a few, huh?

I know that knowledge must move in the right direction as time goes on. But right now we are in a sad state, in which the will to separate bad information from good is undercut by the hunger for click-throughs and revenue, and in a wide-open digital market where anyone can play expert.

I’m usually an optimist. But when I’m not, when I’m in one of those down-low states, I can only gaze into the sky at distant stars and think of those words of Eric Idle’s, from the famous “Galaxy Song:” “And pray that there’s intelligent life somewhere out in space / ‘Cause there’s bugger all down here on Earth.” 

David J. Eicher is Editor-in-Chief of Astronomy, author of 21 books on science and history, a board member of the Starmus Festival, and President of the Astronomy Foundation.