Документ взят из кэша поисковой машины. Адрес
оригинального документа
: http://www.badastronomy.com/mad/1996/au.html
Дата изменения: Unknown Дата индексирования: Sat Apr 9 23:13:59 2016 Кодировка: Поисковые слова: martian surface |
|
|||||||||||||||||||||||||||||||||||||
Buy My Stuff |
Distance to the SunPosted by Jane KincaidGrade level: 7-9 School: Silverado Middle School City: Napa State/Province: CA Country: USA Message: How did the astronomers first determine the distance to the sun, and what methods were used? How did early astronomers determine the distance form earth to the sun, and what method is used today to determine distances in space?
(First, a quick note: this is a lengthy explanation. At the end of this is a series of links with more terse descriptions answering your question about distances. I don't know of any good sources with info on how the Earth-Sun distance was first found, so for that I wrote a longer description.) By the late 1600's, astronomers had determined the distances to the other planets in the solar system relative to the Earth's distance to the Sun. For example, they knew that Jupiter was about 5 times the distance from the Sun as the Earth. The problem was they didn't know how far away the Sun was! Without knowing the actual distance of the Earth to the Sun (called an Astronomical Unit, or AU), they didn't know how far away the other planets were. We now know that an AU is about 150 million kilometers. But how was it first found? In 1653, an astronomer named Christian Huygens (pronounced "Hoy-gens") was the first to find this distance. He used a very clever idea, but as you'll see in a moment, he had to make a guess about one of his numbers. By pure blind coincidence, he guessed correctly and so his measurement of the AU is essentially correct. However, since his determination was not rigorous, the actual first measurement is usually credited to Cassini, who used a method involving getting the parallax of Mars. Cassini did this in 1672. So how did Huygens do it? He knew that Venus showed phases when viewed through a telescope, just like our own Moon does. He also knew that the actual phase of Venus depended on the angle it made with the Sun as seen from the Earth. When Venus is between the Earth and Sun, the far side is lit, and so we see Venus as being dark. When Venus is on the far side of the Sun from the Earth, we can see the entire half facing us as lit, and Venus looks like a full Moon. When Venus, the Sun and Earth form a right angle, Venus looks half lit, like a half Moon. Now, if you can measure any two internal angles in a triangle, and know the length of one of its sides, you can determine the length of another side. Since Huygens knew the Sun-Venus-Earth angle (from the phases), and he could directly measure the Sun-Earth-Venus angle (simply by measuring Venus' apparent distance from the Sun on the sky) all he needed was to know the distance from Earth to Venus. Then he could use some simple trigonometry to get the Earth-Sun distance. This is where Huygens tripped up. He knew that if you measured the apparent size of an object, and knew its true size, you could find the distance to that object. Huygens thought he knew the actual size of Venus using such unscientific techniques as numerology and mysticism. Using these methods he thought that Venus was the same size as the Earth. As it turn out, that is correct! Venus is indeed very close to being the same size as the Earth, but in this case he got it right by pure chance. But since he had the right number, he wound up getting the about the correct number for the AU. Since Huygens' method was not rigorous (that is, was not completely scientifically grounded) he is not usually given credit for being the first to find the value of an AU. In 1672, Cassini used a method involving parallax on Mars to get the AU, and his method was correct. Now, to answer your other question, the common way to measure the distance to nearby stars is using parallax. Try this simple experiment: hold a finger up about 10-20 centimeters from your nose. Now alternately blink your eyes, so that at first you are looking at your finger through your left eye, then your right, back and forth. See how your finger appears to jump back and forth? That's because your eyes are separated from each other by a few centimeters. That effect is called parallax. If you know the separation between your eyes, and can measure the angle that your finger appears to jump, you can calculate the distance using trigonometry. The farther away something is, the wider the separation must be between the two observations (try looking at a telephone pole across a street and blinking your eyes-- the pole is so far it doesn't look like it is moving at all when you blink!). Stars are very far away, so we need to make very separate observations. Luckily, the Earth's orbit is very wide! If you measure the position of a star, then wait six months for the Earth to go halfway around its orbit, the baseline between measurements is 2 AU (aren't you glad we know that distance now?). Stars can be measured with fair accuracy this way out to several hundred light years. Once you measure a star's distance, you can use it to measure other stars too far away to measure using parallax. Say you measure the distance to a nearby star, and you also measure how bright it is. If you can find another star just like it, but much farther away, you can measure how much fainter the farther star is, and then figure out its distance! This method can actually be used to determine distance to nearby galaxies, which have very bright stars in them. At this point this explanation is getting too long. Here are some links that explain all this as well, and also point to other methods for determining astronomical distances.
|
MADSCI Q&A Q&A 1996 Q&A 1997 Q&A 1998 Q&A 1999 Q&A 2000
|