Viewing a single comment thread. View all comments

KronoMakina t1_ja6gz2g wrote

Wouldn't 2 million years worth of ice be a lot thicker? How do we know the ice didn't melt and renew in 2 million years through warm or cold phases?

9

ShadowDV t1_ja6jrpx wrote

The ice sheet is up to 3 miles deep. Antarctica is also a desert getting less than 2 inches of precipitation a year. Also, the glacial ice flows out. Imagine pouring something very viscous, like honey on top of a bowling ball. That kind of like how continental glaciers flow.

So as the ice flows out, the annual sheet accumulation thins out, to where 1 year’s worth of accumulation can be like 1mm thick. And those layers can be read like tree rings for the age in an ice core.

56

llanthas t1_ja6rs84 wrote

Would it be safe to assume that precipitation levels observed in the past 50-ish years have been consistent for millions?

7

ShadowDV t1_ja6sae8 wrote

In the case of Antarctica, at the South Pole, yes. Landmasses at poles of a planet are generally going to be dryer as the weather patterns are not as affected by the Coriolis effect nearly as they are at higher latitudes.

29

Undercover_in_SF t1_ja8g2z9 wrote

To add to the other response you received, precipitation doesn’t have to be the same. The annual warming/cooling of the seasons leaves a mark like a tree ring (I’m simplifying), so they can differentiate precipitation years regardless of how much from each year.

None of it is exact, so they triangulate lots of different measurements to increase confidence.

For example, if you know a big volcano erupted 1,000 years ago, you’d look for ash at the depth the layers tell you is 1,000 years and see how accurate you were.

12

Retepss t1_jaa0hct wrote

As long as there was some precipitation during the year. IIRC, the one assumption we have to make is that there was SOME precipitation for most years on Antarctica. Which doesn't seem unreasonable.

2

Goldenslicer t1_ja7hyz9 wrote

How do we know any particular depth corresponds to a particular age?

2

BlueFlannelJacket t1_ja7k8l3 wrote

That's where the radioactive dating comes in. It uses the half life of known elements to measure how long ago that ice formed, and those 2 measurements together can be used to figure out the environment at a certain time.

17

Retepss t1_jaa6xnx wrote

There are lots of clues. One of the simpler ones is counting seasons. I've only seen ice cores from the Arctic, which don't go as far back, but just looking at them you can tell summer ice from winter ice. Counting the layers give you years. There are also more accurate ways to measure the difference (you can look into what delta 18Oxygen means).

You will lose count if there was a period where summers got warm enough to melt more ice than was formed during the winter, but you can then use the other methods to try and correct for that.

Even so, being of by a 100 years isn't too bad when you are counting 100000.

2

Pandarmy t1_ja6lpux wrote

The other reply has some great info but I can add a bit. Another way we know it hasn't melted is radioactive dating. I'll use carbon dating as the example since I'm most familiar with it. Carbon-14 is radioactive (half life 5700 years) and naturally present at a rate of 1 ppt. If a substance has gas exchange with the atmosphere, it will keep that 1ppt amount of carbon-14. If not, that number will fall as the carbon decays. Since the ice sheet has a much lower percentage of C-14 (or other radioactive element they are testing) it means the ice must have been there for a long time.

11

Undercover_in_SF t1_ja8ggxq wrote

And accuracy decreases the farther back you go. Once you get around 50,000 years you’re at 10 half lifes, so any residual C14 is so small it’s not accurate.

3

Pandarmy t1_ja9bocd wrote

Right, which is why they probably use potassium or krypton instead of carbon. I'm just not as familiar with their isotopes or halflives which is why I used carbon in my example.

5