News aggregator

Preparing for the Inevitable Sea-Level Rise Caused by Climate Change - The Atlantic

Featured News - Mon, 02/29/2016 - 17:47
Scientists are struggling to figure out the timeline for how climate change will affect vulnerable waterfront communities. The Atlantic talks with Lamont's Maureen Raymo about the challenges.

Trials & Tribulations of Coring the Agulhas Plateau

When Oceans Leak - Sun, 02/28/2016 - 14:03
 Tim Fulton/IODP.

Sedimentologists Thibaut Caley of the University of Bordeaux and Andreas Koutsodendris of the University of Heidelberg and Deborah Tangunan, a paleontologist from the University of Bremen, work in the core lab aboard the JOIDES Resolution. Photo: Tim Fulton/IODP

Read Sidney Hemming’s first post to learn more about the goals of her two-month research cruise off southern Africa and its focus on the Agulhas Current and collecting climate records for the past 5 million years.

A lot has happened since my last post. As we were heading south to the Agulhas Plateau, one of the scientists had to be evacuated by helicopter for medical treatment. We were within a day of the Agulhas Plateau site and had to go back to near Port Elizabeth for the handoff and then return to drill the plateau. The weather at the plateau was bad enough that we were probably going to have a delay anyway, so we didn’t lose too much time. Our colleague is fine now, and our drilling on the Agulhas Plateau has been a success.

We have had some trials and tribulations because of the large ocean swells and because the sediments do not have as strong of a physical property signal as the previous site. Both of these factors increased the challenge for the stratigraphic correlators, so it has been a real cliff-hanger to find out if we can splice together a continuous section. Because of the small signal-to-noise of the physical properties, the scanning took longer and the records for correlating are not quite as clear. This has created a backup in the work flow, and it means the descriptions and scanning (and some sampling) of the split cores will be continuing as we begin our transit. And it means that until all this is completed we will not know for sure how continuous of a record we have. We are reasonably sure we will have few or no gaps in the splice, but it will be nice to see it all completed.

 Tim Fulton/IODP

The end of a fresh core, just brought aboard the JOIDES Resolution. Photo: Tim Fulton/IODP

Meanwhile, we came here thinking that we would get a high accumulation rate record for the last million years, but the accumulation rates are modest between the surface and about 100 meters – approximately 2 cm per thousand years. Below that, they turned out to be really quite nice, approaching 7 cm per thousand years through much of the Pliocene. The low accumulation in the Pleistocene is a disappointment as there is a great interest in the mid-Pleistocene climate transition, but it does look like it is a continuous record. The higher accumulation in the older sediment is exciting because the early Pliocene is a warm time in Earth’s history and the most recent with global temperatures as warm as modern times. So we Earth scientists are quite eager to understand everything we can about this interval. The Agulhas Plateau site, near where the Agulhas Current swings back toward the east, is well situated to provide some important information about linkages of different factors in the climate system.

Again at this site, as with the previous site, the development of the time scale has been fun and exciting to watch. We have four groups of organisms that are aiding in our time scale – in addition to foraminifera and nannofossils, there are abundant diatoms and dinoflagellates here. This is great for the biostratigraphy and also great for our participants whose post cruise research will use diatoms for documenting paleo-environmental changes. The magnetic stratigraphy started out looking bleak because the weak signal was messed up by the coring process in the first hole, due to the ship’s heave in the waves.  They almost gave up, but the second core preserved a great record. So we are going to have an excellent time scale for this site as well.

Expedition 361's coring sites. APT is the Agulhas Plateau. NV is the Natal Valley.

Expedition 361’s coring sites. APT is the Agulhas Plateau. NV is the Natal Valley. Credit: IODP

Meanwhile the saga continues in our quest to get permission from Mozambique to drill in their waters. We have word from our contact in the American embassy that the form has been signed by the Foreign Ministry and is now with the ministry that deals with fisheries. While that process continues, we have to start toward our next site. Our decision is to head toward the Zambezi site, as it is going to take us six days to get there anyway. If we don’t get permission before we arrive, we’ll have to turn around and head for the Cape site.

The Zambezi and Limpopo sites are near major rivers. We hope they will give us a record of the terrestrial climate variability in southeastern Africa through the last 5 million years that can be compared with the Agulhas Current and other oceanographic factors. The hope is that we will get a continuous record with a variety of proxy data for factors such as precipitation, runoff, distribution of vegetation on the landscape, and surface ocean temperatures. The coring is going to be fast at these sites because they are much shallower. In the happy case that we get to drill there, we will then have another long transit to finish off the analyses.

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences at Lamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

Snowpacks Relied on for Water are Declining - WAMC Academic Minute

Featured News - Sun, 02/28/2016 - 12:00
Justin Mankin, a postdoctoral fellow at Lamont, describes how a changing climate may change the way cultures get their water in the spring and summer.

7 Years of Earthquakes in Japan in 52 Seconds - TimeOut

Featured News - Thu, 02/25/2016 - 12:00
We know an earthquake involves movement, but what if you could capture these seismic tremors in sounds too? This thought experiment proved to be the catalyst for the Seismic Sound Lab, a project by Lamont geophysicist Ben Holtzman.

The Science Behind Ethiopia’s Hunger Crisis - Mother Jones

Featured News - Thu, 02/25/2016 - 12:00
Ethiopia's last mega-droughts killed hundreds of thousands. Could the same thing happen again? Lamont's Park Williams and Richard Seager weigh in on why the drought is not a surprise.

An Airborne Look Through the Ice - The Antarctic Sun

Featured News - Wed, 02/24/2016 - 13:00
Scientists are working to fill in one of the largest remaining blank spots on ocean charts: the sea floor beneath Antarctica's Ross Ice Shelf. Lamont-Doherty's Kirsty Tinto discusses the IcePod and how it's mapping that area.

Changing Faces on the Ice - Nature

Featured News - Wed, 02/24/2016 - 12:00
Diverse faces are coming to work in the polar regions, Lamont's Robin Bell tells Nature.

Coral Reef Growth Is Declining. Is There Hope? - Christian Science Monitor

Featured News - Wed, 02/24/2016 - 12:00
Scientists find more evidence that coral reefs are suffering from environmental changes. But, they say it's not too late. Lamont's Bärbel Hönisch discusses the possibilities.

Sailing into a Storm as We Head for the Agulhas Plateau

When Oceans Leak - Fri, 02/19/2016 - 16:49
 " data-cycle-speed="750" data-cycle-center-horz="true" data-cycle-caption="#gslideshow_captions" data-cycle-caption-template="{{alt}}" >
The team aboard the JOIDES Resolution collected the first four cores of Expedition 361 from the Natal Valley site. Here, scientists prepare to open the first. (Tim Fulton/IODP) To drill down into the sea floor, the ship uses a large rig and a professional drilling crew. ( Jens Gruetzner, Alfred-Wegener-Institut for Polar and Marine Research) Professional rig personnel bring the first core aboard. Each segment is 9.5 meters long and is sectioned into smaller pieces for analysis and storage. (Tim Fulton) Scientists work the core catcher, which holds each segment of core in as it lifted from the sea floor. (Tim Fulton) Once the cores are split, they are photographed in shipboard imaging equipment called a Section Half Image Logger. (Tim Fulton) The shipboard labs are ready for scientists to go to work. Kaoru Kubota of the University of Tokyo and Xibin Han of China work on reports in the core lab. (Tim Fulton) Sedimentologists Julien Crespin of the University of Bordeaux and Alejandra Cartagena-Sierra of Notre Dame take down descriptions of the core. (Tim Fulton) Nambiyathodi Lathika, a physical properties specialist from India's National Centre for Antarctic and Ocean Research, enters core data at the sample table. (Jens Gruetzner) Jeroen van der Lubbe of the University of Amsterdam works with a cyrogenic magnetometer to analyze the magnetic properties of a sample. (Jens Gruetzner) A map of the Agulhas Current, which the scientists of Expedition 361 are studying along with southern Africa's climate history. (Courtesy of Arnold Gordon)
<
>
The team aboard the JOIDES Resolution collected the first four cores of Expedition 361 from the Natal Valley site. Here, scientists prepare to open the first. (Tim Fulton/IODP)

Read Sidney Hemming’s first post to learn more about the goals of her two-month research cruise off southern Africa and its focus on the Agulhas Current and collecting climate records for the past 5 million years.

We finished up at our first core site yesterday, and now we are steaming south toward the Agulhas Plateau. The groups presented their summaries this morning, and the results from this site are awesome.

Inevitably there is a gap between each sediment core because of how the cores have to be taken. The drilling crew drills down to the level of the previous core’s penetration and then sends the piston core down again. There are potential coring artifacts, such as “suck-in” of sediment at the bottom, “fall-in” of sediment at the top (these are descriptive terms), or loss of sediment because the core catcher didn’t catch. That is the reason for triple – and in this case quadruple – holes at each site. The stratigraphic correlators, Steve Barker and Chris Charles, take data from the physical properties measurement tracks and work fast to determine how the cores that are coming up can be correlated to previous cores. They find the gaps and work with the drillers to ensure that the gaps from all the holes do not overlap. With the help of the lithologic description team, they also avoid parts of the core that have been disturbed by the drilling.

At the Natal Valley site, the correlators were able to achieve a continuous splice going back more than 5 million years!  We obtained samples older than that, but they are “floating”. The splice, like it probably sounds, uses the overlapping physical features in the cores to identify a combination of cores that, when pieced together, yield a full continuous record. This splice is where most of the sampling and measurements will be conducted at the post-cruise sampling party. That will probably take place in September to give enough time for the cores to be shipped to Texas A&M and, we hope, scanned with an XRF scanner in order to construct the best sampling plan.

Women scientists and technicians of Expedition 361 on UN Women in Science Day. (Tim Fulton)

Women scientists and technicians aboard the JOIDES Resolution for IODP Expedition 361 on UN Women in Science Day. (Tim Fulton)

We still do not have permission to drill in Mozambique waters. This is causing considerable anxiety for the team, and by steaming south, we will definitely not be able to drill our northern-most planned site. However, if permission comes before we finish the Agulhas Plateau site, we could still meet our Zambezi and Limpopo objectives. The Zambezi and Limpopo are major rivers that drain from the African continent. We hope to capture sediment coming from those two rivers to answer questions about rainfall and runoff and weathering.

The Agulhas Plateau is also an exciting site. It is a little bit north of the sub-tropical front, which is the northern boundary of the Antarctic Circumpolar Current. Today, it is under the Agulhas retroflection, where the Agulhas Current swings back toward the east (see the map at the end of the slideshow). Our goal is to understand how the position of the sub-tropical front and the Agulhas retroflection have changed through the last 5 million years, and how those changes are related to climatic variability in southern Africa.

Allison Franzese’s Ph.D. research used a time-slice approach to compare the modern composition of sediments washed in from the continents with those deposited during the last glaciation about 20,000 years ago (ice at that time covered what today is New York City). She has suggested that the path of the Agulhas retroflection was very similar between modern and glacial times. Her results are enigmatic because they indicate both a weakening of the current and a similar path for the retroflection, and these observations are inconsistent with what is predicted from physical oceanographic modeling. The cores we collect here should allow her to extend her studies back to 5 million years and achieve a much greater understanding of how the system has worked under a range of climate conditions, particularly when combined with results from the Natal Valley site and the Cape site (the final site of the cruise).

As an aside, I am pretty pleased that even though we are sailing through a storm with about 30 knot winds right now and the ship is swaying, I’m still feeling pretty good. Apparently this could change – we are heading toward the “roaring 40s”.

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences at Lamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

Sea Level Rise Could Put NYC's Proposed Transit System Under Water - Vice News

Featured News - Fri, 02/19/2016 - 09:31
Lamont geologist Klaus Jacob says that while the proposed Brooklyn-Queens Connector project solves desperate transportation needs, the problem is that it runs along current and future flood zones.

Can Germany's Renewable Energy Revolution Be Replicated in the US? - Bulletin of the Atomic Scientists

Featured News - Thu, 02/18/2016 - 12:00
As governments around the world invest in new energy policies and climate strategies, none has gone as far as Germany. Could the model be replicated? Lamont adjunct research scientist Beate Liepert explores the possibilities.

How the Climate Challenge Could Derail a Brilliant Human Destiny - New York Times

Featured News - Mon, 02/15/2016 - 12:00
A conversation on the importance of sustained engagement on a big challenge, whether intellectual, as in revealing spacetime ripples, or potentially existential, as in pursuing ways to move beyond energy choices that are reshaping Earth for hundreds of generations to come. Cites Lamont's review and research by a group that included Lamont Adjunct Senior Research Scientist Anders Levermann.

6 Million Years of Sediment, Studded with Tiny Fossils

When Oceans Leak - Fri, 02/12/2016 - 21:25
Jeroen van der Lubbe examines the first core brought up by the team aboard the Joides Resolution.

Jeroen van der Lubbe examines the first sediment core of Expedition 361 brought up by the team aboard the JOIDES Resolution. Photo: Sidney Hemming

Read Sidney Hemming’s first post to learn more about the goals of her two-month research cruise off southern Africa and its focus on the Agulhas Current and collecting climate records for the past 5 million years.

We have our first core! The team pulled up 254.1 meters of sediment from the Natal Valley site off southern Africa, near the start of the Agulhas Current. We think we have about 6 million years of history in that core that should be able to tell us details about the how the region’s currents and climate changed through time.

The whole core (IODP 361 U1474A) is actually several cores. Each is 9.5 meters long and is cut into sections that are 1.5 meters for analysis and storage. The top of each of those cores was jostled by the coring process, but we hope we will be able to fill the gaps with cores from the two other holes we plan to drill at the Natal Valley site. It has become standard in the International Ocean Discovery Program’s (IODP) paleoceanography drilling to core three holes, deliberately offsetting the gaps, in order to have a full record from the site. This presents a challenge for our stratigraphic correlators, Steve Barker and Chris Charles, but they are definitely up for it. They spent the time in port and during transit working hard to master the software.  I understand it is powerful, but not easy to work with (a sign on one of their computers says: “exercise extreme patience”).

 Sidney Hemming

Steve Barker’s computer used for correlations comes with a warning: “Caution: exercise extreme patience.” Photo: Sidney Hemming

Seeing our first core come up was very exciting. As it was happening, we were getting age estimates in real time from the paleontologists, and not too much later from the paleomagnetists. The paleomagnetic measurements appear robust, and they show several magnetic reversals, so we can use the known magnetic reversal time scale to help date each part of the core. The foraminifera and calcareous nannofossil species changes in the core catchers (more on this below) are providing a similar estimate of the ages as the paleomagnetics. This is so exciting to watch in real time, I keep thinking about whether there is a way we could reenact this in a classroom setting.

Now we are on the second hole. Hole B is primarily being taken for squeezing out pore waters – this is the water captured in pores of the deep sediment. We hope it can tell us about the salinity and oxygen isotope composition of the water long ago. The oxygen isotope composition of glacial water is important for understanding how much ice there was, as well as the temperature of the deep ocean. Sophie Hines, a graduate student working with Jess Adkins at Caltech, is leading the effort. She and the other four geochemists, including Lamont’s Allison Franzese, are working hard to get the pore waters squeezed. It is a tough operation, and the presses are not always as cooperative as one might wish. The geochemists decorated the most cantankerous press with a photo of Jess, who wrote the proposal for the pore waters project, to ensure they remember whom to blame.

I mentioned the foraminifera and calcareous nannofossils (calcareous means they build their shells with calcium carbonate) that were found in the core catchers. At the bottom of the core barrel – remember Lisa’s analogy to the straw in the milk shake – there is a mechanism that is open when the barrel is going down into the mud. When the core barrel has penetrated into the sediment and the rig starts to pull it up, that mechanism snaps shut, thereby catching the core and keeping it from falling back out, hence the term core catcher. Some of the sediment goes into the core catcher, and it’s a bit messy, so it is taken immediately and used for examining the microfossil content. Sometimes it’s disappointing, but in this case, the microfossils except for the siliceous ones are working out really well. More about the siliceous ones in another post.

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences at Lamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

Court Ruling on Clean Power Plan a Setback, But…

The 2015 Paris Climate Summit - Wed, 02/10/2016 - 16:47
 Wikimedia Commons

Big Bend Power Station, a coal-fired plant, near Apollo Beach, Fla. Photo: Wikimedia Commons

The U.S. Supreme Court this week put a hold on one of the key programs in the United States’ efforts to control CO2 emissions and combat global warming. The decision puts aside new regulations to control emissions from power plants until a challenge from more than two dozen states is resolved in federal appeals court.

The court’s 5-4 decision to postpone implementation of the Clean Power Plan represents a clear setback for the Obama administration’s efforts to combat climate change; but the damage to the U.S. ability to meet pledges it made at the Paris climate summit in December “is less than it might seem,” says Michael Gerrard in a commentary posted on the Sabin Center for Climate Change Law’s website.

“That is not because the Clean Power Plan wasn’t important; it is because the plan didn’t do nearly enough,” says Gerrard, director of the Sabin Center.

Gerrard notes that the plan’s emissions reductions won’t begin until 2022, meaning they won’t play a role in meeting the nation’s stated goal of reducing carbon emissions by 17 percent by 2020. Even beyond that date, the plan alone won’t be enough to meet the goal of reducing emissions by 26 to 28 percent by 2025. That, and future reductions, will depend on many other measures. Those would include higher efficiency standards for buildings and appliances and greater efforts to reduce energy consumption in the industrial and transportation sectors.

“In sum,” Gerrard writes, “while the Clean Power Plan is the biggest game in town in terms of achieving the Paris goals, it is by no means the only game in town. While we express our justifiable fury over the Supreme Court’s action, we need to bear in mind that there are many other things that the U.S. must do in the next several years to control greenhouse gas emissions.”

You can read the full commentary at the Sabin Center’s website.

For more on the court’s ruling:

Gearing Up for Our First Cores

When Oceans Leak - Mon, 02/08/2016 - 14:56
Bubba Attryde, a core technician, shows scientists on the <i>Joides Resolution</i> some of the ship's drilling tools. Tim Fulton/IODP

Bubba Attryde, a core technician, shows scientists on the JOIDES Resolution some of the ship’s drilling tools. Tim Fulton/IODP

Read Sidney Hemming’s first post to learn more about the goals of her two-month research cruise off southern Africa and its focus on the Agulhas Current and collecting climate records for the past 5 million years.

Our first day on the ocean was pretty rough. We left the harbor in Mauritius into high winds and choppy seas, and I don’t think I was alone in feeling pretty miserable.  I woke up the next day to calm seas and a much better perspective.

We have been busy with meetings, training sessions, and planning for the core flow, and I think people are getting close to being ready for the 12-hour shifts. My shift is 3 p.m. to 3 a.m., and my co-chief scientist Ian Hall’s is the opposite. It works out pretty well relative to our home clocks (when I start my shift, it’s 8 a.m. back in New York), and we’ll have significant overlap. I plan to get started by noon, and Ian will hang around until 6 or so before going to bed. We have decided we’ll take a break for exercise—should be a good strategy.

The staff is wonderful on the ship. They feed me great meals, and there is even an espresso machine right outside the science office where I sit. Today Kevin Grieger, our operations manager, gave us a tour to the bridge, the drilling rig and the core shack, where we met Bubba Attryde, who has been the core specialist since Glomar Challenger days and continues to make innovations. We went down through the motors and pumps, past the moon pool, and out to the JOIDES Resolution‘s helideck.

The helideck has a special role this cruise. On March 26, Ian Hall and Steve Barker will be running in the IAAF/Cardiff University World Half Marathon Championships. It requires 328 laps around the deck, which is noisy and hot. They are doing it to raise money for a small South African charity called the Goedgedacht Trust, which promotes education to help poor rural African children escape grinding poverty. Ian has learned that the money raised will help bring solar power to schools. When we reach Cape Town, some of the children plan to tour the ship.

It is now official that we will start with the Natal Valley site while we wait for clearance from Mozambique to work on what would have been our first site.

The Natal Valley is at the beginning of the Agulhas Current, where the waters flowing through the Mozambique Channel and the East Madagascar Current come together and flow along the southern Africa coast. A central goal of the expedition is to understand the history of the Agulhas Current and its role in climate variability, and this site could help us characterize how the microorganisms and the land-derived sediments it carries have changed over the last 5 million years.

Recently published evidence from the past 270,000 years from very close to the Natal Valley site also shows that there have been significant changes in rainfall in southern Africa on millennial time scales. We are very interested in getting a longer record of rainfall changes with this expedition. So in effect, we have the dual goals of understanding the nearby climate record from Africa and understanding the ocean currents below which the core is located—both the Agulhas Current and deep water circulation, which currently flows north along the western Natal Valley and is the reason for the sediment “contourite” accumulation that we are coring.

We will be getting to the Natal Valley site about 8 p.m. local time on Tuesday, so we should have cores coming in before daylight on Wednesday. You can feel the excitement start to build. Our staff scientist, Leah, has organized everybody well. The groups gave reports on their methods this morning and will turn in drafts of their methods before we get to the first site. It’s getting close!

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences atLamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

On the Surface, Feeling Further Away from the Ocean than Ever

Sampling the Barren Sea - Mon, 02/08/2016 - 11:11

By Frankie Pavia

How far is five kilometers, vertically? We leaned over the edge of the boat, staring into the water, watching the last glimmer of light from the in-situ pump disappear into the abyss. The furthest down we could see the pump was 50 meters from the surface—remarkably far to still see light anywhere in the ocean, courtesy of the life-devoid upper waters of the South Pacific.

That’s a comprehensible depth, 50 meters. It’s about the same as a 15-story building. But five kilometers? My German colleague and I could conceptualize five kilometers horizontally—the same as her bike ride to work, the same as the first ever race I ran. Neither of us could quite grasp what flipping 5 kilometers 90 degrees might mean, as our pump continued on its 3-hour vertical journey to that depth.

Ocean researcher Frankie Pavia.

Ocean researcher Frankie Pavia.

The spirit of exploration is embedded within all scientific research. It is a quest to probe and understand the unknown. But oceanographers and astronauts have something more than that—the work they do also involves the physical exploration of spaces that have yet to come under dominion of humanity. The ocean and space have not yet been rendered permanently habitable. No human lives at sea or in space without having to depend on land for survival.

I expected to conclude the cruise with a deeper connection to the ocean. I expected to feel like I had performed an act of exploration by sailing from one land mass to another, and as a result to have gained some fundamental understanding of the ocean’s spatial domain.

Yet a week after I stepped foot from the FS Sonne for good, I am left feeling like the ocean is further from my grasp than ever. Five kilometers depth, and all I did was sail across a tiny fraction of the surface. Sure, I hauled back samples from the deep, and I will certainly learn an incredible amount about it from chemical measurements. But did I explore the deep ocean? Is it possible to explore a place without actually traveling there?

I wonder how astronauts feel when they return to Earth. Just like oceanographers experience only the top of the ocean, astronauts only scratch the surface of an incomprehensibly large volume of space. Does it make them feel like a part of something greater, or does experiencing its massive scale make them feel even smaller?

While the ocean is a vast nexus of life, space is seemingly devoid of it. The ocean certainly holds clues as to how life formed on our planet, and where it may exist on distant moons in our solar system. On Mars, it is the locations of long-dessicated oceans and running water where life is thought to have been possible in the distant past. In habitability, oceans are our pluperfect, Earth is our future perfect, space is our future.

The connection between oceans and space will certainly be a source of excitement for science in the coming years. Ice-covered moons in our solar system have liquid water oceans; surely there are planets and moons orbiting stars other than ours that have them as well. How will we ever understand them if we have only seen such a small portion of the ocean’s volume on Earth?

And so we plunge onward into the indomitable vastness of the oceans, of space. I came away feeling further than ever from the oceans after this cruise. To fix that, I must keep exploring.

Frankie Pavia is a second-year graduate student studying oceanography and geochemistry at Lamont-Doherty Earth Observatory.

Setting Off for Two Months at Sea

When Oceans Leak - Wed, 02/03/2016 - 18:59
The scientists aboard the Joides Resolution for Expedition 361

The scientists of Expedition 361, including Co-Chief Scientist Sidney Hemming, will be spending the next two months aboard the Joides Resolution.

Read Sidney Hemming’s first post to learn more about the goals of her two-month research cruise off Southern Africa and its focus on the Agulhas Current and collecting climate records for the past 5 million years.

It’s almost midnight here, and we’ll be setting sail around 7 a.m. The transit will take approximately six days to the first coring site. Right now, we have the uncertainty that we may not have Mozambique clearance in time for the first intended site, so we will have to make a decision when we get to the tip of Madagascar about whether to head toward the proposed first site, or instead go to the site that would be #4, the northernmost site in South African waters. Apparently this is a normal thing that the permissions are not granted until just as the ship leaves (we hope that happens here), and in our case we have rumors that the form has been signed but it is unclear where it is.

So Kevin Grieger, our operations manager, has been calculating times for alternative plans and considering plans we may have to drop if we cannot stick with the original schedule. We may have to skip some of the operations, and we may even have to forgo a site. Our highest priority site is the sixth out of six on our geographic path, so we have to be judicious in our planning in order to ensure we get there. And it is the closest to the port in Cape Town — word is we only have eight hours in the schedule between the coring site and the port — exciting but also scary because of all the work we have to get done before getting into port.

My husband, Gary, and I had fun in Mauritius before we came to meet up with the JOIDES Resolution. Ian Hall (the other co-chief), Leah LeVay (the IODP staff scientist) and I boarded the ship on Jan. 30, and we went into Port Luis for dinner that night to meet up with a few of the scientists, Allison Franzese, Steve Barker (former Lamont post-doc), and Sophie Hines. Sophie is a Caltech graduate student who is leading the pore water sampling program for her advisor Jess Adkins (also a former Lamont post-doc) who was unable to participate in the expedition.

So we have been living on the ship since the 30th and getting ready for the cruise. That involves a lot of meetings and training. Many of the science team did not know each other before we got here, and we also did not know about the others’ research plans. The plans will evolve as we discuss potential overlaps and collaborations. And they will also change as we find out what we really are going to encounter in the cores. We are all getting to know each other and learning what each others’ interests are and trying to come up with a plan that will maximize what we can discover with the materials we will collect on this cruise. It is very different than anything I have done before, and it is exciting. I think it will be a really rewarding experience. The group seems to already have developed a good rapport, and we are all very optimistic.

Before the <i>Joides Resolution</i> leaves port, Lisa Crowder (left) and Rebecca Robinson (right) take students from a girls' school in Mauritius on a tour. Photo: Tim Fulton/IODP

Before the JOIDES Resolution leaves port, Lisa Crowder (left) and Rebecca Robinson (right) take students from a girls’ school in Mauritius on a tour. Photo: Tim Fulton/IODP

While we have been in Mauritius, the BBC picked up on our work, and twitter has been atwitter with blurbs about the cruise and people on the cruise. We have also had quite a few tours through the ship. Dick Norris (from Scripps Institution of Oceanography) and I went to a girls’ school yesterday and discussed global change and encouraged them to think about science. A small group of the girls from that school came for a tour today, and they seemed really keen and engaged. Lisa Crowder, who oversees the ship’s technicians working with core processing protocols and lab equipment, gave a really awesome show that we all enjoyed! She used the straw-in-the-milkshake analogy for coring in the ocean. It was a great visual!

It is supposed to be quite windy tomorrow, so I’m nervous about being seasick and I’m going to take my Dramamine first thing in the morning. I sure hope I am going to get my sea legs quickly!

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences at Lamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

Uncovering the Stories of Southern Africa’s Climate Past

When Oceans Leak - Wed, 01/27/2016 - 10:19
 Arnold L. Gordon.

The Agulhas Current runs along the southern coast of Africa and is influenced by other flows. Credit: Arnold L. Gordon.

I am on my way to Mauritius to spend a few days with my husband, Gary, before boarding the JOIDES Resolution in Port Louis for a two-month research cruise, IODP Expedition 361, South African Climates. This is my first cruise as co-chief scientist, so I am both excited and nervous. The goal of the cruise is to obtain climate records for the past 5 million years at six sites around southern Africa. Each has its own special focus.

As with research cruises in general, this represents the culmination of a huge effort over many years, in this case led by Rainer Zahn and my co-chief scientist, Ian Hall. Those efforts included planning workshops, site survey cruises, proposal writing and re-writing, and a lot of development of stratigraphic records and proxies of climate-sensitive factors such as temperature and salinity of surface and intermediate waters, positions of the “Agulhas Retroflection,” and evidence for deep ocean circulation.

The cruise is going to an exciting place in the global climate system. Evidence for the vigor of North Atlantic Deep Water overturning circulation (a.k.a. the Great Ocean Conveyor) can be found in the same sediment cores taken from the floor of the southern Cape Basin, off the southwest coast of South Africa, where scientists have found evidence for Agulhas “leakage” of warm, salty water from the Indian Ocean into the Atlantic Ocean.

The Agulhas Current is the strongest western boundary current in the world’s oceans. It flows along the eastern side of southern Africa, and when it reaches the tip of Africa, it is “retroflected” to flow east, parallel to the Antarctic Circumpolar Current. Changes in the Agulhas Current are coincident with climate change in Africa, and thus it may even have been an important factor in the evolution of our species in Africa. Lamont-Doherty Earth Observatory’s Arnold Gordon has made the case that leakage of salt and heat from the Agulhas Current into the Atlantic Ocean is one of the ingredients that enhances North Atlantic Deep Water overturning circulation.

On this cruise, we’re studying the current to collectively try to uncover the story of southern African climates and their connections with global ocean circulation and climate variability for the past 5 million years.

My role is to use the layers of sediment on the ocean floor that either blew in or washed in from land to contribute to understanding of rainfall and runoff, weathering on Africa, and changes in the Agulhas. (I’m working on this in collaboration with fellow sailing scientists Allison Franzese of Lamont, Margit Simon of the University of Bergen, and Ian Hall of Cardiff University, and shore-based scientist Steve Goldstein at Lamont). Questions about rainfall and runoff and weathering will be tackled in sediment cores that are near major rivers. These efforts will also serve to characterize the composition of sediments being carried in the Agulhas Current.

Fortuitously, the sources of sediments along the eastern coast of South Africa have significantly different radiogenic isotopes than those on the western side. Radiogenic isotopes are isotope systems that change due to radioactive decay of a parent isotope and thus respond to the time-aspects of geologic history. We discovered during the Ph.D. thesis of former graduate student Randye Rutberg that in RC11-83, a pretty famous sediment core from the southern Cape Basin, the land-derived sediments have a higher ratio of Strontium-87 to Strontium-86 during warmer climate intervals than during colder intervals, and the values are so high as to require an external source—that is the eastern side of Africa and carried into the Atlantic via the Agulhas leakage.

Franzese did her Ph.D. thesis on land-derived sediment evidence of changes in the Agulhas between glacial times, about 20,000 years ago, and modern times. She documented the map pattern of variability of both the sources and sediment changes, and further confirmed the role of the Agulhas Current in depositing sediments in the South Atlantic. It will be super exciting to extend the observations back to 5 million years and explore how the sources, as well as the Agulhas Current itself, may have changed.

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences at Lamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

Analysis with paprica

Chasing Microbes in Antarctica - Mon, 01/25/2016 - 12:42

paprikaThis tutorial is both a work in progress and a living document.  If you see an error, or want something added, please let me know by leaving a comment.

I’ve been making a lot of improvements to paprica, our program for conducting metabolic inference on 16S rRNA gene sequence libraries.  The following is a complete analysis example with paprica to clarify the steps described in the manual, and to highlight some of the recent improvements to the method.  I’ll continue to update this tutorial as the method evolves.  This tutorial assumes that you have all the dependencies for paprica_run.sh installed and in your PATH.  If you’re a Mac user you can follow the instructions here.  If you’re a Linux user (including Windows users running Linux in a VirtualBox) installation is a bit simpler, just follow the instructions in the manual.  The current list of dependencies are:

  1.  Infernal (including Easel)
  2.  pplacer (including Guppy)
  3. Python version 2.7 (I strongly recommend using Anacondas)
  4. Seqmagick

It also assumes that you have the following Python modules installed:

  1.  Pandas
  2.  Joblib
  3.  Biopython

Finally, it assumes that you are using the provided database of metabolic pathways and genome data included in the ref_genome_database directory.  At some point in the future I’ll publish a tutorial describing how to use paprica_build.sh to build a custom database.  All the dependencies are installed and tested?  Before we start let’s get familiar with some terminology.

community structure: The taxonomic structure of a bacterial assemblage.

edge: An edge is a point of placement on a reference tree.  Think of it as a branch of the reference tree.  Edges take the place of OTUs in this workflow, and are ultimately far more interesting and informative than OTUs.  Refer to the pplacer documentation for more.

metabolic structure: The abundance of different metabolic pathways within a bacterial assemblage.

reference tree: This is the tree of representative 16S rRNA gene sequences from all completed Bacterial genomes in Genbank.  The topology of the tree defines what pathways are predicted for internal branch points.

Now let’s get paprica.  There are two options here.  Option 1 is to use git to download the development version.  The development version has the most recent bug fixes and features, but has not been fully tested.  That means that it’s passed a cursory test of paprica_build.sh and paprica_run.sh on my development system (a Linux workstation), but I haven’t yet validated paprica_run.sh on an independent machine (a Linux VirtualBox running on my Windows laptop).  The development version can be downloaded and made executable with:

git clone https://github.com/bowmanjeffs/paprica.git cd paprica chmod a+x *sh

Option 2 is to download the last stable version.  In this case stable means that paprica_build.sh and paprica_run.sh successfully built on the development system, paprica_run.sh successfully ran on a virtual box, and that paprica_build.sh and paprica_run.sh both successfully ran a second time from the home directory of the development machine.  The database produced by this run is the one that can be found in the latest stable version.  Downloading and making the latest stable version executable is the recommended way of getting paprica, but requires a couple of additional steps.  For the current stable release (v0.22):

wget https://github.com/bowmanjeffs/paprica/archive/paprica_v0.22.tar.gz tar -xzvf paprica_v0.22.tar.gz mv paprica-paprica_v0.22 paprica cd paprica chmod a+x *sh

Now using your text editor of choice (I recommend nano) you should open the file titled paprica_profile.txt.  This file will look something like:

## Variables necessary for the scripts associated with paprica_run.sh and paprica_build.sh. # This is the location of the reference directory. ref_dir=~/paprica/ref_genome_database/ # This is the location of the covariance model used by Infernal. cm=~/paprica/bacterial_ssu.cm ## Variables associated with paprica_build.sh only. # This is the number of cpus RAxML will use. See RAxML manual for guidance. cpus=8 # This is the location where you want your pgdbs placed. This should match # what you told pathway-tools, or set to pathway-tools default location if # you didn't specify anything. pgdb_dir=~/ptools-local/pgdbs/user/ ## Variables associated with paprica_run.sh only. # The fraction of terminal daughters that need to have a pathway for it # to be included in an internal node. cutoff=0.5

For the purposes of this tutorial, and a basic analysis with paprica, we are only concerned with the ref_dir, cm, and cutoff variables.  The ref_dir variable is the location of the reference database.  If you downloaded paprica to your home directory, and you only intend to use paprica_run.sh, you shouldn’t need to change it.  Ditto for cm, which is the location of the covariance model used by Infernal.  The cutoff variable specifies in what fraction of genomes in a given clade a metabolic pathway needs to appear in to be assigned to a read placed to that clade.  In practice 50 % works well, but you may wish to be more or less conservative depending on your objectives.  If you want to change it simply edit that value to be whatever you want.

Now go ahead and make sure that things are working right by executing paprica_run.sh using the provided file test.fasta.  From the paprica directory:

./paprica_run.sh test

This will produce a variety of output files in the paprica directory:

ls test* test.clean.align.sto test.clean.fasta test.combined_16S.tax.clean.align.csv test.combined_16S.tax.clean.align.fasta test.combined_16S.tax.clean.align.jplace test.combined_16S.tax.clean.align.phyloxml test.combined_16S.tax.clean.align.sto test.edge_data.csv test.fasta test.pathways.csv test.sample_data.txt test.sum_pathways.csv

Each sample fasta file that you run will produce similar output, with the following being particularly useful to you:

test.combined_16S.tax.clean.align.jplace: This is a file produced by pplacer that contains the placement information for your sample.  You can do all kinds of interesting things with sets of jplace files using Guppy.  Refer to the Guppy documentation for my details.

test.combined_16S.tax.clean.align.phyloxml: This is a “fat” style tree showing your the placements of your query on the reference tree.  You can view this tree using Archaeopteryx.

test.edge_data.csv: This is a csv format file containing data on edge location in the reference tree that received a placement, such as the number of reads that placed, predicted 16S rRNA gene copies, number of reads placed normalized to 16S rRNA gene copies, GC content, etc.  This file describes the taxonomic structure of your sample.

test.pathways.csv: This is a csv file of all the metabolic pathways inferred for test.fasta, by placement.  All possible metabolic pathways are listed, the number attributed to each edge is given in the column for that edge.

test.sample_data.txt: This file described some basic information for the sample, such as the database version that was used to make the metabolic inference, the confidence score, total reads used, etc.

test.sum_pathways.csv: This csv format file describes the metabolic structure of the sample, i.e. pathway abundance across all edges.

Okay, that was all well and good for the test.fasta file, which has placements only for a single edge and is not particularly exciting.  Let’s try something a bit more advanced.  Create a directory for some new analysis on your home directory and migrate the necessary paprica files to it:

cd ~ mkdir my_analysis cp paprica/paprica_run.sh my_analysis cp paprica/paprica_profile.txt my_analysis cp paprica/paprica_place_it.py my_analysis cp paprica/paprica_tally_pathways.py my_analysis

Now add some fasta files for this tutorial.  The two fasta files are from Luria et al. 2014 and Bowman and Ducklow, 2015.  They are a summer and winter surface sample from the same location along the West Antarctic Peninsula.  I recommend wget for downloads of this sort, if you don’t have it, and don’t want to install it for some reason, use curl.

cd my_analysis wget http://www.polarmicrobes.org/extras/summer.fasta wget http://www.polarmicrobes.org/extras/winter.fasta

We’re cheating a bit here, because these samples have already been QC’d.  That means I’ve trimmed for quality and removed low quality reads, removed chimeras, and identified and removed mitochondria, chloroplasts, and anything that didn’t look like it belonged to the domain Bacteria.  I used Mothur for all of these tasks, but you may wish to use other tools.

Run time may be a concern for you if you have many query files to run, and/or they are particularly large.  The rate limiting step in paprica is pplacer.  We can speed pplacer up by telling paprica_place_it.py to split the query fasta into several pieces that pplacer will run in parallel.  Be careful of memory useage!  pplacer creates two threads automatically when it runs, and each thread uses about 4 Gb of memory.  So if you’re system has only 2 cpus and 8 Gb of memory don’t use this option!  If you’re system has 32 Gb of RAM I’d recommend 3 splits, so that you don’t max things out.

While we’re modifying run parameters let’s make one additional change.  The two provided files have already been subsampled so that they have equal numbers of reads (1,977).  We can check this with:

grep -c '>' *fasta summer.fasta:1977 winter.fasta:1977

But suppose this wasn’t the case?  It’s generally a good idea to subsample your reads to the size of the smallest library so that you are viewing diversity evenly across samples.  You can get paprica to do this for you by specifying the number of reads paprica_place_it.py should use.

To specify the number of splits and the number of reads edit the paprica_place_it.py flags in paprica_run.sh:

## default line #python paprica_place_it.py -query $query -ref combined_16S.tax -splits 1 ## new line python paprica_place_it.py -query $query -ref combined_16S.tax -splits 3 -n 1000

This will cause paprica to subsample the query file (by random selection) to 1000 reads, and split the subsampled file into three query files that will be run in parallel. The parallel run is blind to you, the output should be identical to a run with no splits (-splits 1). If you use subsampling you’ll also need to change the paprica_tally_pathways.py line, as the input file name will be slightly different.

## default line python paprica_tally_pathways.py -i $query.sub.combined_16S.tax.clean.align.csv -o $query ## new line python paprica_tally_pathways.py -i $query.combined_16S.tax.clean.align.csv -o $query

Here we are only analyzing two samples, so running them manually isn’t too much of a pain. But you might have tens or hundreds of samples, and need a way to automate that. We do this with a simple loop. I recommend generating a file with the prefixes of all your query files and using that in the loop. For example the file samples.txt might have:

summer winter

This file can be inserted into a loop as:

while read f;do ./paprica_run.sh $f done < samples.txt

Note that we don’t run them in parallel using say, gnu parallel, because Infernal is intrinsically parallelized, and we already forced pplacer to run in parallel using -splits.

Once you’ve executed the loop you’ll see all the normal paprica output, for both samples.  It’s useful to concatenate some of this information for downstream analysis.   The provided utility combine_edge_results.py can do this for you.  Copy it to your working directory:

cp ~/paprica/utilities/combine_edge_results.py combine_edge_results.py

This script will automatically aggregate everything with the suffix .edge_data.csv.  You need to specify a prefix for the output files.

python combine_edge_results.py my_analysis

This produces two files:

my_analysis.edge_data.csv: This is the mean genome parameters for each sample.  Lots of good stuff in here, see the column labels.

my_analysis.edge_tally.csv: Edge abundance for each sample (corrected for 16S rRNA gene copy).  This is your community structure, and is equivalent to an OTU table (but much better!).

To be continued…

Installing paprica on Mac OSX

Chasing Microbes in Antarctica - Wed, 01/20/2016 - 09:55

The following is a paprica installation tutorial for novice users on Mac OSX (installation is Linux is quite a bit simpler). If you’re comfortable editing your PATH and installing things using bash you probably don’t need to follow this tutorial, just follow the instructions in the manual. If command line operations stress you out, and you haven’t dealt with a lot of weird bioinformatics program installs, use this tutorial.

Please note that this tutorial is a work in progress.  If you notice errors, inconsistencies, or omissions please leave a comment and I’ll be sure to correct them.

paprica is 90 % an elaborate wrapper script (or set of scripts) for several core programs written by other groups. The scripts that execute the pipeline are bash scripts, the scripts that do that actual work are Python. Therefor you need to get Python up and running on your system. The version that came with your system won’t suffice without heavy modification. Best to use a free third-party distro like Anaconda (preferred) or Canopy.  If you already have a mainstream v2.7 Python distro going just make sure that the biopython, joblib, and pandas modules are installed and you’re good to go.

If not please download the Anaconda distro and install it following the developer’s instructions. Allow the installer to modify your PATH variable. Once the installation is complete update it by executing:

conda update conda conda update --all

Then you’ll need to install biopython, joblib, and pandas:

conda install biopython conda install joblib conda install pandas

In case you have conflicts with other Python installations, or some other mysterious problems, it’s a good idea to test things out at this point. Open a shell, type “Python”, and you should get a welcome message that specifies Anaconda as your distro. Type:

import Bio import joblib import pandas

If you get any error messages something somewhere is wrong. Burn some incense and try again. If that doesn’t work try holy water.

One challenge with paprica on OSX has to do with the excellent program pplacer. The pplacer binary for Darwin needs the Gnu Scientific Library (GSL), specifically v1.6 (at the time of writing). You can try to compile this from source, but I’ve had trouble getting this to work on OSX. The easier option is to use a package manager, preferably Homebrew. This means however, that you have to marry one of the OSX package managers and never look back. Fink, Macports, and Homebrew will all get you a working version of GSL. I recommend using Homebrew.

To download Homebrew (assuming you don’t already have it) type:

ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

Follow the on-screen instructions. Once it is downloaded type:

brew install GSL

This should install the Gnu Scientific Library v1.6.

Assuming all that went okay go ahead and download the software you need to execute just the paprica_run.sh portion of paprica. First, the excellent aligner Infernal. From your home directory:

curl -O http://selab.janelia.org/software/infernal/infernal-1.1.1-macosx-intel.tar.gz tar -xzvf infernal-1.1.1-macosx-intel.tar.gz mv infernal-1.1.1-macosx-intel infernal

Then pplacer, which also includes Guppy:

curl -O https://github.com/matsen/pplacer/releases/download/v1.1.alpha17/pplacer-Darwin-v1.1.alpha17.zip unzip pplacer-Darwin-v1.1.alpha17.zip mv pplacer-Darwin-v1.1.alpha17 pplacer

Now comes the tricky bit, you need to add the locations of the executables for these programs to your PATH variable. Don’t screw this up. It isn’t hard to undo screw-ups, but it will freak you out. Before you continue please read the excellent summary of shell startup scripts as they pertain to OSX here:

http://hayne.net/MacDev/Notes/unixFAQ.html#shellStartup

Assuming that you are new to the command line, and did not have a .bash_profile or .profile file already, the Anaconda install would have created .profile and added it’s executables to your path. From your home directory type:

nano .profile

Navigate to the end of the file and type:

export PATH=/Users/your-user-name/infernal/binaries:/Users/you-user-name/pplacer:${PATH}

Don’t be the guy or gal who types your-user-name. Replace with your actual user name. Hit ctrl-o to write out the file, and ctrl-x to exit nano. Re-source .profile by typing:

source .profile

Confirm that you can execute the following programs by navigating to your home directory and executing each of the following commands:

cmalign esl-alimerge pplacer guppy

You should get an error message that is clearly from the program, not a bash error like “command not found”.

Now you need to install the final dependency, Seqmagick. Confirm the most current stable release by going to Github, then download it:

curl -O https://github.com/fhcrc/seqmagick/archive/0.6.1.tar.gz tar -xzvf 0.6.1 cd 0.6.1 python setup.py install

Check the installation by typing:

seqmagick mogrify

You should get a sensible error that is clearly seqmagick yelling at you.

Okay, now you are ready to download paprica and do some analysis! Download the latest stable version of paprica (don’t just blindly download, please check Github for the latest stable release):

curl -O https://github.com/bowmanjeffs/paprica/archive/paprica_v0.23.tar.gz tar -xzvf https://github.com/bowmanjeffs/paprica/archive/paprica_v0.23.tar.gz mv paprica-paprica_v0.23 paprica

Now you need to make paprica_run.sh executable

cd paprica chmod a+x paprica_run.sh

At this point you should be ready to rock. Take a deep breath and type:

./paprica_run.sh test

You should see a lot of output flash by on the screen, and you should find that the files test.pathways.csv, test.edge_data.csv, test.sample_data.txt, and test.sum_pathways.txt in your directory. These are the primary output files from paprica. The other files of interest are the Guppy output files test.combined_16S.tax.clean.align.phyloxml and test.combined_16S.tax.clean.align.jplace. Check out the Guppy documentation for the many things you can do with jplace files. The phyloxml file is an edge fattened tree of the query placements on the reference tree. It can be viewed using Archaeopteryx or another phyloxml capable tree viewer.

To run your own analysis, say on amazing_sample.fasta, simply type:

./paprica_run.sh amazing_sample

Please, please, please, read the manual (included in the paprica download) for further details, such as how to greatly decrease the run time on large fasta files, and how to sub-sample your input fasta. Remember that the fasta file you input should contain only reads you are reasonably sure come from bacteria (an archaeal version is a long term goal), and they should be properly QC’d (i.e. low quality ends and adapters and barcodes and such trimmed away).

Pages

 

Subscribe to Lamont-Doherty Earth Observatory aggregator