News aggregator

Sea Level Rise Could Put NYC's Proposed Transit System Under Water - Vice News

Featured News - Fri, 02/19/2016 - 09:31
Lamont geologist Klaus Jacob says that while the proposed Brooklyn-Queens Connector project solves desperate transportation needs, the problem is that it runs along current and future flood zones.

Can Germany's Renewable Energy Revolution Be Replicated in the US? - Bulletin of the Atomic Scientists

Featured News - Thu, 02/18/2016 - 12:00
As governments around the world invest in new energy policies and climate strategies, none has gone as far as Germany. Could the model be replicated? Lamont adjunct research scientist Beate Liepert explores the possibilities.

How the Climate Challenge Could Derail a Brilliant Human Destiny - New York Times

Featured News - Mon, 02/15/2016 - 12:00
A conversation on the importance of sustained engagement on a big challenge, whether intellectual, as in revealing spacetime ripples, or potentially existential, as in pursuing ways to move beyond energy choices that are reshaping Earth for hundreds of generations to come. Cites Lamont's review and research by a group that included Lamont Adjunct Senior Research Scientist Anders Levermann.

6 Million Years of Sediment, Studded with Tiny Fossils

When Oceans Leak - Fri, 02/12/2016 - 21:25
Jeroen van der Lubbe examines the first core brought up by the team aboard the Joides Resolution.

Jeroen van der Lubbe examines the first sediment core of Expedition 361 brought up by the team aboard the JOIDES Resolution. Photo: Sidney Hemming

Read Sidney Hemming’s first post to learn more about the goals of her two-month research cruise off southern Africa and its focus on the Agulhas Current and collecting climate records for the past 5 million years.

We have our first core! The team pulled up 254.1 meters of sediment from the Natal Valley site off southern Africa, near the start of the Agulhas Current. We think we have about 6 million years of history in that core that should be able to tell us details about the how the region’s currents and climate changed through time.

The whole core (IODP 361 U1474A) is actually several cores. Each is 9.5 meters long and is cut into sections that are 1.5 meters for analysis and storage. The top of each of those cores was jostled by the coring process, but we hope we will be able to fill the gaps with cores from the two other holes we plan to drill at the Natal Valley site. It has become standard in the International Ocean Discovery Program’s (IODP) paleoceanography drilling to core three holes, deliberately offsetting the gaps, in order to have a full record from the site. This presents a challenge for our stratigraphic correlators, Steve Barker and Chris Charles, but they are definitely up for it. They spent the time in port and during transit working hard to master the software.  I understand it is powerful, but not easy to work with (a sign on one of their computers says: “exercise extreme patience”).

 Sidney Hemming

Steve Barker’s computer used for correlations comes with a warning: “Caution: exercise extreme patience.” Photo: Sidney Hemming

Seeing our first core come up was very exciting. As it was happening, we were getting age estimates in real time from the paleontologists, and not too much later from the paleomagnetists. The paleomagnetic measurements appear robust, and they show several magnetic reversals, so we can use the known magnetic reversal time scale to help date each part of the core. The foraminifera and calcareous nannofossil species changes in the core catchers (more on this below) are providing a similar estimate of the ages as the paleomagnetics. This is so exciting to watch in real time, I keep thinking about whether there is a way we could reenact this in a classroom setting.

Now we are on the second hole. Hole B is primarily being taken for squeezing out pore waters – this is the water captured in pores of the deep sediment. We hope it can tell us about the salinity and oxygen isotope composition of the water long ago. The oxygen isotope composition of glacial water is important for understanding how much ice there was, as well as the temperature of the deep ocean. Sophie Hines, a graduate student working with Jess Adkins at Caltech, is leading the effort. She and the other four geochemists, including Lamont’s Allison Franzese, are working hard to get the pore waters squeezed. It is a tough operation, and the presses are not always as cooperative as one might wish. The geochemists decorated the most cantankerous press with a photo of Jess, who wrote the proposal for the pore waters project, to ensure they remember whom to blame.

I mentioned the foraminifera and calcareous nannofossils (calcareous means they build their shells with calcium carbonate) that were found in the core catchers. At the bottom of the core barrel – remember Lisa’s analogy to the straw in the milk shake – there is a mechanism that is open when the barrel is going down into the mud. When the core barrel has penetrated into the sediment and the rig starts to pull it up, that mechanism snaps shut, thereby catching the core and keeping it from falling back out, hence the term core catcher. Some of the sediment goes into the core catcher, and it’s a bit messy, so it is taken immediately and used for examining the microfossil content. Sometimes it’s disappointing, but in this case, the microfossils except for the siliceous ones are working out really well. More about the siliceous ones in another post.

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences at Lamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

Court Ruling on Clean Power Plan a Setback, But…

The 2015 Paris Climate Summit - Wed, 02/10/2016 - 16:47
 Wikimedia Commons

Big Bend Power Station, a coal-fired plant, near Apollo Beach, Fla. Photo: Wikimedia Commons

The U.S. Supreme Court this week put a hold on one of the key programs in the United States’ efforts to control CO2 emissions and combat global warming. The decision puts aside new regulations to control emissions from power plants until a challenge from more than two dozen states is resolved in federal appeals court.

The court’s 5-4 decision to postpone implementation of the Clean Power Plan represents a clear setback for the Obama administration’s efforts to combat climate change; but the damage to the U.S. ability to meet pledges it made at the Paris climate summit in December “is less than it might seem,” says Michael Gerrard in a commentary posted on the Sabin Center for Climate Change Law’s website.

“That is not because the Clean Power Plan wasn’t important; it is because the plan didn’t do nearly enough,” says Gerrard, director of the Sabin Center.

Gerrard notes that the plan’s emissions reductions won’t begin until 2022, meaning they won’t play a role in meeting the nation’s stated goal of reducing carbon emissions by 17 percent by 2020. Even beyond that date, the plan alone won’t be enough to meet the goal of reducing emissions by 26 to 28 percent by 2025. That, and future reductions, will depend on many other measures. Those would include higher efficiency standards for buildings and appliances and greater efforts to reduce energy consumption in the industrial and transportation sectors.

“In sum,” Gerrard writes, “while the Clean Power Plan is the biggest game in town in terms of achieving the Paris goals, it is by no means the only game in town. While we express our justifiable fury over the Supreme Court’s action, we need to bear in mind that there are many other things that the U.S. must do in the next several years to control greenhouse gas emissions.”

You can read the full commentary at the Sabin Center’s website.

For more on the court’s ruling:

Gearing Up for Our First Cores

When Oceans Leak - Mon, 02/08/2016 - 14:56
Bubba Attryde, a core technician, shows scientists on the <i>Joides Resolution</i> some of the ship's drilling tools. Tim Fulton/IODP

Bubba Attryde, a core technician, shows scientists on the JOIDES Resolution some of the ship’s drilling tools. Tim Fulton/IODP

Read Sidney Hemming’s first post to learn more about the goals of her two-month research cruise off southern Africa and its focus on the Agulhas Current and collecting climate records for the past 5 million years.

Our first day on the ocean was pretty rough. We left the harbor in Mauritius into high winds and choppy seas, and I don’t think I was alone in feeling pretty miserable.  I woke up the next day to calm seas and a much better perspective.

We have been busy with meetings, training sessions, and planning for the core flow, and I think people are getting close to being ready for the 12-hour shifts. My shift is 3 p.m. to 3 a.m., and my co-chief scientist Ian Hall’s is the opposite. It works out pretty well relative to our home clocks (when I start my shift, it’s 8 a.m. back in New York), and we’ll have significant overlap. I plan to get started by noon, and Ian will hang around until 6 or so before going to bed. We have decided we’ll take a break for exercise—should be a good strategy.

The staff is wonderful on the ship. They feed me great meals, and there is even an espresso machine right outside the science office where I sit. Today Kevin Grieger, our operations manager, gave us a tour to the bridge, the drilling rig and the core shack, where we met Bubba Attryde, who has been the core specialist since Glomar Challenger days and continues to make innovations. We went down through the motors and pumps, past the moon pool, and out to the JOIDES Resolution‘s helideck.

The helideck has a special role this cruise. On March 26, Ian Hall and Steve Barker will be running in the IAAF/Cardiff University World Half Marathon Championships. It requires 328 laps around the deck, which is noisy and hot. They are doing it to raise money for a small South African charity called the Goedgedacht Trust, which promotes education to help poor rural African children escape grinding poverty. Ian has learned that the money raised will help bring solar power to schools. When we reach Cape Town, some of the children plan to tour the ship.

It is now official that we will start with the Natal Valley site while we wait for clearance from Mozambique to work on what would have been our first site.

The Natal Valley is at the beginning of the Agulhas Current, where the waters flowing through the Mozambique Channel and the East Madagascar Current come together and flow along the southern Africa coast. A central goal of the expedition is to understand the history of the Agulhas Current and its role in climate variability, and this site could help us characterize how the microorganisms and the land-derived sediments it carries have changed over the last 5 million years.

Recently published evidence from the past 270,000 years from very close to the Natal Valley site also shows that there have been significant changes in rainfall in southern Africa on millennial time scales. We are very interested in getting a longer record of rainfall changes with this expedition. So in effect, we have the dual goals of understanding the nearby climate record from Africa and understanding the ocean currents below which the core is located—both the Agulhas Current and deep water circulation, which currently flows north along the western Natal Valley and is the reason for the sediment “contourite” accumulation that we are coring.

We will be getting to the Natal Valley site about 8 p.m. local time on Tuesday, so we should have cores coming in before daylight on Wednesday. You can feel the excitement start to build. Our staff scientist, Leah, has organized everybody well. The groups gave reports on their methods this morning and will turn in drafts of their methods before we get to the first site. It’s getting close!

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences atLamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

On the Surface, Feeling Further Away from the Ocean than Ever

Sampling the Barren Sea - Mon, 02/08/2016 - 11:11

By Frankie Pavia

How far is five kilometers, vertically? We leaned over the edge of the boat, staring into the water, watching the last glimmer of light from the in-situ pump disappear into the abyss. The furthest down we could see the pump was 50 meters from the surface—remarkably far to still see light anywhere in the ocean, courtesy of the life-devoid upper waters of the South Pacific.

That’s a comprehensible depth, 50 meters. It’s about the same as a 15-story building. But five kilometers? My German colleague and I could conceptualize five kilometers horizontally—the same as her bike ride to work, the same as the first ever race I ran. Neither of us could quite grasp what flipping 5 kilometers 90 degrees might mean, as our pump continued on its 3-hour vertical journey to that depth.

Ocean researcher Frankie Pavia.

Ocean researcher Frankie Pavia.

The spirit of exploration is embedded within all scientific research. It is a quest to probe and understand the unknown. But oceanographers and astronauts have something more than that—the work they do also involves the physical exploration of spaces that have yet to come under dominion of humanity. The ocean and space have not yet been rendered permanently habitable. No human lives at sea or in space without having to depend on land for survival.

I expected to conclude the cruise with a deeper connection to the ocean. I expected to feel like I had performed an act of exploration by sailing from one land mass to another, and as a result to have gained some fundamental understanding of the ocean’s spatial domain.

Yet a week after I stepped foot from the FS Sonne for good, I am left feeling like the ocean is further from my grasp than ever. Five kilometers depth, and all I did was sail across a tiny fraction of the surface. Sure, I hauled back samples from the deep, and I will certainly learn an incredible amount about it from chemical measurements. But did I explore the deep ocean? Is it possible to explore a place without actually traveling there?

I wonder how astronauts feel when they return to Earth. Just like oceanographers experience only the top of the ocean, astronauts only scratch the surface of an incomprehensibly large volume of space. Does it make them feel like a part of something greater, or does experiencing its massive scale make them feel even smaller?

While the ocean is a vast nexus of life, space is seemingly devoid of it. The ocean certainly holds clues as to how life formed on our planet, and where it may exist on distant moons in our solar system. On Mars, it is the locations of long-dessicated oceans and running water where life is thought to have been possible in the distant past. In habitability, oceans are our pluperfect, Earth is our future perfect, space is our future.

The connection between oceans and space will certainly be a source of excitement for science in the coming years. Ice-covered moons in our solar system have liquid water oceans; surely there are planets and moons orbiting stars other than ours that have them as well. How will we ever understand them if we have only seen such a small portion of the ocean’s volume on Earth?

And so we plunge onward into the indomitable vastness of the oceans, of space. I came away feeling further than ever from the oceans after this cruise. To fix that, I must keep exploring.

Frankie Pavia is a second-year graduate student studying oceanography and geochemistry at Lamont-Doherty Earth Observatory.

Setting Off for Two Months at Sea

When Oceans Leak - Wed, 02/03/2016 - 18:59
The scientists aboard the Joides Resolution for Expedition 361

The scientists of Expedition 361, including Co-Chief Scientist Sidney Hemming, will be spending the next two months aboard the Joides Resolution.

Read Sidney Hemming’s first post to learn more about the goals of her two-month research cruise off Southern Africa and its focus on the Agulhas Current and collecting climate records for the past 5 million years.

It’s almost midnight here, and we’ll be setting sail around 7 a.m. The transit will take approximately six days to the first coring site. Right now, we have the uncertainty that we may not have Mozambique clearance in time for the first intended site, so we will have to make a decision when we get to the tip of Madagascar about whether to head toward the proposed first site, or instead go to the site that would be #4, the northernmost site in South African waters. Apparently this is a normal thing that the permissions are not granted until just as the ship leaves (we hope that happens here), and in our case we have rumors that the form has been signed but it is unclear where it is.

So Kevin Grieger, our operations manager, has been calculating times for alternative plans and considering plans we may have to drop if we cannot stick with the original schedule. We may have to skip some of the operations, and we may even have to forgo a site. Our highest priority site is the sixth out of six on our geographic path, so we have to be judicious in our planning in order to ensure we get there. And it is the closest to the port in Cape Town — word is we only have eight hours in the schedule between the coring site and the port — exciting but also scary because of all the work we have to get done before getting into port.

My husband, Gary, and I had fun in Mauritius before we came to meet up with the JOIDES Resolution. Ian Hall (the other co-chief), Leah LeVay (the IODP staff scientist) and I boarded the ship on Jan. 30, and we went into Port Luis for dinner that night to meet up with a few of the scientists, Allison Franzese, Steve Barker (former Lamont post-doc), and Sophie Hines. Sophie is a Caltech graduate student who is leading the pore water sampling program for her advisor Jess Adkins (also a former Lamont post-doc) who was unable to participate in the expedition.

So we have been living on the ship since the 30th and getting ready for the cruise. That involves a lot of meetings and training. Many of the science team did not know each other before we got here, and we also did not know about the others’ research plans. The plans will evolve as we discuss potential overlaps and collaborations. And they will also change as we find out what we really are going to encounter in the cores. We are all getting to know each other and learning what each others’ interests are and trying to come up with a plan that will maximize what we can discover with the materials we will collect on this cruise. It is very different than anything I have done before, and it is exciting. I think it will be a really rewarding experience. The group seems to already have developed a good rapport, and we are all very optimistic.

Before the <i>Joides Resolution</i> leaves port, Lisa Crowder (left) and Rebecca Robinson (right) take students from a girls' school in Mauritius on a tour. Photo: Tim Fulton/IODP

Before the JOIDES Resolution leaves port, Lisa Crowder (left) and Rebecca Robinson (right) take students from a girls’ school in Mauritius on a tour. Photo: Tim Fulton/IODP

While we have been in Mauritius, the BBC picked up on our work, and twitter has been atwitter with blurbs about the cruise and people on the cruise. We have also had quite a few tours through the ship. Dick Norris (from Scripps Institution of Oceanography) and I went to a girls’ school yesterday and discussed global change and encouraged them to think about science. A small group of the girls from that school came for a tour today, and they seemed really keen and engaged. Lisa Crowder, who oversees the ship’s technicians working with core processing protocols and lab equipment, gave a really awesome show that we all enjoyed! She used the straw-in-the-milkshake analogy for coring in the ocean. It was a great visual!

It is supposed to be quite windy tomorrow, so I’m nervous about being seasick and I’m going to take my Dramamine first thing in the morning. I sure hope I am going to get my sea legs quickly!

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences at Lamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

Uncovering the Stories of Southern Africa’s Climate Past

When Oceans Leak - Wed, 01/27/2016 - 10:19
 Arnold L. Gordon.

The Agulhas Current runs along the southern coast of Africa and is influenced by other flows. Credit: Arnold L. Gordon.

I am on my way to Mauritius to spend a few days with my husband, Gary, before boarding the JOIDES Resolution in Port Louis for a two-month research cruise, IODP Expedition 361, South African Climates. This is my first cruise as co-chief scientist, so I am both excited and nervous. The goal of the cruise is to obtain climate records for the past 5 million years at six sites around southern Africa. Each has its own special focus.

As with research cruises in general, this represents the culmination of a huge effort over many years, in this case led by Rainer Zahn and my co-chief scientist, Ian Hall. Those efforts included planning workshops, site survey cruises, proposal writing and re-writing, and a lot of development of stratigraphic records and proxies of climate-sensitive factors such as temperature and salinity of surface and intermediate waters, positions of the “Agulhas Retroflection,” and evidence for deep ocean circulation.

The cruise is going to an exciting place in the global climate system. Evidence for the vigor of North Atlantic Deep Water overturning circulation (a.k.a. the Great Ocean Conveyor) can be found in the same sediment cores taken from the floor of the southern Cape Basin, off the southwest coast of South Africa, where scientists have found evidence for Agulhas “leakage” of warm, salty water from the Indian Ocean into the Atlantic Ocean.

The Agulhas Current is the strongest western boundary current in the world’s oceans. It flows along the eastern side of southern Africa, and when it reaches the tip of Africa, it is “retroflected” to flow east, parallel to the Antarctic Circumpolar Current. Changes in the Agulhas Current are coincident with climate change in Africa, and thus it may even have been an important factor in the evolution of our species in Africa. Lamont-Doherty Earth Observatory’s Arnold Gordon has made the case that leakage of salt and heat from the Agulhas Current into the Atlantic Ocean is one of the ingredients that enhances North Atlantic Deep Water overturning circulation.

On this cruise, we’re studying the current to collectively try to uncover the story of southern African climates and their connections with global ocean circulation and climate variability for the past 5 million years.

My role is to use the layers of sediment on the ocean floor that either blew in or washed in from land to contribute to understanding of rainfall and runoff, weathering on Africa, and changes in the Agulhas. (I’m working on this in collaboration with fellow sailing scientists Allison Franzese of Lamont, Margit Simon of the University of Bergen, and Ian Hall of Cardiff University, and shore-based scientist Steve Goldstein at Lamont). Questions about rainfall and runoff and weathering will be tackled in sediment cores that are near major rivers. These efforts will also serve to characterize the composition of sediments being carried in the Agulhas Current.

Fortuitously, the sources of sediments along the eastern coast of South Africa have significantly different radiogenic isotopes than those on the western side. Radiogenic isotopes are isotope systems that change due to radioactive decay of a parent isotope and thus respond to the time-aspects of geologic history. We discovered during the Ph.D. thesis of former graduate student Randye Rutberg that in RC11-83, a pretty famous sediment core from the southern Cape Basin, the land-derived sediments have a higher ratio of Strontium-87 to Strontium-86 during warmer climate intervals than during colder intervals, and the values are so high as to require an external source—that is the eastern side of Africa and carried into the Atlantic via the Agulhas leakage.

Franzese did her Ph.D. thesis on land-derived sediment evidence of changes in the Agulhas between glacial times, about 20,000 years ago, and modern times. She documented the map pattern of variability of both the sources and sediment changes, and further confirmed the role of the Agulhas Current in depositing sediments in the South Atlantic. It will be super exciting to extend the observations back to 5 million years and explore how the sources, as well as the Agulhas Current itself, may have changed.

Sidney Hemming is a geochemist and professor of Earth and Environmental Sciences at Lamont-Doherty Earth Observatory. She uses the records in sediments and sedimentary rocks to document aspects of Earth’s history.

Analysis with paprica

Chasing Microbes in Antarctica - Mon, 01/25/2016 - 12:42

paprikaThis tutorial is both a work in progress and a living document.  If you see an error, or want something added, please let me know by leaving a comment.

I’ve been making a lot of improvements to paprica, our program for conducting metabolic inference on 16S rRNA gene sequence libraries.  The following is a complete analysis example with paprica to clarify the steps described in the manual, and to highlight some of the recent improvements to the method.  I’ll continue to update this tutorial as the method evolves.  This tutorial assumes that you have all the dependencies for paprica_run.sh installed and in your PATH.  If you’re a Mac user you can follow the instructions here.  If you’re a Linux user (including Windows users running Linux in a VirtualBox) installation is a bit simpler, just follow the instructions in the manual.  The current list of dependencies are:

  1.  Infernal (including Easel)
  2.  pplacer (including Guppy)
  3. Python version 2.7 (I strongly recommend using Anacondas)
  4. Seqmagick

It also assumes that you have the following Python modules installed:

  1.  Pandas
  2.  Joblib
  3.  Biopython

Finally, it assumes that you are using the provided database of metabolic pathways and genome data included in the ref_genome_database directory.  At some point in the future I’ll publish a tutorial describing how to use paprica_build.sh to build a custom database.  All the dependencies are installed and tested?  Before we start let’s get familiar with some terminology.

community structure: The taxonomic structure of a bacterial assemblage.

edge: An edge is a point of placement on a reference tree.  Think of it as a branch of the reference tree.  Edges take the place of OTUs in this workflow, and are ultimately far more interesting and informative than OTUs.  Refer to the pplacer documentation for more.

metabolic structure: The abundance of different metabolic pathways within a bacterial assemblage.

reference tree: This is the tree of representative 16S rRNA gene sequences from all completed Bacterial genomes in Genbank.  The topology of the tree defines what pathways are predicted for internal branch points.

Now let’s get paprica.  There are two options here.  Option 1 is to use git to download the development version.  The development version has the most recent bug fixes and features, but has not been fully tested.  That means that it’s passed a cursory test of paprica_build.sh and paprica_run.sh on my development system (a Linux workstation), but I haven’t yet validated paprica_run.sh on an independent machine (a Linux VirtualBox running on my Windows laptop).  The development version can be downloaded and made executable with:

git clone https://github.com/bowmanjeffs/paprica.git cd paprica chmod a+x *sh

Option 2 is to download the last stable version.  In this case stable means that paprica_build.sh and paprica_run.sh successfully built on the development system, paprica_run.sh successfully ran on a virtual box, and that paprica_build.sh and paprica_run.sh both successfully ran a second time from the home directory of the development machine.  The database produced by this run is the one that can be found in the latest stable version.  Downloading and making the latest stable version executable is the recommended way of getting paprica, but requires a couple of additional steps.  For the current stable release (v0.22):

wget https://github.com/bowmanjeffs/paprica/archive/paprica_v0.22.tar.gz tar -xzvf paprica_v0.22.tar.gz mv paprica-paprica_v0.22 paprica cd paprica chmod a+x *sh

Now using your text editor of choice (I recommend nano) you should open the file titled paprica_profile.txt.  This file will look something like:

## Variables necessary for the scripts associated with paprica_run.sh and paprica_build.sh. # This is the location of the reference directory. ref_dir=~/paprica/ref_genome_database/ # This is the location of the covariance model used by Infernal. cm=~/paprica/bacterial_ssu.cm ## Variables associated with paprica_build.sh only. # This is the number of cpus RAxML will use. See RAxML manual for guidance. cpus=8 # This is the location where you want your pgdbs placed. This should match # what you told pathway-tools, or set to pathway-tools default location if # you didn't specify anything. pgdb_dir=~/ptools-local/pgdbs/user/ ## Variables associated with paprica_run.sh only. # The fraction of terminal daughters that need to have a pathway for it # to be included in an internal node. cutoff=0.5

For the purposes of this tutorial, and a basic analysis with paprica, we are only concerned with the ref_dir, cm, and cutoff variables.  The ref_dir variable is the location of the reference database.  If you downloaded paprica to your home directory, and you only intend to use paprica_run.sh, you shouldn’t need to change it.  Ditto for cm, which is the location of the covariance model used by Infernal.  The cutoff variable specifies in what fraction of genomes in a given clade a metabolic pathway needs to appear in to be assigned to a read placed to that clade.  In practice 50 % works well, but you may wish to be more or less conservative depending on your objectives.  If you want to change it simply edit that value to be whatever you want.

Now go ahead and make sure that things are working right by executing paprica_run.sh using the provided file test.fasta.  From the paprica directory:

./paprica_run.sh test

This will produce a variety of output files in the paprica directory:

ls test* test.clean.align.sto test.clean.fasta test.combined_16S.tax.clean.align.csv test.combined_16S.tax.clean.align.fasta test.combined_16S.tax.clean.align.jplace test.combined_16S.tax.clean.align.phyloxml test.combined_16S.tax.clean.align.sto test.edge_data.csv test.fasta test.pathways.csv test.sample_data.txt test.sum_pathways.csv

Each sample fasta file that you run will produce similar output, with the following being particularly useful to you:

test.combined_16S.tax.clean.align.jplace: This is a file produced by pplacer that contains the placement information for your sample.  You can do all kinds of interesting things with sets of jplace files using Guppy.  Refer to the Guppy documentation for my details.

test.combined_16S.tax.clean.align.phyloxml: This is a “fat” style tree showing your the placements of your query on the reference tree.  You can view this tree using Archaeopteryx.

test.edge_data.csv: This is a csv format file containing data on edge location in the reference tree that received a placement, such as the number of reads that placed, predicted 16S rRNA gene copies, number of reads placed normalized to 16S rRNA gene copies, GC content, etc.  This file describes the taxonomic structure of your sample.

test.pathways.csv: This is a csv file of all the metabolic pathways inferred for test.fasta, by placement.  All possible metabolic pathways are listed, the number attributed to each edge is given in the column for that edge.

test.sample_data.txt: This file described some basic information for the sample, such as the database version that was used to make the metabolic inference, the confidence score, total reads used, etc.

test.sum_pathways.csv: This csv format file describes the metabolic structure of the sample, i.e. pathway abundance across all edges.

Okay, that was all well and good for the test.fasta file, which has placements only for a single edge and is not particularly exciting.  Let’s try something a bit more advanced.  Create a directory for some new analysis on your home directory and migrate the necessary paprica files to it:

cd ~ mkdir my_analysis cp paprica/paprica_run.sh my_analysis cp paprica/paprica_profile.txt my_analysis cp paprica/paprica_place_it.py my_analysis cp paprica/paprica_tally_pathways.py my_analysis

Now add some fasta files for this tutorial.  The two fasta files are from Luria et al. 2014 and Bowman and Ducklow, 2015.  They are a summer and winter surface sample from the same location along the West Antarctic Peninsula.  I recommend wget for downloads of this sort, if you don’t have it, and don’t want to install it for some reason, use curl.

cd my_analysis wget http://www.polarmicrobes.org/extras/summer.fasta wget http://www.polarmicrobes.org/extras/winter.fasta

We’re cheating a bit here, because these samples have already been QC’d.  That means I’ve trimmed for quality and removed low quality reads, removed chimeras, and identified and removed mitochondria, chloroplasts, and anything that didn’t look like it belonged to the domain Bacteria.  I used Mothur for all of these tasks, but you may wish to use other tools.

Run time may be a concern for you if you have many query files to run, and/or they are particularly large.  The rate limiting step in paprica is pplacer.  We can speed pplacer up by telling paprica_place_it.py to split the query fasta into several pieces that pplacer will run in parallel.  Be careful of memory useage!  pplacer creates two threads automatically when it runs, and each thread uses about 4 Gb of memory.  So if you’re system has only 2 cpus and 8 Gb of memory don’t use this option!  If you’re system has 32 Gb of RAM I’d recommend 3 splits, so that you don’t max things out.

While we’re modifying run parameters let’s make one additional change.  The two provided files have already been subsampled so that they have equal numbers of reads (1,977).  We can check this with:

grep -c '>' *fasta summer.fasta:1977 winter.fasta:1977

But suppose this wasn’t the case?  It’s generally a good idea to subsample your reads to the size of the smallest library so that you are viewing diversity evenly across samples.  You can get paprica to do this for you by specifying the number of reads paprica_place_it.py should use.

To specify the number of splits and the number of reads edit the paprica_place_it.py flags in paprica_run.sh:

## default line #python paprica_place_it.py -query $query -ref combined_16S.tax -splits 1 ## new line python paprica_place_it.py -query $query -ref combined_16S.tax -splits 3 -n 1000

This will cause paprica to subsample the query file (by random selection) to 1000 reads, and split the subsampled file into three query files that will be run in parallel. The parallel run is blind to you, the output should be identical to a run with no splits (-splits 1). If you use subsampling you’ll also need to change the paprica_tally_pathways.py line, as the input file name will be slightly different.

## default line python paprica_tally_pathways.py -i $query.sub.combined_16S.tax.clean.align.csv -o $query ## new line python paprica_tally_pathways.py -i $query.combined_16S.tax.clean.align.csv -o $query

Here we are only analyzing two samples, so running them manually isn’t too much of a pain. But you might have tens or hundreds of samples, and need a way to automate that. We do this with a simple loop. I recommend generating a file with the prefixes of all your query files and using that in the loop. For example the file samples.txt might have:

summer winter

This file can be inserted into a loop as:

while read f;do ./paprica_run.sh $f done < samples.txt

Note that we don’t run them in parallel using say, gnu parallel, because Infernal is intrinsically parallelized, and we already forced pplacer to run in parallel using -splits.

Once you’ve executed the loop you’ll see all the normal paprica output, for both samples.  It’s useful to concatenate some of this information for downstream analysis.   The provided utility combine_edge_results.py can do this for you.  Copy it to your working directory:

cp ~/paprica/utilities/combine_edge_results.py combine_edge_results.py

This script will automatically aggregate everything with the suffix .edge_data.csv.  You need to specify a prefix for the output files.

python combine_edge_results.py my_analysis

This produces two files:

my_analysis.edge_data.csv: This is the mean genome parameters for each sample.  Lots of good stuff in here, see the column labels.

my_analysis.edge_tally.csv: Edge abundance for each sample (corrected for 16S rRNA gene copy).  This is your community structure, and is equivalent to an OTU table (but much better!).

To be continued…

Installing paprica on Mac OSX

Chasing Microbes in Antarctica - Wed, 01/20/2016 - 09:55

The following is a paprica installation tutorial for novice users on Mac OSX (installation is Linux is quite a bit simpler). If you’re comfortable editing your PATH and installing things using bash you probably don’t need to follow this tutorial, just follow the instructions in the manual. If command line operations stress you out, and you haven’t dealt with a lot of weird bioinformatics program installs, use this tutorial.

Please note that this tutorial is a work in progress.  If you notice errors, inconsistencies, or omissions please leave a comment and I’ll be sure to correct them.

paprica is 90 % an elaborate wrapper script (or set of scripts) for several core programs written by other groups. The scripts that execute the pipeline are bash scripts, the scripts that do that actual work are Python. Therefor you need to get Python up and running on your system. The version that came with your system won’t suffice without heavy modification. Best to use a free third-party distro like Anaconda (preferred) or Canopy.  If you already have a mainstream v2.7 Python distro going just make sure that the biopython, joblib, and pandas modules are installed and you’re good to go.

If not please download the Anaconda distro and install it following the developer’s instructions. Allow the installer to modify your PATH variable. Once the installation is complete update it by executing:

conda update conda conda update --all

Then you’ll need to install biopython, joblib, and pandas:

conda install biopython conda install joblib conda install pandas

In case you have conflicts with other Python installations, or some other mysterious problems, it’s a good idea to test things out at this point. Open a shell, type “Python”, and you should get a welcome message that specifies Anaconda as your distro. Type:

import Bio import joblib import pandas

If you get any error messages something somewhere is wrong. Burn some incense and try again. If that doesn’t work try holy water.

One challenge with paprica on OSX has to do with the excellent program pplacer. The pplacer binary for Darwin needs the Gnu Scientific Library (GSL), specifically v1.6 (at the time of writing). You can try to compile this from source, but I’ve had trouble getting this to work on OSX. The easier option is to use a package manager, preferably Homebrew. This means however, that you have to marry one of the OSX package managers and never look back. Fink, Macports, and Homebrew will all get you a working version of GSL. I recommend using Homebrew.

To download Homebrew (assuming you don’t already have it) type:

ruby -e "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install)"

Follow the on-screen instructions. Once it is downloaded type:

brew install GSL

This should install the Gnu Scientific Library v1.6.

Assuming all that went okay go ahead and download the software you need to execute just the paprica_run.sh portion of paprica. First, the excellent aligner Infernal. From your home directory:

curl -O http://selab.janelia.org/software/infernal/infernal-1.1.1-macosx-intel.tar.gz tar -xzvf infernal-1.1.1-macosx-intel.tar.gz mv infernal-1.1.1-macosx-intel infernal

Then pplacer, which also includes Guppy:

curl -O https://github.com/matsen/pplacer/releases/download/v1.1.alpha17/pplacer-Darwin-v1.1.alpha17.zip unzip pplacer-Darwin-v1.1.alpha17.zip mv pplacer-Darwin-v1.1.alpha17 pplacer

Now comes the tricky bit, you need to add the locations of the executables for these programs to your PATH variable. Don’t screw this up. It isn’t hard to undo screw-ups, but it will freak you out. Before you continue please read the excellent summary of shell startup scripts as they pertain to OSX here:

http://hayne.net/MacDev/Notes/unixFAQ.html#shellStartup

Assuming that you are new to the command line, and did not have a .bash_profile or .profile file already, the Anaconda install would have created .profile and added it’s executables to your path. From your home directory type:

nano .profile

Navigate to the end of the file and type:

export PATH=/Users/your-user-name/infernal/binaries:/Users/you-user-name/pplacer:${PATH}

Don’t be the guy or gal who types your-user-name. Replace with your actual user name. Hit ctrl-o to write out the file, and ctrl-x to exit nano. Re-source .profile by typing:

source .profile

Confirm that you can execute the following programs by navigating to your home directory and executing each of the following commands:

cmalign esl-alimerge pplacer guppy

You should get an error message that is clearly from the program, not a bash error like “command not found”.

Now you need to install the final dependency, Seqmagick. Confirm the most current stable release by going to Github, then download it:

curl -O https://github.com/fhcrc/seqmagick/archive/0.6.1.tar.gz tar -xzvf 0.6.1 cd 0.6.1 python setup.py install

Check the installation by typing:

seqmagick mogrify

You should get a sensible error that is clearly seqmagick yelling at you.

Okay, now you are ready to download paprica and do some analysis! Download the latest stable version of paprica (don’t just blindly download, please check Github for the latest stable release):

curl -O https://github.com/bowmanjeffs/paprica/archive/paprica_v0.23.tar.gz tar -xzvf https://github.com/bowmanjeffs/paprica/archive/paprica_v0.23.tar.gz mv paprica-paprica_v0.23 paprica

Now you need to make paprica_run.sh executable

cd paprica chmod a+x paprica_run.sh

At this point you should be ready to rock. Take a deep breath and type:

./paprica_run.sh test

You should see a lot of output flash by on the screen, and you should find that the files test.pathways.csv, test.edge_data.csv, test.sample_data.txt, and test.sum_pathways.txt in your directory. These are the primary output files from paprica. The other files of interest are the Guppy output files test.combined_16S.tax.clean.align.phyloxml and test.combined_16S.tax.clean.align.jplace. Check out the Guppy documentation for the many things you can do with jplace files. The phyloxml file is an edge fattened tree of the query placements on the reference tree. It can be viewed using Archaeopteryx or another phyloxml capable tree viewer.

To run your own analysis, say on amazing_sample.fasta, simply type:

./paprica_run.sh amazing_sample

Please, please, please, read the manual (included in the paprica download) for further details, such as how to greatly decrease the run time on large fasta files, and how to sub-sample your input fasta. Remember that the fasta file you input should contain only reads you are reasonably sure come from bacteria (an archaeal version is a long term goal), and they should be properly QC’d (i.e. low quality ends and adapters and barcodes and such trimmed away).

In Isolation, Community

Sampling the Barren Sea - Tue, 01/19/2016 - 10:38

By Frankie Pavia

I was talking to a colleague on board today while we were subsampling sediment cores we had taken from the last station. The cores were especially interesting – the entire surface was covered in manganese nodules, some the size of baseballs. Our conversation was interrupted by a mysterious occurrence. In one of the subcores we’d taken, there were manganese nodules sitting at 15 cm and 18 cm deep in the sediment. Conventional wisdom and that infamous beacon of knowledge, scientific consensus, stated that the nodules stayed on top of the sediment and were never buried after formation. There were also bright streaks of white carbonate nearby polluting the otherwise pristine red clay that occupied the rest of the core.

manganese

Mysterious manganese nodules in a core.

We had been talking about what it would be like to be back on land after a long cruise like this. My colleague has been to sea a few times before, and I was curious as to what she thought would be the most different to us upon returning to dry land. She explained that for her, the biggest change was interacting with strangers. There are only 64 people aboard the boat, and by now I can match a name to every face. I may not speak to them regularly, but I may have seen how they take their coffee, or what kind of cake they prefer in the afternoon, or exchanged a casual “moin” (hello) in the hallway. I haven’t seen a new face in almost four weeks.

I anticipated that being at sea might be lonely. I knew I would miss my friends and family. It has hit especially hard the past two Sundays when my hometown NFL team, the Seahawks, have played playoff games. I usually watch Seahawks games with my best friends in New York and fire texts back and forth to my friends I grew up with the entire time. Those are the times I am most in contact with the people I love. Sitting alone in my cabin aboard the ship, frantically updating Twitter, trying to follow the happenings and score of the game, feels especially isolating.

In a way, being a scientist is an isolating endeavor, no matter what. A friend of mine who writes for a hip-hop website is easy for any music lover to connect with. I talk to him every time a new mix tape drops, debating which tracks are the most fire. Another friend works for a soccer analytics company; he tracks the most popular sport in the world. I talk to him every time I’m watching an entertaining game or have a question about a soccer article I’ve read. But not many of my friends have burning questions about isotope geochemistry. The rare conversations we have had about protactinium have tended to be short and one-sided. I love talking about my research. I love learning about other peoples’ research. On land, I have limited opportunity to have these conversations.

On the ship, these conversations are nonstop. Oceanography is what the scientists on board all have in common – how could we not constantly talk about it? I might not know what someone’s favorite color is, or what town they grew up in. But I could probably give a pretty solid explanation of the questions they’re trying to answer with their research. I’ve detailed the systematics of protactinium and thorium isotopes countless times to other scientists on board and gotten genuinely interested responses, rather than blank stares. I began to understand what my colleague meant about interacting with strangers being the most difficult thing about returning to land. Returning to land will mean returning to the real world. There, my research and much of my identity will get suppressed until I can find my way back to the company of fellow scientists.

But as I had that realization, I was immediately distracted. The manganese nodules had made their first appearance within the deep sediment where they didn’t belong. Reality on land could wait. My colleague and I began to volley back and forth ideas about how they could have been emplaced so deep, and what experiments we could design to test our hypotheses. This is my beautiful reality at sea.

All I Wanted for Christmas Was for These Pumps to Work

Sampling the Barren Sea - Wed, 12/30/2015 - 08:58
The cruise track and sampling stations for the FS Sonne.

The cruise track and sampling stations for the FS Sonne.

By Frankie Pavia

We’ve just completed our first full station and are remarkably pleased with the results. We collected 8 seawater samples to measure helium isotopes; 20 to measure thorium and protactinium isotopes; 7 in-situ pump filters to measure particulate thorium and protactinium isotopes; 6 manganese oxides cartridges that were attached to the pumps to measure actinium and radium isotopes; and 1 box core of the ocean floor to measure sedimentary thorium and protactinium isotopes. I was going to make this paragraph into the Twelve Days of Christmas song, but 7 pumps-a-pumping doesn’t really roll off the tongue that well.

pump

A pump destined for the deep.

What all this means is that the first station was a smashing success for us. The only thing that didn’t quite go as planned was the nine-meter-long gravity corer coming up empty. We suspect it may have been due to the corer not being able to penetrate the hard carbonate layer we saw–about 15 centimeters thick in our box core. Nonetheless, we are delighted.

We were especially pleased that our in-situ pumps worked. We arrived on the cruise with the knowledge that the pumps would be there, but figured that somebody would be an expert on how to program them, maintain them and operate them. The pumps are essentially motors hung on a line deep in the water, drawing thousands of liters water through a filter, catching the ocean’s suspended particles.

After a week of poring over the manual, we were finally ready to deploy the pumps. It would take them 2.5 hours to descend to 3600 meters water depth, 6 hours of pumping, and 2.5 hours for the deepest pump to return. A convenient time to have them pump is overnight. Sleep is hard to come by while on station, so six hours of pumps pumping away at depth is a great excuse to scuttle off to bed.

We were pretty nervous as to whether they would actually work. We had invested a lot of time and energy getting them up and running. What a bummer it’d be if they spent six hours in the deep ocean not doing anything because I had accidentally programmed them to pump at the wrong time, or something. Our test run the previous day had been a bit spotty, too. The flow rate of the pumps had been something like 3 times lower than it should have been.

We woke up at 4 a.m. the next day to wait for the pumps to arrive back on deck, driven by caffeine and nervous energy. Christmas had been two days previous. On Christmas Eve the crew put on a terrific party in the hangar, and the pumps had been decorated with big red ribbons. We were about to find out whether the pumps were a present we actually wanted, or if they were one of those fancy battery-powered toys you get with a list of parts that has three missing and ends up never working.

All the pumps have names. We were able to name the four new pumps after ourselves, while the other four pumps already names. Claudia, Bernhard, Sebastian, Frankie, Laura, Frauke, Jimmy and Hulda. They all seemed to have a little personality too – especially the old ones, Laura, Frauke, Jimmy, and Hulda. Parts of Laura were backwards, Hulda’s screws refused to come loose, Jimmy’s pump head had missing pieces.

Claudia was the first to arrive at the surface. Immediately upon getting her out of the water, we put a shower cap over the filter holder to protect the filter from contamination by atmospheric aerosols and any dust floating around the hangar. We pumped the remaining water from the bottom through the filter, removed the filter holder and brought it to the lab. We carefully unscrewed the top, opened it up, and…

The filter was covered in particles! One by one, the pumps came up with filters that were coated by an even distribution of particles. Everything worked perfectly. Even Laura, Hulda, and Jimmy, though they were stubborn above water, did everything they were supposed to do once they were submerged.

We plan to measure protactinium and thorium isotopes on the particles to learn about the kinetics of particle movement in the ocean – sinking rates, absorption coefficients for trace metals, and export fluxes. Particles are the vectors that move elements out of the surface ocean, so studying their characteristics will be crucial for understanding how things like carbon and iron are pumped and exported to the deep.

Functional pumps meant that it was a happy Christmas for us. The next full station starts this afternoon. We’ll spend 42 hours sitting in one place, measuring dissolved, particulate, and sediment samples. Yesterday we had to change all the batteries on the pumps. Each pump requires 24 D batteries per deployment, and uses them all. So for every cast of 8 pumps, we use 192 D batteries. We’ll send the pumps out tonight and retrieve them at 4 a.m. again tomorrow morning.

We’re hoping these pumps are gifts that keep on giving.

Wrapping up the season

Chasing Microbes in Antarctica - Wed, 12/23/2015 - 09:34
One of many large chains of Chaeotoceros now blooming in Arthur Harbor.

One of many large chains of the diatom Chaeotoceros now blooming in Arthur Harbor.

Yesterday morning the Gould returned to Palmer Station, which means that it’s time for Jamie and I to take off. I’m looking forward to getting home and working through all the data we’ve collected (and who wouldn’t want to spend Christmas sick in the Drake Passage?), but sad to be leaving at an ecologically interesting point in the season. After a particularly windy spring we’ve had a week of calm conditions. As expected this resulted in a huge increase in primary production. The water at our regular sampling stations has turned green almost overnight. In an ideal world we would have seen those conditions two weeks ago, at the height of our sampling, but there’s no predicting the timing of these events! Consistent with what we’ve seen in the minor blooms all season this major bloom is composed mostly of Chaeotoceros. Instead of short chains however, we’ve got dense chains of many tens of cells. If these calm conditions persist a little longer it bodes well for the krill (and everything else) this season. To keep track of what the Palmer LTER group is up to for the remainder of the season you can check out Nicole Couto’s blog here.

All in all it was an extremely busy and productive early season.  Many thanks to everyone at Palmer Station for making it happen!

A new species of penguin clusters at the ice edge.

Celebrating the summer solstice: a new species of penguin clusters along the shoreline near Palmer Station.

Doing Science When There’s No Science to Be Done

Sampling the Barren Sea - Tue, 12/22/2015 - 13:16

By Frankie Pavia

Six days after we were supposed to have departed, the UltraPac scientists and ship’s crew remain stranded at port aboard the FS Sonne. Containers with the last of our missing science gear are on a truck driving up from San Antonio, Chile, where the port felt comfortable unloading our acids and radioisotopes. The Sonne’s spare parts are being unloaded from a ship across the harbor that I can see from my cabin’s windows. With an abundance of time and a dearth of work, we have begun to devise ways of doing science before we can actually do science at sea.

pavia desk

Setup for photographing particle filters, devised in port.

We first discussed how to optimize our sample depth selections. In the first three stations, the deep waters will be downwind of the East Pacific Rise, one of the fastest spreading mid-ocean ridges in the world. At ridge axes, water that has percolated through the ocean crust and weathered mantle-derived rocks is erupted back out by volcanism and hydrothermal vents. This ‘plume water’ bears a distinct signature of the Earth’s mantle – high in rare noble gases like 3He, biologically critical trace metals like iron and manganese, and small particles that are reactive sites for removing other elements like phosphorous, magnesium, and most importantly (for me!), protactinium and thorium.

When this plume water enters the ocean, it is very hot and less dense than the surrounding waters. It rises until it attains a state of neutral buoyancy – when its density is the same as ambient seawater. Then it simply moves and acts like any other water – in currents and eddies. But since it bears distinct chemical signatures, chemical oceanographers can find easily find it – after they’ve measured something in it.

But we want to know where it is before we sample it. We want to understand the processes going on inside the plume. What kind of particles are there? How fast do they remove trace metals from the ocean? How much iron enters the ocean from submarine volcanism? If we are to answer these questions, we must first be able to sample exactly within the plume waters – which means we must know where they are before we deploy our bottles.

Luckily, past cruises from the World Ocean Circulation Experiment (WOCE) have measured helium isotopes and density in the Pacific before. As a result, we know roughly what density surface is associated with the neutrally-buoyant plume waters. When we sample, we will send down a line with a CTD sensor to measure temperature, salinity, and pressure, from which we can calculate density. That line will have our bottles on it. We can instantaneously calculate the density of the waters we are sampling, find the depth of the density surface we know is associated with plume waters, then tell our bottles to open and sample at that depth. Problem solved!

We also set up an imaging system to take pictures of the particle filters we bring back. At seven depths of each station, we will deploy in-situ pumps that filter thousands of liters of seawater through a filter at a given depth. We then haul the pumps back to the surface, remove the filters, and analyze them.

We would like to photograph the filters before we analyze them so we can visually assess how much material there is on each filter, to confirm the results from our chemistry. To do this accurately, we must photograph every filter from the same angle, with the same lighting, with the same shutter speed. We went to a hardware store in town yesterday and bought some supplies, not knowing if the imagined setup would actually work.

It worked! We turned a lamp with a flexible stand for adjusting light height into a camera holder, decapitating the lamp portion and replacing it with a tripod holding the camera. Then we installed software allowing the camera to be controlled from a phone, so we could take pictures and adjust shutter speed remotely. We bought a clip-on lamp that will be attached to the camera holder for constant lighting (this one used for its true purpose!).

We are finally scheduled to receive our last missing container and depart port late tonight, around 22:00. While the delay has been frustrating, I suppose it hasn’t been all bad. We were scheduled to leave December 17, the day before the new Star Wars movie came out. Six extra days in port meant we were able to go into town to watch it. It was our last little leisure activity on land. Now it’s time for the ocean.

Day 2: What Am I Doing Here, Anyway?

Sampling the Barren Sea - Mon, 12/21/2015 - 10:03
The planned cruise track of the FS Sonne. The stars are stations that will be sampled during the UltraPac cruise. In color is the bathymetry of the seafloor.

The planned cruise track of the FS Sonne. The stars are stations that will be sampled during the UltraPac cruise. In color is the bathymetry of the seafloor.

By Frankie Pavia

Still stuck in Antofagasta, the scientists are becomingly increasingly antsy. Every day we are stuck at the port is a day of sampling we won’t be able to do at sea. Every time we want to take a sample from the bottom of the ocean, at around 5,000 meters depth (16,404 feet), it will take us four hours to lower the line, several hours to do sampling (fill bottles, pumps, etc), and four hours to pull it back up. There are several of these casts at each of eight stations. Every hour we have at sea is precious for returning valuable samples.

What am I doing here, anyway? I am an oceanographer and an isotope geochemist. Originally, I only planned to measure naturally occurring radionuclides thorium and protactinium dissolved in seawater and stuck onto ocean particles. But slowly, more scientists found out there was a chance to get seawater from this part of the ocean and asked us to take samples for them.

The South Pacific Gyre is the most oligotrophic (nutrient-poor) region in the ocean. This makes it largely barren of life and matter—the waters are the clearest in the ocean. The sediments accumulate below the water at rates as low as 0.1 millimeter per thousand years. So, 10 centimeters of seafloor are equivalent to one million years of material deposition in the South Pacific.

The scarcity of particles and lack of eukaryotic life are two major reasons the South Pacific is fascinating to a chemical oceanographer.

Surface biology and dust deposition are the two main factors regulating the flux of particles through the ocean interior. Being so far from land and upwind of major dust sources, almost no atmospheric material makes its way to the South Pacific. Since there is no dust, and no eukaryotes, the particles must largely be made up of tiny bacteria, of which there are millions in each milliliter of seawater.

 Frankie Pavia

Jellyfish floating near the boat. Photo: Frankie Pavia

Much of what we are setting out to do is simply the chemical characterization of the region. We are exploring the ocean using chemistry. We can’t see the scarce sinking particles, but trusty old thorium and protactinium can. They are extremely insoluble. Every time they encounter a particle, they stick to it. We exploit this simple characteristic to provide rare accounts of rates in the ocean. Just by measuring protactinium and thorium, we can calculate how fast particles are sinking through the water, how much dust is entering the water column, how fast different elements are being removed from the water at the seafloor, and more. It’s almost incomprehensible that two obscure elements can teach us so much.

These isotopes are the oceanographer’s equivalent to the Hubble telescope.
They help us see where we cannot. We measure thorium and protactinium to tell us input and removal rates. We measure helium isotopes to trace hydrothermal plumes in the deep ocean. We measure radium and actinium isotopes to determine the mixing rates of waters in the deep ocean. None of these processes are discernable by eye, yet all are crucial for understanding the chemical and physical state of the entire ocean.

So we continue to wait to make our measurements and do our science until we can depart. The void is filled by lighthearted scientific arguments, whether or not we could make a jetpack for one of the massive hordes of dead jellyfish floating around the boat. The idea is that you could throw a bit of dry ice underneath the jellyfish, which would then sublimate, expand, and rise out of the water, taking the jellyfish with it.

Ultimately, no one ever tried it. Who wants to do an experiment where you can just see the answer with your own eyes?

Frankie Pavia is a second-year graduate student studying oceanography and geochemistry at Lamont-Doherty Earth Observatory.

Setting Sail? Plan for the Unexpected

Sampling the Barren Sea - Fri, 12/18/2015 - 10:32
 Frankie Pavia

Stepping aboard. Photo: Frankie Pavia

By Frankie Pavia

Days -3 to 1, The delay: In the weeks before departing for my first scientific cruise, everyone I knew who had ever been to sea gave me some form of the same advice:
Nothing ever works the way you expect it to work at sea. Four days ago, their words lingered heavily in my head as I groggily walked to board my final connection to Antofagasta, Chile. I wondered to myself, “How can the unexpected happen when I have no idea what to expect in the first place?”

I am typing from my cabin aboard the FS Sonne, a German research vessel scheduled to depart on a scientific sampling cruise between Antofagasta, Chile, and Wellington, New Zealand, crossing the entire South Pacific, between Dec. 17 and Jan. 28. Fellow Lamont-Doherty graduate student Sebastian Vivancos and I will be measuring a variety of chemical tracers in the ocean for our Ph.D. theses on the cruise. The South Pacific is a bit of a unicorn for oceanographers—due to its remoteness, cruises rarely go there, making it hard to get samples. We were presented with the opportunity to collect seawater aboard the FS Sonne as part of the UltraPac program and jumped at the chance.

We arrived three days ago to load the ship, with the help our lab’s research technician and scientific cruise guru, Marty Fleisher. The expectation was that we would stay in a nearby hotel on the 14th and 15th, set up the ship, and on the 16th we would have everything ready to go, stay the night aboard the Sonne, and depart the next morning.

The answer to my first question was answered almost immediately. One of the bags filled with crucial last-minute additions to our sampling equipment was lost in transit by the airline company. After a massive series of calls emails, we had a tracking number, but no idea if the bag would arrive by the morning of the 17th when we would depart. Some of the things we could buy at a local hardware store, but some of it was likely too specialized. I, for one, have never seen a hardware store that sells pizza slicers made of ceramic.

 Frankie Pavia

Sitting by the dock of the bay: the FS Sonne. Photo: Frankie Pavia

Once again the unexpected struck. Upon meeting with the other scientists on the cruise, we quickly learned that everyone was missing their supplies. Everyone else’s supplies were scheduled to arrive on the 18th—the day after the cruise was scheduled to depart. Then the bombshell: spare parts for the ship were still at least four days away from arriving, and we couldn’t depart until we had them.

The presence of the unexpected was forcing me to realize what my expectations were in the first place. I had been incredibly anxious leading up to the cruise about leaving my life behind for two months while I went to sea with limited connection to the outside world. I was hoping that the solitude of the ocean and the engagement of constant scientific work would buffer that anxiety. The going is slow. There’s not much work to do until the parts get here and we depart. I’m living on the ship, but my journey hasn’t yet begun.

The best-case scenario is that we leave on Sunday. No one has said anything about the worst-case scenario yet, but there have been rumblings of a Christmas in Chile. That leaves somewhere between three more days and seven more days of the uncertain, and the unexpected grasping my life’s puppet strings, postponing the adventure.

Listen in: Sachs & Cohen on the Paris Agreement

The 2015 Paris Climate Summit - Wed, 12/16/2015 - 13:33
Jeffrey Sachs

Jeffrey Sachs

Earth Institute Director Jeffrey Sachs sat down with Brian Lehrer at WNYC on Tuesday to talk about what happens post-Paris. The climate talks are over, but the real work is just beginning. Sachs talks about the details of the agreement, what the implications are, and what obstacles we face moving forward on climate change.

Listen to the interview here.

Executive Director Steve Cohen was on WNYC on Monday talking with Soterios Johnson about the implications of the Paris accord on New York City. You can hear that interview here.

Understanding the Paris Climate Accord and Its Implications

The 2015 Paris Climate Summit - Tue, 12/15/2015 - 17:14
The climate agreement is adopted.

The climate agreement is adopted.

On Saturday, Dec. 12, 2015, 195 countries reached a history-making agreement to reduce their greenhouse gas emissions in order to avert the direst effects of climate change. The groundbreaking pact requires that nearly every country, large and small, developed or developing, take action.

Here are some of the best and most reliable resources to help you understand the Paris accord and its implications.

The COP21 official site asserts that the rise in global temperature must be kept under 2˚C compared to pre-industrial levels to avoid the most catastrophic effects of climate change, and, establishes for the first time the aim to keep the temperature below 1.5˚ to protect island countries, which are most vulnerable to the risks of sea level rise.

The United Nations Framework Convention on Climate Change newsroom explained that the climate agreement encompasses mitigation, the effort to reduce emissions quickly enough to reach the temperature goal; a transparent and global stock-taking system to monitor progress; adaptation, to help countries deal with the impacts of climate change; loss and damage, to aid countries recovering from the impacts of climate change; and financial and technical support to help nations build sustainable resiliency.

To curb the temperature rise, countries submitted “nationally determined contributions” that indicate how much they will reduce their emissions and what actions they will take to do so, but these are not legally binding. The New York Times said that countries are legally bound by the agreement, however, to monitor and report on their emissions and progress, and ratchet up their efforts to reduce emissions in the future.

The climate pledges that have been made thus far will not cut emissions enough to keep below the 2˚ target, so beginning in 2018, countries must submit new plans every five years that increase their emissions reductions, reported CNN. There is, however, no mechanism to punish any country that violates its commitment.

The New York Times examined some salient points of the agreement. The aspiration to stay below 1.5˚ C as part of the 2˚ limit makes this temperature increase target more ambitious than those in the past. Forests must be preserved with incentives continued to reduce deforestation and forest degradation that increase emissions. A transparent system will be established to evaluate implementation of the countries’ nationally determined contributions; and countries must come up with increasingly ambitious reduction targets every five years. The parties are encouraged to reach a peak of greenhouse gas emissions as soon as possible. The agreement also recognizes loss and damage resulting from climate change impacts. And while the agreement does not set forth a specific dollar amount, the developed countries are encouraged to provide and marshal financing from various sources to help developing countries.

Developed countries agreed to continue their commitment to provide $100 billion a year from 2020 until 2025, after which financing will increase. However the $100 billion figure does not appear in the legally binding part of the agreement .

National Geographic took a look at some of the surprises, as well as the winners and losers of the climate agreement.

The International Energy Agency estimated that fulfilling all the climate pledges would entail investments of $13.5 trillion in energy efficiency and low-carbon technologies between 2015 and 2030. If $3 trillion more were invested, the temperature increase could be held to 2˚ C. While $16.5 trillion sounds like a huge sum, the world is projected to spend $68 trillion anyway by 2040 on energy systems. The climate agreement ensures that the investments will go towards low-carbon technologies.

Each country will decide how best to fulfill its climate pledge. The Deep Decarbonization Pathways Project, an initiative of the Sustainable Development Solutions Network, has put together research teams from the world’s biggest greenhouse gas emitting countries that are developing concrete and detailed strategies for reducing emissions in their countries.

The World Resources Institute’s analysis of the accord said that it presents a new model of international cooperation where developed and developing countries are united and engaged in a common goal. The agreement also signals the recognition that acting to stem climate change can provide tremendous opportunities and benefits.

The accord will be open for signature at the United Nations headquarters in New York City from April 22, 2016 to April 21, 2017, with a high-level signature ceremony on April 22, 2016. It will be in force once it has been ratified by 55 countries, representing at least 55 percent of emissions.

Secretary of State Kerry addresses the country delegates after adoption of the agreement

Secretary of State John Kerry addresses the country delegates after adoption of the agreement

REACTIONS TO THE CLIMATE AGREEMENT

Around the world, countries hailed the accord, while most newspaper editorial boards cheered the historic agreement and praised their own countries’ negotiators.

The Los Angeles Times questioned whether the countries of the world could truly work together to stay within the 2˚ target and if even the aspirational 1.5˚ goal was low enough to save us from catastrophic impacts, but called it a “good moment for the planet.”

The Washington Post said that the agreement will challenge climate deniers to “explain not only why they reject science but also why they would harm the U.S. standing in the world by seeking to slow the progress so many countries are making.”

The Wall Street Journal asserted that the accord would make the world poorer and slow technological progress. It is betting that the agreement will not make an impact on global temperatures because the commitments to reduce emissions are not legally binding.

Many businesses, such as Coca-Cola, DuPont, General Mills, HP and Unilever are supportive of the agreement, recognizing that the policies developed in accord with the Paris agreement would bring more certainty to investors and generate business opportunities, reported The New York Times. The climate agreement sends a strong signal to businesses and investors that the fossil fuel era is on its way out.

Over 100 corporations pledged to reduce their carbon emissions in the effort to support the climate agreement’s 2˚ target. InsideClimate News reported that companies like Wal-Mart, IKEA, Honda, Unilever and Xerox are participating in the new initiative organized by the World Resources Institute, World Wildlife Fund, Carbon Disclosure Project and the UN Global Compact.

Reuters provided a sampling of reactions by prominent political and business leaders around the world.

Paul Krugman, Nobel Prize-winning economist and New York Times op-ed columnist, said that despite the challenges that still exist, there is reason to believe the agreement can change the world’s trajectory because the costs of renewable energy have fallen so dramatically. This means reducing emissions will cost much less than was previously assumed.

Michael Mann, climatologist, geophysicist and Distinguished Professor of Meteorology at Pennsylvania State University, said, “One cannot understate the importance of the agreement arrived at in Paris. For the first time, world leaders have faced up to the stark warnings that climate scientists have been issuing for years instead of shrinking away with denial and delay.”

Bill McKibben, the founder of 350.org, the global climate campaign, called the climate pledges “modest.” While they might have kept the planet at 1.5˚ back in 1995 when the first climate conference occurred, he said, now we need to proceed at breakneck speed, leaving most of the remaining fossil fuels in the ground and transitioning to renewable energy as soon as possible.

James Hansen, former NASA scientist and leading climate scientist, called the agreement a “fraud” and “a fake.” Without a mechanism, such as a carbon tax, to drive up the cost of fossil fuels, he said, they will continue to be the cheapest fuels available and continue to be burned.

Representative Lamar Smith, R-Texas, chairman of the House Science, Space and Technology Committee, contended the Paris climate accord will slow economic growth in the U.S., raise electricity bills and have little impact on the environment. He believes the answer lies in relying on technological advances.

According to a 2014 report, climate denialism is more prevalent in the United States than in any other country in the world.

Many Republicans have vowed to fight President Obama’s climate agenda, The Wall Street Journal reported. Moreover, most of the Republican presidential candidates if elected, would work to undo Obama’s executive actions to deal with climate change. Because of the way the climate agreement is structured, however, it does not need to be approved by Congress.

Meanwhile, Bill Gates and other tech leaders such as Mark Zuckerberg, Richard Branson, Jeff Bezos and Jack Ma have established the Breakthrough Energy Coalition, committing billions of dollars to invest in early stage, high-risk, breakthrough energy companies because the world needs reliable, affordable, clean energy. They will also invest in Mission Innovation, a consortium of 20 countries, including the U.S., that have pledged to double their investment in clean energy over the next five years.

Eiffel_markDixon

The Paris climate agreement is momentous and historic because the countries of the world have been struggling to deal with climate change for over 20 years. In 1992, numerous countries first joined an international treaty, the United Nations Framework Convention on Climate Change, to figure out how they could limit global temperature increases and cope with the impacts of climate change. Here is the history of earlier attempts to negotiate an effective agreement to deal with climate change.

 

VIDEOS

A standing ovation for the acceptance of the Paris accord 

President Obama announces the historic climate agreement

The World Bank Group’s president, Jim Yong Kim, on the climate agreement and its implications for business and investment:

Felipe Calderon, former president of Mexico, discusses the difference between the 2009 climate conference and COP21 with Tom Friedman, Pulitzer Prize-winning journalist

Secretary of State John Kerry talks to Tom Friedman about China’s climate progress

The meetings of COP21

A message from the world’s astronauts to COP21

New App Explores Ice & Sea Level Change through Time

American Geophysical Union Fall Meeting - Tue, 12/15/2015 - 15:39
 Sea Level

Sea level trends, 1955-2012, from the app Polar Explorer: Sea Level

Why does sea level change at different rates? How has it changed in the past? Who will be at risk from more extreme weather and sea level rise in the future? Our scientists often hear questions like these from students, government officials and the media.

To help share the answers more widely, we created a new app that lets users explore a series of maps of the planet, from the deepest trenches in the oceans to the ice at the poles. You can see how ice, the oceans, precipitation and temperatures have changed over time and listen as scientists explain what you’re seeing and why.

“We wanted to make climate data accessible and engaging to the public, for everyone from students to interested adults. The data is displayed in interactive maps with just enough guidance to support independent exploration,” said Margie Turrin, education coordinator at Lamont-Doherty Earth Observatory, who designed the free app called “Polar Explorer: Sea Level” with Bill Ryan, Robin Bell, Dave Porter and Andrew Goodwillie. She is presenting the just-released app this week at the American Geophysical Union meeting in San Francisco.

 Sea Level app

Map and explanation of Antarctica’s elevation.

Turrin leads science education programs with schools across the New York and hands-on public projects, including monitoring the health of the Hudson River Estuary. She and Ryan created the app with broad-based education in mind and have found a welcome reception from middle and high school teachers, undergraduate science instructors, and the inquisitive public. The app pulls global information from databases created by internationally respected scientific organizations, including the National Oceanic and Atmospheric Administration, the National Aeronautics and Space Administration, the U.S. Geological Survey and the Center for Earth Science Information Network (CIESEN) at Lamont.

The app starts with the basics: How and why we study the remnants of old shore lines, what the sea floor looks like, and why the ocean is not actually flat.

It then looks more closely at how the oceans differ from region to region and how they have changed over time in temperature, salinity and sea level. It explores the role of the atmosphere and changes in temperature, radiation and rainfall over time. Flip quickly through the rainfall map month to month over the past three years, and you’ll clearly see the rain band across the tropics and time-lapse of how areas of rain change over time.

In exploring the contribution of glaciers and ice sheets to sea level rise, you can find the largest glaciers in the world and see how the Greenland and Antarctic ice sheets changed over a decade. If the Greenland Ice Sheet, which sits on land, were to completely melt, it would raise sea level by 6 to 7 meters. The app explains why and then looks at the impact.

 Sea Level

Economic risk from flooding.

Data from CIESIN and other organizations takes these scientific explanations the next step: it explores the impact of sea level rise, flooding and extreme weather on communities around the world. The app answers questions about who is vulnerable by looking at population density, low-lying coastlines and islands, and historic hurricane and cyclone tracks, and storm surges over the past 135 years. The data also maps areas of the U.S., China, and other parts of Asia that will be most vulnerable to economic losses as extreme weather increases and sea levels rise.

Sea level change is a topic that brings together complex science with both societal and economic impacts. Tying together the range of physical science interactions and matching them to the implications for human populations is important, and yet not always easy to do. The app was designed to pull together the data and putting it straight into peoples’ hands to explore and interact with.

While the dynamic data is primarily from recent decades, the app also draws in historical data to help tell the story of climate change through past centuries. It lets you compare three periods – 1700, before the Industrial Revolution; 5,000 years ago, during the mid-Holocene; and an interglacial period 125,000 years ago when temperatures were warmer than today. You can see air temperatures, sea surface temperatures, salinity, precipitation, evaporation, ice thickness and ice concentration at each point in time. You can also explore how large ice sheets in North America and Europe receded over the time span from 20,000 years ago to 5,000 years ago.

“Polar Explorer: Sea Level” works well in formal and informal education settings, but it holds lessons for everyone interested in understanding more about our planet, the climate and how and why sea level is rising, Turrin said.

The app is currently available for the ipad and an iphone version is due out in a few days. A browser version is also available for classrooms and seminars. You can download the app at http://www.polarexplorer.org.

Pages

 

Subscribe to Lamont-Doherty Earth Observatory aggregator