Chasing Microbes in Antarctica

Subscribe to Chasing Microbes in Antarctica feed
Bowman Lab: Marine microbial ecology
Updated: 43 min 55 sec ago

Separating bacterial and archaeal reads for analysis with paprica

Fri, 09/07/2018 - 09:50

One of the most popular primer sets for 16S rRNA gene amplicon analysis right now is the 515F/806R set. One of the advantages of this pair is that it amplifies broadly across the domains Archaea and Bacteria. This reduces by half the amount of work required to characterize prokaryotic community structure, and allows a comparison of the relative (or absolute, if you have counts) abundance of bacteria and archaea.  However, paprica and many other analysis tools aren’t designed to simultaneously analyze reads from both domains.  Different reference alignments or covariance models, for example, might be required.  Thus it’s useful to split an input fasta file into separate bacterial and archaeal files.

We like to use the Infernal tool cmscan for this purpose.  First, you’ll need to acquire covariance models for the 16S/18S rRNA genes from all three domains of life.  You can find those on the Rfam website, they are also included in paprica/models if you’ve downloaded paprica.  Copy the models to new subdirectory in your working directory while combining them into a single file:

mkdir cms cat ../paprica/models/*cm cms/ cd cms

Now you need to compress and index the covariance models using the cmpress utility provided by Infernal.  This takes a while.


Pretty simple.  Now you’re ready to do some work.  The whole Infernal suite of tools has pretty awesome built-in parallelization, but with only three covariance models in the file you won’t get much out of it.  Best to minimize cmscan’s use of cores and instead push lots of files through it at once.  This is easily done with the Gnu Parallel command:

ls *.fasta | parallel -u cmscan --cpu 1 --tblout {}.txt cmscan/ {} > /dev/nul

Next comes the secret sauce.  The command above produces an easy-to-parse, easy-to-read table with classification stats for each of the covariance models that we searched against.  Paprica contains a utility in paprica/utilities/ to parse the table and figure out which model scored best for each read, then make three new fasta files for each of domains Bacteria, Archaea, and Eukarya (the primers will typically pick up a few euks).  We’ll parallelize the script just as we did for cmscan.

ls *.fasta | parallel -u python -prefix {} -out {}

Now you have domain-specific files that you can analyze in paprica or your amplicon analysis tool of choice!



We’re joining MOSAiC!

Sat, 08/11/2018 - 14:56

The predicted drift track of Polarstern during MOSAiC, taken from the MOSAiC website.

This has been a (rare) good week for funding decisions.  A loooooong time ago when I was a third year PhD student I wrote a blog post that mentioned the Multidisciplinary drifting Observatory for the Study of Arctic Climate (MOSAiC) initiative.  The Arctic has changed a lot over the last couple of decades, as evidenced by the shift from perennial (year long) to seasonal sea ice cover.  This is a problem for climate modelers whose parameterizations and assumptions are based on observational records largely developed in the “old” Arctic.  MOSAiC was conceived as a coupled ocean-ice-atmosphere study to understand key climatic and ecological processes as they function in the “new” Arctic.  The basic idea is to drive the German icebreaker Polarstern into the Laptev Sea in the Fall of 2019, tether it to an ice flow, and allow it to follow the circumpolar drift (Fram style) for a complete year.  This will provide for continuous time-series observations across the complete seasonal cycle, and an opportunity to carry out a number of key experiments.

I first attended a MOSAiC workshop in 2012 (when I wrote that first blog post).  It only took six years and two tries, but we’ve officially received NSF support to join the MOSAiC expedition.  Co-PI Brice Loose at URI and I will be carrying out a series of experiments to better understand how microbial community structure and ecophysiology control fluxes of O2 (as a proxy for carbon) and methane in the central Arctic Ocean.  The central Arctic Ocean is a weird place and we know very little about how these processes play out there. Like the subtropical gyres it’s extremely low in nutrients, but the low temperatures, extreme seasonality, (seasonal) sea ice cover, and continental shelf influences make it very different from the lower-latitude oceans.

The German icebreaker Polarstern, which will host the MOSAiC field effort.

CURE-ing microbes on ocean plastic

Thu, 08/09/2018 - 18:47

I was excited to learn today that our proposal CURE-ing microbes on ocean plastic, led by collaborators Ana Barral and Rachel Simmons at National University, was just funded by the National Science Foundation through the Hispanic Serving Institutions (HSI) program.  The Bowman Lab will play a small but important role in this project and it should be quite a bit of fun.  As a private, non-profit college National University is a somewhat unusual organization that serves a very different demographic than UC San Diego.  A large number of the students at National University are non-traditional, meaning that they aren’t going straight to college from high school.  A significant number are veterans, and National University is classified as an HSI.  So working with National University is a great opportunity to interact with a very different student body.

A microbial colonization pilot experiment attached to our mounting bracket on the SIO Pier in May.  If you look closely you can see the biofilm developing on the plastic wafers inside the cage.  The composition of the biofilm and its potential to degrade plastics will be explored by students at National University.

For this project Ana and Rachel developed a course-based undergraduate research experience (CURE) around the topic of ocean plastics.  This is an issue that is getting quite a bit of attention in the popular press, but has largely fallen through the cracks as far as research goes, possibly because it’s too applied for NSF and to basic (and expensive) for NOAA.  A lot of somewhat dubious work is going on in the fringe around the theme of cleaning up marine plastics, but we lack a basic understanding of the processes controlling the decomposition of plastics, and their entry into marine foodwebs (topics that do less to stoke the donor fires than clean up).  This CURE is a way to get students (and us) thinking about the science of marine plastics, specifically the colonization and degradation of plastics by marine microbes, while learning basic microbiology, molecular biology, and bioinformatic techniques.  The basic idea is to have the students deploy plastic colonization experiments in the coastal marine environment, isolate potential plastic degraders, sequence genomes, and carry out microbial community structure analysis.  Different courses will target different stages in the project, and students taking the right sequence of courses could be hands-on from start to finish.

Our role in the project is to provide the marine microbiology expertise.  Lab members will have an opportunity to give lectures and provide mentoring to National University students, and we’ll handle the deployment and recovery of plastic-colonization experiments on the SIO Pier (yay, more diving!).  We’ll also play a role in analyzing and publishing the data from these experiments.  Many thanks to lab alumnus Emelia DeForce (formerly with MoBio and now with Fisher Scientific) for bringing us all together on this project!


Ana Barral’s (center) B100A: Survey of Biosciences Lab at National University after recovery of the microbial colonization pilot experiment in June.  Thanks Allison Lee (at left) for coming in to dive on a Saturday morning!

A practical lesson in marine metals

Sat, 07/28/2018 - 10:36

This week Alyssa Demko from the Jenkins Lab and I dove to make repairs on our sample pump intake on the SIO pier.  Very soon the pump will supply water to our membrane inlet mass spectrometer and biological sampling manifold, so we’re eager to keep things in good working condition.  Our pump intake is secured to the pier by a very heavy stainless steel metal bracket.  When we first installed the metal bracket we opted for silicon bronze hardware; silicon bronze is pricey but among the most corrosion resistant alloys available.  When we last dove I noticed the hardware was corroding very rapidly, to the point that a good storm would have ripped the whole contraption off the pier.  Fortunately it’s summer!  Here’s some of the hardware that we recovered from our dive:


When installed these were 5 x 0.5 inch bolts, the head was 3/4 inch.  This is some serious corrosion for a 4 month deployment!  Silicon bronze is supposed to be corrosion resistant, so what happened?

The problem is that when two or more metals are in contact in the presence of a electrolyte (like seawater) they interact.  Specifically, some metals like to donate electrons to other metals.  The metal doing the donating (called the anode) corrodes more quickly than the metal that receives them (called the cathode).  Because this transfer replaces electrons that the cathode loses to seawater, the presence of the anode actually slows the corrosion of the cathode.  This is a well known process that we planned for when we designed the system, and we included a zinc plate called a sacrificial anode that serves no purpose other than to donate electrons to the stainless steel bracket.  Enter silicon bronze.

How readily one metal donates electrons to another metal is indicated by their location on the galvanic series.  The further apart two metals are, the more readily electrons flow from the anodic to the cathodic metal.  Silicon bronze is more anodic that stainless steel, particularly the 316 alloy we are using, but I figured it was close enough.  Apparently not, however, and we didn’t account for other factors that can influence the rate of electron transfer.  An important one is the surface area of the cathode relative to the anode.  Remember that the cathode is losing electrons to seawater, this is what is driving the flow of electrons from the anode to the cathode in the first place (if you put them in contact in say, air, mineral oil, or some other non-conductive medium nothing will happen).  So the more surface of the cathode is in contact with seawater, the more electrons will flow from the cathode to seawater, and from the anode to the cathode.  In our system the relatively small bronze bolts were attached to the a very large stainless steel bracket, and I think this accounts for the rapid corrosion.

There is one thing that I still don’t understand, which is why the zinc anode in the system didn’t protect the bronze and stainless steel.  Bronze will sacrifice itself for stainless steel (as we’ve clearly demonstrated), but zinc should sacrifice to bronze and stainless steel.  However, our sacrificial zinc anode looks almost as good as it did when I installed it.  In the meantime here’s a video of the impressive lobster party taking place on the piling where our sampling system is located (don’t even think about it, it’s a no take zone!). At the end of the video you can see our shiny new 316 stainless steel bolts. Hopefully these last!

Simons ECIMMEE award

Thu, 05/10/2018 - 00:04

The Simons Foundation Early Career Investigator in Marine Microbial Ecology and Evolution (ECIMMEE) awards were announced today and I’m thrilled that the Bowman Lab was selected.  Our project is centered on using the SIO Pier as a unique platform for collecting ecological data at a high temporal resolution.  Consider that marine heterotrophic bacteria often have cell division times on the order of hours – even less under optimal conditions – and that entire populations can be decimated by grazing or viral attack on similarly short timescales.  A typical long term study might sample weekly; the resulting time-series is blind to the dynamics that occur on shorter time-scales.  This makes it challenging to model key ecological processes, such as the biogeochemical consequences of certain microbial taxa being active in the system.

Over the past year we’ve been slowly developing the capability to conduct high(er)-resolution time-series sampling at the SIO Pier.  This award will allow us to take these efforts to the next level and really have some fun, from an ecological observation and modeling point of view.  Our goal is to develop a sampling system that is agnostic to time, but instead observes microbial community structure and other parameters along key ecological gradients like temperature and salinity.  Following the methods in our 2017 ISME paper, we can model key processes like bacterial production from community structure and physiology data, allowing us to predict those processes for stochastic events that would be impossible to sample in person.

The SIO Pier provides a unique opportunity to sample ocean water within minutes of the lab. 

Sometimes it’s useful to visualize all that goes into scientific observations as a pyramid.  At the tip of this pyramid is a nice model describing our ecological processes of interest.  Way down at the base is a whole lot of head-scratching, knuckle-scraping labor to figure out how to make the initial observations to inform that model.  One of our key challenges – which seems so simple – is just getting water up to the pier deck where we can sample it in an automated fashion.  The pier deck is about 30 feet above the water, and most pumps designed to raise water to that height deliver much too much for our purposes.  We identified a pneumatic pumping system that does the job nicely, but the pump intake requires fairly intensive maintenance and a lot of effort to keep the biology off it.  Here’s a short video of me attempting to clean and reposition the (kelp covered) pump intake on Monday, shot by Gabriel Castro, a graduate student in the Marine Natural Products program at SIO (thanks Gabriel for the assist!).  Note the intense phytoplankton bloom and moderate swell, not an easy day!

New seascape analysis of the western Antarctic Peninsula

Wed, 04/18/2018 - 18:49

We have a paper “Recurrent seascape units identify key ecological processes along the western Antarctic Peninsula” that is now available via advance-online through the journal Global Change Biology.  I place full blame for this paper on my former postdoctoral advisor and co-author Hugh Ducklow.  Shortly after I arrived for my postdoc at the Lamont-Doherty Earth Observatory, Hugh suggested using all the core Palmer LTER parameters since the start of the start of that program, and “some kind of multivariate method” to identify different years.  The presumption was that different years would map to some kind of recognizable ecological or climatic phenomenon.

At the time I knew nothing about seascapes or geospatial analysis.  However, I had been playing around with self organizing maps (SOMs) to segment microbial community structure data.  I thought that similarly segmenting geospatial data would yield an interesting result, so we gave it a go.  This involved carefully QC’ing all the core Palmer LTER data since 1993 (sadly discarding several years with erroneous or missing parameters), interpolating the data for each year to build 3 dimensional maps of each parameter (you can find these data here), then classifying each point in these maps with a SOM trained on the original data.  After a lot of back and forth with co-authors Maria Kavanaugh and Scott Doney, we elected to use the term “seascape unit” for different regions of the SOM.  Our classification scheme ultimately maps these seascape units to the original sampling grid.  By quantifying the extent of each seascape unit in each year we can attempt to identify similar years, and also identify climatic phenomena that exert controls on seascape unit abundance.

If you’re scratching your head at why it’s necessary to resort to seascape units for such an analysis it’s helpful to take a look at the training data in the standard T-S plot space.

Fig. 2 from Bowman et al., 2018.  Distribution of the training data in a) silicate-nitrate space and b) T-S space.  The color of the points gives the seascape unit.

The “V” distribution of points is characteristic of the western Antarctic Peninsula (WAP), and highlights the strong, dual relationship between temperature and salinity.  The warmest, saltiest water is associated with upper circumpolar deepwater (UCDW) and originates from offshore in the Antarctic Circumpolar Current (ACC).  The coldest water is winter water (WW), which is formed from brine rejection and heat loss during the winter.  Warm, fresh water is associated with summer surface water (SW).  Note, however, that multiple seascape unites are associated with each water mass.  The reason for this is that nutrient concentrations can vary quite a bit within each water mass.  My favorite example is WW, which we usually think of as rich in nutrients (nutrient replete); it is, but WW associated with SU 2 is a lot less nutrient rich than that associated with SU 3.  Both will support a bloom, but the strength, duration, and composition of the bloom is likely to differ.

To evaluate how different climatic phenomena might influence the distribution of seascape units in different years we applied elastic-net regression as described here.  This is where things got a bit frustrating.  It was really difficult to build models that described a reasonable amount of the variance in seascape unit abundance.  Where we could the usual suspects popped out as good predictors; October and January ice conditions play a major role in determining the ecological state of the WAP.  But it’s clear that lots of other things do as well, or that the tested predictors are interacting in non-linear ways that make it very difficult to predict the occurrence of a given ecosystem state.

We did get some interesting results predicting clusters of years.  Based on hierarchical clustering, the relative abundance of seascape units in a core sampling area suggests two very distinct types of years.  We tested models based on combinations of time-lagged variables (monthly sea ice extent, fraction of open water, ENSO, SAM, etc.) to predict year-type, with June and October within-pack ice open water extent best predicting year-type.  This fits well with our understanding of the system; fall and spring storm conditions are known to exert a control on bloom conditions the following year.  In our analysis, when the areal extent of fall and spring within-pack ice open water is high (think cold but windy), chlorophyll the following summer is associated with a specific seascape (SU 1 below) that is found most frequently offshore.  When the opposite conditions prevail, chlorophyll the following summer is associated with a specific seascape (SU 8) that is found most frequently inshore.  Interestingly, the chlorophyll inventory isn’t that different between year-types, but the density and distribution of chlorophyll is, which presumably matters for the higher trophic levels (that analysis is somewhere on my to-do list).

Fig. 3 from Bowman et al., 2018.  Clustering of the available years into different year-types.  The extent of within-pack ice open water in June and October are reasonable predictors of year-type.  Panel A shows the relative abundance of seascape unit for each year-type.  Panel B shows the fraction of chlorophyll for each year that is associated with each seascape unit.

One of our most intriguing findings was a steady increase in the relative abundance of SU 3 over time.  SU 3 is one of those seascapes associated with winter water; it’s the low nutrient winter water variant.  That steady increase means that there has been a steady decrease in the bulk silicate inventory in the study area with time.  I’m not sure what that means, though my guess is there’s been an increase in early season primary production which could draw down nutrients while winter water is still forming.

Fig. 5 from Bowman et al., 2018.  SU 3, a low-nutrient flavor of winter water, has been increasing in relative abundance.  This has driven a significant decrease in silicate concentrations in the study area during the Palmer LTER cruise.

Tutorial: How to make a map using QGIS

Mon, 01/29/2018 - 16:31

Hi! I’m Natalia Erazo, currently working on the Ecuador project aimed at examining biogeochemical processes in mangrove forest. In this tutorial, we’ll learn the basics of (free) QGIS, how to import vector data, and make a map using data obtained from our recent field trip to the Ecological Reserve Cayapas Mataje in Ecuador!  We’ll also learn standard map elements and QGIS function: Print Composer to generate a map.


I. Install QGIS

II. Learn how to upload raster data using the Plugin OpenLayers and QuickMap services.

III. Learn how to import vector data: import latitude, longitude data and additional data. Learn how to select attributes from the data e.g., salinity values and plot them.

IV. Make a map using Print Composer in QGIS.

I. QGIS- Installation

QGIS is a very powerful tool and user friendly open source geographical system that runs on linux, unix, mac, and windows. QGIS can be downloaded here . You should follow the instructions and install gdal complete.pkg, numpy.pkg, matplotlib.pkg, and qgis.pkg.

II.Install QGIS Plug-in and Upload a base map.

  1. Install QGIS Plug-in

Go to Plugins and select Manage and Install plugins. This will open the plugins dialogue box and type OpenLayers Plugin and click on Install plugin.

This plugin will give you access to Google Maps, openStreet map layers and others, and it is very useful to make quick maps from Google satellite, physical, and street layers. However, the OpenLayers plugin could generate zoom errors in your maps.   There is another plug in: Quick Map Service which uses tile servers and not the direct api for getting Google layers and others. This is a very useful plugin which offers more options for base maps and less zoom errors. To install it you should follow the same steps as you did for the OpenLayers plugin except this time you’ll type QuickMap Service and install the plugin.

Also, If you want to experiment with QuickMap services you can expand the plugin: Go to Web->Quick Map Services->Settings->More services and click on get contributed pack. This will generate more options for mapping.

2. Add the base layer Map:

I recommend playing with the various options in either OpenLayers like the Google satellite, physical, and other maps layers, or QuickMap Service.

For this map, we will use ESRI library from QuickMap services. Go to–> Web- ->QuickMapServices–> Esri–> ESRI Satellite

You should see your satellite map.

You can click on the zoom in icon to adjust the zoom, as shown in the map below where I  zoom in the Galapagos Islands. You’ll also notice that on the left side you have a Layers panel box, this box shows all the layers you add to your map. Layers can be raster data or vector data, in this case we see the layer: ESRI Satellite. At the far left you’ll see a list of icons that are used to import your layers. It is important to know what kind of data you are importing to QGIS to use the correct function.

III. Adding our vector data.

We will now add our data file which contains latitude and longitude of all the sites we collected samples, in addition to values for salinity, temperature, and turbidity. You can do this with your own data by creating a file in excel  and have a column with longitude and latitude values and columns with other variables  and save it as a csv file. To input data you’ll go to the icons on the far left and click on “Add  Delimited Text Layer”. Or you can click on Layer-> Add Layer-> Add Delimited Text Layer.

You’ll browse to the file with your data. Make sure that csv is selected for File format. Additionally, make sure that X field represents the column for your longitude points and Y field for latitude. QGIS is smart enough to recognize longitude and latitude columns but double check! You can also see an overview of the data with columns for latitude, longitude, Barometer mmHg, conductivity, Salinity psu and other variables. You can leave everything else as default and click ok.

You’ll be prompt to select the coordinate reference system selector, and this is very important because if you do not select the right one you’ll get your points in the wrong location. For GPS coordinates, as the data we are using here, you need to select WGS 84 ESPG 43126.

Now we can see all the points where we collected data!

As we saw earlier, the data contains environmental measurements such as: salinity, turbidity, temperature and others. We can style the layer with our sampling points based on the variables of our data. In this example we will  create a layer representing salinity values.

You’ll right click on the layer with our data in the Layer Panel, in this case our layer: 2017_ecuador_ysi_dat.. and select properties.

The are many styles you can choose for the layer and the styling options are located in the Style tab of the Properties dialogue. Clicking on the drop-down bottom in the Style dialogue, you’ll see there are five options available: Single Symbol, Categorized, Graduated, Rule Based and Point displacement. We’ll use Graduated which allows you to break down the data in unique classes. Here we will use the salinity values and will classify them into 3 classes: low, medium, and high salinity. There are 5 modes available in the Graduated style to do this: Equal interval, Quantile, Natural breaks, Standard deviation and Pretty breaks. You can read more about these options in qgis documentation.

In this tutorial, for simplicity  we’ll use the Quantile option. This method will decide the classes such that number of values in each class are the same; for example, if there are 100 values and we want 4 classes, the quantile method decide the classes such that each class will have 25 values.

In the Style section: Select->Graduated, in Column->salinity psu, and in color ramp we’ll do colors ranging from yellow to red.

In the classes box write down 3 and  select mode–>Quantile. Click on classify, and QGIS will classify your values in different ranges.

Now we have all the data points color in the 3 different ranges: low, medium, and high salinity.

However, we have a lot of points and it is hard to visualize the data points. We can edit the points by right clicking on the marker points and select edit symbol.

Now, I am going to get rid of the black outline to make the points easy to visualize. Select the point by clicking on Simple Marker and in Outline style select the No Pen. Do the same for the remaining two points.

Nice, now we can better see variations in our points based on salinity!

IV. Print Composer: making a final map

We can start to assemble the final version of our  map. QGIS has the option to create a Print composer where you can edit your map. Go to Project -> New Print composer

You will be prompted to enter a title for the composer, enter the title name and hit ok. You will be taken to the Composer window.

In the Print composer window, we want to bring the map view that we see in the QGIS canvas to the composer. Go to Layout-> Add a Map. Once the Add map button is active, hold the left mouse and drag a rectangle where you want to insert the map. You will see that the rectangle window will be rendered with the map from the main QGIS canvas.

You can see in the far right end the Items box; this  shows you the map you just added. If you want to make changes, you’ll select the map and edit it under item properties. Sometimes it is useful to edit the scale until you are happy with the map.

We can also add a second map of the location of Cayapas Mataje in South America as a  geographic reference. Go to the main qgis canvas and zoom out the map until you can see where in South America the reserve is located.

Now go back to Print Composer and add the map of  the entire region. You’ll do the same as with the first map. Go to Layout–> Add map. Drag a rectangle where you want to insert the map. You will see that the rectangle window will be rendered with the map from the main QGIS canvas. In Items box, you can see you have Map 0 and Map 1. Select Map 1, and add a frame under Item properties, click on Frame to activate it and adjust the thickness to 0.40mm.

We can add a North arrow to the map. The print composer comes with a collection of map related images including many North arrows. Click layout–> add image.

Hold on the left mouse button, draw a rectangle on the top-right corner of the map canvas.

On the right-hand panel, click on the Item Properties tab and expand the Search directories and select the north arrow image you like the most. Once you’ve selected your image, you can always edit the arrow under SVG parameters.

Now we’ll add a scale bar. Click on Layout–> Add a Scale bar. Click on the layout where you want the scale bar to appear. Choose the Style and units that fit your requirement. In the Segments panel, you can adjust the number of segments and their size. Make sure Map 0 is selected under main properties.

I’ll add a legend to the map. Go to Layout–> add a Legend. Hold on the left mouse button, and draw a rectangle on the area you want the legend to appear. You can make any changes such as adding a title in the item properties, changing fonts and renaming your legend points by clicking on them and writing the text you want.

It’s time to label our map. Click on Layout ‣ Add Label. Click on the map and draw a box where the label should be. In the Item Properties tab, expand the Label section and enter the text as shown below. You can also make additional changes to your font, size by editing the label under Appearance.

Once you have your final version, you can export it as Image, PDF or SVG. For this tutorial, let’s export it as an image. Click Composer ‣ Export as Image.

Here is our final map!

Now you can try the tutorial with your own data. Making maps is always a bit challenging but put your imagination to work!

Here is a list of links that could help with QGIS:

-QGIS blog with various tutorials and new info on functions to use: here.

-If you want more information on how QGIS handles symbol and vector data styling: here  is a good tutorial.

-If you need data, a good place to start is Natural Earth: Free vector and raster basemap data used for almost any cartographic endeavor.

If you have specific questions please don’t hesitate to ask.


Hunting for halophiles at the South Bay Saltworks

Sat, 12/16/2017 - 17:23

Hello! My name is Melissa Hopkins. I just finished my first quarter as an undergraduate researcher in the Bowman Lab. The project I am working on involves the diversity of halophiles from the South Bay Saltworks lakes in San Diego. The South Bay Saltworks is an active solar salt harvesting facility that is part of the San Diego Bay National Wildlife Refuge. The goal of this project is to use 16S/18S community structure to identify microbial taxa that are currently poorly represented in existing genomes. We want to contribute to halophile genomes to learn more about halophiles, and this helps us decide which new genomes to sequence.

Aharan Oren’s open access review Microbial life at high concentrations” phylogenetic and metabolic diversityexplains the different classes and orders that contain halophiles, as well as the similarities of strategies used by these halophiles. Here, he uses the definition of halophiles as microbes that are able to tolerate 100 g/L salt but grow optimally at 50 g/L salt (seawater contains 35 g/L salt). Extreme halophiles (including the haloarchaea) are defined as growing best at salt concentrations of 2.5-5.2 M, and moderate halophiles as growing optimally at salt concentrations of 1.5-4.0 M. This assumes that the salt is sodium chloride however, different salts such as magnesium chloride can present additional challenges to life.

Halophiles are able to survive these high salt concentration environments in two different ways: pumping salt ions into their cells from the surrounding environment, or synthesizing organic solutes to match the concentration of their surrounding environment. Synthesizing organic solutes is more energetically expensive because it requires energy to make the high concentration of organic solutes needed, thus keeping salt ions out of the cytoplasm. But, this strategy is actually found widely across halophiles species. Pumping specific salt ions into their cells that don’t interfere with biological processes is less energetically expensive, but requires proteins in the cell to be specially adapted to high salt conditions. Because of this that strategy is not seen as often across the different species of halophiles. Different families and orders of halophiles use variations of these strategies to survive. There have been some new halophile species, such as Salinbacter, that are outliers, as in they use a different survival strategy then other halophiles they are related to. Sequencing many more halophile genomes will give us new information on how these adaptive strategies across different halophiles.

For this project, Jeff, Natalia and I spent the day sampling lakes from South Bay Saltworks on October 6. Out goal was to sample from 3 different points in each of several lakes of different salt concentrations, and to sample as many lakes as possible.

We started out at the lower salinity lakes to see how the equipment would function and get used to the sampling process. At each point, we took unfiltered samples using a peristaltic pump for respiration tests, bacterial abundance, measurements of photosynthetic efficiency and chlorophyll concentration, turbidity, and salinity.

We then placed a GF/F filter on the housing of the pump to collect a coarsely filtered sample for ion composition analysis, FDOM, and dissolved inorganic nutrients. Finally, we placed a 0.2 micron filter on the housing to collect bacteria, archaea, and phytoplankton for DNA analysis.

In all, we sampled 7 lakes: one low salt concentration lake, one medium salt concentration lake, 2 high salt concentration lakes, and 3 magnesium chloride lakes. Unfortunately, due to time constraints and the high viscosity of the magnesium chloride lakes (its like trying to filter maple syrup), we were only to sample from 1 point for each of the magnesium chloride lakes.

Natalia and I setting up for sampling on one of the lower salinity lakes.  You can see the large piles of harvested salt in the background.

One of the magnesium chloride lakes we sampled. Due to the high salt concentration and extremely high attraction between water and magnesium chloride these lakes have an oily texture and high viscosity, making them difficult to sample.

One of the saltier sodium chloride lakes that we sampled from.  The pink color comes from pigments in the different microorganisms that we’re studying.

Ecuador update

Sun, 11/26/2017 - 19:52

Today we took the last sample of our Ecuador field effort, though we have a few days left in-country. Right now we are in the town of Mompiche, just down the coast from our second field site near Muisne. Tomorrow we’ll be sorting out gear and getting ready for a few days of meetings in Guayaquil. Then its time to fly home and start working up some data! I’m too tired to write a coherent post on the last two (intensive!) weeks of sampling, but here’s a photo summary of our work in the Cayapas-Mataje Ecological Reserve, where we spent the bulk of our time. Enjoy!

Pelicans congregating along a front in the Cayapas-Mataje estuary.

The town of Valdez, near the mouth of the Cayapas-Mataje estuary.  The Reserve is right on the border with Columbia, and up until a few years ago Valdez had a reputation as a trafficking hub.  Drug trafficking is still a problem throughout the Reserve, but with the conflict with FARC more or less over I understand that tension in the local communities has gone way down.  Valdez seems okay, and the people we met there were friendly.

Another view of Valdez.

Shrimp farm in the Cayapas-Mataje Ecological Reserve.  You can’t build a new farm in the Reserve, but old shrimp farms were grandfathered in when the Reserve was created.

The R/V Luz Yamira, at its home port of Tambillo.  Tambillo was a vibrant, friendly little town where we spent a bit of time.  The town is struggling to hold onto its subsistence fishing lifestyle in the face of declining fish stocks.

ADCP, easy as 1-2-3

Birds of a feather…

Morning commute from the city of San Lorenzo.

The Cayapas-Mataje Ecological Reserve has the tallest mangrove trees in the world.

I took this picture just for the good people at UCSD risk management.

Team Cayapas-Mataje.  From left; Jessie, Natalia, and Jeff.  We are standing in front of Jessie’s house in Tambillo.  Many thanks to Jessie and his wife for letting us stay a night and get away from the craziness of San Lorenzo!

A very full car ready to head to Muisne.  Its a good thing Natalia and I are both fairly short.

A large shrimp farm near Muisne.  The area around Muisne has been almost entirely deforested for shrimp aquaculture.  By comparing this area with the more pristine Cayapas-Mataje, we hope to better understand the biogeochemical consequences of coastal land-use change.

Initial field effort in Ecuador

Tue, 11/14/2017 - 03:08

As any reader of this blog will know, most of the research in the Bowman Lab is focused on polar microbial ecology. Although focusing a research program on a set of geographically-linked environments does have advantages, primarily the ability to spend more time thinking in depth about them, there is I think something lost with this approach. Insights are often gained by bringing a fresh perspective to a new study area, or applying lessons learned in such an area to places that one has studied for years. With this in mind lab member Natalia Erazo and I are launching a new field effort in coastal Ecuador. Natalia is an Ecuadorean native, and gets credit for developing the idea and sorting out the considerable logistics for this initial effort. Our first trip is very much a scouting effort, but will carry out some sampling in the Cayapas-Mataje Ecological Reserve and near the town of Muisne. Depending on funding we hope to return during the rainy season in January-February for a more intensive effort.

Dense coastal forest in the Cayapas-Mataje Ecological Reserve.  Photo Natalia Erazo.

Our primary objective is to understand the role of mangrove forests in coastal biogeochemical cycling. Mangroves are salt-tolerant trees that grow in tropical and sub-tropical coastal areas around the world. They are known to provide a range of positive ecosystem functions; serving as fish habitat, stabilizing shorelines, and providing carbon and nutrient subsidies to the coastal marine environment. Globally mangroves are under threat. The population density of many tropical coastal areas is increasing, and that inevitably leads to land-use changes (such as deforestation) and a loss of economic services – the social and economic benefits of an ecosystem – as economic activity ramps up. The trick to long-term sustainability is to maintain ecosystem services during economic development, allowing standards of living to increase in the short term without a long-term economic loss resulting from ecological failure (such as the collapse of a fishery or catastrophic coastal flooding). This is not easily done, and requires a much better understanding of what functions exactly, specific at-risk components of an ecosystem provide than we often have.

One particular land-use threat to the mangrove ecosystem is shrimp aquaculture. Mangrove forests in Ecuador and in other parts of the world have been deforested on a massive scale to make room for shrimp aquaculture ponds. In addition to scaling back any ecosystem functions provided by the mangrove forest, the shrimp aquaculture facilities are a source of nutrients in the form of excrement and excess feed. On this trip we will try to locate estuaries more and less perturbed by aquaculture. By comparing nutrient and carbon concentrations, sediment load, and microbial community structure between these areas, we will gain a preliminary understanding of what happens to the coastal ecosystem when mangroves are removed and aquaculture facilities are installed in their place.

Our first stop on this search will be San Lorenzo, a small city in the Cayapas-Mataje Ecological Reserve near the border with Columbia. I’m extremely excited to visit the Reserve, which has the distinction of hosting the tallest mangrove trees anywhere on Earth. We may some meager internet access in San Lorenzo, so I’ll try to update this blog as I’m able. Because of the remote nature of some of our proposed field sites we’ll have a Garmin InReach satellite messenger with us. We plan to leave the device on during our field outings, you can track our location in real time on the map below! The Garmin map interface is a bit cludgey; you should ignore the other Scripps Institution of Oceanography users listed on the side panel as I can’t seem to make them disappear.

Microbial session at POLAR 2018 in Davos

Tue, 10/03/2017 - 10:38

Davos, Switzerland, site of the POLAR2018 conference.  Image from

With colleagues Maria Corsaro, Eric Collins, Maria Tutino, Jody Deming, and Julie Dinasquet I’m convening a session on polar microbial ecology and evolution at the upcoming POLAR2018 conference in Davos, Switzerland.  Polar2018 is shaping up to be a unique and excellent conference; for the first time (I think) the major international Arctic science organization (IASC) is joining forces with the major international Antarctic science organization (SCAR) for a joint meeting.  Because many polar scientists specialize in one region, and thus have few opportunities to learn from the other, a joint meeting will be a great opportunity to share ideas.

Here’s the full-text description for our session.  If your work is related to microbial evolution, adaption, or ecological function in the polar regions I hope you’ll consider applying!

Unable to display PDF
Click here to download

Denitrifying bacteria and oysters

Thu, 09/21/2017 - 16:03

The eastern oyster.  Denitrification never tasted so good!  Photo from

I’m happy to be co-author on a study that was just published by Ann Arfken, a PhD student at the Virginia Institute for Marine Science (VIMS).  The study evaluated the composition of the microbial community associated with eastern oyster Crassostrea virginica to determine if oysters play a role in denitrification.  Denitrification is an ecologically significant anaerobic microbial metabolism.  In the absence of oxygen certain microbes use other oxidized compounds as electron acceptors.  Nitrate (NO3–) is a good alternate electron acceptor, and the process of reducing nitrate to nitrite (NO2), and ultimately to elemental nitrogen (N2), is called denitrification.  Unfortunately nitrate is needed by photosynthetic organisms like plants and phytoplankton, so the removal of nitrate can be detrimental to ecosystem health.  Oxygen is easily depleted in the guts of higher organisms by high rates of metabolic activity, creating a niche for denitrification and other anaerobic processes.

Predicted relative abundance (paprica) as a function of measured (qPCR) relative abundance of nosZI genes.  From Arfken et al. 2017.

To evaluate denitrification in C. virginica, Arfken et al. coupled actual measurements of denitrification in sediments and oysters with an analysis of microbial community structure in oyster shells and digestive glands.  We then used paprica with a customized database to predict the presence of denitrification genes, and validated the predictions with qPCR.

I was particularly happy to see that the qPCR results agreed well with the paprica predictions for the nosZ gene, which codes for the enzyme responsible for reducing nitrous oxide (N2O) to N2.  I believe this is the first example of qPCR being used to validate metabolic inference – which currently lacks a good method for validation.  Surprisingly however (at least to me), denitrification in C. virginica was largely associated with the oyster shell rather than the digestive gland.  We don’t really know why this is.  Arfken et al. suggests rapid colonization of the oyster shell by denitrifying bacteria that also produce antibiotic compounds to exclude predators, but further work is needed to demonstrate this!

Analyzing flow cytometry data with R

Fri, 08/11/2017 - 11:33

We recently got our CyFlow Space flow cytometer in the lab and have been working out the kinks.  From a flow cytometry perspective the California coastal environment is pretty different from the western Antarctic Peninsula where I’ve done most of my flow cytometry work.  Getting my eyes calibrated to a new flow cytometer and a the coastal California environment has been an experience.  Helping me on this task is Tia Rabsatt, a SURF REU student from the US Virgin Islands.  Tia will be heading home in a couple of weeks which presents a challenge; once she leaves she won’t have access to the proprietary software that came with the flow cytometer.  To continue analyzing the data she collected over the summer as part of her project she’ll need a different solution.

To give her a way to work with the FCS files I put together a quick R script that reads in the file, sets some event limits, and produces a nice plot.  With a little modification one could “gate” and count different regions.  The script uses the flowCore package to read in the FCS format files, and the hist2d command in gplots to make a reasonably informative plot.

library('flowCore') library('gplots') #### parameters #### <- '' # name of the file you want to analyze, file must have extension ".FCS" sample.size <- 1e5 # number of events to plot, use "max" for all points fsc.ll <- 1 # FSC lower limit ssc.ll <- 1 # SSC lower limit fl1.ll <- 1 # FL1 lower limit (ex488/em536) #### functions #### ## plotting function <- function(fcm, x.param, y.param){ hist2d(log10(fcm[,x.param]), log10(fcm[,y.param]), col = c('grey', colorRampPalette(c('white', 'lightgoldenrod1', 'darkgreen'))(100)), nbins = 200, bg = 'grey', ylab = paste0('log10(', y.param, ')'), xlab = paste0('log10(', x.param, ')')) box() } #### read in file #### fcm <- read.FCS(paste0(, '.FCS')) fcm <- #### analyze file and make plot #### ## eliminate values that are below or equal to thresholds you ## defined above fcm$SSC[fcm$SSC <= ssc.ll|fcm$FSC <= fsc.ll|fcm$FL1 == fl1.ll] <- NA fcm <- na.omit(fcm) fcm.sample <- fcm if(sample.size != 'max'){ try({fcm.sample <- fcm[sample(length(fcm$SSC), sample.size),]}, silent = T) } ## plot events in a couple of different ways, 'FSC', 'SSC'), 'FSC', 'FL1') ## make a presentation quality figure png(paste0(, '_FSC', '_FL1', '.png'), width = 2000, height = 2000, pointsize = 50), 'FSC', 'FL1')

And here’s the final plot: