Category Archives: Citizen Climate Science

NOAA’s Annual Global Temperature Anomaly Trends

Here is an alternative way using dygraphs (link)

NOAA has released their December, 2014 global anomaly data, allowing us to examine the 1880 – 2014 global temperature anomaly trend.

This R script trend charts shows the annual and average decadal temperature anomalies for the NOAA data series.

 

NOAA’s data, like NASA GISS’s and the Japan Meteorological Agency, show that the global mean annual temperature continues to rise.  2014 was the warmest year in the warmest decade for all 3 global temperature data series.

Here is my R script for those who would like to reproduce my chart:

 


############## RClimate Script: NOAA Annual Temperature Anomaly Trend     ##################
##   1/16/15     Retrieve daa from NOAA's and plot annual - decadal trends                ##
#############################################################################################
 library(plyr); library(reshape); library(RCurl)
 NOAA_ann_link <- "http://www.ncdc.noaa.gov/cag/time-series/global/globe/land_ocean/ytd/12/1880-2014.csv"
  n <- read.table(NOAA_ann_link, skip = 2, sep = ",", dec=".",
                 row.names = NULL, header = T,  as.is = T, colClasses = rep("numeric",2),
                 col.names = c("yr", "anom") )
## Find last report year and last anomaly value
  num_rows <- nrow(n)
  NOAA_last_yr <- as.integer(n[num_rows,1])
  NOAA_last_anom <- signif(n[nrow(n),2],2)
# Decade calculations
 dec_mean<- as.numeric(14)
 dec_st <- as.numeric(14)
 dec_end <- as.numeric(14)
 base_yr <- 1880
 n$dec_n <-  (as.numeric((n$yr - base_yr) %/% 10) * 10) + base_yr
# df <- data.frame(df, dec_n)
 for (i in 1:13) {dec_st[i] = base_yr+ i*10
                 dec_sub <- subset(n, dec_n == dec_st[i], na.rm=T)
                 dec_mean[i] <- mean(dec_sub$anom)
         }
 dec_st[14] <- 2020              # Need to have for last step line across decade
 dec_mean[14] <- dec_mean[13]
 dec<- data.frame(dec_st, dec_mean)
# Trend chart function
  plot_func<- function() {
   par(las=1); par(ps=12); par(oma=c(2.5,1,1,1)); par(mar=c(2.5,4,2,1))
   p_xmin <- 1880;   p_xmax <- n[num_rows,1]
   title <- paste("NOAA Land and Sea Temperature Annual Anomaly Trend \n", p_xmin, " to ", NOAA_last_yr, sep="")
   plot(n$yr, n$anom, type = "l", col = "grey",
     xlim = c(p_xmin, p_xmax), ylab = "Temperature Anomaly - \u00B0C (1951-1980 Baseline)",
     xlab="", main = title,cex.main = 0.85)
   points(NOAA_last_yr, NOAA_last_anom, col = "red", pch=19)
  last_pt <- paste( NOAA_last_yr, " @ ", NOAA_last_anom, " \u00B0C",sep="")
  points(dec$dec_st, dec$dec_mean, type="s", col="blue")
  ## add legend
  legend(1882,0.6, c("Decadal Avg Anomaly" ,"Annual anomaly", last_pt), col = c("blue", "grey", "red"),
       text.col = "black", lty = c(1,1,0),pch=c(0,0,16),pt.cex=c(0,0,1),
       merge = F, bg = "white", bty="o", cex = .75, box.col="white")
   out <- paste("NOAA Annual Temperature Anomaly \nData updated through: " , NOAA_last_yr, sep="")
   t_pos <- p_xmin + 0.5*(p_xmax-p_xmin)
  text(t_pos, -0.55, out, cex = 0.7, adj = 0)
  data_source <- paste("Data Source: ", NOAA_ann_link, sep="")
# Plot Annotation
 mtext(data_source,1,0, cex = 0.75, adj=0.5, outer=T)
 mtext("D Kelly O'Day - https://chartsgraphs.wordpress.com", 1,1, adj = 0, cex = 0.8, outer=TRUE)
 mtext(format(Sys.time(), "%m/%d/ %Y"), 1, 1, adj = 1, cex = 0.8, outer=TRUE)
  }
##########################################################################################
plot_func()

NASA GISS’s Annual Global Temperature Anomaly Trends

Update: Here is an alternative way to chart the same data (link).

NASA’s Goddard Institute for Space Studies (GISS)  has released their December, 2014 anomaly data, showing that 2014 was the warmest year in the warmest decade in the 1880 – 2014 instrumental temperature record period.

NASA’s results are consistent with the Japanese Meteorological Agency report (here)

 

Here is my R script for those who would like to reproduce my chart.

 

############## RClimate Script: GISS Annual Temperature Anomaly ###########################
##                   http:chartsgraphs.wordpress.com    1/16/15                            ##
############################################################################################
  library(plyr); library(reshape)
 ## File Download and File
  url <- c("http://data.giss.nasa.gov/gistemp/tabledata/GLB.Ts+dSST.txt")
  file <- c("GLB.Ts+dSST.txt")
  download.file(url, file)

## 1st 8 rows and the last 12 rows contain instructions
## Find out the number of rows in the file, and exclude the last 12
    rows <- length(readLines(file)) - 12
## Read file as  char vector, one line per row, Exclude first 8 rows
    lines <- readLines(file, n=rows)[8:rows]
## Data Manipulation, R vector
## Use regexp to replace all the occurences of **** with NA
    lines2 <- gsub("\\*{3,5}", " NA", lines, perl=TRUE)
## Convert the character vector to a dataframe
    df <- read.table(
      textConnection(lines2), header=TRUE, colClasses = "character")
    closeAllConnections()
## Select monthly data in first 13 columns
    df <- df[,c(1,14)]
## Convert all variables (columns) to numeric format
    df <- colwise(as.numeric) (df)
    df[,2] <- df[,2]/100
    names(df) <- c("Year", "anom")
## Remove rows where Year=NA from the dataframe
    df <- df [!is.na(df$Year),]
## Find last report month and last value
    GISS_last <- nrow(df)
    GISS_last_yr <- df$Year[GISS_last]
    GISS_last_temp <- df$anom[GISS_last]
## Calc decade averages
  dec_mean<- as.numeric(14)
  dec_st <- as.numeric(14)
  dec_end <- as.numeric(14)
 # yr_n <- as.integer(df$Year)
  base_yr <- 1870
  df$dec_n <-  (as.numeric((df$Year - base_yr) %/% 10) * 10) + base_yr
 # df <- data.frame(df, dec_n)
  for (i in 1:13) {dec_st[i] = base_yr+ i*10
     dec_sub <- subset(df, dec_n == dec_st[i], na.rm=T)
     dec_mean[i] <- mean(dec_sub$anom)
     }
 dec_st[14] <- 2020              # Need to have for last step line across decade
 dec_mean[14] <- dec_mean[13]
 dec<- data.frame(dec_st, dec_mean)
#### Plot function
  plot_func <- function() {
  par(las=1); par(ps=12)
  par(oma=c(2.5,1,1,1)); par(mar=c(2.5,4,2,1))
# specify plot yr min & max
  p_xmin <- 1880;   p_xmax <- GISS_last_yr+10
  title <- paste("GISS Land and Sea Temperature Annual Anomaly Trend \n", p_xmin, " to ",
   GISS_last_yr, sep="")
  plot(df[,1], df[,2], type = "l", col = "grey",
     xlim = c(p_xmin, p_xmax), ylab = "Temperature Anomaly - C (1951-1980 Baseline)",
     xlab="", main = title,cex.main = 1,cex.lab=0.8,cex.axis=0.85)
  points(GISS_last_yr, GISS_last_temp, col = "red", pch=19)
  last_pt <- paste( GISS_last_yr, ", ", GISS_last_yr, " @ ", GISS_last_temp, "C",sep="")
  points(dec$dec_st, dec$dec_mean, type="s", col="blue")
## add legend
  legend(1880,0.6, c("Decade Mean Anomaly", "Annual Anomaly" ,GISS_last_temp), col = c("blue", "grey", "red"),
       text.col = "black", lty = c(1,1,0),pch=c(0,0,16),pt.cex=c(0,0,1),
       merge = T, bg = "white", bty="o", cex = .75, box.col="white")

Global Sea Level Rise and El Nino – La Nina

Here is an informative interview with NASA’s Josh Willis about global sea level rise and El Nino – La Nina.  I first saw the video on Zeke Hausfather’s  YALE Forum on CLIMATE CHANGE & THE MEDIA

Comparison of UAH and RSS Time Series with Common Baseline

In this post I set both UAH 5.4 and RSS 3.3 global temperature anomaly series to a common baseline period (1981-2010)  to compare them. Since both the UAH 5.4 and RSS 3.3 series are satellite based , they exhibit striking similarities.

Common Baseline

In this previous post, I showed how to convert temperature anomaly time series from one baseline period to another period.  I then used this technique in this post to directly compare UAH 5.4 (baseline 1981-2010) and GISS.

In this post, I compare the satellite based UAH 5.4 (baseline 1981-2010) and RSS 3.3 (baseline 1979-1998) series.

The offsets are as follows:

  • UAH:  -0.000978
  • RSS:      0.098772

Since the UAH TLT 5.4 series is based on a 1981-2010 baseline, the offset is nearly zero (-0.00098 versus 0.0). The RSS offset changes the baseline from 1979-1998 to 1981-2010.

Users can reproduce my analysis on their own by downloading my CTS.csv file and applying the offsets to the UAH and RSS series.

Comparison of 1981-2010 Baseline Series

Here is a plot of UAH and RSS 12 month moving averages for 1979 to current: Click to Enlarge

Continue reading

Comparison of UAH and GISS Time Series with Common Baseline

In this post I set both UAH and GISS global temperature anomaly series to a common baseline period (1981-2010)  and compare them. Even though the UAH series is satellite based and GISS series is station based, the series exhibit striking similarities.

Common Baseline

In this previous post, I showed how to convert temperature anomaly time series from one baseline period to another period.  I use this technique in this post to directly compare UAH (baseline 1981-2010) and GISS (baseline 1951-1980) series.

The offsets are as follows:

  • UAH:  -0.000978
  • GISS: 0.34958

Since the UAH TLT 5.4 series is based on a 1981-2010 baseline, the offset is nearly zero (-0.00098 versus 0.0).

Users can reproduce my analysis on their own by downloading my CTS.csv file and applying the offsets to the UAH and GISS series.

Comparison of 1981-2010 Baseline Series

Here is a plot of UAH and GISS 12 month moving averages for 1979 to current: Click to Enlarge Continue reading

September 2011 Arctic Sea Ice Extent Forecast

In this post, I use a quadratic regression model to forecast the  September, 2011  Arctic Sea Ice Extent. The model was developed with  1980 – 2010 data. Links to the R script, source data and  how-to article on polynomial regression are provided.

Arctic Sea Ice Extent Forecast for September, 2011

First, here is my forecast: (Click image to enlarge)

ASIE_forecast_2011

Based on the 1980 – 2010 downward Arctic Sea Ice trend,  my forecast is that September, 2011 SIE will decline  0.36 below 2010 levels, to 4.54 million km^2, with a confidence band of +- 0.59.

How Did I Develop My Forecast?

I have written a number of posts on Arctic Sea Ice Extent (here, here, here). In this post, I used the NSDIC‘s monthly data file (link)  to construct a quadratic regression model of September sea ice extent for the 1980 – 2010 period. I then used this model to predict the September, 2011  Arctic Sea Ice Extent.

I have 2 main learning curve sources for this model:

  • Tamino‘s post on Arctic Sea Ice decline provided the basic idea of using a quadratic model to fit Arctic SIE decline.
  • John Quick’s tutorial on polynomial regression provided the how-to instructions I needed to implement Tamino’s approach in R.

RClimate Script and Links

Here is the link to my RClimate script.

Climate Time Series In a Single CSV File: Update 1

I am pleased to announce my CTS.csv file which includes 18 climate monthly time series in one easy to access csv file. This is part of  my goal of having a user friendly way for do-it-yourself citizen climate scientists to get up-to-date agency climate time series in a painless way.

Update 1: Reader Scott asked if I could provide meta data for the columns in my CTS.csv. This page lists the source agency and data links for the climate data series.

Here’s a snap shot of the first 6 rows of my  CTS.csv file. The data extends from 1880 until the most recent month.  Click image to enlarge

My hope is to make the CTS.csv the go-to file for citizen climate scientists who may want to:

  • Check temperature anomalies trends by series (GISS, HAD, NOAA, RSS, UAH)
  • Assess climate oscillations(AMO, AO, MEI, Nino34,  PDO)  trends
  • Evaluate  CO2 versus temperature anomaly relationships
  • Evaluate relationship between Sunspot numbers and anomaly temperature anomaly trends
  • Compare atmospheric transmission, SATO index  and volcanic activity
  • Assess impact of volcanoes on temperature anomaly trends
  • Compare MEI versus Nino ENSO 34 indicators
  • Assess lower stratospheric trends using RSS’s TLS series

By having these climate time series in a single csv file, R and Excel users can work with up to date data in a convenient form. The file will be automatically updated monthly as the climate agencies release their latest data.

How can CTS.csv Help Do-It-Yourself Citizen Climate Scientists?

Interested climate observers who want to compare global SSTA versus Nino34 trends, for example, have to follow a multiphase process:

  1. Find data file – even with Google this can take time
  2. Download files
  3. Merge 2 or more files to get data  into a usable format – source files all have different formats
  4. Perform analysis

Steps 1-3 can be very time consuming, so many users don’t bother checking out their ideas. Rather, they may rely on climate blog  comments. With CTS.csv and some R or Excel analysis, they can find the facts themselves rather than just having opinions.  They can submit their analysis and charts to blog posts, hopefully increasing the rigor of blog discussions.

Climate bloggers can request that their readers submit charts to back up their climate trend claims.

Data & RClimate Scripts Are All Open Book

All of the RClimate script that I use to produce the CTS.csv is available on-line at this link. Source data links are included in the function for each series.