Skip to contents

Downloading GPM Data

To download the data for example, for 30.09.2011:

library(idps)
# Define the date
DATE <- "2011-09-30"

path <- "/path/to/where/files/should/be/stored"
user <- "username"
password <- "your password"
# other options are: laterun or earlyrun
product <- "finalrun"
# other options are available, check the gpm_download function doc.
band <- "precipitationCal"
lonMin <- 20
lonMax <- 70
latMin <- -5
latMax <- 40
# The data is originally in HDF5 format. this will remove original files
removeHDF5 <- TRUE
quiet <- TRUE

gpm_download(
  path = path,
  user = user,
  password = password,
  dates = DATE,
  product,
  band = "precipitationCal",
  lonMin = lonMin,
  lonMax = lonMax,
  latMin = latMin,
  latMax = latMax,
  removeHDF5 = TRUE,
  quiet = TRUE
)

Again, thanks to Cesar Aybar to for their work here, that inspired this work.

Post-processing GPM Data

You might need to tidy your work space or for long-term storage, having thousands of files, is not optimistic. Therefore, there is another function that takes a list of many NetCDF files and aggregate them adding to the calender month.

library(idps)

path <- "/path/to/where/files/should/be/stored"
output.path <- "/path/to/where/aggregated-files/should/be/stored"

netCDF.files <- list.files(
  path = path,
  full.names = T,
  pattern = ".nc$"
)

aggregate_netCDF_files(
  netCDF.files,
  output.path
)

Note: Both function use the r package terra to read and process HDF5 and NetCDF files with a compression level of 9. For more information, please check here.