(One of my summaries of a talk at the 2017 fossgis conference).
Till Adams works at mundialis, a firm that works at processing massive GIS datasets. The data he talks about now is satellite-based temperature data. So that means making the data more complete by handling the holes in the data caused by clouds, for instance.
They have an archive of 15 years, with 4 temperatures per day per 250x250 meter. The data is from the MODIS project.
15 years, 365 days, 4 sets per day: 21.900 data sets. For Germany, it is about 6 million pixels per data set.
The source data for central Europe is around 5GB/year. In the end, the full archive will be around 795GB for central Europe.
You don’t need a lot of machines for it. They currently use two servers with 32GB RAM, it takes around 9 minutes per data set. In the end, you need to weigh speed and costs against each other.
You can do all sorts of nice analysis with it. Look at “heat islands” in inner cities. Calculate the risk of ticks surviving a winter due to local temperature differences. Wine farmers are helped a lot by the good detail level compared to most other data sets: it shows the local differences.
My name is Reinout van Rees and I work a lot with Python (programming language) and Django (website framework). I live in The Netherlands and I'm happily married to Annie van Rees-Kooiman.
Most of my website content is in my weblog. You can keep up to date by subscribing to the automatic feeds (for instance with Google reader):