Hello-
I have hourly readings of temperature at 0, 1.5, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11 meters beneath the surface of the water for an entire year. I would like to create a graph like this...
Depth on y axis, time on x axis and the water temperature used for z axis.
![]()
Unfortunately, due the large size of the dataset (100k observations for a year), I cannot attach the dataset, or replicate it. I have included a snippet to at least show the general outline of the data.
data <- structure(list(date_time = structure(c(1339016400, 1339020000,
1339023600, 1339027200, 1339030800, 1339034400), class = c("POSIXct",
"POSIXt"), tzone = ""), temperature = c(21.103, 21.199, 20.96,
20.96, 21.103, 20.412), depth = c(0, 0, 0, 0, 0, 0)), .Names = c("date_time",
"temperature", "depth"), row.names = c("48730", "48731", "48732",
"48733", "48734", "48735"), class = "data.frame")
You just have to imagine another 100k records!
When I use ggplot2 and the stat_contour function, I produce the following graph:
v <- ggplot(data,aes(date_time,depth,z=temperature))
v <- v + stat_contour(geom="polygon")
v <- v + scale_fill_gradientn(colours=rev(rainbow(10)))
v <- v + geom_tile(aes(fill=temperature))
v <- v + scale_y_reverse()
v
![]()
This produces a very choppy graph with no data between the depths. Of course, part of the difference between the two is that the first graph uses modeled data to fill in the gaps between depths. I believe it also models more time steps, to make a smoother graph. I tried modeling the data using the loess function which helps smooth the data, unfortunately it creates an incredibly large dataset, bumping into memory limits on my machine. Not to mention, it takes minutes to render millions of values.
Maybe I'm using the wrong tool within R to produce this type of plot. I enjoy using ggplot and would like to continue to do so. I figured this group would have a solution.
Thanks for any help or suggestions.
-al