-> GitHub Pages Documentation Site <-
{timeseriesdb} maps R time series objects to PostgreSQL database relations for permanent storage. Instead of writing time series to spreadsheet files or .RData files on disk, {timeseriesdb} uses a set of PostgreSQL relations which allows to store data alongside extensive, multi-lingual meta information in context aware fashion. {timeseriesdb} was designed with official statistics in mind: It can keep track of various versions of the same time series to handle data revisions, e.g., in the case of GDP data.
{timeseriesdb} …
{timeseriesdb} is not built to incrementally append new observations as fast as possible. {timeseriesdb} does not try to compete with the amazing speed of InfluxDB. It’s not a server log or IoT minded time series storage.
Make sure you followed the installation notes to make sure all components of the {timeseriesdb} were installed properly: PostgreSQL, necessary PostgreSQL extension, R as well as the {timeseriesdb} R package.
The following examples illustrate basic use in a nutshell. The learn more about the use of {timeseriesdb}, read the vignette articles.
# Create DB connection.
# In this case connect to a local db running on port 1111
# /w lame passwords -- strongly discouraged for production.
con <- dbConnect(Postgres(),
"dev_writer", "localhost",
1111, "dev_writer",
"postgres")
tsl <- list(ts1 = ts(rnorm(100), frequency = 12,
start = 2002),
ts2 = ts(rnorm(100), frequency = 12,
start = 2001))
db_store_ts(connection, tsl)
dbDisconnect(con)
con <- db_connection_create(
dbname = "postgres",
user = "dev_admin",
host = "localhost",
passwd = "dev_admin",
port = 1111
)
tsl <- db_read_ts(connection, c("some_ts_id","another_ts_id"))
db_connection_close(con)
{timeseriesdb} offers a plethora of features beyond just mere storage of time series themselves: