Copy a local data frame to a remote database
This is an implementation of the dplyr copy_to()
generic and it mostly
a wrapper around DBI::dbWriteTable()
.
It is useful for copying small amounts of data to a database for examples, experiments, and joins. By default, it creates temporary tables which are only visible within the current connection to the database.
## S3 method for class 'src_sql' copy_to( dest, df, name = deparse(substitute(df)), overwrite = FALSE, types = NULL, temporary = TRUE, unique_indexes = NULL, indexes = NULL, analyze = TRUE, ..., in_transaction = TRUE )
dest |
remote data source |
df |
A local data frame, a |
name |
name for new remote table. |
overwrite |
If |
types |
a character vector giving variable types to use for the columns. See https://www.sqlite.org/datatype3.html for available types. |
temporary |
if |
unique_indexes |
a list of character vectors. Each element of the list will create a new unique index over the specified column(s). Duplicate rows will result in failure. |
indexes |
a list of character vectors. Each element of the list will create a new index. |
analyze |
if |
... |
other parameters passed to methods. |
in_transaction |
Should the table creation be wrapped in a transaction? This typically makes things faster, but you may want to suppress if the database doesn't support transactions, or you're wrapping in a transaction higher up (and your database doesn't support nested transactions.) |
Another tbl_lazy
. Use show_query()
to see the generated
query, and use collect()
to execute the query
and return data to R.
library(dplyr, warn.conflicts = FALSE) df <- data.frame(x = 1:5, y = letters[5:1]) db <- copy_to(src_memdb(), df) db df2 <- data.frame(y = c("a", "d"), fruit = c("apple", "date")) # copy_to() is called automatically if you set copy = TRUE # in the join functions db %>% left_join(df2, copy = TRUE)
Please choose more modern alternatives, such as Google Chrome or Mozilla Firefox.