Skip to content

DB Writer

Bases: FrozenModel

Class specifies schema and table where you can write your dataframe. |support_hooks|

.. versionadded:: 0.1.0

.. versionchanged:: 0.8.0 Moved onetl.core.DBReaderonetl.db.DBReader

Parameters:

  • connection (:obj:onetl.connection.DBConnection) –

    Class which contains DB connection properties. See :ref:db-connections section.

  • target (str) –

    Table/collection/etc name to write data to.

    If connection has schema support, you need to specify the full name of the source including the schema, e.g. schema.name.

    .. versionchanged:: 0.7.0 Renamed tabletarget

  • options (dict, :obj:onetl.connection.DBConnection.WriteOptions, default: `None` ) –

    Spark write options. Can be in form of special WriteOptions object or a dict.

    For example: {"if_exists": "replace_entire_table", "compression": "snappy"} or Hive.WriteOptions(if_exists="replace_entire_table", compression="snappy")

    .. note::

    Some sources does not support writing options.
    

Examples:

.. tabs::

.. code-tab:: py Minimal example

    from onetl.connection import Postgres
    from onetl.db import DBWriter

    postgres = Postgres(...)

    writer = DBWriter(
        connection=postgres,
        target="fiddle.dummy",
    )

.. code-tab:: py With custom write options

    from onetl.connection import Postgres
    from onetl.db import DBWriter

    postgres = Postgres(...)

    options = Postgres.WriteOptions(if_exists="replace_entire_table", batchsize=1000)

    writer = DBWriter(
        connection=postgres,
        target="fiddle.dummy",
        options=options,
    )

run(df)

Method for writing your df to specified target. |support_hooks|

.. note :: Method does support only batching DataFrames.

.. versionadded:: 0.1.0

Parameters:

  • df (DataFrame) –

    Spark dataframe

Examples:

Write dataframe to target:

.. code:: python

writer.run(df)