Write to Google BigQuery

To write data to Google BigQuery with Flux:

  1. Import the sql package.

  2. Pipe-forward data into and provide the following parameters:

    • driverName: bigquery
    • dataSourceName: See data source name
    • table: Table to write to
    • batchSize: Number of parameters or columns that can be queued within each call to Exec (default is 10000)
import "sql"

        driverName: "bigquery",
        dataSourceName: "bigquery://projectid/?apiKey=mySuP3r5ecR3tAP1K3y",
        table: "exampleTable",

BigQuery data source name

The bigquery driver uses the following DSN syntaxes (also known as a connection string):


Common BigQuery URL parameters

  • dataset - BigQuery dataset ID. When set, you can use unqualified table names in queries.

BigQuery authentication parameters

The Flux BigQuery implementation uses the Google Cloud Go SDK. Provide your authentication credentials using one of the following methods:

  • Set the GOOGLE_APPLICATION_CREDENTIALS environment variable to identify the location of your credential JSON file.

  • Provide your base-64 encoded service account, refresh token, or JSON credentials using the credentials URL parameter in your BigQuery DSN.

    Example credentials URL parameter

Flux to BigQuery data type conversion converts Flux data types to BigQuery data types.

Flux data typeBigQuery data type

Was this page helpful?

Thank you for your feedback!

Introducing InfluxDB 3.0

The new core of InfluxDB built with Rust and Apache Arrow. Available today in InfluxDB Cloud Dedicated.

Learn more

State of the InfluxDB Cloud Serverless documentation

The new documentation for InfluxDB Cloud Serverless is a work in progress. We are adding new information and content almost daily. Thank you for your patience!

If there is specific information you’re looking for, please submit a documentation issue.