or run

tessl search
Log in

Version

Workspace
tessl
Visibility
Public
Created
Last updated
Describes
golangpkg:golang/cloud.google.com/go/bigquery@v1.72.0
tile.json

tessl/golang-cloud-google-com--go--bigquery

tessl install tessl/golang-cloud-google-com--go--bigquery@1.72.0

Google Cloud BigQuery client library providing comprehensive Go APIs for querying, loading data, managing datasets and tables, streaming inserts, and accessing BigQuery's ecosystem of services including Storage, Analytics Hub, Data Transfer, and Migration APIs

index.mddocs/

Google Cloud BigQuery Go Client Library

The Google Cloud BigQuery Go client library provides comprehensive APIs for interacting with BigQuery, Google's fully-managed, serverless data warehouse and analytics platform. The library enables developers to execute SQL queries, manage datasets and tables, load and export data, perform streaming inserts, and access BigQuery's ecosystem of services including Storage API, Analytics Hub, Data Transfer, Migration, and more.

Package Information

  • Package Name: cloud.google.com/go/bigquery
  • Package Type: golang
  • Language: Go
  • Installation: go get cloud.google.com/go/bigquery@v1.72.0

Core Imports

Standard import for the main BigQuery client:

import (
    "context"
    "cloud.google.com/go/bigquery"
)

Common additional imports for working with BigQuery:

import (
    "context"
    "cloud.google.com/go/bigquery"
    "google.golang.org/api/iterator"
    "google.golang.org/api/option"
    "cloud.google.com/go/civil"  // For DATE, TIME, DATETIME types
)

Storage API imports:

import (
    "cloud.google.com/go/bigquery/storage/apiv1"
    "cloud.google.com/go/bigquery/storage/managedwriter"
)

Basic Usage

package main

import (
    "context"
    "fmt"
    "log"

    "cloud.google.com/go/bigquery"
    "google.golang.org/api/iterator"
)

func main() {
    ctx := context.Background()

    // Create a BigQuery client
    client, err := bigquery.NewClient(ctx, "my-project-id")
    if err != nil {
        log.Fatal(err)
    }
    defer client.Close()

    // Execute a query
    q := client.Query(`
        SELECT name, SUM(number) as total
        FROM ` + "`bigquery-public-data.usa_names.usa_1910_2013`" + `
        WHERE name = @name
        GROUP BY name
    `)
    q.Parameters = []bigquery.QueryParameter{
        {Name: "name", Value: "William"},
    }

    // Read query results
    it, err := q.Read(ctx)
    if err != nil {
        log.Fatal(err)
    }

    // Iterate through results
    for {
        var values []bigquery.Value
        err := it.Next(&values)
        if err == iterator.Done {
            break
        }
        if err != nil {
            log.Fatal(err)
        }
        fmt.Println(values)
    }
}

Architecture

The BigQuery Go client library is organized into several key areas:

Main Client Package

The cloud.google.com/go/bigquery package provides the primary interface for BigQuery operations including:

  • Client creation and configuration
  • Dataset and table management
  • Query execution with parameterization
  • Data loading from GCS and streaming inserts
  • Data extraction to GCS
  • Job management and monitoring
  • Schema inference and type conversion
  • ML model and routine management

Storage API

The cloud.google.com/go/bigquery/storage/* packages provide high-performance read and write APIs:

  • Read API: Efficient parallel reading of large datasets using Apache Arrow format
  • Managed Writer API: High-throughput streaming writes with exactly-once semantics
  • Multiple API versions (v1, v1beta1, v1beta2, v1alpha) for different stability levels

Service APIs

Additional packages provide access to BigQuery ecosystem services:

  • Analytics Hub: Data exchange and marketplace functionality
  • BigLake: Unified analytics for BigQuery and open source formats (Iceberg, Parquet)
  • Connection: External data source configuration (Cloud SQL, AWS, Azure, etc.)
  • Data Policies: Column-level security and data masking
  • Data Transfer: Scheduled data transfers from SaaS applications
  • Migration: SQL translation and migration from other data warehouses
  • Reservation: Capacity planning and slot reservation management

Capabilities

Client Setup and Configuration

Create and configure BigQuery clients with authentication, project settings, and location preferences.

func NewClient(ctx context.Context, projectID string, opts ...option.ClientOption) (*Client, error)

Client Setup and Configuration

Dataset Management

Create, read, update, and delete datasets. Configure access control, encryption, table expiration, and other dataset properties.

func (c *Client) Dataset(id string) *Dataset
func (d *Dataset) Create(ctx context.Context, md *DatasetMetadata) error
func (d *Dataset) Metadata(ctx context.Context) (*DatasetMetadata, error)
func (d *Dataset) Update(ctx context.Context, dm DatasetMetadataToUpdate, etag string) (*DatasetMetadata, error)
func (d *Dataset) Delete(ctx context.Context) error

Dataset Management

Table Management

Create and manage tables with schema definition, partitioning, clustering, and encryption. Support for regular tables, views, materialized views, and external tables.

func (d *Dataset) Table(tableID string) *Table
func (t *Table) Create(ctx context.Context, tm *TableMetadata) error
func (t *Table) Metadata(ctx context.Context) (*TableMetadata, error)
func (t *Table) Update(ctx context.Context, tm TableMetadataToUpdate, etag string) (*TableMetadata, error)
func (t *Table) Delete(ctx context.Context) error

Table Management

Query Execution

Execute SQL queries with support for parameterization, destination tables, and flexible result iteration using value slices or struct mapping.

func (c *Client) Query(q string) *Query
func (q *Query) Read(ctx context.Context) (*RowIterator, error)
func (q *Query) Run(ctx context.Context) (*Job, error)
func (it *RowIterator) Next(dst interface{}) error

Query Execution

Data Loading

Load data into BigQuery from Google Cloud Storage, local files, or other sources. Support for batch loading with schema inference and format-specific options.

func (t *Table) LoaderFrom(src LoadSource) *Loader
func (l *Loader) Run(ctx context.Context) (*Job, error)
func NewGCSReference(uri ...string) *GCSReference
func NewReaderSource(r io.Reader) *ReaderSource

Data Loading

Data Export

Extract table data to Google Cloud Storage in various formats (CSV, JSON, Avro, Parquet). Copy tables within BigQuery.

func (t *Table) ExtractorTo(dst *GCSReference) *Extractor
func (t *Table) CopierFrom(srcs ...*Table) *Copier
func (e *Extractor) Run(ctx context.Context) (*Job, error)
func (c *Copier) Run(ctx context.Context) (*Job, error)

Data Export and Table Copy

Streaming Inserts

Insert data in real-time with deduplication support. High-throughput streaming for continuous data ingestion.

func (t *Table) Inserter() *Inserter
func (i *Inserter) Put(ctx context.Context, src interface{}) error

Streaming Inserts

Job Management

Monitor and control asynchronous BigQuery operations including queries, loads, extracts, and copies.

func (j *Job) Wait(ctx context.Context) (*JobStatus, error)
func (j *Job) Status(ctx context.Context) (*JobStatus, error)
func (j *Job) Cancel(ctx context.Context) error
func (c *Client) Jobs(ctx context.Context) *JobIterator

Job Management

BigQuery Storage Read API

High-performance parallel reading of large datasets using the BigQuery Storage API with Apache Arrow format support.

func (c *Client) EnableStorageReadClient(ctx context.Context, opts ...option.ClientOption) error

Storage Read API

BigQuery Storage Write API

Managed writer for high-throughput streaming writes with exactly-once semantics, schema updates, and optimized performance.

import "cloud.google.com/go/bigquery/storage/managedwriter"

func NewClient(ctx context.Context, projectID string, opts ...option.ClientOption) (*Client, error)
func (c *Client) NewManagedStream(ctx context.Context, opts ...WriterOption) (*ManagedStream, error)

Storage Write API (Managed Writer)

Advanced Features

Configure table partitioning, clustering, encryption, BigLake integration, and work with ML models and routines.

type TimePartitioning struct {
    Type                      TimePartitioningType
    Expiration                time.Duration
    Field                     string
    RequirePartitionFilter    bool
}

type Clustering struct {
    Fields []string
}

type EncryptionConfig struct {
    KMSKeyName string
}

Advanced Table Features

Analytics Hub

Manage data exchanges, listings, and subscriptions for sharing datasets across organizations.

Analytics Hub API

BigLake

Unified analytics with support for Apache Iceberg, Delta Lake, and Hudi table formats on BigQuery.

BigLake API

Connection Management

Configure connections to external data sources including Cloud SQL, AWS, Azure, and other databases.

Connection API

Data Exchange

Deprecated in favor of Analytics Hub. Legacy data exchange functionality.

Data Exchange API

Data Policies

Configure column-level security, data masking, and policy-based access control.

Data Policies API

Data Transfer

Schedule automated data transfers from SaaS applications and external data sources.

Data Transfer API

Migration

Translate SQL from other dialects (Teradata, Redshift, Hive, Spark, Presto) and manage migration workflows.

Migration API

Reservation Management

Manage BigQuery slot reservations, commitments, capacity reservations, and assignments for cost and performance optimization.

Reservation API

Common Types

Schema and Field Types

type Schema []*FieldSchema

type FieldSchema struct {
    Name                     string
    Description              string
    Repeated                 bool
    Required                 bool
    Type                     FieldType
    PolicyTags               *PolicyTagList
    Schema                   Schema
    MaxLength                int64
    Precision                int64
    Scale                    int64
    DefaultValueExpression   string
    Collation                string
    RangeElementType         *RangeElementType
    RoundingMode             RoundingMode
}

type FieldType string
// Constants: StringFieldType, BytesFieldType, IntegerFieldType, FloatFieldType,
// BooleanFieldType, TimestampFieldType, RecordFieldType, DateFieldType,
// TimeFieldType, DateTimeFieldType, NumericFieldType, GeographyFieldType,
// BigNumericFieldType, IntervalFieldType, JSONFieldType, RangeFieldType

Null Types

BigQuery-specific null-safe types for handling NULL values:

type NullBool struct {
    Bool  bool
    Valid bool
}

type NullString struct {
    StringVal string
    Valid     bool
}

type NullInt64 struct {
    Int64 int64
    Valid bool
}

type NullFloat64 struct {
    Float64 float64
    Valid   bool
}

type NullTimestamp struct {
    Timestamp time.Time
    Valid     bool
}

type NullDate struct {
    Date  civil.Date
    Valid bool
}

type NullTime struct {
    Time  civil.Time
    Valid bool
}

type NullDateTime struct {
    DateTime civil.DateTime
    Valid    bool
}

type NullGeography struct {
    GeographyString string
    Valid           bool
}

type NullJSON struct {
    JSONVal interface{}
    Valid   bool
}

Interval Type

type IntervalValue struct {
    Years          int32
    Months         int32
    Days           int32
    Hours          int32
    Minutes        int32
    Seconds        int32
    SubSecondNanos int32
}

func IntervalValueFromDuration(in time.Duration) *IntervalValue
func ParseInterval(value string) (*IntervalValue, error)

Value Types

type Value interface{}

type ValueLoader interface {
    Load(v []Value, s Schema) error
}

type ValueSaver interface {
    Save() (map[string]Value, string, error)
}

Range Values

type RangeValue struct {
    Start Value
    End   Value
}

RangeValue represents a continuous range of values for BigQuery RANGE types. The supported element types are DATE, DATETIME, and TIMESTAMP. A missing Start or End value represents an unbounded range in that direction:

// Create a date range
dateRange := bigquery.RangeValue{
    Start: civil.Date{Year: 2024, Month: 1, Day: 1},
    End:   civil.Date{Year: 2024, Month: 12, Day: 31},
}

// Create an unbounded range (all dates from 2024-01-01 onwards)
unboundedRange := bigquery.RangeValue{
    Start: civil.Date{Year: 2024, Month: 1, Day: 1},
    // End is nil, representing unbounded end
}

// Use in a query parameter
q := client.Query("SELECT * FROM table WHERE date_col IN UNNEST(@date_range)")
q.Parameters = []bigquery.QueryParameter{
    {Name: "date_range", Value: dateRange},
}

Error Handling

type Error struct {
    Location string
    Message  string
    Reason   string
}

func (e Error) Error() string

Functions

Schema Inference

func InferSchema(st interface{}) (Schema, error)

Numeric String Conversion

func NumericString(r *big.Rat) string
func BigNumericString(r *big.Rat) string

Time String Conversion

func CivilTimeString(t civil.Time) string
func CivilDateTimeString(dt civil.DateTime) string
func IntervalString(iv *IntervalValue) string

Random Seed

func Seed(s int64)

Constants

OAuth Scope

const Scope = "https://www.googleapis.com/auth/bigquery"

Project ID Detection

const DetectProjectID = "*detect-project-id*"

Deduplication

const NoDedupeID = "NoDedupeID"

Numeric Precision

const (
    NumericPrecisionDigits = 38
    NumericScaleDigits = 9
    BigNumericPrecisionDigits = 76
    BigNumericScaleDigits = 38
)

Storage Billing Models

const (
    LogicalStorageBillingModel = ""
    PhysicalStorageBillingModel = "PHYSICAL"
)

Routine Types

const (
    ScalarFunctionRoutine = "SCALAR_FUNCTION"
    ProcedureRoutine = "PROCEDURE"
    TableValuedFunctionRoutine = "TABLE_VALUED_FUNCTION"
)