1. Packages
  2. dbt Cloud Provider
  3. API Docs
  4. getBigQueryConnection
dbt Cloud v0.1.30 published on Thursday, Mar 20, 2025 by Pulumi

dbtcloud.getBigQueryConnection

Explore with Pulumi AI

dbt Cloud v0.1.30 published on Thursday, Mar 20, 2025 by Pulumi

Using getBigQueryConnection

Two invocation forms are available. The direct form accepts plain arguments and either blocks until the result value is available, or returns a Promise-wrapped result. The output form accepts Input-wrapped arguments and returns an Output-wrapped result.

function getBigQueryConnection(args: GetBigQueryConnectionArgs, opts?: InvokeOptions): Promise<GetBigQueryConnectionResult>
function getBigQueryConnectionOutput(args: GetBigQueryConnectionOutputArgs, opts?: InvokeOptions): Output<GetBigQueryConnectionResult>
Copy
def get_big_query_connection(connection_id: Optional[int] = None,
                             project_id: Optional[int] = None,
                             opts: Optional[InvokeOptions] = None) -> GetBigQueryConnectionResult
def get_big_query_connection_output(connection_id: Optional[pulumi.Input[int]] = None,
                             project_id: Optional[pulumi.Input[int]] = None,
                             opts: Optional[InvokeOptions] = None) -> Output[GetBigQueryConnectionResult]
Copy
func LookupBigQueryConnection(ctx *Context, args *LookupBigQueryConnectionArgs, opts ...InvokeOption) (*LookupBigQueryConnectionResult, error)
func LookupBigQueryConnectionOutput(ctx *Context, args *LookupBigQueryConnectionOutputArgs, opts ...InvokeOption) LookupBigQueryConnectionResultOutput
Copy

> Note: This function is named LookupBigQueryConnection in the Go SDK.

public static class GetBigQueryConnection 
{
    public static Task<GetBigQueryConnectionResult> InvokeAsync(GetBigQueryConnectionArgs args, InvokeOptions? opts = null)
    public static Output<GetBigQueryConnectionResult> Invoke(GetBigQueryConnectionInvokeArgs args, InvokeOptions? opts = null)
}
Copy
public static CompletableFuture<GetBigQueryConnectionResult> getBigQueryConnection(GetBigQueryConnectionArgs args, InvokeOptions options)
public static Output<GetBigQueryConnectionResult> getBigQueryConnection(GetBigQueryConnectionArgs args, InvokeOptions options)
Copy
fn::invoke:
  function: dbtcloud:index/getBigQueryConnection:getBigQueryConnection
  arguments:
    # arguments dictionary
Copy

The following arguments are supported:

ConnectionId This property is required. int
Connection Identifier
ProjectId This property is required. int
Project ID to create the connection in
ConnectionId This property is required. int
Connection Identifier
ProjectId This property is required. int
Project ID to create the connection in
connectionId This property is required. Integer
Connection Identifier
projectId This property is required. Integer
Project ID to create the connection in
connectionId This property is required. number
Connection Identifier
projectId This property is required. number
Project ID to create the connection in
connection_id This property is required. int
Connection Identifier
project_id This property is required. int
Project ID to create the connection in
connectionId This property is required. Number
Connection Identifier
projectId This property is required. Number
Project ID to create the connection in

getBigQueryConnection Result

The following output properties are available:

AuthProviderX509CertUrl string
Auth Provider X509 Cert URL for the Service Account
AuthUri string
Auth URI for the Service Account
ClientEmail string
Service Account email
ClientId string
Client ID of the Service Account
ClientX509CertUrl string
Client X509 Cert URL for the Service Account
ConnectionId int
Connection Identifier
DataprocClusterName string
Dataproc cluster name for PySpark workloads
DataprocRegion string
Google Cloud region for PySpark workloads on Dataproc
ExecutionProject string
Project to bill for query execution
GcpProjectId string
GCP project ID
GcsBucket string
URI for a Google Cloud Storage bucket to host Python code executed via Datapro
Id string
The provider-assigned unique ID for this managed resource.
IsActive bool
Whether the connection is active
IsConfiguredForOauth bool
Whether the connection is configured for OAuth or not
Location string
Location to create new Datasets in
MaximumBytesBilled int
Max number of bytes that can be billed for a given BigQuery query
Name string
Connection name
Priority string
The priority with which to execute BigQuery queries
PrivateKey string
Private key of the Service Account
PrivateKeyId string
Private key ID of the Service Account
ProjectId int
Project ID to create the connection in
Retries int
Number of retries for queries
TimeoutSeconds int
Timeout in seconds for queries
TokenUri string
Token URI for the Service Account
Type string
The type of connection
AuthProviderX509CertUrl string
Auth Provider X509 Cert URL for the Service Account
AuthUri string
Auth URI for the Service Account
ClientEmail string
Service Account email
ClientId string
Client ID of the Service Account
ClientX509CertUrl string
Client X509 Cert URL for the Service Account
ConnectionId int
Connection Identifier
DataprocClusterName string
Dataproc cluster name for PySpark workloads
DataprocRegion string
Google Cloud region for PySpark workloads on Dataproc
ExecutionProject string
Project to bill for query execution
GcpProjectId string
GCP project ID
GcsBucket string
URI for a Google Cloud Storage bucket to host Python code executed via Datapro
Id string
The provider-assigned unique ID for this managed resource.
IsActive bool
Whether the connection is active
IsConfiguredForOauth bool
Whether the connection is configured for OAuth or not
Location string
Location to create new Datasets in
MaximumBytesBilled int
Max number of bytes that can be billed for a given BigQuery query
Name string
Connection name
Priority string
The priority with which to execute BigQuery queries
PrivateKey string
Private key of the Service Account
PrivateKeyId string
Private key ID of the Service Account
ProjectId int
Project ID to create the connection in
Retries int
Number of retries for queries
TimeoutSeconds int
Timeout in seconds for queries
TokenUri string
Token URI for the Service Account
Type string
The type of connection
authProviderX509CertUrl String
Auth Provider X509 Cert URL for the Service Account
authUri String
Auth URI for the Service Account
clientEmail String
Service Account email
clientId String
Client ID of the Service Account
clientX509CertUrl String
Client X509 Cert URL for the Service Account
connectionId Integer
Connection Identifier
dataprocClusterName String
Dataproc cluster name for PySpark workloads
dataprocRegion String
Google Cloud region for PySpark workloads on Dataproc
executionProject String
Project to bill for query execution
gcpProjectId String
GCP project ID
gcsBucket String
URI for a Google Cloud Storage bucket to host Python code executed via Datapro
id String
The provider-assigned unique ID for this managed resource.
isActive Boolean
Whether the connection is active
isConfiguredForOauth Boolean
Whether the connection is configured for OAuth or not
location String
Location to create new Datasets in
maximumBytesBilled Integer
Max number of bytes that can be billed for a given BigQuery query
name String
Connection name
priority String
The priority with which to execute BigQuery queries
privateKey String
Private key of the Service Account
privateKeyId String
Private key ID of the Service Account
projectId Integer
Project ID to create the connection in
retries Integer
Number of retries for queries
timeoutSeconds Integer
Timeout in seconds for queries
tokenUri String
Token URI for the Service Account
type String
The type of connection
authProviderX509CertUrl string
Auth Provider X509 Cert URL for the Service Account
authUri string
Auth URI for the Service Account
clientEmail string
Service Account email
clientId string
Client ID of the Service Account
clientX509CertUrl string
Client X509 Cert URL for the Service Account
connectionId number
Connection Identifier
dataprocClusterName string
Dataproc cluster name for PySpark workloads
dataprocRegion string
Google Cloud region for PySpark workloads on Dataproc
executionProject string
Project to bill for query execution
gcpProjectId string
GCP project ID
gcsBucket string
URI for a Google Cloud Storage bucket to host Python code executed via Datapro
id string
The provider-assigned unique ID for this managed resource.
isActive boolean
Whether the connection is active
isConfiguredForOauth boolean
Whether the connection is configured for OAuth or not
location string
Location to create new Datasets in
maximumBytesBilled number
Max number of bytes that can be billed for a given BigQuery query
name string
Connection name
priority string
The priority with which to execute BigQuery queries
privateKey string
Private key of the Service Account
privateKeyId string
Private key ID of the Service Account
projectId number
Project ID to create the connection in
retries number
Number of retries for queries
timeoutSeconds number
Timeout in seconds for queries
tokenUri string
Token URI for the Service Account
type string
The type of connection
auth_provider_x509_cert_url str
Auth Provider X509 Cert URL for the Service Account
auth_uri str
Auth URI for the Service Account
client_email str
Service Account email
client_id str
Client ID of the Service Account
client_x509_cert_url str
Client X509 Cert URL for the Service Account
connection_id int
Connection Identifier
dataproc_cluster_name str
Dataproc cluster name for PySpark workloads
dataproc_region str
Google Cloud region for PySpark workloads on Dataproc
execution_project str
Project to bill for query execution
gcp_project_id str
GCP project ID
gcs_bucket str
URI for a Google Cloud Storage bucket to host Python code executed via Datapro
id str
The provider-assigned unique ID for this managed resource.
is_active bool
Whether the connection is active
is_configured_for_oauth bool
Whether the connection is configured for OAuth or not
location str
Location to create new Datasets in
maximum_bytes_billed int
Max number of bytes that can be billed for a given BigQuery query
name str
Connection name
priority str
The priority with which to execute BigQuery queries
private_key str
Private key of the Service Account
private_key_id str
Private key ID of the Service Account
project_id int
Project ID to create the connection in
retries int
Number of retries for queries
timeout_seconds int
Timeout in seconds for queries
token_uri str
Token URI for the Service Account
type str
The type of connection
authProviderX509CertUrl String
Auth Provider X509 Cert URL for the Service Account
authUri String
Auth URI for the Service Account
clientEmail String
Service Account email
clientId String
Client ID of the Service Account
clientX509CertUrl String
Client X509 Cert URL for the Service Account
connectionId Number
Connection Identifier
dataprocClusterName String
Dataproc cluster name for PySpark workloads
dataprocRegion String
Google Cloud region for PySpark workloads on Dataproc
executionProject String
Project to bill for query execution
gcpProjectId String
GCP project ID
gcsBucket String
URI for a Google Cloud Storage bucket to host Python code executed via Datapro
id String
The provider-assigned unique ID for this managed resource.
isActive Boolean
Whether the connection is active
isConfiguredForOauth Boolean
Whether the connection is configured for OAuth or not
location String
Location to create new Datasets in
maximumBytesBilled Number
Max number of bytes that can be billed for a given BigQuery query
name String
Connection name
priority String
The priority with which to execute BigQuery queries
privateKey String
Private key of the Service Account
privateKeyId String
Private key ID of the Service Account
projectId Number
Project ID to create the connection in
retries Number
Number of retries for queries
timeoutSeconds Number
Timeout in seconds for queries
tokenUri String
Token URI for the Service Account
type String
The type of connection

Package Details

Repository
dbtcloud pulumi/pulumi-dbtcloud
License
Apache-2.0
Notes
This Pulumi package is based on the dbtcloud Terraform Provider.
dbt Cloud v0.1.30 published on Thursday, Mar 20, 2025 by Pulumi