feat: add Snowflake privatelink host injection and fix BigQuery preflight check#145
Open
MiroCillik wants to merge 6 commits intomasterfrom
Open
feat: add Snowflake privatelink host injection and fix BigQuery preflight check#145MiroCillik wants to merge 6 commits intomasterfrom
MiroCillik wants to merge 6 commits intomasterfrom
Conversation
b4b9a1e to
e9d13a3
Compare
…e remoteDwh Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
e9d13a3 to
fa1ccc3
Compare
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
dataset->reload() only tests read permission (bigquery.datasets.get), but dbt needs write permission (bigquery.datasets.create) for CREATE SCHEMA IF NOT EXISTS. GCP IAM propagates read permissions faster than write, so the check passed but dbt still failed with 403. Replaced with dataset->update([]) which requires bigquery.datasets.update — same IAM role as datasets.create. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
The dbt host parameter is only needed for privatelink connections where the actual hostname differs from what dbt infers from the account name. For regular Snowflake URLs, account alone is sufficient. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
MiroCillik
commented
Apr 20, 2026
| try { | ||
| $retryProxy->call(function () use ($dataset): void { | ||
| $dataset->reload(); | ||
| $dataset->update([]); |
Member
Author
There was a problem hiding this comment.
Tohle je jeste fix predesleho PR na retry u BQ. Ten "reload" totiz overiril jen "read" a ty "write" permissions se propaguji pomaleji, takze ten retry pak nefungoval spravne.
The previous check (dataset->update) tested a dataset-level permission (bigquery.datasets.update), but dbt's CREATE SCHEMA IF NOT EXISTS requires the project-level bigquery.datasets.create permission. These are different IAM bindings that propagate independently. Now creates a temporary probe dataset and deletes it, testing the exact permission dbt needs. A 409 (already exists) also proves the permission. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
AJDA-2149
Follow up to #132
Summary
hostparameter intoprofiles.ymlonly for privatelink connections, where the hostname differs from what dbt infers from the account namedataset->update()(write operation) instead ofdataset->reload()(read-only) to properly verify write permissions before dbt runsCREATE SCHEMA IF NOT EXISTSDbtProfilesYaml::dumpYaml()to acceptadditionalOptionsinjected into every output after mergeTest plan
DbtYamlCreateTest(profiles with host for privatelink URLs)LocalBigQueryProviderTest(dataset write permission check)tests-in-kbcpasses (BigQuery jobs no longer fail with 403)Release Notes
Justification
hostparameter override is needed only for privatelink — regular Snowflake connections work fine with just the account.datasets.get) to verify eventual consistency of IAM permissions, but dbt needs write permission (datasets.create). GCP IAM propagates read permissions faster than write, so the check passed but dbt still failed with 403.Plans for Customer Communication
Impact Analysis
Deployment Plan
Rollback Plan
Post-Release Support Plan
🤖 Generated with Claude Code