Migrate Databricks from sqlalchemy-databricks to databricks-sqlalchemy#26896
Migrate Databricks from sqlalchemy-databricks to databricks-sqlalchemy#26896
Conversation
| url = f"{connection.scheme.value}://{connection.hostPort}" | ||
| if connection.catalog: | ||
| url = f"{url}?catalog={connection.catalog}" | ||
| return url |
There was a problem hiding this comment.
⚠️ Edge Case: Catalog value not URL-encoded in connection URL
The get_connection_url functions in both Databricks and Unity Catalog connection modules directly interpolate connection.catalog into the URL query string without URL-encoding. If a catalog name contains special characters (&, =, #, %, spaces), this will produce a malformed URL or cause SQLAlchemy to misparse the query parameters.
The codebase already uses quote_plus for URL parameters elsewhere (e.g., get_connection_url_common in builders.py).
Suggested fix:
from urllib.parse import quote_plus
def get_connection_url(connection) -> str:
url = f"{connection.scheme.value}://{connection.hostPort}"
if connection.catalog:
url = f"{url}?catalog={quote_plus(connection.catalog)}"
return url
Was this helpful? React with 👍 / 👎 | Reply gitar fix to apply this suggestion
| # Suppress noisy deprecation warning from databricks-sqlalchemy using | ||
| # the deprecated '_user_agent_entry' parameter internally | ||
| logging.getLogger("databricks.sql.session").setLevel(logging.ERROR) |
There was a problem hiding this comment.
💡 Quality: Module-level log suppression is too broad
Setting logging.getLogger('databricks.sql.session').setLevel(logging.ERROR) at module level suppresses all WARNING and INFO messages from the Databricks SQL session logger globally and permanently, not just the _user_agent_entry deprecation warning. This could hide legitimate warnings about connection issues, timeouts, or other important diagnostic information from the driver.
A more targeted approach would use a warnings.filterwarnings call to suppress only the specific deprecation warning.
Suggested fix:
import warnings
warnings.filterwarnings(
"ignore",
message=".*_user_agent_entry.*",
category=DeprecationWarning,
)
Was this helpful? React with 👍 / 👎 | Reply gitar fix to apply this suggestion
There was a problem hiding this comment.
Pull request overview
Migrates OpenMetadata’s Databricks-related connectors (Databricks, Unity Catalog, Databricks Pipeline) from the unmaintained sqlalchemy-databricks dialect to the official databricks-sqlalchemy dialect, updating connection URL scheme semantics and adapting profiler/ingestion logic for SQLAlchemy 2.0 compatibility.
Changes:
- Updated Databricks/Unity Catalog connection scheme from
databricks+connectortodatabricksacross JSON schemas, ingestion code, unit tests, and DB migrations. - Adjusted connection URL generation to pass
catalogvia URL query parameter, and updated profiler compiler integration away from PyHive. - Updated ingestion internals for SQLAlchemy 2.0 compatibility (Row handling, Column parenting) and removed legacy dialect preinstalls from CI images/actions.
Reviewed changes
Copilot reviewed 20 out of 20 changed files in this pull request and generated 7 comments.
Show a summary per file
| File | Description |
|---|---|
| openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/unityCatalogConnection.json | Updates Unity Catalog scheme enum/default to databricks. |
| openmetadata-spec/src/main/resources/json/schema/entity/services/connections/database/databricksConnection.json | Updates Databricks scheme enum/default to databricks. |
| openmetadata-spec/src/main/resources/json/schema/entity/applications/configuration/external/metadataExporterConnectors/databricksConnection.json | Aligns exporter connector schema scheme enum/default to databricks. |
| ingestion/tests/unit/topology/database/test_databricks.py | Updates expected scheme/URL in Databricks unit tests. |
| ingestion/tests/unit/topology/database/test_databricks_migration.py | Adds unit coverage for scheme enum, _type_map, default scheme, and pipeline URL scheme. |
| ingestion/tests/unit/test_source_connection.py | Updates Databricks URL expectations; adds Unity Catalog and pipeline URL coverage including catalog query param. |
| ingestion/tests/unit/observability/profiler/sqlalchemy/databricks/test_visit_column.py | Switches compiler mocking from PyHive HiveCompiler to SQLAlchemy SQLCompiler. |
| ingestion/src/metadata/profiler/interface/sqlalchemy/databricks/profiler_interface.py | Updates compiler integration to work with databricks-sqlalchemy statement compiler and SQLAlchemy 2.0. |
| ingestion/src/metadata/mixins/sqalchemy/sqa_mixin.py | Changes Databricks/Unity Catalog catalog selection DDL execution. |
| ingestion/src/metadata/ingestion/source/pipeline/databrickspipeline/connection.py | Updates pipeline connection URL scheme to databricks and adds log suppression. |
| ingestion/src/metadata/ingestion/source/database/unitycatalog/connection.py | Appends catalog as URL query param and adds log suppression. |
| ingestion/src/metadata/ingestion/source/database/databricks/metadata.py | Replaces PyHive _type_map; updates dialect import; fixes SQLAlchemy 2.0 Row iteration for comments/descriptions. |
| ingestion/src/metadata/ingestion/source/database/databricks/connection.py | Appends catalog as URL query param and adds log suppression. |
| ingestion/src/metadata/ingestion/source/database/common/data_diff/databricks_base.py | Updates default scheme fallback from databricks+connector to databricks. |
| ingestion/setup.py | Adds databricks-sqlalchemy dependency; updates connector versions; removes PyHive from databricks extra. |
| ingestion/operators/docker/Dockerfile.ci | Removes preinstall of legacy sqlalchemy-databricks dialect. |
| ingestion/Dockerfile.ci | Removes preinstall of legacy sqlalchemy-databricks dialect. |
| bootstrap/sql/migrations/native/1.13.0/postgres/schemaChanges.sql | Migrates stored Databricks/UnityCatalog scheme values to databricks in Postgres. |
| bootstrap/sql/migrations/native/1.13.0/mysql/schemaChanges.sql | Migrates stored Databricks/UnityCatalog scheme values to databricks in MySQL. |
| .github/actions/setup-openmetadata-test-environment/action.yml | Removes preinstall of legacy sqlalchemy-databricks in test environment setup. |
| if isinstance( | ||
| self.service_connection_config, | ||
| (UnityCatalogConnection, DatabricksConnection), | ||
| ): | ||
| session.execute( | ||
| text("USE CATALOG :catalog"), | ||
| {"catalog": self.service_connection_config.catalog}, | ||
| ).first() | ||
| catalog = self.service_connection_config.catalog | ||
| session.execute(text(f"USE CATALOG `{catalog}`")) | ||
|
|
There was a problem hiding this comment.
set_catalog now always executes USE CATALOG even when catalog is unset, which will run USE CATALOG None`` for connections without a catalog. Also, interpolating catalog directly into SQL is unsafe (identifier quoting / injection) and inconsistent with other Databricks/Unity Catalog codepaths that use the dialect identifier preparer to quote identifiers safely. Consider (a) guarding on a truthy `catalog` and (b) quoting via `session.bind.dialect.identifier_preparer.quote(...)` (or equivalent) and escaping backticks rather than direct f-string interpolation.
| def get_connection_url(connection: DatabricksConnection) -> str: | ||
| return f"{connection.scheme.value}://{connection.hostPort}" | ||
| url = f"{connection.scheme.value}://{connection.hostPort}" | ||
| if connection.catalog: | ||
| url = f"{url}?catalog={connection.catalog}" | ||
| return url |
There was a problem hiding this comment.
Building the connection URL with ?catalog={connection.catalog} does not URL-encode the catalog value. Catalog names containing spaces or reserved URL characters will produce an invalid URL and can break SQLAlchemy parsing. Consider using urllib.parse.urlencode/quote when appending query parameters.
| def get_connection_url(connection: UnityCatalogConnection) -> str: | ||
| url = f"{connection.scheme.value}://{connection.hostPort}" | ||
| if connection.catalog: | ||
| url = f"{url}?catalog={connection.catalog}" | ||
| return url |
There was a problem hiding this comment.
Building the connection URL with ?catalog={connection.catalog} does not URL-encode the catalog value. Catalog names containing spaces or reserved URL characters will produce an invalid URL and can break SQLAlchemy parsing. Consider using urllib.parse.urlencode/quote when appending query parameters.
| # Suppress noisy deprecation warning from databricks-sqlalchemy using | ||
| # the deprecated '_user_agent_entry' parameter internally | ||
| logging.getLogger("databricks.sql.session").setLevel(logging.ERROR) |
There was a problem hiding this comment.
Setting databricks.sql.session logger level at import time is a global side effect (affects all callers) and may hide useful INFO/WARN logs for debugging Databricks connectivity issues. If the goal is to suppress a specific deprecation warning, prefer filtering the specific warning/message (e.g., warnings.filterwarnings) or applying a targeted log filter closer to connection initialization rather than changing the logger’s level module-wide.
| # Suppress noisy deprecation warning from databricks-sqlalchemy using | ||
| # the deprecated '_user_agent_entry' parameter internally | ||
| logging.getLogger("databricks.sql.session").setLevel(logging.ERROR) |
There was a problem hiding this comment.
Setting databricks.sql.session logger level at import time is a global side effect (affects all callers) and may hide useful INFO/WARN logs for debugging Databricks connectivity issues. If the goal is to suppress a specific deprecation warning, prefer filtering the specific warning/message (e.g., warnings.filterwarnings) or applying a targeted log filter closer to connection initialization rather than changing the logger’s level module-wide.
| # Suppress noisy deprecation warning from databricks-sqlalchemy using | ||
| # the deprecated '_user_agent_entry' parameter internally | ||
| logging.getLogger("databricks.sql.session").setLevel(logging.ERROR) |
There was a problem hiding this comment.
Setting databricks.sql.session logger level at import time is a global side effect (affects all callers) and may hide useful INFO/WARN logs for debugging Databricks connectivity issues. If the goal is to suppress a specific deprecation warning, prefer filtering the specific warning/message (e.g., warnings.filterwarnings) or applying a targeted log filter closer to connection initialization rather than changing the logger’s level module-wide.
| from databricks.sqlalchemy._ddl import DatabricksStatementCompiler | ||
|
|
||
| DatabricksStatementCompiler.visit_column = ( | ||
| DatabricksProfilerInterface.visit_column | ||
| ) | ||
| DatabricksStatementCompiler.visit_table = ( | ||
| DatabricksProfilerInterface.visit_table | ||
| ) | ||
|
|
There was a problem hiding this comment.
Importing DatabricksStatementCompiler from databricks.sqlalchemy._ddl relies on a private module (_ddl). This is fragile across databricks-sqlalchemy upgrades and can break at runtime if internals move. Consider guarding the import (try/except) and/or retrieving the statement compiler class from the dialect/engine in a supported way if available.
| from databricks.sqlalchemy._ddl import DatabricksStatementCompiler | |
| DatabricksStatementCompiler.visit_column = ( | |
| DatabricksProfilerInterface.visit_column | |
| ) | |
| DatabricksStatementCompiler.visit_table = ( | |
| DatabricksProfilerInterface.visit_table | |
| ) | |
| # Override the Databricks statement compiler's visit methods to handle | |
| # struct columns and table names more robustly. Instead of importing | |
| # DatabricksStatementCompiler from a private module, retrieve the | |
| # active statement compiler class from the dialect in a supported way. | |
| try: | |
| bind = getattr(self.session, "bind", None) | |
| dialect = getattr(bind, "dialect", None) | |
| compiler_cls = getattr(dialect, "statement_compiler", None) | |
| if compiler_cls is not None: | |
| compiler_cls.visit_column = DatabricksProfilerInterface.visit_column | |
| compiler_cls.visit_table = DatabricksProfilerInterface.visit_table | |
| else: | |
| logger.debug( | |
| "DatabricksProfilerInterface: dialect has no statement_compiler; " | |
| "skipping compiler monkey-patching." | |
| ) | |
| except Exception as exc: # Defensive: do not break initialization | |
| logger.debug( | |
| "DatabricksProfilerInterface: failed to patch statement compiler: %r", | |
| exc, | |
| ) |
✅ TypeScript Types Auto-UpdatedThe generated TypeScript types have been automatically updated based on JSON schema changes in this PR. |
Code Review
|
| Compact |
|
Was this helpful? React with 👍 / 👎 | Gitar
|
The Python checkstyle failed. Please run You can install the pre-commit hooks with |
🛡️ TRIVY SCAN RESULT 🛡️ Target:
|
| Package | Vulnerability ID | Severity | Installed Version | Fixed Version |
|---|---|---|---|---|
com.fasterxml.jackson.core:jackson-core |
CVE-2025-52999 | 🚨 HIGH | 2.12.7 | 2.15.0 |
com.fasterxml.jackson.core:jackson-core |
CVE-2025-52999 | 🚨 HIGH | 2.13.4 | 2.15.0 |
com.fasterxml.jackson.core:jackson-databind |
CVE-2022-42003 | 🚨 HIGH | 2.12.7 | 2.12.7.1, 2.13.4.2 |
com.fasterxml.jackson.core:jackson-databind |
CVE-2022-42004 | 🚨 HIGH | 2.12.7 | 2.12.7.1, 2.13.4 |
com.google.code.gson:gson |
CVE-2022-25647 | 🚨 HIGH | 2.2.4 | 2.8.9 |
com.google.protobuf:protobuf-java |
CVE-2021-22569 | 🚨 HIGH | 3.3.0 | 3.16.1, 3.18.2, 3.19.2 |
com.google.protobuf:protobuf-java |
CVE-2022-3509 | 🚨 HIGH | 3.3.0 | 3.16.3, 3.19.6, 3.20.3, 3.21.7 |
com.google.protobuf:protobuf-java |
CVE-2022-3510 | 🚨 HIGH | 3.3.0 | 3.16.3, 3.19.6, 3.20.3, 3.21.7 |
com.google.protobuf:protobuf-java |
CVE-2024-7254 | 🚨 HIGH | 3.3.0 | 3.25.5, 4.27.5, 4.28.2 |
com.google.protobuf:protobuf-java |
CVE-2021-22569 | 🚨 HIGH | 3.7.1 | 3.16.1, 3.18.2, 3.19.2 |
com.google.protobuf:protobuf-java |
CVE-2022-3509 | 🚨 HIGH | 3.7.1 | 3.16.3, 3.19.6, 3.20.3, 3.21.7 |
com.google.protobuf:protobuf-java |
CVE-2022-3510 | 🚨 HIGH | 3.7.1 | 3.16.3, 3.19.6, 3.20.3, 3.21.7 |
com.google.protobuf:protobuf-java |
CVE-2024-7254 | 🚨 HIGH | 3.7.1 | 3.25.5, 4.27.5, 4.28.2 |
com.nimbusds:nimbus-jose-jwt |
CVE-2023-52428 | 🚨 HIGH | 9.8.1 | 9.37.2 |
com.squareup.okhttp3:okhttp |
CVE-2021-0341 | 🚨 HIGH | 3.12.12 | 4.9.2 |
commons-beanutils:commons-beanutils |
CVE-2025-48734 | 🚨 HIGH | 1.9.4 | 1.11.0 |
commons-io:commons-io |
CVE-2024-47554 | 🚨 HIGH | 2.8.0 | 2.14.0 |
dnsjava:dnsjava |
CVE-2024-25638 | 🚨 HIGH | 2.1.7 | 3.6.0 |
io.airlift:aircompressor |
CVE-2025-67721 | 🚨 HIGH | 0.27 | 2.0.3 |
io.netty:netty-codec-http |
CVE-2026-33870 | 🚨 HIGH | 4.1.96.Final | 4.1.132.Final, 4.2.10.Final |
io.netty:netty-codec-http2 |
CVE-2025-55163 | 🚨 HIGH | 4.1.96.Final | 4.2.4.Final, 4.1.124.Final |
io.netty:netty-codec-http2 |
CVE-2026-33871 | 🚨 HIGH | 4.1.96.Final | 4.1.132.Final, 4.2.11.Final |
io.netty:netty-codec-http2 |
GHSA-xpw8-rcwv-8f8p | 🚨 HIGH | 4.1.96.Final | 4.1.100.Final |
io.netty:netty-handler |
CVE-2025-24970 | 🚨 HIGH | 4.1.96.Final | 4.1.118.Final |
net.minidev:json-smart |
CVE-2021-31684 | 🚨 HIGH | 1.3.2 | 1.3.3, 2.4.4 |
net.minidev:json-smart |
CVE-2023-1370 | 🚨 HIGH | 1.3.2 | 2.4.9 |
org.apache.avro:avro |
CVE-2024-47561 | 🔥 CRITICAL | 1.7.7 | 1.11.4 |
org.apache.avro:avro |
CVE-2023-39410 | 🚨 HIGH | 1.7.7 | 1.11.3 |
org.apache.derby:derby |
CVE-2022-46337 | 🔥 CRITICAL | 10.14.2.0 | 10.14.3, 10.15.2.1, 10.16.1.2, 10.17.1.0 |
org.apache.ivy:ivy |
CVE-2022-46751 | 🚨 HIGH | 2.5.1 | 2.5.2 |
org.apache.mesos:mesos |
CVE-2018-1330 | 🚨 HIGH | 1.4.3 | 1.6.0 |
org.apache.spark:spark-core_2.12 |
CVE-2025-54920 | 🚨 HIGH | 3.5.6 | 3.5.7 |
org.apache.thrift:libthrift |
CVE-2019-0205 | 🚨 HIGH | 0.12.0 | 0.13.0 |
org.apache.thrift:libthrift |
CVE-2020-13949 | 🚨 HIGH | 0.12.0 | 0.14.0 |
org.apache.zookeeper:zookeeper |
CVE-2023-44981 | 🔥 CRITICAL | 3.6.3 | 3.7.2, 3.8.3, 3.9.1 |
org.eclipse.jetty:jetty-server |
CVE-2024-13009 | 🚨 HIGH | 9.4.56.v20240826 | 9.4.57.v20241219 |
org.lz4:lz4-java |
CVE-2025-12183 | 🚨 HIGH | 1.8.0 | 1.8.1 |
🛡️ TRIVY SCAN RESULT 🛡️
Target: Node.js
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: Python
Vulnerabilities (13)
| Package | Vulnerability ID | Severity | Installed Version | Fixed Version |
|---|---|---|---|---|
apache-airflow |
CVE-2026-26929 | 🚨 HIGH | 3.1.7 | 3.1.8 |
apache-airflow |
CVE-2026-28779 | 🚨 HIGH | 3.1.7 | 3.1.8 |
apache-airflow |
CVE-2026-30911 | 🚨 HIGH | 3.1.7 | 3.1.8 |
cryptography |
CVE-2026-26007 | 🚨 HIGH | 42.0.8 | 46.0.5 |
jaraco.context |
CVE-2026-23949 | 🚨 HIGH | 5.3.0 | 6.1.0 |
jaraco.context |
CVE-2026-23949 | 🚨 HIGH | 6.0.1 | 6.1.0 |
pyOpenSSL |
CVE-2026-27459 | 🚨 HIGH | 24.1.0 | 26.0.0 |
starlette |
CVE-2025-62727 | 🚨 HIGH | 0.48.0 | 0.49.1 |
urllib3 |
CVE-2025-66418 | 🚨 HIGH | 1.26.20 | 2.6.0 |
urllib3 |
CVE-2025-66471 | 🚨 HIGH | 1.26.20 | 2.6.0 |
urllib3 |
CVE-2026-21441 | 🚨 HIGH | 1.26.20 | 2.6.3 |
wheel |
CVE-2026-24049 | 🚨 HIGH | 0.45.1 | 0.46.2 |
wheel |
CVE-2026-24049 | 🚨 HIGH | 0.45.1 | 0.46.2 |
🛡️ TRIVY SCAN RESULT 🛡️
Target: /etc/ssl/private/ssl-cert-snakeoil.key
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: /ingestion/pipelines/extended_sample_data.yaml
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: /ingestion/pipelines/lineage.yaml
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: /ingestion/pipelines/sample_data.json
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: /ingestion/pipelines/sample_data.yaml
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: /ingestion/pipelines/sample_data_aut.yaml
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: /ingestion/pipelines/sample_usage.json
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: /ingestion/pipelines/sample_usage.yaml
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: /ingestion/pipelines/sample_usage_aut.yaml
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️ Target:
|
| Package | Vulnerability ID | Severity | Installed Version | Fixed Version |
|---|---|---|---|---|
com.fasterxml.jackson.core:jackson-core |
CVE-2025-52999 | 🚨 HIGH | 2.12.7 | 2.15.0 |
com.fasterxml.jackson.core:jackson-core |
CVE-2025-52999 | 🚨 HIGH | 2.13.4 | 2.15.0 |
com.fasterxml.jackson.core:jackson-databind |
CVE-2022-42003 | 🚨 HIGH | 2.12.7 | 2.12.7.1, 2.13.4.2 |
com.fasterxml.jackson.core:jackson-databind |
CVE-2022-42004 | 🚨 HIGH | 2.12.7 | 2.12.7.1, 2.13.4 |
com.google.code.gson:gson |
CVE-2022-25647 | 🚨 HIGH | 2.2.4 | 2.8.9 |
com.google.protobuf:protobuf-java |
CVE-2021-22569 | 🚨 HIGH | 3.3.0 | 3.16.1, 3.18.2, 3.19.2 |
com.google.protobuf:protobuf-java |
CVE-2022-3509 | 🚨 HIGH | 3.3.0 | 3.16.3, 3.19.6, 3.20.3, 3.21.7 |
com.google.protobuf:protobuf-java |
CVE-2022-3510 | 🚨 HIGH | 3.3.0 | 3.16.3, 3.19.6, 3.20.3, 3.21.7 |
com.google.protobuf:protobuf-java |
CVE-2024-7254 | 🚨 HIGH | 3.3.0 | 3.25.5, 4.27.5, 4.28.2 |
com.google.protobuf:protobuf-java |
CVE-2021-22569 | 🚨 HIGH | 3.7.1 | 3.16.1, 3.18.2, 3.19.2 |
com.google.protobuf:protobuf-java |
CVE-2022-3509 | 🚨 HIGH | 3.7.1 | 3.16.3, 3.19.6, 3.20.3, 3.21.7 |
com.google.protobuf:protobuf-java |
CVE-2022-3510 | 🚨 HIGH | 3.7.1 | 3.16.3, 3.19.6, 3.20.3, 3.21.7 |
com.google.protobuf:protobuf-java |
CVE-2024-7254 | 🚨 HIGH | 3.7.1 | 3.25.5, 4.27.5, 4.28.2 |
com.nimbusds:nimbus-jose-jwt |
CVE-2023-52428 | 🚨 HIGH | 9.8.1 | 9.37.2 |
com.squareup.okhttp3:okhttp |
CVE-2021-0341 | 🚨 HIGH | 3.12.12 | 4.9.2 |
commons-beanutils:commons-beanutils |
CVE-2025-48734 | 🚨 HIGH | 1.9.4 | 1.11.0 |
commons-io:commons-io |
CVE-2024-47554 | 🚨 HIGH | 2.8.0 | 2.14.0 |
dnsjava:dnsjava |
CVE-2024-25638 | 🚨 HIGH | 2.1.7 | 3.6.0 |
io.airlift:aircompressor |
CVE-2025-67721 | 🚨 HIGH | 0.27 | 2.0.3 |
io.netty:netty-codec-http |
CVE-2026-33870 | 🚨 HIGH | 4.1.96.Final | 4.1.132.Final, 4.2.10.Final |
io.netty:netty-codec-http2 |
CVE-2025-55163 | 🚨 HIGH | 4.1.96.Final | 4.2.4.Final, 4.1.124.Final |
io.netty:netty-codec-http2 |
CVE-2026-33871 | 🚨 HIGH | 4.1.96.Final | 4.1.132.Final, 4.2.11.Final |
io.netty:netty-codec-http2 |
GHSA-xpw8-rcwv-8f8p | 🚨 HIGH | 4.1.96.Final | 4.1.100.Final |
io.netty:netty-handler |
CVE-2025-24970 | 🚨 HIGH | 4.1.96.Final | 4.1.118.Final |
net.minidev:json-smart |
CVE-2021-31684 | 🚨 HIGH | 1.3.2 | 1.3.3, 2.4.4 |
net.minidev:json-smart |
CVE-2023-1370 | 🚨 HIGH | 1.3.2 | 2.4.9 |
org.apache.avro:avro |
CVE-2024-47561 | 🔥 CRITICAL | 1.7.7 | 1.11.4 |
org.apache.avro:avro |
CVE-2023-39410 | 🚨 HIGH | 1.7.7 | 1.11.3 |
org.apache.derby:derby |
CVE-2022-46337 | 🔥 CRITICAL | 10.14.2.0 | 10.14.3, 10.15.2.1, 10.16.1.2, 10.17.1.0 |
org.apache.ivy:ivy |
CVE-2022-46751 | 🚨 HIGH | 2.5.1 | 2.5.2 |
org.apache.mesos:mesos |
CVE-2018-1330 | 🚨 HIGH | 1.4.3 | 1.6.0 |
org.apache.spark:spark-core_2.12 |
CVE-2025-54920 | 🚨 HIGH | 3.5.6 | 3.5.7 |
org.apache.thrift:libthrift |
CVE-2019-0205 | 🚨 HIGH | 0.12.0 | 0.13.0 |
org.apache.thrift:libthrift |
CVE-2020-13949 | 🚨 HIGH | 0.12.0 | 0.14.0 |
org.apache.zookeeper:zookeeper |
CVE-2023-44981 | 🔥 CRITICAL | 3.6.3 | 3.7.2, 3.8.3, 3.9.1 |
org.eclipse.jetty:jetty-server |
CVE-2024-13009 | 🚨 HIGH | 9.4.56.v20240826 | 9.4.57.v20241219 |
org.lz4:lz4-java |
CVE-2025-12183 | 🚨 HIGH | 1.8.0 | 1.8.1 |
🛡️ TRIVY SCAN RESULT 🛡️
Target: Node.js
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: Python
Vulnerabilities (24)
| Package | Vulnerability ID | Severity | Installed Version | Fixed Version |
|---|---|---|---|---|
Authlib |
CVE-2026-27962 | 🔥 CRITICAL | 1.6.6 | 1.6.9 |
Authlib |
CVE-2026-28490 | 🚨 HIGH | 1.6.6 | 1.6.9 |
Authlib |
CVE-2026-28498 | 🚨 HIGH | 1.6.6 | 1.6.9 |
Authlib |
CVE-2026-28802 | 🚨 HIGH | 1.6.6 | 1.6.7 |
PyJWT |
CVE-2026-32597 | 🚨 HIGH | 2.11.0 | 2.12.0 |
Werkzeug |
CVE-2024-34069 | 🚨 HIGH | 2.2.3 | 3.0.3 |
aiohttp |
CVE-2025-69223 | 🚨 HIGH | 3.12.12 | 3.13.3 |
apache-airflow |
CVE-2026-26929 | 🚨 HIGH | 3.1.7 | 3.1.8 |
apache-airflow |
CVE-2026-28779 | 🚨 HIGH | 3.1.7 | 3.1.8 |
apache-airflow |
CVE-2026-30911 | 🚨 HIGH | 3.1.7 | 3.1.8 |
apache-airflow-providers-http |
CVE-2025-69219 | 🚨 HIGH | 5.6.4 | 6.0.0 |
cryptography |
CVE-2026-26007 | 🚨 HIGH | 42.0.8 | 46.0.5 |
jaraco.context |
CVE-2026-23949 | 🚨 HIGH | 5.3.0 | 6.1.0 |
jaraco.context |
CVE-2026-23949 | 🚨 HIGH | 6.0.1 | 6.1.0 |
protobuf |
CVE-2026-0994 | 🚨 HIGH | 4.25.8 | 6.33.5, 5.29.6 |
pyOpenSSL |
CVE-2026-27459 | 🚨 HIGH | 24.1.0 | 26.0.0 |
pyasn1 |
CVE-2026-30922 | 🚨 HIGH | 0.6.2 | 0.6.3 |
ray |
CVE-2025-62593 | 🔥 CRITICAL | 2.47.1 | 2.52.0 |
starlette |
CVE-2025-62727 | 🚨 HIGH | 0.48.0 | 0.49.1 |
tornado |
CVE-2026-31958 | 🚨 HIGH | 6.5.4 | 6.5.5 |
urllib3 |
CVE-2025-66418 | 🚨 HIGH | 1.26.20 | 2.6.0 |
urllib3 |
CVE-2025-66471 | 🚨 HIGH | 1.26.20 | 2.6.0 |
urllib3 |
CVE-2026-21441 | 🚨 HIGH | 1.26.20 | 2.6.3 |
wheel |
CVE-2026-24049 | 🚨 HIGH | 0.45.1 | 0.46.2 |
🛡️ TRIVY SCAN RESULT 🛡️
Target: usr/bin/docker
Vulnerabilities (2)
| Package | Vulnerability ID | Severity | Installed Version | Fixed Version |
|---|---|---|---|---|
stdlib |
CVE-2025-68121 | 🔥 CRITICAL | v1.25.6 | 1.24.13, 1.25.7, 1.26.0-rc.3 |
stdlib |
CVE-2026-25679 | 🚨 HIGH | v1.25.6 | 1.25.8, 1.26.1 |
🛡️ TRIVY SCAN RESULT 🛡️
Target: /etc/ssl/private/ssl-cert-snakeoil.key
No Vulnerabilities Found
🛡️ TRIVY SCAN RESULT 🛡️
Target: /home/airflow/openmetadata-airflow-apis/openmetadata_managed_apis.egg-info/PKG-INFO
No Vulnerabilities Found
|
|
🔴 Playwright Results — 1 failure(s), 17 flaky✅ 2827 passed · ❌ 1 failed · 🟡 17 flaky · ⏭️ 194 skipped
Genuine Failures (failed on all attempts)❌
|



Summary
sqlalchemy-databricks==0.2.0(pyhive-based) to officialdatabricks-sqlalchemy~=2.0.9with native SQLAlchemy 2.0 supportdatabricks+connectortodatabricksacross JSON schemas, generated models, frontend types, and Flyway migrationsHiveCompilerreferences withSQLCompiler/DatabricksStatementCompilerin profiler interface?catalog=) so the new dialect's internal methods (get_pk_constraint,_describe_table_extended) resolve the catalog correctlyRow.values()→tuple(result)for SQLAlchemy 2.0 Row compatibility in table/schema comment extractionColumn._set_parent()to pass requiredall_namesandallow_replacementskwargs for SQLAlchemy 2.0USE CATALOG :catalogparameterized DDL → literalUSE CATALOGfor NATIVE paramstyle compatibility_user_agent_entrydeprecation warning fromdatabricks-sqlalchemyTest plan
_type_mapcompleteness and complex type registrationDatabricksBaseTableParameterdefault schemevisit_column/visit_tablewith new compiler classpyhiveremains in hive extras)🤖 Generated with Claude Code