Snowflake has presented several product news at its Build developer eventfocused on making it easier for companies to develop agentic AI applications for companies.
Among these new features is the general availability of the Snowflake Intelligence business intelligence agent, with which users can answer complex questions in natural language and bring key information closer to the company’s employees.
Additionally, improvements in Snowflake Openflow and Horizon Catalog make it easier to analyze data by connecting structured, unstructured, and semi-structured information from different sources and catalogs. All in a secure and interoperable environment, compatible with multiple providers.
Snowflake also announced several enhancements to native AI and collaborative tools, enabling you to build, test and deploy enterprise AI applications faster and more securely, reducing overhead and cost of ownership, all on the same platform.
Innovations in Horizon Catalog provide context for AI and a unified security and governance framework that secures and connects data across regions, clouds and formats. This guarantees interoperability and the absence of supplier dependency.
For its part, Openflow allows business users to securely automate the integration and ingestion of data from virtually any source, facilitating the centralization of data across the enterprise data lake.
Additionally, by integrating the open APIs Apache Polaris (Incubating) and Apache Iceberg™ REST Catalog Directly in Horizon Catalog, Snowflake delivers an enterprise lakehouse that centralizes governance, security, and interoperable access management across your data in open table formats.
With Interactive Tables and Stores, in private testing, Snowflake redefines AI application and agent development, and with near real-time streaming, coming soon in private testing, businesses can act on live data quickly.
Thus, it allows combining live data with historical context to drive mission-critical use cases, such as fraud detection, personalization, recommendations, observability and IoT monitoring.
Snowflake is also expanding integration options through its agreement with Oracle, providing the opportunity to leverage near real-time data change capture, built on Openflow, to continuously stream transactional updates to the Snowflake AI Data Cloud.
Snowflake Postgres has also seen the light after the purchase of Crunchy Data, soon in a trial version, a managed service that brings the database to the company’s platform, which has released the Postgres pg_lake extensions in open source, a move to facilitate the integration of Postgres with a lakehouse system.
Additionally, Snowflake has also released Business Continuity and Disaster Recovery for managed Iceberg tables in preview, improving protection of critical business data.
Developers can now optimize their data workflows with Cortex Code (in private testing), a revamped AI assistant in the Snowflake UI that allows users to interact with their entire Snowflake environment with natural language.
Cortex Code helps users understand how Snowflake is used, optimize complex queries, and tune their results to maximize cost savings.
With improvements in Snowflake Cortex AISQL, developers can build scalable AI pipelines within Snowflake Dynamic Tables to create AI inference pipelines using declarative SQL queries.
AI Redact (coming soon to public testing) within Cortex AISQL, can detect and redact sensitive data from unstructured data, allowing you to prepare a multimodal dataset for AI while maintaining security and privacy.
Snowflake’s centralized development environment, Workspaces, has been enhanced with Git Integration with a seamless way to review version control; and VS Code Integration, which allows users to work from their preferred Integrated Development Environment (IDE) and share code with the rest of their team.
Lastly, with dbt Projects on Snowflakecompanies can develop, test, deploy and monitor their dbt projects in their Snowflake environment.
