Data is the fuel behind the AI revolution — the foundational building block for the new technological world order. But data is immaterial, difficult to organize, and subject to an ever-growing mountain of walled gardens and regulatory decrees. Businesses seeking to harness AI, therefore, often struggle to make the most of their data, this most vital of resources. Enter Snowflake.
At its annual Snowflake Summit user conference, the company announced the release of Openflow, a new service designed to integrate businesses’ data into a single, unified, and intelligible channel. Like disparate streams flowing into a single river, Openflow takes the whole of a company’s data — structured, unstructured, batch, and streaming — and collects them in such a way that they can be more easily visualized and leveraged.
Also: AI doesn’t have to be a job-killer. How some businesses are using it to enhance, not replace
The platform is also intended to simplify the process of creating new AI systems, including agents, which are able to automatically perform tasks on behalf of human users and work flexibly across an organization’s digital ecosystem.
“With Snowflake Openflow, we’re redefining what open, extensible, and managed data integration looks like, so our customers can quickly build AI-powered apps and agents without leaving their data behind,” Chris Child, VP of Product at Snowflake’s Data Engineering department, said in a statement.
From coal and iron to ones and zeroes
Companies today have to manage a vast amount of data coming from various sources. Every marketing email, internal presentation, customer service interaction, financial statement, video file, and market research survey represents a valuable bit of information that must be collected and stored. The rise of AI has complicated the picture further, as models are trained on the ingestion of this internal, multimodal data.
It’s a bit like an international corporation managing a vast network of mines on different continents. Such a corporation would require an equally vast bureaucracy to ensure that the quota for every individual ore is being met, and that each gets subsequently transported to wherever in the world it needs to go.
<!–>
Snowflake has sought to occupy the new managerial role at a time when companies no longer primarily depend on physical materials like coal or iron, but on digital information. Openflow is the company’s latest step towards achieving that goal: the platform “makes the process of getting data from where it is created to where it can be used effortless,” the company said in a press release.
Snowflake isn’t the only company with its eye on this burgeoning and valuable niche: Box also recently announced that it will soon release its own AI agents that can help businesses organize and retrieve internal data.
New features
Openflow’s ability to operate across all of an organization’s data streams – “interoperability,” in tech parlance – opens the door to some powerful benefits, the company said.
For one, Openflow will enable customers to build custom data build tool (dbt) projects directly within the platform, a feature that will soon launch in public preview. It’s also integrated with Apache Iceberg, a data visualization tool that makes it easy to track files from across the full swath of a company’s internal structure.
Also: How AI coding agents could destroy open source software
And thanks to another new feature called Snowpipe Streaming, now available in public preview, data streaming in Openflow has been ratcheted up to 10 gigabytes-per-second, with “significantly reduced latency,” according to the company.
All of these features have been built with data security and governance in mind – both of which are key considerations at a time when the proliferation of AI tools is also ramping up cybersecurity risks.
Get the morning’s top stories in your inbox each day with our Tech Today newsletter.