[ad_1]
Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More
Microsoft today made good on its promise to offer simpler, more efficient pricing for its Microsoft Fabric suite, a new end-to-end platform for analytics and data workloads.
The pricing is based on how much total compute and storage a customer uses, VentureBeat has confirmed. It will not require customers to pay for separate buckets of compute and storage for each of Microsoft’s multiple services.
The move ups the ante on an array of competitors, including Google and Amazon, who fiercely vie for market share with Microsoft. Those competitors offer similar analytics and data products based on their own clouds, but (Amazon in particular) charge customers multiple times for the various, discreet analytics and data tools used on their clouds. And while Google has created its own fabric offering called Google DataPlex to avoid charging in buckets, Google’s analytics offering isn’t as comprehensive, said Forrester analyst Noel Yuhanna.
The pricing sheet, which shows set pricing for compute and storage across Fabric, is expected to be published tomorrow on Microsoft’s blog. An example of the pricing for U.S. west 2, which covers part of the West Coast, was obtained early Wednesday by VentureBeat and is embedded at the bottom of this story.
Event
Transform 2023
Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.
Register Now
The new pricing comes after Microsoft announced it was integrating its various data and analytics tools last week into the single Fabric suite. The suite integrates six separate tools, including Azure Data Factory, Azure Synapse Analytics and Power BI, into a unified experience and data architecture. The offering is delivered as a software as a service (SaaS), and is designed for engineers and developers to more easily extract insights from data and present them to business decision-makers.
A singular data lake based on an open format
Fabric centers around a centralized data lake called Microsoft OneLake that stores a single copy of data in one place. OneLake is built around the open-source Apache Parquet format, allowing for a unified way to store and retrieve data natively across databases. All Fabric workloads are automatically wired into OneLake, just like all Microsoft 365 applications are wired into OneDrive.
This is where the savings come in. OneLake eliminates the need for developers, analysts and business users across a company to create data silos by provisioning and configuring their own storage accounts for the various tools they use.
So, for example, when a user of Microsoft’s business intelligence tool Power BI wants to run analysis on a Microsoft Synapse data warehouse, they no longer send a SQL query to Synapse. Power BI “simply goes to OneLake and pages the data,” according to Arun Ulagaratchagan, Microsoft’s corporate VP of Azure Data, who spoke with VentureBeat Tuesday.
“This does two things for customers,” he continued. “First, there’s pretty substantial performance acceleration, because if there’s no SQL query being executed, it’s simply going to run data that shares the same open format, across both Synapse and Power BI.”
He added, “The second thing is a big cost reduction for customers. Because you’re not paying for the SQL queries, because there are no SQL queries being done…this idea of a lake-centric and open architecture is so powerful to customers because they don’t have to worry about being locked in. They don’t have to worry about the costs piling up.”
Any unused compute capacity on one workload can be utilized by any of the workloads.
Fabric adds generative AI and multi-cloud
Fabric promises a few other advances. Microsoft will soon add Copilot, a chatbot using generative AI, to every product interface within Fabric. This will allow developers and engineers to use conversational language to ask questions about data or to create data flows, pipelines, code and build machine learning (ML) models.
Second, Fabric supports multi-cloud. Through something called “Shortcuts,” OneLake can virtualize data lake storage in Amazon S3 and Google storage (coming soon).
Microsoft also announced Data Activator, a no-code way for business analysts to automate actions based on data. For example, a sales manager can be alerted if a particular customer is behind on their payments.
In announcing Fabric last week, Microsoft said pricing would come in a separate step (the formal release expected tomorrow). The move saves customers money because Microsoft no longer forces them to pay several buckets of fees for each of the separate tools. For example, being charged once if they use Power BI, again if they use Microsoft’s analytics tools, and again if they use Microsoft’s warehousing tools.
Microsoft also said a single security model will be used for OneLake, where all applications enforce a single security management system on the data as they process queries and jobs.
Ulagaratchagan said he’d pitched the idea of Fabric to 100 of the Fortune 500 companies over the past few years, and chief data officers told him they were “tired of paying what they consider an integration tax.”
Customers seeking simplicity, speed
This “integration tax” was levied not only from separate products from Microsoft, but from the hundreds of other vendors selling data and analytics products that enterprise companies need.
“This is why we introduced Microsoft fabric: To give customers an end-to-end analytics platform that goes from the database to the business user making decisions, and to give every developer an opportunity to sign up within seconds and get real business value within minutes,” said Ulagaratchagan.
Amalgam Insights analyst Hyoun Park said the move by Microsoft puts pressure on Amazon and Google, its two largest competitors in the cloud who have also been charging customers fees for separate buckets of services that they offer.
“For Amazon, that could be 200 different buckets, which is part of what makes cloud cost so challenging,” said Park.
An integrated package of capability
Fabric also puts pressure on some big vendors that only offer one part of the analytics and data stack, Park said. For example, it challenges Snowflake, a data warehouse that uses its own proprietary data formats and requires customers to transform their data to use in other applications. Similarly, it raises questions for business intelligence vendors like Qlik, TIBCO and SAS.
“Part of the innovation here is that Microsoft is providing all of these as an integrated package of capability,” said Park. “And as simple as that sounds. It’s not something that the majority of data and analytic vendors are able to provide.”
On the other hand, the more ambitious global offering will make it a harder sell for Microsoft, according to Park.
By combining products into one, Microsoft’s Fabric isn’t targeting different products to different roles within an organization. Microsoft will now have to sell to the executive suite. Until now, engineers might seek to buy Microsoft’s Data Factory product. Analysts would vouch for Microsoft’s Power BI product. And developers might want Microsoft Synapse.
“This is definitely billed as executive sales because nobody below the C-level can okay this,” said Park.
However, he pointed out that Microsoft is well-positioned to be able to make that pitch.
Forthcoming pricing changes
Here’s what Microsoft said it will be posting tomorrow about pricing:
Rather than provisioning and managing separate compute for each workload, with Microsoft Fabric, a bill is determined by two variables: The amount of compute provisioned and the amount of storage used.
- Compute: A shared pool of capacity that powers all capabilities in Microsoft Fabric, from data modeling and data warehousing to business intelligence. Pay-as-you-go (per sec billing with one minute minimum).
- Storage: A single place to store all data. Pay-as-you-go ($ per GB / month).
By purchasing Fabric capacity, customers will get a set of capacity units (CUs). Capacity units (CUs) are units of measure that represent a pool of compute power needed. Compute power is required to run queries, jobs, or tasks. The CU consumption is highly correlated to the underlying compute effort needed for the tasks performed during the processing time by the capability. Each capability and the associated queries, jobs, or tasks have a unique consumption rate.
VentureBeat’s mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.
[ad_2]
Source link