Businessman Using A Computer To Document Management Concept, Online Documentation Database And Digital File Storage System Or Software, Records Keeping, Database Technology, File Access, Doc Sharing.

[SERIES] Data Shopping Part 1 – How to Shop for Data Products

June 17, 2024
June 17, 2024
17 June 2024

Just as shopping for goods online involves selecting items, adding them to a cart, and choosing delivery and payment options, the process of acquiring data within organizations has evolved in a similar manner. In the age of data products and data mesh, internal data marketplaces enable business users to search for, discover, and access data for their use cases.

In this series of articles, get an excerpt from our Practical Guide to Data Mesh and discover all there is to know about data shopping as well as Zeenea’s Data Shopping experience in its Enterprise Data Marketplace:

  1. How to shop for data products
  2. The Zeenea Data Shopping experience

 

—–

 

As mentioned above, all classic marketplaces offer a very similar “checkout” experience, which is familiar to many people. The selected products are placed in a cart, and then, when validating the cart, the buyer is presented with various delivery and payment options.

The actual delivery is usually done outside the marketplace, providing tracking functionalities. Delivery can be immediate (for digital products) or deferred (for physical products). Some marketplaces have their own logistics system, but most of the time, delivery is the responsibility of the seller. The delivery time is an important element of customer satisfaction – the shorter it is, the more satisfied users are.

How does this shopping experience translate into an Enterprise Data Marketplace? To answer this question, we need to consider what data delivery means in a business context and, for that, focus on the data consumer.

The delivery of data products

 

A data product offers one or more consumption protocols – these are its outbound ports. These protocols may vary from one data product to another, depending on the nature of the data – real-time data, for example, may offer a streaming protocol, while more static data may offer an SQL interface (and instructions for using this interface from various programming languages or in-house visualization tools).

For interactive consumption needs, such as in an application, the data product may also offer consumption APIs, which in turn may adhere to a standard (REST, GraphQL, OData, etc.). Or simply download the data in a file format.

Some consumers may integrate the data product into their own pipelines to build other data products or higher-level uses. Others may simply consume the data once, for example, to train an ML model. It is up to them to choose the protocol best suited to their use case.

Whatever protocols are chosen, they all have one essential characteristic: they are secure. This is one of the universal rules of governance – access to data must be controlled, and access rights supervised.

With few exceptions, the act of purchase therefore simply involves gaining access to the data via one of the consumption protocols.

Access Rights Management for Data Products

 

However, in the world of data, access management is not a simple matter, and for one elementary reason: consuming data is a risky act.

Some data products can be desensitized – somehow removing personal or sensitive data that poses the greatest risk. But this desensitization cannot be applied to the entire product portfolio: otherwise, the organization forfeits the opportunity to leverage data that is nonetheless highly valuable (such as sensitive financial or HR data, commercial data, market data, customer personal data, etc.). In one way or another, access control is therefore a critical activity for the development and widespread adoption of the data mesh.

In the logic of decentralization of the data mesh, risk assessment and granting access tokens should be carried out by the owner of the data product, who ensures its governance and compliance. This involves not only approving the access request but also determining any data transformations needed to conform to a particular use. This activity is known as policy enforcement.

Evaluating an access request involves analyzing three dimensions:

  • The data themselves (some carry more risk than others) – the what.
  • The requester, their role, and their location (geographical aspects can have a strong impact, especially at the regulatory level) – the who.
  • The purpose – the why.

Based on this analysis, the data may be consumed as is, or they may require transformation before delivery (data filtering, especially for data not covered by consent, anonymization of certain columns, obfuscation of others, etc.). Sometimes, additional formalities may need to be completed – for example, joining a redistribution contract for data acquired from a third party, or complying with retention and right-to-forget policies, etc.

Technically, data delivery can take various forms depending on the technologies and protocols used to expose them.

For less sensitive data, simply granting read-only access may suffice – this involves simply declaring an additional user. For sensitive data, fine-grained permission control is necessary, at the column and row levels. Most modern data platforms support native mechanisms to apply complex access rules through simple configuration – usually using data tags and a policy enforcement engine. Setting up access rights involves creating the appropriate policy or integrating a new consumer into an existing policy. For older technologies that do not support sufficiently granular access control, it may be necessary to create a specific pipeline to transform the data to ensure compliance, store them in a dedicated space, and grant the consumer access to that space.

This is, of course, a lengthy and potentially costly approach, which can be optimized by migrating to a data platform supporting a more granular security model or by investing in a third-party policy enforcement solution that supports the existing platform.

Data Shopping in an Internal Data Marketplace

 

In the end, in a data marketplace, data delivery, which is at the heart of the consumer experience, translates into a more or less complex workflow, but its main stages are as follows:

  • The consumer submits an access request – describing precisely their intended use of the data.
  • The data owner evaluates this request – in some cases, they may rely on risk or regulatory experts or require additional validations – and determines the required access rules.
  • An engineer in the domain or in the “Infra & Tooling” team sets up the access – this operation can be more or less complex depending on the technologies used.

Shopping for the consumer involves triggering this workflow from the marketplace.

For Zeenea’s marketplace, we have chosen not to integrate this workflow directly into the solution but rather to interface with external solutions.

In our next article, discover the Zeenea Data Shopping experience and the technological choices that set us apart.

The Practical Guide to Data Mesh: Setting up and Supervising an enterprise-wide Data Mesh

 

Written by Guillaume Bodet, co-founder & CPTO at Zeenea, our guide was designed to arm you with practical strategies for implementing data mesh in your organization, helping you:

✅ Start your data mesh journey with a focused pilot project
✅ Discover efficient methods for scaling up your data mesh
✅ Acknowledge the pivotal role an internal marketplace plays in facilitating the effective consumption of data products
✅ Learn how Zeenea emerges as a robust supervision system, orchestrating an enterprise-wide data mesh

Signature Data Mesh En

zeenea logo

At Zeenea, we work hard to create a data fluent world by providing our customers with the tools and services that allow enterprises to be data driven.

zeenea logo

Chez Zeenea, notre objectif est de créer un monde “data fluent” en proposant à nos clients une plateforme et des services permettant aux entreprises de devenir data-driven.

zeenea logo

Das Ziel von Zeenea ist es, unsere Kunden "data-fluent" zu machen, indem wir ihnen eine Plattform und Dienstleistungen bieten, die ihnen datengetriebenes Arbeiten ermöglichen.

Related posts

Articles similaires

Ähnliche Artikel

Be(come) data fluent

Read the latest trends on big data, data cataloging, data governance and more on Zeenea’s data blog.

Join our community by signing up to our newsletter!

Devenez Data Fluent

Découvrez les dernières tendances en matière de big data, data management, de gouvernance des données et plus encore sur le blog de Zeenea.

Rejoignez notre communauté en vous inscrivant à notre newsletter !

Werden Sie Data Fluent

Entdecken Sie die neuesten Trends rund um die Themen Big Data, Datenmanagement, Data Governance und vieles mehr im Zeenea-Blog.

Melden Sie sich zu unserem Newsletter an und werden Sie Teil unserer Community!

Let's get started

Make data meaningful & discoverable for your teams

Los geht’s!

Geben Sie Ihren Daten einen Sinn

Mehr erfahren >

Soc 2 Type 2
Iso 27001
© 2024 Zeenea - All Rights Reserved
Soc 2 Type 2
Iso 27001
© 2024 Zeenea - All Rights Reserved

Démarrez maintenant

Donnez du sens à votre patrimoine de données

En savoir plus

Soc 2 Type 2
Iso 27001
© 2024 Zeenea - Tous droits réservés.