What is Cloud FinOps?

What is Cloud FinOps?

As organizations pursue their digital transformation journey, Cloud Computing has become an essential foundation for business performance. However, the unlimited flexibility of Cloud services is sometimes accompanied by rising costs, prompting companies to consider ways of controlling expenditure without degrading employee usage. To do so, they are implementing a Cloud financial management approach, also known as Cloud FinOps.

Does the term FinOps ring a bell? Derived from the contraction of Financial Operations, the term refers to a financial management methodology applied in Cloud Computing. The emergence of Cloud FinOps is linked to the need to control costs associated with the exponential growth in the use of Cloud services. This approach aims to reconcile the actions of financial, operational, and technical teams to optimize Cloud spending and guarantee optimal use of resources.

Cloud Finops focuses on cost transparency, identifying optimization opportunities, and empowering teams to take responsibility for their use of Cloud resources. By fostering collaboration between IT, finance, and business teams, Cloud Finops improves visibility, cost predictability, and operational efficiency, enabling companies to maximize the benefits of the Cloud while maintaining strict financial control.

How does Cloud Finops work?

 

Cloud Finops works through a combination of specific practices, processes, and architecture. In terms of architecture, cost monitoring tools, such as Cloud Financial Management platforms, are deployed to collect real-time data on resource usage. This information is then analyzed to identify opportunities for optimization.

In terms of processes, Cloud Finops encourages close collaboration between financial, operational, and technical teams, establishing regular review cycles to evaluate costs and adjust resource allocations. This iterative approach enables you to optimize spending on an ongoing basis, ensuring that your company makes efficient use of Cloud services while creating the conditions for total cost control.

What are Cloud FinOps best practices?

 

The practice of Cloud FinOps relies on a combination of methods, tools, processes, and vision. To take full advantage of your Cloud Finops approach, you’ll need to foster the emergence of a number of best practices.

Transparency & Synergy

 

The founding principles of Cloud FinOps are based on cross-functional collaboration. This involves the close involvement of financial, operational, and technical teams. This synergy enables a common understanding of business objectives and associated costs, promoting continuous optimization of Cloud resources.

Automation & Control

 

Automating processes is essential to ensure optimum cost management on a day-to-day basis. The use of automation solutions for automatic resource provisioning, instance scheduling, and all repetitive cloud management tasks, improves operational efficiency and avoids unnecessary waste.

Reporting & Analysis

 

To guarantee cost transparency, you need to be able to provide detailed, accessible reports on resource utilization. These reports enable teams to make informed decisions. This greater visibility encourages users to take responsibility and makes it easier to identify areas for improvement.

What are the main challenges for Cloud Finops?

 

To deliver its full potential, Cloud FinOps must overcome the complexity of Cloud pricing models. Indeed, the diversity of these models, which vary from one Cloud provider to another, makes it difficult to accurately forecast costs. As a result, expenditure can fluctuate according to demand, making budget planning more delicate.

Finally, compliance management, data security, and Cloud migration considerations are also complex aspects to integrate into an effective FinOps approach.

What does the future hold for Cloud Finops?

 

As companies move further along the road to cloudification, the future of Cloud FinOps looks brighter month after month. Tools and platforms specializing in the financial management of Cloud resources, offering advanced cost analysis, automation, and forecasting capabilities, are likely to continue to grow in line with Cloud adoption.

Closer integration and collaboration between financial, operational, and technical teams will enable companies to place greater emphasis on financial governance in the Cloud, integrating FinOps principles right from the start of their Cloud projects.

What is Edge Analytics?

What is Edge Analytics?

Edge Analytics enables data-driven companies to go straight to analyzing their data after it has been collected by IoT devices. It helps eliminate data processing bottlenecks.

Learn more about Edge Analytics, its benefits, and concrete use cases to better understand this new data trend.

Speed up data processing and analysis, and reduce the number of steps between collecting and using your data assets: That’s the promise of Edge Analytics. This method of data processing is all about proximity to the data source. It avoids all the steps involved in sending data to a data processing center.

How Edge Analytics works

 

Edge Analytics responds to a very different logic than traditional data analysis, with which data is generally transferred to a remote processing center, such as a server or cloud, and the analysis is performed. In the case of Edge Analytics, connected devices or sensors located at the edge of the network collect data in real-time from various sources such as industrial machines, vehicles, surveillance equipment, IoT sensors, etc.

The raw data collected is pre-processed locally – it is then filtered and prepared for immediate analysis. The purpose of this local pre-processing is to clean, sample, normalize and compress the data, in order to reduce the quantity of data to be transferred and guarantee its quality, prior to analysis. Once this preliminary phase has been completed, data analysis is also carried out on-site, at the edge of the network, using algorithms and models previously deployed on local devices or servers.

With Edge Analytics, you can fine-tune your data analysis strategy by transferring only essential data or exceptional events to a remote processing center. The objective? Reduce network bandwidth requirements and save storage resources!

What are the benefits of Edge Analytics?

 

If the proximity between the source of the data and the means of processing and analyzing it appears to be the main advantage of Edge Analytics, you’ll be able to reap five main benefits:

Benefit #1: Accelerate real-time decision-making

 

Less distance between the place where data is collected and the place where it is processed and analyzed means the prospect of time savings on two levels. As Edge Analytics processes data at the network edge, where the data is generated, this enables real-time analysis, eliminating the latency associated with sending data to a remote location. Another advantage of this real-time dimension is that it enables autonomous data analysis.

Benefit N°2: Reduce latency between data collection and analysis

 

Edge Analytics is a promise of real-time exploitation of your data assets because data processing is done locally. In the case of applications requiring rapid responses, such as the Internet of Things (IoT) or industrial control systems (production or predictive maintenance, for example), proximity data processing drastically reduces latency and optimizes processing times.

Benefit N°3: Limit network bandwidth requirements

 

Traditional data analysis almost always relies on the transfer of large quantities of data to a remote data processing center. The result: intensive use of network bandwidth. This is particularly true when your business generates large volumes of data at high speed. Edge Analytics has the advantage of reducing the amount of data that needs to be transferred, as part of the analysis is carried out locally. Only essential information or relevant analysis results are transmitted, reducing the load on the network.

Benefit #4: Optimize data security and confidentiality

 

As you know, not all data have the same level of criticality. Some sensitive data cannot be transferred outside the local network for security or confidentiality reasons. Edge Analytics enables this data to be processed locally, which can enhance security and confidentiality by avoiding transfers of sensitive data to external locations.

Benefit N°5: Embark on the road to scalability

 

Because Edge Analytics enables part of the data analysis to be carried out locally, it enables a significant reduction in network load. In so doing, Edge Analytics facilitates scalability by avoiding bandwidth bottlenecks and paves the way for the multiplication of IoT devices without the risk of network overload.

Data analysis can be distributed across several processing nodes, facilitating horizontal scalability. Adding new devices or servers at the edge of the network increases overall processing capacity and enables you to cope with growing demand without having to reconfigure the centralized processing architecture.

What are the main use cases for Edge Analytics?

 

While the Edge Analytics phenomenon is relatively recent, it’s already being used massively in many business sectors.

Manufacturing

 

Edge is already widely used in manufacturing and industrial automation. In particular, it helps to monitor production tools in real-time, in order to detect breakdowns, optimize production, plan maintenance, or even improve the overall efficiency of equipment and processes.

Healthcare

 

In the healthcare and telemedicine sector, Edge Analytics is used in connected medical devices to monitor patients’ vital signs, detect anomalies, and alert healthcare professionals in real-time.

Smart cities and mobility

 

Edge Analytics is also well suited to the urban mobility and smart cities sector. In the development of autonomous urban transport, for example, real-time analytics can detect obstacles, interpret the road environment, and make autonomous driving decisions.

Security & surveillance

 

The surveillance and security sector has also seized on Edge Analytics, enabling real-time analysis of video streams to detect movement or facial recognition.

What are Industry Cloud Platforms?

What are Industry Cloud Platforms?

With 60% of enterprise data now stored in the Cloud, and companies around the world turning to Cloud solutions to manage their data, many are finding that general-purpose Cloud platforms are not always able to meet the specific needs of their industry. They must then turn to an Industry Cloud Platform.

In this article, find out everything you need to know about Industry Cloud Platforms.

All industries have specific requirements for managing and securing their data. Industry Cloud Platforms are Cloud platforms designed to meet the specific requirements of a given industry or sector.

Unlike general-purpose Cloud platforms, such as Amazon Web Services (AWS) or Microsoft Azure, Industry Cloud Platforms offer features and services tailored to industries such as healthcare, finance, logistics, retail, energy, agriculture, and many others. The popularity of these platforms continues to grow.

According to Gartner, nearly 40% of companies have already considered adopting an Industry Cloud Platform. 15% of them are already engaged in a pilot project. Even better, about 15% more are considering deployment by 2026. As a result, Gartner predicts that by 2027, companies will use Industry Cloud Platforms to accelerate more than 50% of their critical business initiatives, up from less than 10% in 2021!

How does an Industry Cloud Platform work?

 

Industry Cloud Platforms provide robust and scalable cloud infrastructure, along with features and services that are ideally suited to the specific needs of companies in each industry. This can include features such as data analytics tools, supply chain management platforms, industry-specific security solutions, and custom business applications.

Industry Cloud Platforms can help companies improve operational efficiency, reduce costs and innovate faster by providing easy access to specialized cloud services for their industry. In addition, these platforms can help companies better manage risk and comply with industry-specific regulations.

What are the advantages and benefits of Industry Cloud Platforms?

 

Using a specialized Industry Cloud Platform for your sector provides you with data analysis tools and customized business applications. The primary benefit of being able to rely on tailored tools and services is that you gain operational efficiency and productivity.

But that’s not all; Industry Cloud Platforms help reduce the cost of purchasing, maintaining, and upgrading your IT infrastructure by taking an industry-specific approach. The “hyper-specialization” of these Cloud Platforms and the services they contain means that you only have the solutions you really need, and you don’t have to invest in expensive infrastructure that you rarely use to its full potential. This is a “best of need” rather than a “best of breed” perspective.

Moreover, since Industry Cloud Platforms are designed to be scalable and flexible, they will enable you to adapt quickly to the growth of your business. You can easily add or remove Cloud resources as needed to quickly adapt to market fluctuations.

Finally, the use of an Industry Cloud platform increases your capacity to innovate by giving you access to data analysis technologies, adapted to your activity.

Examples of Industry Cloud Platforms for different industries

 

There are many major players in the Industry Cloud Platforms market, each offering specific solutions and services for a particular industry or sector. Here are some examples of major players:

  • Salesforce is one of the leading Industry Cloud Platform players in the sales and marketing industries, with its Salesforce Customer 360 platform.

 

  • Microsoft offers a range of cloud solutions for different industries, such as Dynamics 365 for Finance and Operations for the finance and manufacturing sector, and Azure IoT Suite for the Internet of Things.

 

  • IBM is positioning itself in this market segment with a dedicated cloud platform for several industries, including healthcare, financial services, and supply chain, with its Watson Health, IBM Cloud for Financial Services, and IBM Sterling Supply Chain Suite solutions.

 

  • Amazon Web Services (AWS) offers a range of cloud services for different industries, including AWS Healthcare for healthcare and AWS Retail for retail. These offerings are distinct from Amazon Web Services’ general-purpose offerings.

 

  • SAP has developed a cloud platform for several industries, including manufacturing, retail, financial services, and healthcare, with its SAP S/4HANA, SAP Commerce Cloud, and SAP Health solutions.

Unlock the value of your enterprise data by connecting your Industry Cloud Platform to Zeenea

Our integrated scanners and APIs automatically collect, consolidate, and connect metadata from your data ecosystem so your users can discover, trust, and understand their data.

For more information about our data discovery platform and its connectivity:

What is Cloud Security?

What is Cloud Security?

Cloud Security refers to the technologies, policies, controls, and services that protect data, applications, and infrastructure in the cloud from both internal and external threats.

Between essential protection and uncompromising lockdown, it is critical to find the best compromise between security practices and the flexibility that is essential for business productivity.

Business applications, data storage, complete virtual machines… Almost everything can be governed by Cloud Computing. According to forecasts and observations made by IDC, it appears that spending on cloud infrastructure will increase by 22% in 2022 compared to 2021. This means that it will exceed the $90 billion mark by the end of the calendar year. A record number as this is the highest annual growth rate since 2018! But the more our businesses become cloud-based, the more the issue of Cloud Security becomes a top priority.

However, behind a concept as vast and complex as Cloud Security, it must be understood that it is based on a set of strategies, technical means, and control solutions that ensure both the protection of data stored, the availability and reliability of applications and infrastructure services essential to the operation of the cloud.

Protection against external and internal threats and vulnerabilities, resilience, and resistance to cybersecurity issues, Cloud Security is a concept intrinsically linked to Cloud Computing.

Private, public, hybrid… each cloud has its own security challenges

To fully understand Cloud Security, we must first differentiate the types of Cloud Computing. Public clouds are hosted by third-party cloud service providers.

When you use a public cloud, you get a turnkey cloud and have no latitude to configure and administer it, and the services are fully managed by the cloud provider.

If, on the other hand, you move to a private cloud service, you have a more secure and potentially more customizable space.

Finally, hybrid cloud services combine the scalability of public clouds with the greater resource control of private clouds while offering lower pricing than private clouds.

In all cases, whether you choose a public, private or hybrid cloud service, it is critical to ensure Cloud Security.

Why is Cloud Security important?

The more your company leverages the cloud, the more agile it becomes. Even better: the cloud enables small and medium-sized businesses to have the same tools and functionalities as very big companies. The downside is that the power at your disposal increases the cloud usage of your teams and, consequently, increases your exposure to threats that in the past only concerned larger companies.

That’s why Cloud Security is more important than ever. Unleashing usage and productivity through the cloud mechanically increases your dependency on the cloud.

Without the cloud, nothing is possible. Therefore, Cloud Security becomes a priority issue, especially when it comes to data protection. Preventing data leakage and theft is essential to maintaining your customers’ trust. Cloud Security, by guaranteeing data protection, makes it an element of trust between you and your customers.

The challenges of Cloud Security

The availability of the cloud is a major issue for companies. In this context, the first challenge of Cloud Security is to guarantee maximum availability, particularly by protecting infrastructures from Denial of Service (DDoS) attacks. Analysis of traffic on cloud servers, and detection of suspicious packets, are all practices that are essential to Cloud Security. The fight against data breaches and by extension, against data loss, are two other prominent challenges of Cloud Security.

Finally, the last key challenge of Cloud Security is user identity verification. As employees become increasingly mobile, they connect to cloud services from anywhere in the world, making identity verification more and more complex. This is why multi-factor authentication is highly recommended by cloud players.

Anticipation, visibility, transparency, reactivity, and proactivity are the levers to be used on a daily basis to guarantee Cloud Security.

How pivoting to a SaaS model allowed 320 production releases in 6 months

How pivoting to a SaaS model allowed 320 production releases in 6 months

After starting out as an on-premise data catalog solution, Zeenea made the decision to switch to a fully SaaS solution. A year and a half later, more than three hundred production releases have been carried out over the last six months, an average of almost three per day! We explain here the reasons that pushed us to make this pivot, the organization put in place to execute it, as well as the added value for our customers.

 

Zeenea’s beginnings: an on-prem’ data catalog

When Zeenea was created in 2017, it was an on-premise solution, meaning that the architecture was physically present within our client’s companies. This choice was made in response to two major issues: the security of a solution that accesses the customer’s data systems is essential and must be guaranteed; most of our customers’ information systems relied on on-premise database management systems, which could not be accessed outside of these companies’ internal networks.

This approach, however, was a constraint to the expansion and evolution of Zeenea. The first reason was that it required a lot of customer support for deployments. The second reason was that several versions could be in production at different customers simultaneously. Also, it was complicated to deploy urgent fixes. Finally, the product value developed was only updated at the customer’s site at a late stage.

 

The strategic pivot to a 100% SaaS data catalog

Faced with these potential obstacles to the development of our data catalog, we naturally decided at the end of 2019 to make the switch to a fully SaaS solution. A year and a half later, we have just completed more than three hundred production releases over the past six months, an average of almost three per day. Here’s how we did it.

First, we addressed the initial security issue. We integrated security into our cloud practices right from the start of the project, and have in fact launched a security certification process in this regard (SOC2 and soon ISO27001). 

Then, we extracted from our architecture the only brick that had to remain on-premise: the Zeenea scanner. From a technological point of view, we set up a multi-tenant SaaS architecture, by splitting our historical monolith into several application bricks. 

However, the biggest challenge did not lie in the technical aspects, but in the cultural and organizational aspects…

 

The keys to our success: organization and acculturation to the SaaS model

We have built and consolidated our SaaS culture, mainly by orienting our recruitments towards experienced profiles in this field, and by organizing knowledge sharing efficiently.

To illustrate the cultural aspect, we distinguish, for example, finished developments from complete developments. At Zeenea, a development is considered finished when it is integrated into the code base, without any known bugs, with a level of security and engineering that conforms to the level of requirements that we set for ourselves. A development is considered complete when it can be made available to our customers, so that the developed functionalities form a usable and coherent whole. 

To support this distinction, we have implemented a feature toggle mechanism to manage the activation of fully developed features: a development is systematically put into production as soon as it is finished, and then activated for our customers once it is complete.

In terms of organization, we have set up Feature Teams: each team works on a given feature, on all its components. As soon as a feature is complete, it is delivered. Other features are delivered incomplete, deactivated, but finished.

 

The SaaS model and added value for our customers 

The first to benefit from the agility of the SaaS model are obviously Zeenea’s customers. The functionalities are available more quickly, that is to say as soon as they are complete. Moreover, the deployment of a new functionality can be done at their convenience within two months after the feature toggle is made available. This allows for easy integration into the customer’s context, notably by integrating their user constraints. Finally, this ability to activate features allows us to demonstrate the features in advance, or even in some cases to activate them in beta testing for our customers.

All this is obviously combined with the traditional advantages of a SaaS solution: automatic and frequent updates of minor evolutions or corrections, access to the solution from any browser, the absence of infrastructure at our customers’ sites allowing rapid scalability, etc. 

If the path to pivot from an on-premise model to a SaaS application has had many challenges, we are proud today to have met the challenge of implementing continuous deployment and to bring more and more added value to our customers.

We are always looking for new talents to join our features teams, so if you want to join the Bureau des Légendes, the IT Crowd or the ACDC team, contact us.

The keys to succeed your Cloud Migration

The keys to succeed your Cloud Migration

The recent COVID-19 pandemic has brought about major changes in the work culture and the Cloud is becoming an essential part by offering employees access to the company’s data, wherever they are.  But why migrate? How to migrate? And for what benefits? Here is an overview.

Head in the clouds and feet on the ground, that’s the promise of the Cloud which, with the health crisis, has proven to be an essential tool for business continuity.

In a study conducted by Vanson Bourne at the end of 2020, it appears that more than 8 out of 10 business leaders (82%), accelerated their decision to migrate their critical data and business functions to the Cloud, after facing the COVID-19 crisis. 91% of survey participants say they have become more aware of the importance of data in the decision-making process since the crisis began. 

Cloud and data. A duo that is now inseparable from business performance.

A reality that is not limited to a specific market. The plebiscite for Cloud data migration is almost worldwide! The Vanson Bourne study highlights a shared awareness on an international scale, with edifying figures:

  • United States (97%),
  • Germany and Japan (93%),
  • United Kingdom (92%).

Finally, 99% of Chinese executives are accelerating their plans to complete their migration to the Cloud. In this context, the question “why migrate to the Cloud” is unequivocally answered: if you don’t, your competitors will do it before you and will definitely beat you to it.

 

The main benefits of Cloud migration

Ensuring successful Cloud data migration is first and foremost a question of guaranteeing its availability in all circumstances. Once stated, this benefit leads to many others! If data is accessible everywhere and at all times, a company is able to meet the demand for mobility and flexibility expressed by employees. 

A requirement that was fulfilled during the successive confinements and that should continue as the return to normalcy seems finally possible. Fully operational employees at home, in the office or in the countryside, not only promise increased productivity but also a considerable improvement in the user experience. HR benefits are not the only consequences of Cloud migration. 

From a financial point of view, the Cloud opens the way to a better control of IT costs. By shifting data from a CAPEX dimension to an OPEX dimension, you can improve the TCO (Total Cost of Ownership) of your information system and your data assets. Better experience, budget control, the Cloud opens the way to optimized data availability. 

Indeed, when migrating to the Cloud, your partners make commitments in terms of maintenance or backups that guarantee maximum access to your data. You should therefore pay particular attention to these commitments, which are referred to as SLAs (Service Level Agreements). 

Finally, by migrating data to the cloud, you benefit from the expertise and technical resources of specialized partners who deploy resources that are far superior to those that you could have on your own.

 

How to successfully migrate to the Cloud

Data is, after human resources, the most valuable asset of a company.

This is one of the reasons why companies should migrate to the Cloud. But the operation must be carried out in the best conditions to limit the risk of data degradation, as well as the temporary unavailability that impacts your business.

To do this, preparation is essential and relies on one prerequisite: the project does not only concern IT teams, but the entire company. 

Support, reassurance, training: the triptych that is essential to any change management process must be applied. Then make sure you give yourself time. Avoid the Big Bang mode, which could irritate your teams and dampen their enthusiasm. Even if the Cloud migration of your data should go smoothly, put all the chances on your side by making backups of your data. 

Rely on redundancy to prepare for any eventuality, including (and especially!) the most unlikely. Once the deployment on the cloud is complete, ensure the quality of the experience for your employees. By conducting rigorous long-term project management, you can easily identify if you need to make adjustments to your initial choices. 

 

The scalability of the Cloud model is a strength that you should seize upon to constantly adapt your strategy!

COVID19: The rise of digital transformations for enterprises

COVID19: The rise of digital transformations for enterprises

2020 has marked a turning point in companies’ processes for digital transformation. The coronavirus and its procession of health measures, restrictions and precautions have strongly impacted human societies as well as the business world. Let’s take a look back at the data management challenges related to COVID19.

With the closure of physical points of contact, the widespread use of remote work, logistical tensions and uncertainties all around the world, throughout 2020, companies have had to reinvent codes, implement new methods and develop new strategies. The challenge? To maintain the link between employees and customers, who are kept at a distance by more or less strict measures.

According to a study by Gartner, 69% of companies believe that the health crisis has accelerated their digital business initiatives. For 60% of them, digital transformation is a perspective to improve their operational efficiency. Another study conducted by Pega Systems reveals that 56% of companies have increased their budgets related to their digital transformation. Sixty-nine percent consider that the health crisis is leading them to become more empathetic to their customers. At the heart of these challenges: data management.

 

Accelerating digital transformations… with data

According to estimates from a study carried out at the end of 2020, the COVID-19 crisis has accelerated enterprises’ digital strategy at an average of 6 years. In fact, 97% of business decision-makers believe that the pandemic has accelerated their digital transformation.

The same study reveals that in the face of the crisis, 95% of companies are looking for new ways to engage customers and 92% say that the transformation of digital communications is critical to meeting today’s business challenges.

Behind these findings, there is a compelling need: to make the most of data. Indeed, beyond identifying new levers to engage customers and keep in touch with your audiences, it’s first and foremost about understanding their needs. To show empathy, you need to know who you are talking to and which channel you can interact effectively with them.

Data management is the foundation of this whole process of accelerating your digital transformation!

Data allows you to define your marketing strategies as well as define the axes of your campaigns (which is essential to preserve the continuity of your business activity in times of crisis). It also conditions your priorities in terms of R&D. In fact, the customer knowledge provided by data management allows you to include innovation in a data-driven dimension.

The objective: to design and develop products and services corresponding to the expectations and needs of your target audience. A study by Solocal, published at the end of 2020, highlighted that for 81% of companies seeking to accelerate their digital transformation, their objective is to solicit and respond to customer feedback. 

 

Certain sectors at the forefront of digital transformation

Integrating real-time data into business strategies, analyzing customer journey, deploying predictive analytics solutions to accelerate commitments or detecting weak signals in order to anticipate them… The scope of data management is widening every day.

During the COVID-19 crisis, some business sectors had to completely reinvent themselves and, thanks to their data assets, managed to find levers to maintain business continuity.

For example, a recent study by the consulting firm QuantMetry, published last October, showed that 68% of companies have maintained or even increased their data-related budgets in 2020. During the health crisis, Uber for Business noted an explosion in demand for meal delivery. With the careful use of their data assets, the company managed, in just a few months, to design a new offer for marketing departments. The concept? To offer meal delivery vouchers not only to employees working from home, but also to corporate customers as a loyalty lever.

The offer, with a B2B2C strategy, is also positioned as an alternative to all those moments of sociability linked to professional events. These vouchers are also first-rate marketing tools because many indicators are related to them (type of use, type of meals ordered, geographical area…). These are all useful KPIs for Uber for Business, but also for the companies that use them to refine their customer retention strategies… or talent retention strategies.

If you are in the middle of digitally transforming your business, check out our tips on how to succeed your transformation!

 

Best practices for succeeding your digital transformation

Best practices for succeeding your digital transformation

Today, data impacts all sectors; all companies are confronted with data management challenges, in one way or another. In spite of these observations, most of them are still struggling to really transform their enterprises into data-driven organizations. One of the reasons why they do not succeed is that they are often faced with complex and time-consuming information systems, which represent a real obstacle for their digital transformation. 

Indeed, few organizations are able to truly find their enterprise data, especially if their data ecosystem includes different formats, sizes, and varieties of data. Moreover, it is difficult to interpret them, or even to know if they are of quality when they are poorly, or simply not, documented. 

In this article, we share some best practices so that companies can find the keys to start their digital transformation through data discovery.

Rethink your corporate IS

For many years, data and data management challenges were reserved for “Tech Giants”, such as GAFAM for example (Google, Apple, Facebook, Amazon, Microsoft). Being actors of the web and working mainly with digital resources, their IS and infrastructures were already thought and developed around data. It was therefore more difficult for other market players to implement new models and strategies: they were at a disadvantage!

These other organizations were confronted with technologies built on an accumulation over time. It is obvious that it is more complex to undertake digital transformation in this case.

A striking example can be found in the banking sector. While banks have been able to standardize data due to a succession of international directives, they have had to deal with new digital banks, offering much more agile and efficient services. In order to remain competitive, they realized that it is necessary to change their strategic model to stay in the digital race.

Another example can be found in the automotive sector. Buyer behaviors tend to gradually change over time, and these changes have not come unnoticed. The increased demand for “eco-friendly” travel through the use of bicycles, access to VTC services, or shared mobility with the increase in carpooling. These facts mainly show that the mobility market is changing. Demand is no longer based on the acquisition of new goods but rather on the mobility services made available to them.  Thus, in order to remain competitive and meet the expectations of their users’ new behaviors, automotive market players must diversify. 

Getting the right data solutions

In order to meet the needs of data users, it is essential to equip your data teams with the right solutions. This means answering the question: which cross-functional technologies across all the company’s silos need to be deployed in order to have an agile configuration? To do this, IT resources need to be organized to cover three main stages:

  1. data discovery,
  2. the preparation and transformation of these data,
  3. the consumption of these by the different departments of the company.

For more information on data discovery, here are detailed articles:

These three steps are essential to start its transformation to a data-driven enterprise.

On the one hand, they help implement appropriate security measures to prevent the loss of sensitive data and avoid devastating financial and reputational consequences for the company. On the other hand, it enables data teams to drill down into the data to identify the specific elements that reveal the answers and find ways to show the answers. 

All this with a data catalog

At Zeenea, we define a data catalog as being:

“A detailed inventory of all data assets in an organization and their metadata, designed to help data professionals quickly find the most appropriate data for any analytical business purpose.”

With the help of a data catalog, both data teams and managers will be able to start their digital transformation based on their company’s data. 

Choosing Zeenea is choosing:

  • An overview of all of an enterprise’s data assets through our connectors,

  • A Google-esque search engine that enables employees to intuitively search for a dataset, business term, or even a field from just a single keyword. Narrow your search with various personalized filters (reliability score, popularity, type of document, etc.).

  • A collaborative application that allows enterprises to become acculturated to data thanks to collective knowledge, discussions, feeds, etc,

  • Machine learning technology that notifies you and gives suggestions as to your catalogued data’s documentation,

  • A dedicated user experience that allows data leaders to empower their data explorers to become autonomous in their data journeys.

Learn more about our data catalog 

Start Accelerating your data initiatives now

If you would like more information, a free and personalized demo, or if you just want to say hello, do not hesitate to contact us and our sales team will answer you as soon as we receive your request 🙂.

How Total accelerated their digital transformation

How Total accelerated their digital transformation

Total, one of the 7 “SuperMajor” oil companies, has recently opened their Digital Factory earlier this year in Paris. The Digital Factory will bring together up to 300 different profiles such as developers, data scientists and other digital experts to accelerate the Group’s digital transformation.

More specifically, Total’s Digital Factory aims to develop the digital solutions Total needs to improve its availability and cost operations in order to offer new services to their customers. Their priorities are mainly centered around the management and control of energy consumption, the ability to extend their reach to new distributed energies, as well as provide more environmentally friendly solutions. Total’s ambition is to generate $1.5 billion in value per year for the company by 2025.

During France’s Best Developer 2019 contest, Patrick Pouyanné, Chairman and Chief Executive Officer of Total, stated:

 “I am convinced that digital technology is a critical driver for achieving our excellence objectives across all of Total’s business segments. Total’s Digital Factory will serve as an accelerator, allowing the Group to systematically deploy customized digital solutions. Artificial intelligence (AI), the Internet of Things (IoT) and 5G are revolutionizing our industrial practices, and we will have the know-how in Paris to integrate them in our businesses as early as possible. The Digital Factory will also attract the new talent essential to our company’s future.”

 

Who makes up the Digital Factory teams?

In an interview with Forbes this past October, Frédéric Gimenez, Chief Digital Officer and Head of the project described how the teams will be structured within Digital Factory. 

As mentioned above, the team will have around 300 different profiles, all working using agile methodologies: managerial lines will be flattened, teams will have great autonomy and development cycles will be short in order to “test & learn” quickly and efficiently. 

Gimenez explains that there will be multiple teams in his Digital Factory:

  • Data Studio, which will consist of data scientists. Total’s CDO (Chief Data Officer) will be the one in charge of this team and their main missions will be to acculturate the enterprise to data and manage the data competences of the Digital Factory.
  • A pool of developers and agile coaches. 
  • Design Studio, that will regroup UX and UI professionals. They will help come up with various creative ideas and will interfere not only at the analysis stage of Total’s business projects but also during the customer journey stages.
  • A “Tech Authority” team, in charge of the security and architecture of their data ecosystem, in order to effectively transform their legacy in a digital environment.
  • A platform team, in charge of various data storages such as their Cloud environment, their data lake, etc.
  • A Product & Value office in charge of managing the Digital Factory portfolio, assessing the value of projects with the business and analyzing all the use cases submitted to the Digital Factory.
  • A HR & a general secretariat 
  • Product Owners that come from all over the world. They are trained in agile methods on arrival and then immersed in their project for 4 to 6 months. They then accompany the transformation when they return to their jobs. 

These teams will soon be reunited in a 5,500m2 workspace in the heart of Paris in the 2nd arrondissement, an open-space favorising creativity and innovation. 

How governance works at Total’s Digital Factory

Gimenez explained that the business lines are responsible for their Digital Factory use cases. The Digital Factory analyzes the eligibility of their use cases through four criteria:

  • Value brought during the 1st iteration and during its scaling up 
  • Feasibility (technology / data)
  • Customer Appetence / Internal Impact
  • Scalability 

An internal committee at the Digital Factory then decides whether or not the use case is taken care of and the final decision is validated by Gimenez himself.  For good coordination with the business lines, the digital representatives in the branches are also located in the Digital Factory. They are responsible for acculturating the business lines and piloting the generation of ideas, but also for ensuring the consistency of their branch’s digital initiatives with the Group’s ambitions, Total calls them Digital Transformation Officers.

 

First success of Total’s Digital Factory

Digital Factory started this past March and deployed the first squads in April during the Corona virus lockdown in France. In the Forbes interview, Gimenez explained that 16 projects are in progress with a target of 25 squads in permanent operation.

The first two digital solutions will be delivered by the end of this year:

  • A tool for Total Direct Energie to assist customers in finding the best payment schedule using algorithms and data
  • A logistics optimization solution based on IoT trucks for the Marketing and Services branch, which will be deployed in 40 subsidiaries.

 In addition, Total managed to attract experts such as data scientists (despite a still very limited form of communication such as Welcome to the Jungle or Linkedin) and retain them by offering a diversity of projects.

“We are currently carrying out a first assessment of what has worked and what needs to be improved, we are in a permanent adaptation process.” stated Gimenez.

 

Digital Factory in the future?

Gimenez ended the Forbes interview by saying that the main reason for his project’s success is the general mobilization that everyone kept despite the sanitary context: “We received more use cases than we are able to deliver (50 projects per year to continuously feed our 25 squads)!”

Otherwise Total has two major KPI sets:

. measuring the proper functioning of the squads by examining the KPIs of their agile methodologies

. tracking the value generated

 

Are you interested in unlocking data access for your company?

Are you in the manufacturing industry? Get the keys to unlocking data access for your company by downloading our new white paper “Unlock data for the manufacturing industry” 

Air France: Their Big Data strategy in a hybrid cloud context

Air France: Their Big Data strategy in a hybrid cloud context

Air France-KLM is the leading group in terms of international traffic departing from Europe. The airline is a member of the SkyTeam alliance which consists of 19 different airlines; offering access to a global network of more than 14,500 daily flights in over 1,150 destinations around the world. In 2019, Air France represented: 

  • 104.2 million passengers,
  • 312 destinations,
  • 119 countries,
  • 546 aircrafts,
  • 15 million members enrolled in their “Flying Blue” loyalty program*,
  • 2,300 flights per day*. 

At the Big Data Paris 2020, Eric Poutrin, Lead Enterprise Architect Data Management & Analytics at Air France, explained how the airline business works, what Air France’s Big Data structure started as, and how their data architecture is today in the context of a hybrid cloud structure.

air-france-big-data-paris-1-1

How does an airline company work?

Before we start talking about data, it is imperative to understand how an airline company works from the creation of its flight path to its landing. 

Before planning a route, the first step for an airline such as Air France is to have a flight schedule. Note that in times of health crises, they are likely to change quite frequently. Once the flight schedule is set up, there are three totally separate flows that activate for a flight to have a given departure date and time: 

  • the flow of passengers, which represents different forms of services to facilitate the traveler’s experience along the way, from buying tickets on their various platforms (web, app, physical) to the provision of staff or automatic kiosks in various airports to help travelers check in, drop off their luggage, etc.

  • the flow of crew management, with profiles adapted to the qualifications required to operate or pilot the aircraft, as well as the management of flight attendant schedules.

  •  the engineering flow which consists of getting the right aircraft with the right configuration at the right parking point.
air-france-big-data-paris-2

However, Eric tells us that all this… is in an ideal world: 

“The “product” of an airline goes through the customer, so all of the hazards are visible. And, they all impact each other’s flows! So the closer you get to the date of the flight, the more critical these hazards become.”

Following these observations, 25 years ago now, Air France decided to set up a “service-oriented” architecture, which allows, among other things, the notification of subscribers in the event of hazards on any flow. These real-time notifications are pushed either to agents or passengers according to their needs: prevention of technical difficulties (an aircraft breaking down), climate hazards, prevention of delays, etc.

“The objective was to bridge the gap between a traditional analytical approach and a modern analytical approach based on omni-present, predictive and prescriptive analysis on a large scale” affirmed Eric.

Air France’s Big Data journey

air-france-big-data-paris-3

The timeline

In 1998, Air France began their data strategy by setting up an enterprise data warehouse on the commercial side, gathering customer, crew and technical data that allowed the company’s IT teams to build analysis reports. 

Eric tells us that in 2001, following the SARS (Severe Acute Respiratory Syndrome) health crisis, Air France had to redeploy their aircrafts following the ban on incoming flights to the United States. It was the firm’s data warehouse that allowed them to find other sources of revenue, thanks to their machine learning and artificial intelligence algorithms. This way of working with data had worked well for 10 years and even allowed the firm to overcome several other difficulties, including the tragedy of September 11, 2001 and the crisis of rising oil prices. 

In 2012, Air France’s data teams decided to implement a Hadoop platform in order to be able to perform predictive or prescriptive analysis (depending on individual needs) in real time, as the data warehouse no longer met these new needs and the high volume of information that was to be managed. It was only in a few months after the implementation of Hadoop, KAFKA, and other new-generation technologies that the firm was able to obtain much “fresher” and more relevant data. 

Since then, the teams have been constantly improving and optimizing their data ecosystem in order to stay up to date with new technologies and thus, allow data users to work efficiently with their analysis.

Air France’s data challenges

During the conference, Eric also presented the firm’s data challenges in the implementation of a data strategy:

  • Delivering a reliable analytics ecosystem with quality data,
  • Implementing technologies adapted for all profiles and their use cases regardless of their line of business,
  • Having an infrastructure that supports all types of data in real time. 

Air France was able to resolve some of these issues with the implementation of a robust architecture (which notably enabled the firm to withstand the COVID-19 crisis), as well as the setting up of dedicated teams, the deployment of applications and the security structures, particularly regarding the GDPR and other pilot regulations. 

However, Air France KLM has not finished working to meet their data challenges. With ever-increasing volumes of data, the number of data and business users growing, managing data flows across the different channels of the enterprise and managing data is a constant work of governance:

“We must always be at the service of the business, and as people and trends change, it is imperative to make continuous efforts to ensure that everyone can understand the data”.

air-france-big-data-paris-4

Air France’s unified data architecture

The Unified Data Architecture (UDA) is the cornerstone of Air France. Eric explains that there are four types of platforms:

The data discovery platform 

Separated into two different platforms, they are the applications of choice for data scientists and citizen data scientists. They allow, among other things, to :

  • extract the “knowledge” from the data,
  • process unstructured data, (text, images, voice, etc.)
  • have predictive analytics support to understand customer behaviors

A data lake 

Air France’s data lake is a logical instance and is accessible to all the company’s employees, regardless of their profession. However, Eric specifies that the data is well secured: “The data lake is not an open bar at all! Everything is done under the control of the data officers and data owners“. The data lake :

  • stores structured and unstructured data,
  • combines the different data sources from various businesses,
  • provides a complete view of a situation, a topic or a data environment,
  • is very scalable.

“Real Time Data Processing” platforms 

To operate the data, Air France has implemented 8 real-time data processing platforms to meet the needs of each “high priority” business use case.  For example, they have a platform for predictive maintenance, customer behavior knowledge, or process optimization on stopovers.

Eric confirms that when an event or hazard occurs, their platform is able to push recommendations in “real time” in just 10 seconds!

Data Warehouses 

As mentioned above, Air France had also already set up data warehouses to store external data such as customer and partner data and data from operational systems.  These Data Warehouses allow users to query these datasets in complete security, and are an excellent communication vector to explain the data strategy between the company’s different business lines.

air-france-big-data-paris-5

The benefits of implementing a Hybrid Cloud architecture

Air France’s initial questions regarding the move to the Cloud were :

  • Air France KLM aims to standardize its calculation and storage services as much as possible.
  • Not all data is eligible to leave Air France’s premises due to regulations or sensitive data.
  • All the tools already used in UDA platforms are available both on-premise and in the public cloud.

Éric says that a hybrid Cloud architecture would allow the firm to have more flexibility to meet today’s challenges:

“Putting our UDA on the Public Cloud would give greater flexibility to the business and more options in terms of data deployment.”

According to Air France, here is the checklist of best practices before migrating to a Hybrid Cloud:

  • check if the data has a good reason to be migrated to the Public Cloud
  • check the level of sensitivity of the data (according to internal data management policies)
  • verify compliance with the UDA implementation guidelines
  • verify data stream designs
  • configure the right network connection
  • for each implementation tool, choose the right level of service management
  • for each component, evaluate the locking level and exit conditions
  • monitor and forecast possible costs
  • Adopt a security model that allows Hybrid Cloud security to be as transparent as possible.
  • Extend data governance in the Cloud

Where is Air France today? 

It’s clear that the COVID-19 crisis has completely changed the aviation sector. Every day, Air France has to take the time to understand new passenger behavior and adapt flight schedules in real time, in line with the travel restrictions put in place by various governments. By the end of summer 2020, Air France will have served nearly 170 destinations, or 85% of their regular network. 

Air France’s data architecture has therefore been a key catalyst for the recovery of their airlines:

“a huge thanks to our business users (data scientists) who every day try to optimize services in real time so that they can understand how passengers are behaving in the midst of a health crisis. Even if we are working on artificial intelligence, the human factor is still an essential resource in the success of a data strategy”. 

Retail 4.0: How Monoprix migrated to the Cloud

Retail 4.0: How Monoprix migrated to the Cloud

Omni-channel leader with a presence in more than 250 cities in France, Monoprix, french retail chain, offers varied innovative products and services every day with a single objective in mind: “making the good and the beautiful accessible to all”. 

The company’s stores combine food retailing with hardware, clothing, household items and gifts. To give some stats on the firm, Monoprix in 2020 is : 

  • Nearly 590 stores in France,
  • 22,000 employees,
  • Approximately 100 stores internationally,
  • 800,000 customers per day,
  • 466 local partner producers.

With close to one million customers in store and more than 1.5 million users on their website each day, it’s no secret that Monoprix has hundreds of thousands of data to manage! Whether it’s from loyalty cards, customer receipts or online delivery orders, the company has to manage a huge amount of data in a variety of formats. 

At Big Data Paris 2020, Damien Pichot, Director of Operations and Merchandise Flows at Monoprix, shared with us the company’s journey in implementing a data-driven culture thanks to the Cloud.  

big-data-paris-monoprix-1

Big Data at Monoprix

In response to the amount of data that was coming into Monoprix’s data systems every day, the company had implemented various technologies: an on-premise data warehouse for structured data and a data lake in the cloud, which was used to manage the semi-structured data coming from their websites. In addition, a lot of data also comes from partners or service providers, in the context of information exchanges and acquisitions.

Despite the fact that the architecture had been working well and fulfilling its role for many years, it was beginning to show its limitations and weaknesses: 

“To illustrate, every Monday, our teams gather and analyze the profits made and everything that happened the previous week. As time went by, we realized that each week the number of users logging in to our information systems was increasing and we were reaching saturation. In fact, some of our employees would have to get up at 5am to launch their queries, only to retrieve it that day in the late morning or early afternoon,” explains Damien Pichot. 

Another negative aspect of the company’s IT structure was regarding their business users, and more specifically the marketing users. They were beginning to develop analytical environments outside the control of the IT department, thus creating what is known as “shadow IT”.  The Monoprix data teams were obviously dissatisfied because they had no supervision over the business projects. 

“The IT department represented within Monoprix was therefore not at the service of the business and did not meet its expectations”. 

After consulting the IT committee, Monoprix decided to break off their contract with their large on-premise structure. The new solution had to answer four questions:

  • Does the solution allow business users to be autonomous
  • Is the service efficient / resilient?
  • Will the solution lower operating costs?
  • Will users have access to a single platform that will enable them to extract all the data from the data warehouse and the data lake in order to meet business, decision-making, machine learning and data science challenges? 

After careful consideration, Monoprix finally decided to migrate everything to the Cloud! “Even if we had opted for another big on-premise solution, we would have faced the same problems at some point. We might have gained two years, but that’s not viable in the long term.” 

Monoprix’s journey to the Cloud

Monoprix started this new adventure in the Cloud with Snowflake! Only a few months after its implementation, Monoprix quickly saw improvements  compared to their previous architecture. Snowflake was also able to meet their needs in terms of data sharing, which is something they were struggling to do before, as well as robustness and data availability.

The first steps

During the conference, Damien Pichot explained that it was not easy to convince Monoprix teams that a migration to the Cloud was secure. They were reassured with the implementation of Snowflake, which carries out a level of security as high as that of the pharmaceutical and banking industries in the United States. 

To give themselves all the means possible to make this project a success, Monoprix decided to create a dedicated team, made up of numerous people such as project managers, integrators, managers of specific applications, etc. The official launch of the project took place in March 2019. 

Damien Pichot had organized a kickoff, inviting all the company’s business lines: “I didn’t want it to be an IT project but a company project, I am convinced that this project should be driven by the business lines and for the business lines”. 

Damien tells us that the day before the project was launched, he had trouble sleeping! Indeed, Monoprix is the first French company to embark on the total migration of an on-premise data warehouse to the Cloud! 

big-data-paris-monoprix-2

The challenges of the project 

The migration was done in an iterative way, due to a strong technical legacy, because everything needed to be reintegrated in a technology as modern as Snowflake. Indeed, Monoprix had big problems with its connectors: “We thought at the time that the hardest part of the project would be to automate the data processing. But the most complicated part was to replatform our ETLs in a new environment. So we went from a 12-month project to a 15-month project.

The new architecture 

Monoprix therefore handles two types of data: structured and semi-structured data. The structured data comes from their classic data warehouse, which contains data from the Supply Chain, Marketing, customer transactions, etc. And the semi-structured data that comes from website-related events. All of this is now converged via ETLs into a single platform running on Azure with Snowflake. “Thanks to this new architecture in the Cloud we can attack the data we want via different applications,” says Damien.

big-data-paris-monoprix-3

Conclusion: Monoprix is better in the Cloud

Since May 2020, Monoprix has been managing its data in the Cloud, and it’s been “life changing”. On the business side, there is less latency, queries that used to take hours now take minutes, (and employees are finally sleeping in the morning!). Business analyses are also much deeper, with the possibility of making analyses over five years, which was not possible with the old IT structure. But the most important point is the ability to easily share data with the firm’s partners and service providers.

Damien proudly explains.  “With the old structure, our marketing teams took 15 days to prepare the data and had to send thousands of files to our providers, today they connect in a few minutes and they fetch the data alone, without us having to intervene. That alone is a direct ROI. 

Data management is embracing Cloud technologies

Data management is embracing Cloud technologies

Contemporary business initiatives such as digital transformation, are facing an explosion of data volume and diversity. In this context, organizations are looking for more flexibility and agility in their data management.

This is where Cloud strategies come in…

Data management definition

Before we begin, let’s define what data management is. Data Management, as described by TechTarget is “the process of ingesting, storing, organizing and maintaining the data created and collected by an organization”. Data management is a crucial part of an enterprise’s business and IT strategy, and provides analytical help that drives overall decision-making by executives.

As mentioned above, data is seen as a corporate asset that can be used to make better and faster decisions, improve marketing campaigns, increase overall revenue and profits, and above all: innovate. As a result, organizations are seeing cloud technologies as a way to improve their data initiatives.

Cloud strategies are the new black in data management disciplines

It is an undeniable fact that Cloud service providers are becoming the new default platform for database management. This phenomenon provides data management teams with great advantages:

  • Cost-effective deployment: Greater flexibility and a more rapid configuration.
  • Consumption-based spending: Pay what you use and do not over provision
  • Easy maintenance: better control over the associated costs and investments 

By knowing this, there is no doubt that data leaders perceive cloud as a less expensive technology, driving this choice even more.

Data leaders will embrace the cloud as an integral part of their IT landscape in the coming years and months. However, we strongly believe that the rate at which organizations migrate  to the cloud will differ by organization size. Small or midsize organizations will migrate quicker , while larger organizations will take months, even years to migrate. 

Thus, the Cloud is going to become a default option for all data management technologies.  Many strategies appear including various deployment types or approaches. We have identified 3 main strategies:

 

  • Hybrid Cloud: Made up of two or more separate Cloud infrastructures that may be private or public and that remain single entities
  • Multicloud: Use more than one cloud service provider infrastructure as well as on-premises solutions.
  • Intercloud: Where data is integrated or exchanged between cloud service providers as part of a logical application deployment.
Cloud data-management (1)

 

The Cloud is also seen as an opportunity for data analytics leaders

The increased adoption of cloud strategy deployments regarding data management  has important implications for data and analytics strategies. As data is  moving to the cloud, the data and analytics applications they use must follow.

Indeed, the emphasis on the speed of value delivery has made cloud technologies the first choice for new data management solution development for vendors, and deployment for enterprises. Thus, enterprises and data leaders are choosing next-gen data management solutions! They will migrate their assets by selecting applications that connect to future cloud strategies and preparing their teams & budgets for the upcoming challenges they will overcome..

Those data leaders who use analytics, business intelligence (BI) and data science solutions are seeing Cloud solutions as greater opportunities to:

  • Use a cloud sandbox environment for trial purposes in terms of onboarding, usages, connectivity and create a prototyping analytics environment before actually buying the solution.
  • Facilitate application access wherever you are and improve collaboration between peers.
  • Access to new emerging capabilities over time with ease, with continuous delivery approaches.
  • Support heavy lifting with the cloud’s elasticity and scalability along the analytics process.

 

A data catalog, the new essential solution for cloud data management strategies

Data and analytics leaders will inevitably engage in more than one cloud where data management, governance and integration become more complex than ever before. Thus, data leaders must equip their organization to new metadata management solutions to assist in finding and inventorying data distributed across a hybrid and multi cloud ecosystem.  Failure to do so will result in a proliferation of data silos, leading to derailed data management, analytics and data science projects.

Data management teams will have to choose among the wide-range of data catalog in the market the most relevant one.

We like to define a data catalog as a way to create and maintain an inventory of data assets through the discovery, description and organization of distributed datasets.
If you are working on the data catalog project, you will find: 

  • On the one hand by fairly old players, initially positioned on the Data Governance market.
    These players provide on premises solutions with rich but complex offers, which are expensive, difficult and time-consuming to deploy and maintain, and are designed for cross-functional governance teams. Their value proposition is focused on control, risk management and compliance.

  • on the other hand by suppliers of data infrastructures (Amazon, Google, Microsoft, Cloudera, etc.) or data processing solutions (Tableau, Talend, Qlik, etc.), for which metadata management is an essential block to complete their offer. They offer much more pragmatic (and less costly) solutions, but are often highly technical and limited to their ecosystem.

 We consider those alternatives as not sufficient enough. Here are  some essential guidelines to find your future data catalog. It must:

– Be a cloud data catalog enabling competitive pricing and rapid ROI for your organization.
Have universal connectivity, adapting to all systems and all data strategies (edge, cloud, multi-cloud, cross-cloud, hybrid).
– Have very advanced automation for the collection and enrichment of data assets as well as their attributes and links (augmented catalog). The automatic feeding mechanisms, as well as the suggestion and correction algorithms reduce the overall cost of the catalog and guarantees the quality of the information it contains
– Be strongly focused on user experience, especially for business users, to improve solution adoption.

 

Cloud data management -2

To conclude, data management capabilities are becoming more and more cloud-first and in some cases cloud-only.

Data leaders who want to drive innovation in analytics will need to leverage cloud technologies from data assets. They will have to go from ingestion to transformation without forgetting to invest in an efficient data catalog in order to find their way in an ever more complex data world.