• +34.664.85.09.74
  • michele.iurillo@synergo.es
The round tables of the 2023 edition of the Data Management Summit

The round tables of the 2023 edition of the Data Management Summit

Data management in the ESG era (Spain & Italy)

Driven by laws and regulations, environmental strategy and financing, environmental, social and governance (ESG) strategy and financing funding requirements, the importance of ESG data management can no longer be ignored.

To ensure the future viability of organizations in all industries, effective ESG data management and disclosure is crucial. Yet some major challenges have been highlighted in different forums: lack of alignment of ESG data and performance standards, barriers to data sharing, unclear incentives for ESG data management and disclosure, and lack of coordination.

Opportunities lie in the introduction of ambitious and enforceable regulation, creating clear incentives among ecosystem actors, making results transparent.

Some important gaps are:

ESG standards at the moment are inconsistent and ambiguous.

The incomplete and inconsistent nature of ESG standards complicates the use of ESG data. For example, energy labels are calculated analytically and do not necessarily describe actual energy performance.

Barriers to ESG data interoperability

Data does not flow freely throughout the ecosystem. Stakeholders are sometimes unwilling to share ESG data to mitigate the risk of others gaining a competitive advantage from their data. As a result, the ecosystem as a whole cannot benefit optimally from the available information. Privacy (e.g. GDPR) further complicates the exchange.

Where does data management need to focus? Undoubtedly on quality, governance and integration and interoperability. Most data infrastructures can govern data, but it is often a very manual process, and many of these initiatives are driven by IT programs. Instead, there is a need for a board-level mandate on data, as well as business-driven use cases for tools that automate the process.”

Soon even access to credit will be linked to ESG factors so the industry cannot afford to miss this opportunity.

Italian Roundtable will be moderated by Sofia D’Alessandro (Intesa-Sanpaolo), Spanish RoundTable will be moderated by Gorka Santos Ortells (NTT Data)

From silos to data democratization: how to address it in non-mature enterprises (Spain & Italy)

A data silo is a repository of data controlled by one department or business unit and isolated from the rest of the organization. Siloed data is usually stored in a separate system and is often incompatible with other data sets. This makes it difficult for users in other parts of the organization to access and use the data. A department can create a data silo even in an organization that has robust data management processes in place. More often, however, data silos are a consequence of the way in which organizations are structured and managed as a whole, including their IT operations.

Especially when the organization is not mature or when data governance is not fully integrated and embedded, there is often a situation where there is little traceability of data processes and a lack of knowledge of the most important assets as they rest in these silos.

To truly break down data silos, it may be necessary to change the culture of an organization. Efforts to do so may be part of the data strategy development process or a data governance initiative. In some cases, a change management program may be necessary to implement cultural changes and ensure that departments and business units adopt them.

The roundtable is intended to focus on concrete experiences and practices of how to arrive at cross-domain asset management with a focus on domains rather than departmental silos.

The round table will be facilitated by Marta Diaz from Adevinta and DAMA Spain in the Spanish edition and by Simona Di Felice in the Italian edition.

The citizen at the center of data: initiatives and experiences (Rome & Bilbao)

Providing services to citizens is at the core of what most public administrations do. Tasks such as paying taxes, renewing a driver’s license and applying for benefits are often the most tangible interactions citizens have with their government. Services are therefore fundamental in shaping trust and perceptions of the public sector. When we talk about interactions we are obviously talking about data that needs to be moved and consolidated in different systems ensuring privacy, quality and efficiency. Because citizens expect more transparent, accessible and responsive services from the public sector. And those expectations are increasing. Many governments have made efforts to improve service delivery through online portals or “one-stop shops” such as centralized call centers, but they are still unable to meet public expectations. Many citizens continue to be frustrated by cumbersome or confusing websites and often fail to complete their application in toto. Without data and process optimization, public administrations face increased costs associated with delivering services through multiple channels.

A “citizen journey” is the complete experience a person has when seeking government service. The journey has a distinct beginning and end and, because it is often multi-touch and multi-channel, it is also cross-functional in nature. The citizen journey is based on how people think about their experience, not how government agencies think about it. Turning this perspective around is one of the main issues, seeing the world as the citizen sees it, getting at the center of this perspective and using the capabilities of technology to be able to be effective, efficient and at the same time be able to map out all the processes and track everything correctly.

According to a study by McKinsey, “Transforming service delivery is not easy, but there is a clear and proven roadmap for success. By taking a citizen-centric approach, leaders can better understand their citizens’ needs and translate them into targeted and effective improvements in service delivery. In this way, they can increase citizen satisfaction and reduce costs.” 

As a good practice, a benchmark is Estonia: From the requirement to create a national integration platform to reduce data exchange costs and end data leakage from current unsecured databases. The X-Road program has been generated and has become the backbone of e-Estonia, allowing the nation’s public and private sector information systems to connect and work in harmony. Ninety-nine percent of public services are accessible online 24 hours a day.

This roundtable will be facilitated by Gigi Beltrame in the Italian Edition and Saioa Leguinagoicoa in the Bilbao Edition

Challenges of data interoperability between public administrations (Rome & Bilbao)

The objective is to project on a larger scale the current methodologies, specifications and practices related to information processing, in order to achieve a fluid and continuous data exchange between administrations, industrial sectors and citizens, which generates advantages and opportunities for the different actors involved, and always taking into account the necessary privacy and security considerations. This strengthening of administrative collaboration is materialized in the different public sector data spaces, enhancing the value of data in the development of citizen-centric public programs, policies and services, reducing the bureaucratic burden of administrative processes borne by economic operators and citizens.

What are the main barriers to data sharing? Do we set the barriers ourselves? How can we overcome them?

In the design of public policies? What would it take to cross-reference protected data sets from different agencies?

Is it possible to have a large catalog of datasets from which to take what is needed, how to collect it, with what content, what data could be moved to a central warehouse and what data would reside at its source?

Is all the transactional data that can be served through the EPI already served?

What is the future of the semantic interoperability center and why hasn’t the initiative taken off?

How can you ensure the quality of the data exchanged?

Do you have in mind the development of (or have you developed) analysis projects crossing information from different agencies – have the results gained traction for new projects?

Is there a real management awareness of the importance of the data journey or is it an ICT issue? Does the company know the potential of its data or the external data it could obtain?

This roundtable will be moderated by Antonio Rotundo (Agid) in Italy and Mario de Francisco (Anjana Data – Dama EspaƱa) in Bilbao

The challenges of the IDMP for the pharmaceutical industry (Cancelled)

The International Organization for Standardization (ISO) Drug Identification Standards (IDMPs) specify the use of standardized definitions for the identification and description of medicinal products for human use. The objective of these standards is to facilitate the reliable exchange of information on medicines in a robust and consistent manner, providing a common product “language” for stakeholders to use in their interactions.

The five standards include information on the regulated medicinal product; information on the regulated pharmaceutical product; units of measurement; dosage forms, units of presentation, routes of administration and packaging; and structured substance information. Taken together, these standards allow regulated pharmaceuticals to be uniquely defined, characterized and identified throughout their entire life cycle, from development to authorization and marketing.

IDMP will be a game-changer in the use of process and technology integration to improve patient safety. IDMP will offer companies the opportunity to make product data work more efficiently from a business standpoint through improvements in data quality and use.

This will be a revolution in the pharmaceutical industry much like what happened in banking with the advent of Basel regulations.

Is our pharmaceutical industry at the optimum level to meet this challenge in terms of data management and quality, and will it be possible to map all the processes as required by the regulator?

Master Data Challenges: Best Practices and Technology Barriers (Milan – Italy Only)

Master data management requires the adoption of advanced technologies that can ensure data consistency, data quality, and data security, as well as the ability to handle large volumes of data from multiple sources. Master data is the core data set used by organizations to make strategic business decisions. However, maintaining accurate, complete and consistent master data can be a very complex undertaking and require a high degree of technological expertise.

Here are some of the main technological barriers that can limit the effective management of master data:

Legacy systems: many organizations still have legacy systems that can be difficult to integrate with new technologies. This can make it difficult to flow data between systems and manage information across platforms.

Diversity of data sources: master data may come from different sources, such as ERP applications, CRM, SCM, and so on. This may result in the need to unify data from different platforms in order to ensure consistency.

Scalability: increasing data volume can make master data management difficult. This requires technology that can handle large amounts of data and provide high performance.

Data security: master data often includes sensitive information about customers, transactions and business activities. This requires effective security measures to protect the information from unauthorized access.

Data quality: master data must be accurate and consistent. However, data quality can be affected by a number of factors, including human error, incomplete, duplicate or outdated data. This requires the use of technologies that can ensure data quality and error correction.

Many companies are becoming more mature in master data management, but there are still challenges to overcome to ensure effective data management. Awareness of the importance of master data is increasing, and companies are investing more and more resources to improve their performance in data management.

Italian Edition of Roundtable will be moderated by Gigi Beltrame

Process Automation and artificial intelligence in Financial Services Industries: pipe dream or reality? (Milan Italy Only)

The roundtable will focus on the intersection of process automation and artificial intelligence in the financial services industry. Participants will discuss the benefits and challenges of implementing automation and AI technologies, as well as the impact on jobs and skills in the industry. The event will bring together experts in finance, technology, and data science to share insights and perspectives on how these technologies can improve efficiency, reduce costs, and enhance customer experience. Key topics of discussion will include the integration of AI and automation into financial processes, the role of human oversight and decision-making, and the ethical considerations of these technologies. The roundtable aims to provide a platform for thought-provoking discussions and knowledge sharing among industry leaders and professionals.

We are surrounded by technologies that promise to structure our processes and give answers to all our questions. Vendors are constantly proposing new paradigms and new solutions, but how is it possible to land all these promises? Are we still afraid of middleware?

This roundtable will only be held in Italy and will be facilitated by Monica Ripoldi (BancoBPM).