5 questions to...
Massimiliano Ceceri
Interview with the Data Visualization Senior Manager of Engineering.
Massimiliano Ceceri has been leading the Data Visualization team within the AI & Data Technology Business Line at Engineering Group since 2017. He currently also serves as Director of Delivery Management for AI & Data.
The team is structured by technology areas (Power BI, Qlik, Knowage, Tableau, etc.) and operates across all markets, with a strong focus on excellence in technical expertise.
He brings over 30 years of experience in managing highly complex projects for large organizations, both nationally and internationally. He has deep knowledge of the banking sector, particularly in Information Technology and Operations.
Before joining Engineering, he held the position of Head of Product Area in the Finance domain for more than 15 years, where he developed strong capabilities in managing large teams, setting up engagement models and commercial propositions, and overseeing client relationships with a focus on direction, governance, and control of project delivery phases.
In recent years, the exponential growth of data and the emergence of new tools for fast and effective analysis have profoundly changed how companies make decisions. Today, thanks to increasingly accessible storage systems, powerful computing capabilities, and natural language-based Business Intelligence tools, data is no longer the exclusive domain of analysts: it has become a shared resource across all business functions. This shift enables more informed decisions based on concrete evidence, enhancing the ability to identify trends, risks, and opportunities, and to quickly adapt business strategies.
Data-Driven Decision Making (DDDM) is built on three fundamental pillars: data quality and management, structured and repeatable analysis, and integrated decision-making systems that include dashboards, alerts, and human review. This approach makes it possible to promptly detect changes, tailor products and services to customer needs, and continuously monitor business performance. Moreover, DDDM does not aim to replace the experience or intuition of managers, but rather to enhance their contribution by reducing uncertainty and promoting more solid and reliable choices.
The centrality of data in business management is now a key competitive factor: those who adopt DDDM can optimize processes, overcome cognitive biases, and seize new growth opportunities.
Concepts such as Data-Driven Management (DDM) and Data-Driven Strategic Management (DDSM) are gaining increasing relevance, precisely because they combine the value of human insight with the power of data.
The so-called “Digital Divide” will become an increasingly decisive factor: the gap between companies that stand out through intelligent data use and those that remain anchored to outdated models will continue to widen. DDDM is therefore an essential tool for any organization aiming to lead in the digital era, enabling data-driven decisions that fully unlock business potential and deliver tangible results.
Effective data visualization is not merely a matter of aesthetics: it is a strategic tool for transforming numbers and information into insights that are immediately understandable by the end user. Following key best practices, along with using the right tools, is essential to ensure that data is accessible, relevant, and decision-oriented.
Every visualization should begin with a clear purpose and a deep understanding of the audience: knowing the needs of those who will consume the data allows for the selection of the most appropriate format and technology. Tools such as Tableau, Power BI, Looker Studio, Qlik Sense, and Knowage (Engineering’s open-source solution), to name a few, enable the creation of interactive dashboards that integrate with heterogeneous data sources and guide users through personalized analytical journeys.
Design should prioritize clarity: consistent colors, readable labels, white space, and a clean layout all enhance comprehension. Features like dynamic filters, tooltips, and drill-down capabilities help users explore data intuitively.
Guiding the eye with visual cues and building a coherent narrative is crucial; data storytelling tools such as Flourish, Observable, or StoryMapJS support this narrative dimension. Finally, continuous testing and feedback help refine visualizations, turning data into truly useful insights and enabling more informed decisions.
In conclusion, effective data visualization is the result of a thoughtful combination of tool selection, message clarity, and user experience design, with the ultimate goal of surfacing clear insights and supporting data-driven decision-making.
Artificial Intelligence is revolutionizing data visualization, transforming it from a descriptive tool into a powerful means of data discovery and storytelling. Through the integration of machine learning algorithms and natural language models, today’s analytics platforms can automatically generate relevant visualizations, uncover hidden correlations, and deliver real-time insights. This evolution, known as Augmented Analytics, is making data analysis more accessible and interactive, allowing users to “converse” with information in a simple and natural way.
However, this transformation also introduces new challenges: ensuring transparency in algorithmic processes, preventing bias in results, and maintaining a critical human role in interpreting information, essentially giving data a voice.
At Engineering, we are providing that voice through the native integration between Knowage, our open-source business intelligence platform, and EngGPT, the Italian Large Language Model developed by the Group for Private GenAI. This synergy combines the analytical power of data with the contextual understanding of AI, making visualizations smarter and more personalized.
Ultimately, AI does not replace human intelligence: it enhances it. It turns numbers into stories, supports more informed decisions, and makes data visualization more human, narrative, and inclusive.
Accessibility in data visualization is now essential for truly democratizing access to data. It’s no longer just about making information available: it’s about ensuring that everyone can understand and use it to make informed decisions. Achieving this goal means breaking down technical, linguistic, and cognitive barriers, and promoting an inclusive, data-driven culture.
The most effective solutions for making data visualization accessible involve designing inclusive visualizations, with particular attention to color contrast, the use of palettes suitable for colorblind users, clear and descriptive labels, and the provision of alternative text or data tables for those using screen readers. It’s crucial to offer simple and intuitive interactions, such as filters and tooltips that enrich the experience without complicating it, adapting to the diverse needs of users.
The integration of AI-powered conversational interfaces represents a further step forward, enabling interaction with data through natural language and making access easier for users without advanced technical skills. Simplified dashboards and visual storytelling tools help translate complex information into understandable narratives, fostering knowledge sharing.
Finally, algorithmic transparency and maintaining human oversight are essential to avoid distortions and ensure a fair, open, and participatory information ecosystem. Democratizing data therefore requires technological innovation, thoughtful design, and a continuous commitment to inclusivity.
In today’s digital landscape, data sovereignty is a strategic priority. Organizations must strike a balance between the need for advanced, user-friendly insights and the imperative to ensure data security, governance, and regulatory compliance.
To meet this challenge, it is essential to adopt solutions that combine innovation with data protection, leveraging “trusted by design” architectures and Private GenAI models that keep processing and intelligence within secure, controlled environments.
Engineering stands out in this area by offering flexible solutions that integrate seamlessly with existing systems, along with continuous support in security and compliance. A concrete example is EngGPT, the Italian Large Language Model for Private GenAI developed by the Group, which enhances business data without exposing it to external platforms. It generates insights and language-based automations while fully respecting principles of data sovereignty and privacy-by-design.
In this way, organizations can harness the potential of AI to make more informed and competitive decisions, while maintaining full control, security, and trust over their data.
Democratizing data requires technological innovation, thoughtful design, and a continuous commitment to inclusivity.
Recommended for you
Explore additional content associated with the topic