A Meta-Model-Driven Approach to Support Digitization Evolution with Maturity Models
Abstract
Digitization makes knowledge and information a centerpiece since these connect all dimensions affected by innovation, i.e., people, processes, organization, business, and technology. Maturity Models (MMs) support managers in this evolution. The paper provides a theoretical formalization of MMs to give a practical knowledge contribution to increasing the intensiveness of information in enterprises, through digitization evolution. The paper reviews the main MMs, and presents their state of the art. Then, it defines a backbone structure common to MMs to abstract and describe their features by a meta-model. The new meta-model-driven approach guides companies in the selection of MMs, and in designing a new MM, where needed. The meta-model formalizes the two-levels inputs, the process, and the output to align the company’s motivations with the MM features, resulting in the definition of an appropriate MM for organizations. A qualitative exploratory case study shows the approach and its results, providing guidelines for future actions.
1. Introduction
Enterprises are becoming smarter, i.e., intelligent, fast, and efficient due to the combination of automation, information, connection, and programming (Frank et al., 2019). Nowadays, innovation is a pressing need, on which depends companies’ survival (Honorato and Cristóvão de Melo, 2023) regardless of business size (Petzolt et al., 2022). Digitization changes technological and cultural paradigms, even exposing operators to new hazards and risks (Costantino et al., 2021). Smart transformation improves all the business functions and processes, e.g., manufacturing (Jagtap et al., 2021), logistics and supply chain (Loske and Klumpp, 2020), safety (Quatrini et al., 2020a), quality (Zaidin et al., 2018), and maintenance (Quatrini et al., 2020b). It is crucial to define guidelines to support business managers and to identify and analyze specific critical areas (Yagiz Akdil et al., 2018), so that organizations can apply transformative solutions in a step-by-step manner, consistently with their needs (Ejaz, 2018). The transformation is not just new technologies but a combination of processes, organization, business, and technology (Leng et al., 2021). Information is the connection element of these dimensions. The research proposes a theoretical formalization of Maturity Models (MMs) as practical knowledge to increase the intensiveness of information in enterprises through digitization. MMs are proven tools to guide companies, simple yet efficient solutions adopted in many fields for different purposes (Dikhanbayeva et al., 2022; Frank et al., 2018; Maier and Schmidt, 2017; Serenko et al., 2016). Among these, MMs can measure the quality of the processes with respect to a specific goal, the smart level of resources and competencies (Wendler, 2012), technologies (Bernabei et al., 2023; Pour et al., 2023), and business processes (Flechsig et al., 2022). Literature has shown how some organizational contexts need more support in the transformation and how, when properly chosen, MMs are effective in that support (Silva et al., 2021). Nevertheless, both practical and theoretical challenges arise in leveraging MMs. Practically, organizations are often hindered from leveraging MMs because they are disoriented by many existing MMs, and they are unable to understand the MMs’ assessment potential in a structured way (Silva et al., 2021). Also, despite their salient popularity, critical voices emerge regarding the practical applicability of MMs. MMs lack a clear description and definition, making them theoretical constructs but not applicable for users (Katja et al., 2020). This paper presents a conceptual perspective to analyze the main MMs acknowledged in the literature. Namely, it defines the backbone structure shared among the models and the way it declines within each MM. This is the first step to define the subsequent meta-model-driven approach to guide organizations in choosing the most appropriate MM for their evaluations, and eventually in the design of a new MM, considering the context in which the organizations operate, the assessment objectives, and comparing those features with the MMs’ backbone structure elements. This support is crucial considering the proliferation of MMs in the literature and their features. Theoretically, the paper contributes to the literature by bringing together the main MMs established in the literature, providing the MMs state of the art and identifying the backbone structure shared among the models. The presence of a backbone structure common to the MMs has not yet been investigated. This structure highlights common characteristics and properties of the MMs and guides the researchers and practitioners in the definition of a suitability scope for each MM.
In Fig. 1, the steps involved in the research are summarized.

Fig. 1. The steps involved in defining the meta-model-driven approach before its application.
The research questions are as follows:
(1) | Do MMs have a common basic structure? | ||||
(2) | If such a structure exists, how does it fit within the models? | ||||
(3) | What approach can guide an organization in selecting an existing MM or designing a new one? |
The research first presents the state of the art of MMs. This initial step is crucial for analyzing relevant MMs and addressing the RQs. The remainder of the paper is structured as follows. Section 2 presents the research approach that guides the collection of MMs; Sec. 3 gives the analysis of the MMs retrieved and the resulting state of the art; Sec. 4 defines the backbone structure shared among MMs and how it differs in MMs. Then, the approach to guide organization in selecting or design a MM, i.e., the meta-model-driven approach, is presented in Sec. 5. A qualitative exploratory case study to test the meta-model-driven approach is accessible in Sec. 6. The conclusions and suggestions for future research activities are summarized in Sec. 7.
2. Research Approach
MMs related to the digitization topic assess the maturity and readiness in relation to smart transformation (Ariffin and Ahmad, 2021; Chen et al., 2022). Also, those models provide guidelines to define actions, starting from the current state and guiding companies toward implementation (Hellweg et al., 2021). To determine whether MMs share features or differ in aspects, the first step of the research focuses on the collection of relevant MMs. This involves identifying and synthesizing the key features or concepts contained in the MMs.
The aim is to:
− | identify and summarize key characteristics or concepts included in MMs; | ||||
− | map these characteristics according to a scheme that allows the MMs to be analyzed and a new MM to be developed where required. |
Accordingly, a scoping review, i.e., a well-established method in the literature to synthesize knowledge, was conducted (Kastner et al., 2012). This allows identifying key concepts, mapping, and discussing characteristics, and analyzing gaps in the knowledge base (Davy et al., 2016; Harfield et al., 2015; Wagman et al., 2015). As MMs are a well-established concept in the literature, the research began by collecting a set of MMs for analysis from the current literature. The selection process included the identification of the most up-to-date systematic analysis of MMs focused on digitization evolution, readiness, and maturity within the manufacturing sector (Onyeme and Liyanage, 2023). Then, starting with the MMs present in that research, additional MMs that emerged in the literature on the topic have been retrieved, as summarized in Fig. 2 and detailed below.

Fig. 2. The scoping review process map.
Step 1.1
The literature review was initiated by querying the Scopus scientific database on 13 January 2024 using a query like that used for similar purposes (Silva et al., 2021).
TITLE-ABS-KEY (review* AND (“maturity model” OR “readiness model” OR “maturity assessment” OR “readiness assessment”) AND (“industry 4.0” OR “industrie 4.0” OR “i4”)).
The query investigates the Scopus database, i.e., the largest abstract and citation database of peer-reviewed literature. This choice is based on its relevance in academia. At the end of 2022, the RELX Annual Report (Relx, 2022) identifies Scopus as a leading source, an expertly curated database of abstracts and citations from 27,000+ journals from 7,000+ publishers, helping researchers discover global knowledge across all disciplines.
This query yielded 136 papers. Duplicates were eliminated, and papers without full text were available, resulting in 126 papers. Systematic reviews were selected (29 papers, of which 15 reviews and 14 conference reviews). Among these, consistent with the Inclusion and Exclusion Criteria of Step 1.1 shown in Table 1, the most updated was identified (Onyeme and Liyanage, 2023). This review contains an initial set of 19MMs.
Inclusion (IC) | Exclusion (EC) | |||
---|---|---|---|---|
Criteria | Step 1.1 | Step 1.2 | Step 1.1 | Step 1.2 |
Literature Type | Review, Conference Review | All | Article, Conference Paper, Book Chapter, Editorial | None |
Language | English | Other | ||
Timeline | All | All | — | — |
Access | Available in full text | Access restricted | ||
Key Research | Research on Industry 4.0 in manufacturing | Research not on Industry 4.0, or not in manufacturing | ||
Article Type | Review of existing MMs with/without proposition of a new MM | Proposition of a new MM | No review of existing MMs | No proposition of a new MM |
Required Content | MM conceived as a tool to assess and define the organizational 4.0 level of maturity | MM lacking a mechanism to assess and define 4.0 level of maturity |
Step 1.2
The documents assessed for screening in Step 1.1 were examined (126 documents). From these, 88 documents were selected in line with the Inclusion and Exclusion Criteria of Step 1.2 shown in Table 1. Using the Backward and Forward Snowball Sampling Technique (Wohlin, 2014), were collected further 39MMs, not included in the initial set of 19MMs.
This brings to a final set of 58MMs, analyzed to represent the state of the art of MMs (Sec. 3). Subsequently, the 58MMs were in-depth examined to uncover the presence of a basic backbone structure (Sec. 4). This structure, once identified, is analyzed to understand how it is declined within the different MMs. A different declination of the backbone structure within the MMs leads to the characterization of the models. That is, each model allows for different types of analysis and evaluation.
Table 1 summarizes the Inclusion and Exclusion Criteria (IC, EC) for the collection of the final set of MMs.
3. MMs’ State of the Art
To address RQ1, the collected MMs were screened to discover which types of information were always reported by the authors. The screening revealed a common evaluation logic based on a variable number of assessment areas. A maturity level is defined for each area using either a quantitative or qualitative procedure, which is sometimes aggregated into an overall maturity index. Table 2 shows the number of assessment dimensions and maturity levels proposed by each model, along with an assigned ID for each MM. From this point forward, MMs will only be retrieved using their assigned ID.
The distribution of the number of Assessment dimensions (Fig. 3) shows that the most of MMs provide 3 or 4 dimensions, but commonly, this number is placed between 3 and 6. The number of Assessment Dimensions is always fixed a priori in the existent MMs.

Fig. 3. MM distribution per number of assessment dimensions.
Moreover, most MMs provide a 5-level assessment scale (Fig. 4). The Assessment scale may not always be represented by a predetermined number of levels (see other criteria). For instance, qualitative assessments may be provided, or only the upper and lower limits of the range within which a maturity score can be placed may be defined.

Fig. 4. MM distribution per number of assessment levels.
An analysis of the assessment dimensions and proposed scales shows that the development of MM can be based on either organizational or technological aspects within the organization, or it can provide a benchmark assessment. The latter places the organization at a certain level of maturity and comparing it with other organizations. Analyzing the evolution direction (Fig. 5), it emerges that most MMs provide an organizational assessment.

Fig. 5. MM distribution per evaluation direction.
Furthermore, the basic structures of the MMs were analyzed in detail. In Table 3, the assessment dimensions and the proposed assessment scale are detailed for each MM. Also, the direction of evolution is reported.
Assessment scale | |||
---|---|---|---|
ID | Assessment dimensions | Evolution direction | Assessment levels |
MM1 | Seven Dimensions: Digital Business Model; Digitalization of Product and Service; Digitalization of Value Chain; Data & Analytics Capability; Agile IT Architecture; Compliance, Security, Legal and Tax; Employees and Digital Culture | Benchmark | Four Stages: Digital Novice; Vertical Integrator; Horizontal Collaborator; Digital Champion |
MM2 | Six Dimensions: Organizational Strategy; Smart Factory; Smart Operations; Smart Products; Data-driven Services; Employees | Benchmark | Six Readiness Levels: Outsider; Beginner; Intermediate; Experienced; Expert; Top Performer |
MM3 | Four Dimensions: Information Infrastructure; Controls and Devices; Networks; Security Policies | Technology | Five Stages: Assessment of Existing OT/IT Network; Secure and Upgrade Network and Controls; Defined and Organized Working Data Capital (WDC); Define Analytics; Collaboration (external and internal) |
MM4 | Two Viewpoints: Product; Production | Organization | Five Stages: Preparation; Analysis; Creativity; Evaluation; Implementation. |
MM5 | Three Stages: Envision (Vision 4.0); Enable (Roadmap 4.0); Enact (Projects 4.0) | Organization | Five Levels: Initial; Managed; Defined; Transformed; Detailed Business Model |
MM6 | Nine Dimensions: Strategy; Leadership; Customers; Products; Operations; Culture; People; Governance; Technology | Organization | Undefined (Weighted average of all maturity items within its related dimension). |
MM7 | Four Dimensions:Vertical Integration; Horizontal Integration; Digital product development; Cross-sectional technology criteria | Technology | Five Stages: Basic digitization level; Cross-departmental digitization; Horizontal and vertical digitization; Full digitization; Optimized full digitization |
MM8 | Four Viewpoints: Factory; Business; Products; Customer | Technology | Five Levels: Single-Station Automated Cells; Automated Assembly System; Flexible Manufacturing System; Computer-Integrated Manufacturing (CIM) System; Reconfigurable Manufacturing System |
MM9 | Three Phases Approach: Vendor Selection, Requirement Modeling and GAP Analysis. | Benchmark | Undefined (Percentage of compliance with industry standards). |
MM10 | Four Dimensions: Organizational Maturity; IT Maturity; Performance Management Maturity; Information Connectivity Maturity | Technology | Undefined (Computational method) |
MM11 | Multiple-Criteria Decision-Making (MCDM) | Technology | Five Levels: Checking; Monitoring; Control; Optimization; Autonomy |
MM12 | Five Aspect Dimensions: Asset Management; Data Governance; Application Management; Process Transformation; Organizational Alignment | Organization | Six Levels: Incomplete; Performed; Managed; Established, Predictable; Optimizing |
MM13 | Five Business Dimensions (divided into 28 Sub-Dimensions): Customers; Strategy; Technology; Operations; Organization & Culture | Benchmark | Three Maturity Groups: Early; Developing; Maturing |
MM14 | Four Dimensions: Process; Monitoring and Controlling; Technology; Organization | Organization | Five Maturity Levels: Initial; Managed; Defined; Integrated and Interoperability; Digital-oriented |
MM15 | Dimension: IoT Technologies | Technology | Eight Maturity Stages: 3.0 Maturity; Initial to 4.0 Maturity; Connected; Enhanced; Innovating; Integrated; Extensive; 4.0 Maturity |
MM16 | Six Major features: Data Storage and Compute; Service-oriented Architecture; Information Integration; Digital Twin; Advanced Analytics; Real-time Capabilities | Technology | Six Levels: Nonexistent IT Integration; Data and System Integration; Integration of Cross-Life-Cycle Data; Service-Orientation; Digital Twin; Self-Optimizing Factory |
MM17 | Three Broad Dimensions: Smart Products and Services; Smart Business Processes; Strategy; Organization | Benchmark | Four Levels: Absence; Existence; Survived; Maturity |
MM18 | Three Axes (containing 30 Maturity Items): Strategy, Maturity and Performance | Organization | Five Roadmap Stages: (0–4 Levels) |
MM19 | Four Key Areas With two Sub-Dimensions: Resources Information System; Organization Structure; Culture | Technology | Six Development Stages: Computerization; Connectivity; Visibility; Transparency; Predictability; Adaptability |
MM20 | Five Main dimensions (divided into five attributes): Decision-making; Data availability; Data quality; Data analysis and insight; Information use | Technology | Five Levels (1–5): Uninitiated; Awareness; Proactive adopting; Integral embracement; Completely embedded |
MM21 | Five Dimensions: Strategy; People; Processes; Technology; Integration | Organization | Five Digital Readiness Levels: (1 – 1.8]; (1.8 – 2.6]; (2.6 – 3.4]; (3.4 – 4.2]; (4.2 – 5] |
MM22 | Five Dimensions: Governance; Technology; Connectivity; Value creation; Competencies | Technology | Six Stages: None; Basic; Transparent; Aware; Autonomous; Integrated |
MM23 | Four Main Dimensions (divided into 21 subdimensions):Operations; Organization; Socio-Culture; Technology | Organization | Five Levels (1–5) and a combination of a single term and a brief statement for each level |
MM24 | 10 Aspects (along six Strategic Factors): IT Infrastructure; Human Resource Management; R&D; Administration, Finance and Control; Procurement; Inbound Logistic; Operations; Outbound Logistic; Marketing & Sales; Post-sales Services | Organization | Five Levels: Absence of Digital Initiatives; Planned; Just Started; Under Development; Developed and Ongoing |
MM25 | Six Dimensions (with a variable number of subdimensions): Strategy and Organization; Smart Factory; Data Driven; Smart Operations; Smart Products; Employees | Benchmark | Five Maturity Levels: Outsider; Beginner; Intermediate; Experienced; Expert; Top Performer |
MM26 | Three Digitalization Scales: Strategy; Organization; Readiness | Benchmark | Four Levels: Beginners; Catching ups; Off-track; Leaders |
MM27 | Four Attributes: Alignment; Technology; Outcomes; Mindset | Organization | Four Maturity Phases: (0–3) |
MM28 | Four Dimensions: Culture; Technology; Organization; Insights | Benchmark | Four Segments: Skeptics; Adopters; Collaborators; Differentiators |
MM29 | Three Building Blocks (divided into 8 pillars and 16 dimensions): Process; Technology; Organization | Organization | Six-Level Band (different for each dimension) |
MM30 | Six Fields of Actions: Customer; Data; Value Proposition; Organization; Operations; Transformation. | Organization | Undefined ( Insights and implicationsinto digital transformation) |
MM31 | Two Basic Dimensions (divided into 51 attributes): Organizational Capability; Digital Capability | Organization | 4-Point Scale (depending on the attribute) |
MM32 | Six Dimensions (divided into 26 sub-dimensions): Change; Smart Products; People; Production Process; Technology; Organizations | Organization | Six Level Scales: (0–5) |
MM33 | Six Dimensions: Strategy and Leadership; Company Culture and Organization; IT Infrastructure; Data Maturity; Process and Operations; Product | Organization | Three Digital Level: (1–3) |
MM34 | Five Dimensions: Culture; Ecosystem; Operations; Governance; Strategy | Organization | 5-Level Likert Scale |
MM35 | Five Dimensions (divided into 43 sub-dimensions): Business and Organization Strategy; Manufacturing and Operations; Technology Driven Process; Digital Support; People Capability | Organization | Three Stages Assessment: Ordinal scale; Ordinal scale; Likert Scale |
MM36 | Five Dimensions (divided into 34 items): Technology; Products and Services; Company Operations; Strategy and Organization; Human Resource Capability | Benchmark | Six Levels: Not Ready; Beginners; Intermediate; Experienced; Expert; Top Performer |
MM37 | Seven Dimensions (divided into 38 items): People and Culture; Industry 4.0 Awareness; Organizational Strategy; Value Chain and Process; Smart Manufacturing Technology; Product and Services Oriented Technology; Industry 4.0 Base Technology | Benchmark | Four Levels: Outsider; Digital Novice; Experienced; Expert |
MM38 | Three Key Principles: People; Process; Technology | Technology | Four Levels: Connected Technologies; Structured Data Gathering and Sharing; Real-Time Process Analytics and Optimization; Smart and Predictable Manufacturing |
MM39 | Three Dimensions (divided into 13 sub-dimensions): Factory of the Future; People and Culture; Strategy | Technology | Four Levels: Minimal; Development; Defined; Excellence |
MM40 | Four Criteria (divided into 13 sub-criteria): Strategy; Technology; Operations; Organization; Culture | Organization | Six Readiness Degrees: Embryonic; Initial; Primary; Intermediate; Advanced; Ready |
MM41 | Six Factors (divided into 36 items): Strategy; Governance; Methods, Information Technology; People; Culture | Organization | 7-Point Likert Scale |
MM42 | Three Dimensions (divided into 6 categoriesand 12 sub-categories): Environment; Competence; Operability | Organization | Three Stages: Not Implemented; Partially Implemented; Fully Implemented |
MM43 | Six Digital Development Steps: Creation of Transformation Vision and Objectives; Assess the Organization Transformation Capability; Design the End User and Employee Experience; Review and Select Solutions and Vendors; Creation of Implementation Roadmap; Adjust Organization Culture and Infrastructure | Organization | Five Digital Levels: Unaware; Conceptual; Defined; Integrated; Transformed |
MM44 | Four Vertical Layers (divided into 23 dimensions): Strategy; Governance; Operations; Objects | Technology | Six Readiness Levels: (0–5) |
MM45 | Six Factors: Strategic Alignment; Governance; Methods; ICT; People; Culture | Organization | Five Levels: Initial; Repeatable; Defined; Managed; Optimizing |
MM46 | Five Dimensions: Strategy; Processes; Products and Services; Technologies; Personnel | Benchmark | Undefined (Computational method) |
MM47 | Three Dimensions (divided into seven perspectives): SCM; SCM and POM; POM | Organization | Five Levels: Nonexistent; Conceptual; Managed; Advanced; Self-optimized |
MM48 | Five Dimensions: Strategy; Technology; Production; Products; People. | Organization | Undefined ( Average calculation: 0–5) |
MM49 | Six Dimensions: Management & Leadership; Planning & Strategy; Smart Products & Machines; Business Process; Employee & Training; Customers Feedback. | Benchmark | Four Readiness Levels: Outsider; Novice; Survival; Professional. |
MM50 | 10 Dimensions: Strategy and organization; Smart factory; Smart operations; Smart products; Data-driven services; Employees; Marketing and customer access; Legal consideration; Culture. | Organization | Five Readiness Levels: Stranger; Novice; Intermediate; Learned; Specialist |
MM51 | Three Dimensions (divided into 12 axes): Technology; Organization; Environment. | Organization | Six Levels: (1–6) |
MM52 | Five Dimensions (divided into 13 IoT sub-dimensions): Organizational & Business; Data & Application; Technology & Infrastructure; Governance & Compliance; Requirements & Change Management. | Technology | Five Levels: Initial; Managed; Defined; Quantitively Managed; Optimizing. |
MM53 | Six Categories: Data, information, and knowledge; Process; Interactions; Infrastructure; Self-X; Measurement performances. | Technology | Five Maturity Levels: (1–5) |
MM54 | Three Dimensions (divided into 16 concepts): Operational Readiness; Organizational Readiness; Technological Readiness. | Organization | Undefined (Computational method - fuzzy number from 0 to 1) |
MM55 | Six Dimensions: Service; Operations; Quality; Products; Documented – Information – Big Data; Leadership and Strategy; Communication; Culture and Staff. | Organization | Five Maturity Levels: (1–5) |
MM56 | Three Dimensions (divided into 12 items): Technical enablers (smart products); Realization of value (smart service); Integration into business (product-service-system). | Technology | Undefined ( Maturity scale: 0–100) |
MM57 | Six Dimensions (divided into six areas each) : Product; Process; Platform; People; Partnership; Performance. | Organization | Undefined ( Qualitative assessment -firm and external consultant) |
MM58 | Four Dimensions: Strategic Governance; Information and Technology; Digital Process Transformation; Workforce Management | Organization | Six Levels: Incomplete; Performed; Managed; Established; Predictable; Innovating. |
The table highlights that authors use a very heterogeneous terminology to indicate assessment areas and maturity levels, despite all MMs sharing a similar structure. As also pointed out by other contributors, when designing new MMs, developers tend to invent new titles for common MM approaches to stand out from the plethora of already existing MM (Katja et al., 2020).
4. MMs Backbone Structure
The state of the art of MMs revealed the presence of a common structure among MMs ( RQ1), hidden behind the different labels the authors adopt to identify the same concepts (e.g., areas, dimensions, axes, items, categories or levels, stages, degrees, scales). Previous analyses showed that the 58MMs share a basic three-element backbone structure, depicted in Fig. 6. Under a uniform terminology, the elements are the Assessment Dimension, the Assessment Scale, and the Assessment Matrix. The Assessment matrix reflects the way in which every assessment dimension is described and broken down for each level of the assessment scale. This means that every dimension is declined into specific concepts to provide a punctual 4.0 assessment.

Fig. 6. Three-element backbone structure of MMs.
Assessment Dimensions
These dimensions reflect the assessment objective by the content area definition. The main assessment objectives represent organizational and/or technological aspects, with a specific focus, e.g., on a single business dimension, a single technology, specific industrial sectors, or with a wider analysis spectrum. The analysis of MM dimensions revealed the presence of different assessment rationales, considering internal or external aspects of an organization.
Internal technological aspects focused on the technical solutions implemented, e.g., digital twin, assessing the coverage of the functional areas (e.g., MM1, MM16). Internal organizational aspects focused on the management of the digital transformation, for example the strategy, asset management, and employee alignment (e.g., MM15, MM19).
Internal mixed aspects consider both technological and organizational areas of interest (e.g., MM2, MM17). Addition of external elements: Evaluation of the context outside the company, such as suppliers and customers, and government policy (e.g., MM7, MM19).
Assessment Scale
The scale shows the levels available for the assessment, representing the levels related to each dimension, and the assessment technique, which can be qualitative or quantitative. Each assessment scale can be described by three main features.
Assessment range: The range within which the assessment result can be placed, i.e., the opposite ends of the scale described by a Lower End and a Higher End. In some MMs, the first level is for newcomers in the transformation, some others require knowledge and experience already reached. When an organization conducts a maturity assessment, its status should be captured, close to the scale’s Lower End, and its potential evolutionary path should be considered in terms of maturity, defining a Higher End.
Assessment refinement: The level of detail through which the Assessment range is described, i.e., the number of levels through which the scale is articulated. The level of detail of the assessment depends on the number of levels. A greater number provides a meaningful, and timely assessment, but it may be onerous. A reduced number provides a less detailed assessment but may be less onerous to implement.
Assessment evolution direction: The direction of the assessment. The scale may reflect technological maturity, i.e., the assessment evaluates specific technologies, already existing or to be implemented; organizational maturity, i.e., the assessment focuses on organizational aspects, such as strategic aspects or human resources; finally, benchmark maturity, i.e., the assessment measures the maturity level of the organization in relation to a wide-ranging external context.
Assessment Matrix
The matrix is defined by the intersection of the two previous elements: Assessment Dimensions and Assessment Scale. For each dimension or sub-dimension of the evaluation areas, it is required to define specific contents or objectives which, if present or reached, allow the achievement of a maturity level.
The Assessment Matrix analysis identified five topics, which are characteristics derived from cross-reading Assessment Dimensions and Assessment Scales. The analysis focused on the criteria required to achieve each level of maturity, the detailed content of each sub-dimension presented and the way in which the final score was provided. These topics represent recurring elements and contents, and therefore are analyzable in each MM. Practically, by answering the five questions below, organizations can determine whether the presented MM is compatible with their company context and assessment objectives.
Topic 1: Absence of digital initiatives
Does the MM feature a level of digitization that reflects the absence of 4.0?
To answer this, the minimum 4.0 level, i.e., the digital level the organization is required to possess to deploy the model is considered.
Topic 2: Internal competencies training and improvement
Does the MM present dimensions to assess internal competencies? To answer this, it’s analyzed whether the internal competencies are included in the assessment, i.e., if the experience and knowledge about digital technologies, such as internal training and skill acquisition are considered in the assessment proposed.
Topic 3: External competencies’ need and acquisition
Does the MM present dimensions to assess the need for external competencies?
To answer this, it’s analyzed whether external competencies, i.e., the need to acquire skills outside the organization, are considered in the assessment proposed.
Topic 4: Practical implementation
After the evaluation, does the MM provide practical guidelines for implementing a digital path for progress?
To answer this, it’s analyzed whether the model provides practical guidelines for implementing digital transformation strategies after the assessment.
Topic 5: External factors
Does the MM include in the assessment aspects external to the organization, such as economic, social, environmental, or legislative ones?
To answer this, it’s analyzed whether external factors are considered in the assessment dimensions proposed.
The 58MMs were analyzed for a “Suitability analysis”, using the five topics listed (Table 4). This analysis offers a starting point for organizations looking to implement an MM. It measures how well each model responds to the described topics on a two-level scale (Yes/No). This information can be used by organizations to make an initial selection of MMs. So, companies can select only the MMs consistent with their context and assessment needs.
![]() |
The summary of the Suitability Analysis results is presented in Fig. 7. It shows that more than half of the models allow evaluations even in the absence of 4.0 initiatives. Residual MMs are those that do not include internal competences in the assessment. However, most models do not account for the need to acquire external competencies. Less than half of the MMs provide practical guidelines and include external factors in the evaluation.

Fig. 7. Aggregate results of the suitability analysis.

Fig. 8. Three possible paths for MM definition.
An example of suitability analysis is provided for the MM2 (Table 5).
Yes | MM2 | MM2 | MM2 | ||
---|---|---|---|---|---|
No | MM2 | MM2 | |||
Assessment including absence of digitization (4.0) | Assessment including internal competencies training and improvement | Assessment including external competencies need and acquisition | Assessment including practical guidelines provision | Assessment including external organization factors |
MM2 is applicable even when digital initiatives are absent, since the first Readiness Level of the model is “Outsider”. It assesses the maturity of internal competencies, considering that the model dedicates an entire assessment dimension to “Employees”. With reference to this dimension, the model assesses the presence of appropriate training and continuing education. The possibility of acquiring the skills and competencies that the organization is lacking from external sources is not mentioned, and the evaluation focuses exclusively on assessing the preparation of the individuals in the organization. Once the assessment is complete, the model recommends practical steps for improvement. As an example, when the “Smart Factory” level is equal to 0 (Outsider), the model suggests the possibility “look into the technical possibilities of integrating equipment and add at least the basic functionalities to the product requirements document for new equipment. It may also be advisable to check whether your current systems can be upgraded”. Finally, all the assessment belongs to organizational factors and efforts.
Summarizing, every MM can be formalized by its assessment dimensions {ADk}, its assessment levels {ALh}, and its resulting assessment matrix {Dk⋅h}. Depending on its specific objectives, every MM covers one or more of the five topics. In the remaining part of the paper, the authors consider the five topics characterizing the assessment matrix as the starting point for the company to choose a MM, to see if the company’s context and assessment objectives are aligned with available MMs.
5. Meta-Model Driven Approach
The analyses show the ever-increasing proliferation of MMs. Companies and practitioners need a systematic approach to comprehend the MMs potential (Silva et al., 2021) or meaning (Katja et al., 2020), and consequently their appropriateness for the assessment.
To answer RQ3, i.e., to define the approach to guide an organization in selecting an existing MM or designing a new one, a meta-model is designed. Such meta-model represents a mechanism to create MM representing input and output relations, by means of a uniform terminology. For sure, MM design demands expertise and knowledge available in specific organizational or technical field, to avoid the design of useless, insignificant, or incorrect MM.
Considering the state of the art of MMs, such as their three-elements backbone, a meta-model driven approach has been developed to guide in choosing the most suitable MM or in developing a new MM wherever needed. In line with the concept of meta-model, input, process, and output are formalized and then depicted in Fig. 9.

Fig. 9. The overall Meta-model illustration.
Input
This approach needs the formalization of the company context and assessment objectives, that is, the company motivations for assessment. It is based on real factors that typify each organization interested in the assessment.
The company context considers its experience in the digital technologies’ implementation, to avoid the implementation of MMs for the smart manufacturing introduction if experience is high, and to avoid the assessment of the smart evolution if experience is low and any path is defined. Moreover, the context provides information in terms of needs, e.g., to improve a negative benchmark with competitors, to suggest practical action, and to check internal coverage of data sharing. So, considering the company context, the assessment scale requirements and assessment objectives must be established. According to the previous sections, the company assessment objectives are important in the choosing of MM. That is, the MMs assessment dimensions define the scope of the assessment, confining it to predefined areas. Each existing MM has a predefined number of evaluation areas. So, the formalization of the company assessment objectives guides in the selection of MMs, considering the assessment dimensions proposed. Besides defining these areas, each organization must determine these further aspects:
− | Internal/External view: The assessment considers technological and/or organizational aspects internal to the organization and/or considers also external aspects. | ||||
− | Narrow/Wide focus: The assessment generates a less or more extended result in terms of technology, production process, enterprise, supply chain; | ||||
− | One-spot/Ongoing assessment: The assessment leads to a snapshot of the maturity level, or it foresees a recurrent application, to trace the digital evolution. |
The company’s motivation, articulated in Company context and Company assessment objectives, represents the meta-model first-level input. Those inputs guide the analysis and the formalization of the needed tree-element backbone structure (second-level input). The meta-model connections underline that the company reviews MMS available, considering the alignment of the backbone elements to the company’s context and objectives, in a compatibility analysis.
Process
Once the first- and second-level inputs are formalized and defined, three paths are possible for the MM definition (Process) as represented in Fig. 8. What distinguishes these paths is how different the new MM is from one or more existing MMs. In order of presentation, these paths exhibit an increasing level of diversity.
Choice: The structure and content of the assessment dimensions, assessment levels, and assessment matrix are inherited. This is, all the three backbone elements of an existing MM are appropriate to the context and assessment objective and consequently such MM is selected;
Modification: At least one element between the content and the structure of the assessment scale and/or assessment dimensions, but not all, is newly designed. This is, there is no MM that has all three backbone structure elements appropriate to the context and assessment objective.
Design: No element between structure and content is inherited, neither of the assessment scale, nor of the assessment dimensions and consequently neither the assessment matrix. The new MM is entirely designed since existing MMs are unsuitable.
Thus, existing MMs are procedurally analyzed regarding their structural elements. That is, MMs are selected whose structural elements are compatible with the company’s assessment needs and context. It goes without saying that, on the content level, the so-called “human in the loop” is always necessary. For example, it is necessary to verify that a dimension which, at the level of naming, is like that required, also reflects what is expected in terms of content.
Output
This process led to the definition of an appropriate MM (Output).
The meta-model leads to the meta-model driven approach which consists of five steps (Fig. 10).

Fig. 10. The five steps of the meta-model driven approach.
Firstly, the assessment motivation is formalized, which requires the definition of the company’s experience and objectives, with a focus on their technological and organizational priorities.
Subsequently, suitability analysis of MMs is conducted. From Step 1, the company decides the expected level of interest on the five topics, identifying an initial set of MMs suitable to its context and assessment objectives. This step leads from the complete set {MMs} to the initial set {MMsinit}. Then, the backbone structure of the initial set is analyzed in terms of structure. By comparing the assessment objectives with the MMs assessment dimensions (see Table 3), is understandable where available MMs fit the company’s needs. Namely, inbound/outbound, and technological/organizational elements guide the match of assessment dimensions. So, in the {MMsinit} a subset {MMsk} is identified. Here, the MMs’ assessment dimensions are coherent with the company’s needs. Then, the company context guides the analysis of the assessment scale. The MMs are analyzed looking at the assessment range, assessment refinement, and evaluation directions (see Table 3). Those features guide the subsequent match. So, in the {MMsinit} a further subset {MMsh} is identified. This leads to two subsets of the previous one, i.e., {MMsk} and {MMsh}. Practically, MMs significantly matching with Assessment Dimension, and/or Assessment Scale survive.
The union of sets {MMsk} and {MMsh} represents the final set {MMsfinal}.
After the final set of MMs is retrieved, one of the three possible paths (see Fig. 8) guides the definition of the appropriate MM (i.e., choice, modification, or design). Depending on the path, the company has an MM ready to use, or it carries out modification and/or design activities, guided by the meta-model.
6. Application of the Meta-Model Driven Approach
The application of the meta-model-driven approach to a real manufacturing organization is illustrated, leading to the design of a new MM. Namely, a qualitative exploratory case study was conducted (Hew and Hara, 2007). The case study strategy is often described in the literature as the intensive study of a single case or a small number of cases based on observational data from a restricted object or process (Gerring, 2004). The following sections present the MM conceptualization, by retracing the meta-model driven approach steps, i.e., the design of an appropriate MM for the given company. Then, the application of the MM and its results, i.e., the MM operationalization.
6.1. MM conceptualization
To conceptualize the MM, a four-hour focus group was conducted. The researchers met with two company representatives responsible for innovation and R&D within the company. The meeting took place on the company’s premises where the production process of interest takes place, allowing the researchers to see it and understand its characteristic aspects at first hand. During this meeting, the meta-model driven approach was presented. The meeting included open questions to the managers and operators of the companies, with the aim to define the inputs of the meta-model and to select the process for the design or the selection of an MM. Researchers and company representatives thus defined and agreed on the first two dimensions of the MM, i.e., the Assessment Dimensions and the Assessment Scale. Furthermore, for each dimension, the aspects of interest were established. That is, the sub-dimensions to be investigated and the objectives to be completed for the achievement of each maturity level. The outcomes of this meeting were hereafter formalized.
The company’s motivation
Context
The organization operates in the space satellite composite sandwich panel manufacturing. This industry lacks innovation despite the potential benefits of smart technology. Sandwich panel production is intermittent. Volumes are low and variety high, with low production rates and high flexibility (Eugeniet al., 2022). Nowadays, spacecraft MAIT (Manufacturing, Assembly, Integration and Testing) is mostly based on manual processes (Gaudenziet al., 2020), and industrial solutions exist in limited stages, e.g., the optimization of installation provided by the Automated Potting Machine (APM). Cyber-Physical-Systems (CPSs) can confer advantages for reducing cost overruns and delivery delays in product development and testing. CPSs empower modern systems helping the diffusion of smart services, giving new opportunities and functionalities (Colabianchi et al., 2021). The case study embarked on a digitization path, starting from the sandwich panels process.
Visits to production facilities, combined with meetings with experts and employees here operating, gave a clear indication that the state of the art is far from a smart philosophy. Traditional and manual operations prevail. The company found shortcomings in smart terms. Inefficiency or absence of information transfer within production steps; absence of analysis on data gathered by sensors; high complexity and volume of data; absence of interconnectivity in the production system. Human judgement and subjectivity are decisive for the outcome of the analysis. The company evaluates a solution for Data Collection and Analysis based on a CPSs architecture, integrating control, communication, and computation levels. CPSs can collect data referred to themselves and their environment; process and evaluate these data and communicate with other systems. The company intends to implement this solution to provide real-time information sharing and visualization, internally across different business functions and employees, and then analyze it.x
Assessment objectives
The organization intends to undertake a structured digital transformation path, based on a well-defined strategy and indicators to monitor its evolution. To this end, the organization intends to adopt an MM to understand the state of the system’s technological-innovation, and to monitor its evolution over time, verifying an alignment between planning and implementation. Then, evaluating and monitoring strategic aspects, i.e., the strategy, pilot initiatives, indicators, and a dedicated budget. Considering the budget, the focus is on evaluating the allocation and the activation timing of funds, since economic feasibility checks were already done. Other interesting aspects are the presence of data collection technologies, and the type of analysis enabled. The organization wants to analyze real-time data and gain greater visibility of processes, easily sharing information, internally and externally. So, information sharing is also assessed, considering extent and security issues. As such technologies are novel for the organization, employees’ skill level assessment is required, to align the needed skills with those possessed, and to train employees. The assessment considers the current process and resources, and then the digital enhancements the data-driven approach generates. This motivation suggested the definition of four inbound characteristics, technical or organizational, i.e., the assessment dimensions: strategy, employees, data collection and analysis, information sharing.
Analysis of MMs
Suitability analysis
Considering the five topics retrieved, the organization is seeking a model reflecting:
• | Topic 1: Absence of digital initiatives. Currently, the organization is a digital outsider, and the model must be consistent with this. | ||||
• | Topic 2: Internal competencies training and improvement. The operators are of strong interest for the organization. It believes to have enough internal resources to face the innovation, if trained and informed. |
Other topics represent non-primary issues. Currently, external expertise is not needed. Practical guidelines are not required, but a way to assess and track technological and organizational evolution. The model is intentionally deployed to assess internal organizational and technological dimensions, not external issues (e.g., social, or cultural-related). Only MMs addressing the relevant topic for the organizations were selected (Table 6).
![]() |
The suitability analysis performed on the Assessment Matrix of MMs generates the initial set of MMs.
Backbone analysis
The backbone elements of these MMs were analyzed by looking at their structure, by analyzing Table 3.
Assessment dimensions. To identify matching MMs, the presence of dimensions like Strategy, Data collection and Analysis, Information Sharing and Employees was researched within the MMs. Strategy, Data collection and analysis, and Employees exist, while no MMs which directly assess the ability related to Information Sharing arise, as summarized in Fig. 11. Here, only MMs that survived the suitability analysis were considered. The following MMs present, respectively, these dimensions, consistent with the Strategy dimension of interest.

Fig. 11. Second part of the step two of the meta-model-driven approach.
− | Strategy: MM2,6,13,18,21,26,34,36,41,44,45,46,48 | ||||
− | Strategy and Organization: MM25,37,40 | ||||
− | Strategy and Leadership: MM33,55 | ||||
− | Business and Organization Strategy: MM35 | ||||
− | Organizational Strategy: MM37 | ||||
− | Strategic Alignment: MM45 | ||||
− | Planning and Strategy: MM49 |
With the same logic just explained, MM1,2,12,16,20,25,30,33,50,52,55 were selected, since they present dimensions consistent with the Data Collection and Analysis dimension of interest.
No MM has a dedicated dimension to Information Sharing.
Finally, MM1,2,6,13,19,21,23,24,25,28,32,33,34,35,36,37,38,40,41,42,45,46,48,49,50,55,57 were selected, since they present dimensions consistent with the Employee dimension of interest.
Then, the Assessment Scales were analyzed.
With respect to the Assessment range, the organization looks for a qualitative assessment where the first level describes the absence of 4.0 solutions. This is because the assessment wants to capture the current 4.0 state. Considering the Assessment evolution direction, the organization wants to assess and monitor the technological and organizational evolution. No interest in benchmark assessments.
These requirements led to the selection of 10MMs, as summarized in Fig. 11. Namely, only MMs with technology or organizational evolution direction were analyzed in Table 3. Among these, MMs with a first qualitative level that reflects the absence of 4.0 were selected, i.e., MM15,15,19,22,24,40,42,43,47,50.
The final set of MMs
As a result, the set {MMsk}U{MMsh} was retrieved, i.e., the MMsfinal
MM definition
Among the possible paths suggested, Modification path is pursued. That is, the organization needs to investigate contents not found in the available MMs, although from a structural point of view similar dimensions of evaluation exist. That is, the structure of assessment dimensions is partially inherited while the contents were designed. The company wants to investigate specific issues that are different from those found in the existing assessment dimensions of MMs. Those contents, reflected in the sub-dimensions of each assessment dimension, emerged from discussions with the organization and from the process analysis conducted during this conceptualization phase. In the following paragraph, the definition of each dimension and sub-dimension is given.
The backbone element related to Assessment scale can be partially inherited, e.g., from MM24, which defines the following levels: Absence of Digital Initiatives; Planned; Just Started; Under Development; Developed and Ongoing. The level labels were terminologically modified, i.e., severed to better reflect lower levels. In this way, the model captures the current smartness and defines attainable levels. So the company designed a new MM that inherits some features of existing MM and defines others ex novo. The Modification path required a working group, joining the authors and company’s employees: production manager and the Digital Transformation Officer. Two meetings were held to provide a first review of the three-elements definition, the following final review and approval.
The new MM is hereafter described.
Assessment Dimensions
The assessment dimensions needed cover four areas, illustrated hereafter. Also, the subdimensions for each assessment dimension are listed.
Strategy
Strategy dimension evaluates the extent to which the innovative solutions are determined and formalized within the business strategy. A smart manufacturing transformation is a long and complex journey and strategy implementation is a priority. This dimension investigates the presence of:
• | transformation strategy, in a long-term perspective, defining objectives and actions; | ||||
• | pilot initiatives, i.e., experimentations of new technologies, working methods, or organizational models. Such initiatives enable the acquisition of theoretical knowledge; | ||||
• | indicators, to monitor the transformation strategy and the pilot initiatives; | ||||
• | budget, dedicated to initiatives. The budget represents a dual-action tool: planning and control tool. | ||||
• | general CPS technologies, needed for strategy and the pilot initiatives, i.e., whether the needed technologies to implement the transformation exist. Sometimes, although the technologies have been defined, companies still do not possess them. |
Data Collection and Analysis
This dimension investigates the presence of:
• | specific technologies, to ensure an increasing maturity logic — visibility, and transparency on the processes and resources, up to predictive and adaptability mechanisms intended as self-optimizing; Then, assesses the extent of data collection within the initiatives, as digitized basic measures for efficiency and efficacy analysis, considering: | ||||
• | data collection of process parameters; | ||||
• | data collection of output parameters; | ||||
• | analyses that data collection provides, i.e., assesses whether the company provides systematic analysis, uses data to understand what is happening, and acts proactively through analysis. |
Information sharing
This dimension assesses the extent of information sharing, and the temporal frequency of the sharing. The sharing could occur at fixed times or continuously, with real-time systems. Namely, it assesses:
• | internal sharing, among departments; | ||||
• | external sharing, with customers and/or suppliers; | ||||
• | smart products/services, i.e., the presence of smart products/services connected to the production environment, able to exchange information and influence its responses and activities. So, measures the smart product/service capabilities i.e., monitoring, control, and optimization. These could also be IT-controlled, interacting with higher-level systems in the value chain, integrating products, production, and customers; | ||||
• | IT security systems, generic across the company, or specifically designed for the initiatives. |
Employees
This dimension assesses the company’s management of competencies for the transformation, and training tools. Such path involves two types of knowledge: about the technology itself, technically and technologically; about the new work processes and methods, as a new knowledge and skill may arise to continue working (e.g., safely). So, the dimension evaluates the level of:
• | competencies on technologies; | ||||
• | competencies on work and process methods; |
Finally, the existence of:
• | technologies training tools, i.e., plan and tools to acquire the missing technological competencies; | ||||
• | processes and work methods training tools, i.e., plan and tools to acquire the missing processes and work methods competencies. |
Assessment Scale
The model should assess the current organizational smartness, and to monitor its evolution. In line with the objective of structuring and monitoring a strategy, as well as designing and implementing digital initiatives in the given context, the five maturity levels are
• | Absence of digital initiatives
|
• | Design of digital initiatives: The previous points have been defined but not yet fully implemented; | ||||
• | Starting of digital initiatives: The previous points are at an early stage of implementation; | ||||
• | Development of digital initiatives: The previous points have been implemented and are currently experiencing further progress and improvement; | ||||
• | Consolidation of digital initiatives: The previous points have been implemented, improved, and are regularly monitored to assess their effectiveness and efficiency. |
The scale starting point reflects that the organization considers itself as a smart manufacturing outsider, since no concrete 4.0 initiatives exist within the organization. So, the first and the second levels of the model represent not smart features, while factors reflecting the initial features of a digital development appear from level 3 onwards. The levels representing advanced digital readiness are merged into a single level, i.e., level 5, since the organization considers them far away to reach. This level incorporates all the features that are completely integrated into a smart context. Overall, the assessment scale puts emphasis on lower maturity levels since 3 out of 5 levels define few, or none, innovative features. The range of maturity levels is extensive, fitting with the objective of monitoring transformation over time and employing the model on an ongoing basis.
At the end of the meeting with company representatives, the structure presented below, i.e., The Assessment Dimensions and the Assessment scale, i.e., the first two backbone structure elements, were formalized and agreed (Fig. 12).

Fig. 12. The first two backbone elements of the new MM.
Assessment matrix
The dialogue between company representatives and researchers allows for the formalization of all the traits the organization intends to define and monitor in its digitization journey, as reported in the Assessment Dimensions section. With respect to these dimensions, the achievement of defined sub-goals is required to define each maturity level. The combination of the assessment dimensions and sub-dimensions, with the Assessment Scale, leads to the definition of the Assessment Matrix presented in the questionnaire in Table 7. This combination required the formulation of a question for each sub-dimension; to define the possible answers to each question, the maturity levels defined by the Assessment scale were considered. Each possible answer needs to represent one of the five levels of maturity in each sub-dimension assessed. For example, considering the item 1.4 in Table 7 — Does the company have a dedicated budget for the digital transformation path strategy and the pilot initiatives? — the question originates from the budget sub-dimension of the strategy dimension, and the possible answers reflect the 5 levels of the Assessment scale. So, maturity level 1 — Absence of digital initiatives, reflects a Budget not allocated; maturity level 2 — Design of digital initiatives, a Budget allocatedbut still not activated; maturity level 3 — Starting of digital initiatives, a Budget in execution stage; maturity level 4 — Development of digital initiativesa Budget in controlling, and finally maturity level 5 — Consolidation of digital initiatives reflects a Budget in assessment and review stage. The score for each dimension is based on the weighted average of the scores achieved in each question. An answer corresponding to the absence of digital initiatives is equivalent to one point and, unitarily, to the consolidation of digital initiatives level is five points. The weighting of each question is an autonomous choice of the companies. Since the questionnaire is a self-assessment tool to support organizations, a priori imposition of weights is not appropriate (Belaret al., 2003), considering these depend on company history, context, transformation strategies and initiatives. Empirical studies show that the self-assessment effectiveness is guaranteed when people are really engaged, able to monitor their progress, and to collect feedback to guide revision (Andrade and Du, 2007). Also, the self-assessment technique can generate positive side effects. For example, it increases the ability to focus on the key elements of the mission over time, resulting in improved processes because they are directly observed and evaluated (Andrade and Valtcheva, 2009). More complex calculations are not considered because the quantitative assessment is subject to a high level of uncertainty than the qualitative (Van Der Sluijs et al., 2005; Wynne, 1992). The model does not provide the option “don’t know/no opinion” since it could lead to several pitfalls (Krosnicket al., 2002). It lets respondents have no opinion while the model wants the company to face its state of implementation in the transformation path. Moreover, all the motivations for the option “don’t know/no opinion” are missing: the questionnaire has been designed together with the organization, which knows the content and is familiar with it.
LEVEL 1 | LEVEL 2 | LEVEL 3 | LEVEL 4 | LEVEL 5 |
---|---|---|---|---|
Absence of digital initiatives | Design of digital initiatives | Starting of digital initiatives | Development of digital initiatives | Consolidation of digital initiatives |
1. STRATEGY | ||||
1.1 Does the company have a digital transformation strategy? | ||||
Absent | In design stage | In implementation stage | In production stage | In assessment and review stage |
1.2 Are there pilot initiatives? | ||||
Absent | In design stage | In implementation stage | In production stage | In assessment and review stage |
1.3 Does the company use indicators to monitor the implementation of the digital transformation path strategy and/or the pilot initiatives? | ||||
Absent | In design stage | In implementation stage | In production stage | In assessment and review stage |
1.4 Does the company have a dedicated budget for the digital transformation path strategy and the pilot initiatives? | ||||
Budget not allocated | Budget allocated | Budget in execution stage | Budget in controlling | Budget in assessment and review stage |
1.5 What is the actual state of CPS technologies useful for the digital transformation path strategy and the pilot initiatives? | ||||
Not all technologies defined | All technologies defined | Some technologies defined and in use | All technologies defined and some in use | All technologies defined and in use |
2. DATA COLLECTION | ||||
2.1 Does the company have the data collection technologies needed for its digital transformation strategy and pilot initiatives? | ||||
Not all technologies defined | All technologies defined | Some technologies defined and in use | All technologies defined and some in use | All technologies defined and in use |
2.2 Does the company collect and analyze data from internal process parameters, involved in the digital transformation path? | ||||
Collection and analysis absent | Partial collection but absent analysis | Partial collection and partial analysis | Complete collection and partial analysis | Complete collection and analysis |
2.3 Does the company collect and analyze data from output process parameters, involved in the digital transformation path? | ||||
Collection and analysis absent | Partial collection but absent analysis | Partial collection and partial analysis | Complete collection and partial analysis | Complete collection and analysis |
2.4 What analysis do data collection technologies provide? | ||||
Ad hoc analysis | Ad hoc and systematic analysis on historical data | Ad hoc, systematic on historical data and real-time analysis | Ad hoc, systematic on historical data, real-time and predictive analysis | Ad hoc, systematic on historical data, real-time, predictive and proactive analysis |
3. INFORMATION SHARING | ||||
3.1 Is there an internal sharing of information (within the company, among departments) relevant to the digital transformation path? | ||||
Internal information sharing absent | Information internally shared at fixed times only between certain departments among those concerned | Information internally shared at fixed times between all the departments concerned | Information internally shared in real-time only between certain departments among those concerned | Information is internally shared in real-time between all the departments concerned |
3.2 Is there an external sharing of information (with customers and/or suppliers) relevant to the digital transformation path? | ||||
External information sharing absent | Information externally shared at fixed times only between certain stakeholders | Information externally shared at fixed times between all stakeholders | Information externally shared in real-time only between certain stakeholders | Information is externally shared in real-time between all stakeholders |
3.3 Is there sharing of information between the company’s smart manufacturing environment and smart products and/or services? If present, what are the capabilities of these innovative products/services? | ||||
Absent | Smart products/services in design stage | Present with monitoring (e.g., environmental conditions, usage, performances) and/or control (e.g., personalization of product features and experience) | Present with monitoring, control and optimization (e.g., algorithms for predictive diagnostics, proactive support) | Present with monitoring, control, optimization and independence (e.g., self-coordination with other products and systems, independent customization, self-diagnosis and support) |
3.4 Are there IT security systems within your transformation path? | ||||
Absent | In design stage | In production stage | Generic IT security systems | Specific IT security systems for the digital initiatives |
4. EMPLOYEES | ||||
4.1 What is the overall competencies level on CPS technologies in terms of smart manufacturing requirements? | ||||
Absent | Identified but not acquired | Early stage of acquisition | Acquired | Acquired, and regularly reviewed |
4.2 What is the overall competencies level on processes and work methods of the employees in terms of smart manufacturing requirements? | ||||
Absent | Identified but not acquired | Early stage of acquisition | Adequate and complete | Adequate, complete, and regularly reviewed |
4.3 Does the company have a training plan to acquire the missing technological competencies (e.g., seminars, knowledge transfer systems, coaching, etc.)? | ||||
Absent | In the design stage | In the implementation stage | Ongoing stage | In the assessment and review stage |
4.4 Does the company have a training plan to acquire the missing processes and work methods competencies (e.g., seminars, knowledge transfer systems, coaching, etc.)? | ||||
Absent | In the design stage | In the implementation stage | Ongoing stage | In the assessment and review stage |
6.2. MM operationalization
To fill out the questionnaire, the working group organized two more meetings. In the first comprehension tuning meeting, the final version of the MM, i.e., the Assessment Matrix, was illustrated and agreed, and the weights for each question within the assessment area were defined. As a first test, the team set equal weights. The weights’ choice is typically left to the companies that identify relevant assessment areas for their purposes. Then, the company filled in the questionnaire autonomously. The company selected five respondents, with full awareness of the questionnaire’s contents, that together answered the questions. Namely, the contact person for new digitization projects, a tutor for process equipment and training, two heads of department, and a head of production. In the second meeting, the company shared the answers and justified the choices, showing coherence between what was declared, and the state of the company. The graphical representation of the results is hereafter presented. Radar chart is a common tool in this assessment (Rafael et al., 2020). The test results, for each assessment dimension, are discussed after each corresponding figure (Figs. 14–17), while a synthetic overview of the four assessment dimensions is provided in Fig. 13. Intermediate situations may occur, namely when the score obtained in one of the dimensions places a company in an intermediate position between two levels. These situations do not generate confusion because the organization is interested in qualitative positioning.

Fig. 13. Four assessment dimensions synthetic overview.

Fig. 14. Strategy assessment dimension.

Fig. 15. Data collection and analysis assessment dimension.

Fig. 16. Information sharing assessment dimension.

Fig. 17. Employees assessment dimension.
Since weights were set equal for all the questions, the scores are just average values. The company does not exclude inserting different weights in the future. Eventually, the scores will be recalculated with a simple weighted average.
Figure 13 proves that the organization is still immature in sharing information and training employees. Currently, the strategy is in the design phase, and some data collection technologies allow for partial collection and analysis of process output data.
Figure 14 reports a Strategy average score of 1.8. The organization belongs to an intermediate level between Absence and Design of digital initiatives. This result is coherent with the following organization’s statements. Namely, the company has understood the potentials of the CPS technologies, identifying improvements in the transformation path. Nevertheless, the production process is composed of complex and long activities where precision is a key element and nowadays there are frequent manual controls and rework operations. The organization hopes CPS technologies speed up operations, automating them, and allowing to avoid some controls and reworks, resulting from manual assemblies. Also, the company understood the transformation requires real-time data acquisition and analysis, to predict anomalous trends of the process. Thus, planned a strategy that involves sensor technologies such as structural technologies, and sensing techniques. Realizing such potential, pilot initiatives and budget definition are in the design phase, but specific indicators to monitor the strategy’s and pilot initiatives’ implementation and progress are still missing.
Figure 15 shows a Data collection and analysis average scores of 2.25. The organization belongs to an intermediate level between Design and Starting of digital initiatives. As observed by researchers during the visit to the production facilities, and attested by the organization, data collection is partially enabled in the potting production step, mainly concerning measures of the output products. The data transfer within the company’s database is carried out only through USB sticks. Moreover, data are analyzed only when significant anomalies occur.
Figure 16 shows an Information Sharing average scores of 1.5, placing the organization into an intermediate level between Absence and Design of digital initiatives. Declarations confirm that information sharing is incomplete, and the transformational strategy does not yet include the introduction of Smart Products/Services, neither a new IT security system. Internally and external information is only transmitted between few actors among those potentially interested, limiting data transparency. Real-time sharing is not available, but sharing occurs at pre-determined moments or on demand. The information about the progress of the design of panels is not timely communicated, even if it would be significant for the production department, and for the downstream phases (finalization, packaging, and shipping). External information sharing does not involve suppliers and customers, even if it seems feasible since they are only a few. Suppliers and customers cannot communicate real-time, and sometimes this causes information losses, reworks, and ineffectiveness. As an example, customer complaints are of interest for suppliers when the non-compliance originates from the raw material.
Figure 17 shows an Employees average score of 1.5, placing the organization into an intermediate level between Absence and Design of digital initiatives. When talking to employees, it emerged that some of them were informed about the future introduction of CPS. The functioning of these technologies was internally presented through informative videos, showing their basic features and functionality. Nevertheless, the organization aims to provide on-going training, integrating it with interactive tools to give employees a first-hand experience. New processes and working methods, which will change significantly, are still not included. The questionnaire provided valuable insights for the organization, indicating that employees should also receive training on these new processes and working methods to fully involve them in the innovation process. As a result, the organization stated that it will include this training in its strategy.
7. Conclusions and Future Developments
The literature review and the state of the art indicate a steady proliferation of MMs. These tools are intended to measure and monitor digital evolution. However, some critical issues emerged. Although MMs can support organizations in this regard if carefully designed, the research highlights implementation gaps in MMs, namely:
− | MMs are highly case-specific, or strongly generalizing, and it is difficult for an organization to implement a ready-made MM when considering the specific organizational and technological context, as well as its assessment objectives; | ||||
− | MMs are only useful tools if they are supported by an expert evaluation, allowing the result to be compared with the analyzed process, placing it in context and supplementing the results with expert and practical advice. |
Regarding the first remark, it’s clear that MMs lack the requisite flexibility to be applied in a wide range of contexts. While the selection of an existing MM may be appropriate for quick or not deepened analysis, it doesn’t provide accurate and useful assessments in most cases. To arrive at such assessment, organizations must either modify existing models or develop ad hoc ones. This indicates that to utilize a MM effectively, it is necessary to analyze and formalize aspects relating to the organizational context and evaluation objectives, which are extremely case-specific. The consequence is that it remains challenging to compare the maturity levels of different organizations through MMs, whether they are compared using the same model. This is because if the MM for comparison is the same, it probably does not capture context-specific aspects, resulting in a shallow evaluation. On the other hand, the use of different MMs, based on different criteria and logics, makes it even more difficult to make meaningful comparisons. Nevertheless, a MM remains a valuable tool for internal comparison. The same organization can assess and monitor its development in terms of digitization in a temporal benchmark, which effectively allows for the tracking of progress over time.
Regarding the second consideration, the MM developed in the case study presented, requires expert technical support both in the design phase and in the analysis of the results. Additionally, the results analysis necessitated comparison with experts within the evaluated organization. This is also relevant in terms of providing practical guidelines to support the innovation pathway. This research demonstrates that this aspect remains a significant limitation in the practical application of MMs, despite the tendency for research to overlook this aspect.
With reference to the RQs presented, the presence of a common structure within the MMs is shown ( RQ1), hidden behind the different labels the authors adopt o identify the same concepts. As also pointed out by other contributors, when designing new MMs, developers tend to invent new labels for common MM approaches to stand out from the plethora of already existing MM. Therefore, it is worth highlighting the common structure underlying these MMs (i.e., the three-backbone-structure depicted in Sec. 4). Analyzing this structure, it emerges that the definition of the assessment dimensions, assessment scales and assessment matrix are based on different rationales, that characterize MMs and make their assessment potential not wide enough to cover a wide range of purposes ( RQ2). The formalization of these three dimensions, together with the company motivation, must guide the choice of MMs and their design.
To follow up on this evidence and respond to RQ3, a meta-model-driven approach was designed. Namely, first-level inputs (company context and assessment objectives), and second-level inputs (i.e., the constituent elements of the three-backbone-structure — assessment dimension, assessment scale and assessment matrix) responding directly to the needs and to the organizational context must be defined. Once these inputs are established, the process of defining an appropriate MM begins. This process may involve choosing an existing MM, modifying one or more existing MMs, or designing MM ex novo. In the design of a MM, the choices are again driven, i.e., the inputs for the building of the MM must be accounted for.
Moreover, the MMs’ state of the art represents a useful theoretical and practical result both for businesses and researchers. Business can immediately get visibility and comprehension of the existing MMs and of the distinctive features of each one. Researchers can better appreciate the state of the art on the topic, identifying common trends, and existing gaps in terms of model development, filling these by proposing new support tools. Furthermore, the state of the art is integrated into the meta-model-driven approach providing further practical support. This support is evidenced by the Suitability Analysis, i.e., the analysis that offers a starting point for organizations looking to implement a MM. It measures how well each model responds to the topics emerged within MMs. This information can be used by organizations to make an initial selection of MMs. So, companies can select only the MMs consistent with their context and assessment needs.
The innovative approach presented was tested on a real industrial case study. The on-going discussion with the organization, both in the MM design phase and in the results analysis stage, allowed for a critical evaluation of the outcome of the test. The results obtained were satisfactory. The procedure proved to be efficient in identifying a set of MMs suitable for the assessment and, starting from their backbone structure, in guiding to design a new MM well-fitting the organization and technological context and the assessment objectives. The scores obtained were representative of the company’s level of maturity and the model provides insights and guidelines for future actions, and the revision of the company’s innovation strategy.
Future research refers to the validation of the approach on a relevant number of companies, and to its consolidation. Moreover, to continuously improve and upgrade the approach, consistent feedback should be provided by the companies, and by the groups involved in the research. This will allow to adapt the approach to the rapid technological development and to the resulting new companies’ needs. The assessment of different realities will identify any potential critical areas of the approach and verify whether in some industrial sector, there is a need for a better-tailored approach. The qualitative exploratory case study presented involved a direct collaboration between researchers and the company. At the end of the collaboration, business representatives declared that the methodology was clear, easy to implement and that the results were helpful in defining the digitization path. Nevertheless, the collaboration between the researchers and the company provided the company with the necessary clarifications on the methodological approach, while allowing the researchers to gain a closer insight into the production processes evaluated. In the absence of any direct contact between the researchers and the actors in the company, it would be worthwhile to analyze the effectiveness of this approach. That is, the direct dialogue and collaboration between researchers and company members ensured full adherence between methodological principles and the result obtained. However, the need for such direct collaboration may represent a limitation for some contexts. To address this, future developments of the method will also test it as a stand-alone tool to identify potential areas for improvement and ensure its unambiguous comprehensiveness. Also, this would enable the organizations themselves to test whether the approach remains self-explanatory, clear, and feasible. The resultant MMs could then be analyzed by the researchers to confirm their methodological soundness. This could inherently improve the companies’ ability to develop, analyze and implement MMs, since nowadays it is still an open challenge, allowing researchers to investigate such capacity and give suggestions for improvement.
Funding
The authors declare no fundings.
Declarations Conflicts of Interest
The authors declare that they have no known competing interests.
ORCID
Margherita Bernabei https://orcid.org/0000-0002-0917-0845
Francesco Costantino https://orcid.org/0000-0002-0942-821X
Massimo Tronci https://orcid.org/0000-0002-6648-9350