The list of deliverables to date is as follows:
TITLE | DESCRIPTION |
D1.1 Specification of the use cases | This document outlines the industrial use cases agreed upon by the IoD project partners. The intention of these use cases is to build upon the goals of the project proposal to better define the scope of the project and pave the way for the final demonstrators and assessment results. Most importantly the use cases are thus primarily defined as challenges: industry- or even company-specific difficulties that will form the basis for collaboration within the project. This allows the partners to build solutions together and to solve tangible problems that are related to the main problems to be investigated by the IoD project The document starts with a short background and a capability model for DevOps. The latter is intended to support the definition of the context at each use case owner. Thereafter, each use case owner has provided a section that define their overall contexts, and describes the details of each of their use cases. |
D1.2 Requirements and Evaluation Criteria | The overall objective of WP1 is to define, specify and implement the project use-cases along with the description of the user and technical requirements for the use-cases. This current version document, “D1.2 IOD Requirements and Evaluation Criteria (version 1)”, is the first in a series of two and describes the functional and non-functional requirements along with the evaluation criteria in general aspects. The work performed in D1.2 version 1 is based on the results of other deliverables: D1.1 – Specification of the use cases, D5.1. Pain Points and D5.4. Key Measures for DevOps Tool Chains. We outline the fact that the requirements described in this version will be updated in the second version and there it will be considered as an exhaustive list. During this year, partners involved in WP1 will continue the work needed to ensure that the solution remains on track with the project vision. The results of this future work will be presented in the second version of the document “D1.2 IOD Requirements and Evaluation Criteria (version 2)”: as a consequence, the list of requirements and evaluation criteria will be updated and prioritized. In this document, metrics and evaluation criteria for each requirement is proposed. |
D1.3 Requirements and Evaluation Criteria | The overall objective of WP1 is to define, specify and implement the project use-cases along with the description of the user and technical requirements for the use-cases. This current version document, “D1.3 IOD Requirements and Evaluation Criteria (version 2)”, is the second in a series of two and describes the functional and non-functional requirements along with the evaluation criteria in general aspects. This is revised version of the first one in the series, that is, “D1.2 IOD Requirements and Evaluation Criteria (version 1)”. The work performed in D1.3 version 2 is based on the results of other deliverables: D1.1 – Specification of the use cases, D5.1. Pain Points and D5.4. Key Measures for DevOps Tool Chains, and of course D1.2 IOD Requirements and Evaluation Criteria (version 1). It is important to note that, (almost) all the information the first version (D1.2) was kept here, with some additions and revisions. From that aspect, D1.3 is the final version of the requirements and the evaluation criteria of the IOD project. In this document, metrics and evaluation criteria for each requirement are proposed. |
D1.4 Intermediate use case detailed reports about demonstrators, assessment results | The aim of this deliverable is the description of the use cases status in an intermediate state. This project covers a wide diversity of cases to show how DevOps technologies can be adapted and applied in several fields, generating similar benefits and contributing the integration of such solutions in the industry: Cyber Physical Systems (SAAB), (VESTEL), Marketing information management (JOT), IT Service Delivery (CONCATEL) and code tracking (TRC). This deliverable uses as input the use cases specified and described in D1.1 and the technical requirements and indicators defined in D1.2. The document covers all the use cases following similar structure to facilitate the alignment and comprehension. After a brief introduction of each case, they describe the back-end processes, mainly dealing with data processing and services integration. As developed solution and services are integrated in real conditions, supporting front-end developments are also needed for better monitoring and tracking. Finally, all the cases shows the status of the main KPIs used to evaluate the improvement generated, comparing the starting values with the actual ones and the expected goals. |
D1.5 Final use case detailed reports about demonstrators and assessment results | This document describes and outlines the results of the industrial use cases in IoD project. The intention of these use cases was to build upon the goals of the project proposal to better define the scope of the project and pave the way for the final demonstrators and assessment results. Most importantly the use cases were primarily defined as challenges: industry- or even company-specific difficulties that formed the basis for collaboration within the project. This allowed the partners to support each other with knowledge and solve tangible problems that were related to the main problems to be investigated by the IoD project. The document starts with a short background and a capability model for DevOps. The latter supported the definition of the context at each use case owner. Thereafter, each use case owner has provided a section that define their overall contexts, and describes the details of each of their use cases. |
D2.1 Cross-Cutting Lifecycle Services V1 | This deliverable consists of a set of videos created by IoD solution providers contributing to the WP2. These videos present the main features of lifecycle services and tools to be potentially experimented or adopted by use case owners within the project. Such features encompass for instance: Lifecycle Traceability (supporting data integration across the DevOps silos & tools), basic building blocks for enhancing process automation, global versioning and configuration management, core software components for visualization engines and Cloud-based dashboards, or software components for bridging the gap between operational and development phases of the lifecycle. The main content of this deliverable consists of a set of videos provided as files that have been recorded and edited directly by the partners. Therefore, this current document only provides short descriptions of these videos and reference to these files. |
D2.2 Cross-Cutting Lifecycle Services V2 | This deliverable consists of a set of videos created by IoD solution providers contributing to the WP2. These videos present the main features of lifecycle services and tools to be potentially experimented or adopted by use case owners within the project. Such features encompass for instance: Lifecycle Traceability (supporting data integration across the DevOps silos & tools), basic building blocks for enhancing process automation, global versioning and configuration management, core software components for visualization engines and Cloud-based dashboards, or software components for bridging the gap between operational and development phases of the lifecycle. The main content of this deliverable consists of a set of videos provided as files that have been recorded and edited directly by the partners. Therefore, this current document only provides short descriptions of these videos and reference to these files. D2.2 is the second iteration of the deliverable D2.1 which has been released before our Celtic Mid-Term Project Review. |
D2.3 Proof of Concept for a semi-automatic toolchain generation and configuration service -V1 | Delivered in the form of software demo |
D2.4 Proof of Concept for a semi-automatic toolchain generation and configuration service – V2 | Delivered in the form of software demo |
D3.1 Automated Testing tool for DevOps | This document deals about the integration of all the available automated testing and their feedback in a single DevOps loop. And, depending on the situation, including capabilities to customize the execution of such tests, in regards of: 1. controlling the execution of the existing one depending on the change detected in the development and their dependencies 2. automatically creating and executing new tests cases The Introduction will provide a general description of these integration problems. And later, each partner in the project with involvement in this approach will describe their activities and results in a separated section. |
D3.2 Automated Quality Analyses of Natural Language Texts | This document shows the work performed and the results of the effort on WP3 carried out by all the partners involved in this workpackage which have used Natural Language Processing (NLP) techniques for automating DevOps processes, such as integration log files, test outputs, version control comments by developers, customer feedback, etc. and, in general, any kind of structured or unstructured information created as a result of a loop in the DevOps lifecycle. |
D3.3 DevOps for hardware dependant software Demonstrator | Delivered in the form of software demo |
D3.4 WP3 Intermediate Results | This document consists on the description of all the activities and tasks contributing to WP3 DevOps Services for Continuous Integration, Quality Control and Deployment. The content will be in the form of images as well as links to videos. The intention of collecting these results is not only to provide a basis for the next version of the document due in the final year of the project, but also to streamline the WP3 results across the partners. Part of the topics addressed in this deliverable will be also dealt under a more specific and detailed approach and point of view in deliverables D3.1 Automated Testing tool for DevOps, D3.2 Automated Quality Analyses of Natural Language Texts and D3.3 DevOps for hardware dependant software Demonstrator to be released by the end of the project milestone. The automated testing tools for DevOps efforts have been focused on the use cases provided in WP1, and the main results have been that the integration of the heterogeneous tool ecosystem has been achieved in general with some specific problems in some use cases dealing with legacy parts and some non-deterministic failures for another. |
D4.1 Visualization Tools for Continuous Integration V1 (Software & Videos, RE) | The tasks underlying this deliverable aim to develop data collection methods and methodologies for the purpose of monitoring continuous integration of DevOps environments. Moreover, several key performance indicators will be identified through earlier tasks related with the quality assessment of the integration and data relevant to these KPIs will be collected. Decision on data visualization techniques and the development of a dashboard will be done according to this data. Dashboard will be an interactive webbased tool for the purpose of aiding several stakeholders about the DevOps quality measures. Several visual analytics approaches will be embedded in the dashboard for better user experience. This document provides an outlook for the visualisation tools for continuous integration by a subset of then IoD project partners, namely by Enforma, Ericsson-Turkey, Ericsson-Sweden, JOT-IM, KTH, and REUSE. The content will be in the form of images as well as links to images and videos. The intention of collecting these results is not only to provide a basis for the next version of the document due in the final year of the project, but also to streamline the visualisation results across the partners. |
D4.2 Visualization Tools for Continuous Integration V2 (Software & Videos, RE) | The tasks underlying this deliverable aim to develop data collection methods and methodologies for the purpose of monitoring continuous integration of DevOps environments. Moreover, several key performance indicators will be identified through earlier tasks related with the quality assessment of the integration and data relevant to these KPIs will be collected. Decision on data visualization techniques and the development dashboards will be done according to this data. Dashboards will be an interactive webbased tool for the purpose of aiding several stakeholders about the DevOps quality measures. Several visual analytics approaches will be embedded in dashboards for better user experience. D4.1, which was a prequel to this D4.2 was already published in M16. Therefore, in D4.2 the reader will only see content that is new, and any material that was already reported in D4.1 will not be repeated here. |
D4.3 Data Analytics and Machine Learning for DevOps V1 (Software, Videos, Report) | This version of the document is the first in a series of two that are part of Task 4.2 (Data Analytics / Machine Learning) and describes the develop of different tools based on Machine Learning (ML) techniques that will be applied to the large amount of data generated from DevOps environments. This deliverable consists of software and reports to analyse, design and implement different services based on ML algorithms and AI techniques to provide better insight into data obtained automatic or semi‐automatic acquisition from within the DevOps life‐cycle. |
D4.4 Data Analytics and Machine Learning for DevOps V2 (Software, Videos, Report) | This version of the document is the last in a series of two that are part of Task 4.2 (Data Analytics / Machine Learning) and describes the development of different tools based on Machine Learning (ML) techniques that will be applied to the large amount of data generated from DevOps environments. This deliverable consists of software and reports to analyse, design, and implement different services based on ML algorithms and AI techniques to provide better insight into data obtained automatically or semiautomatically from within the DevOps life cycle. |
D5.1 Pain Points | This report focuses on WP5 Deliverable D5.1, the identification and documentation of Pain Points of classical agile development and operations when trying to apply DevOps. The results provide input to the further work in this work package as well as other work packages. The work of identifying and documenting Pain Points will continue as part of WP5, and this is the first report. Much of the work has been to define models and methods to be able to classify and map the Pain Points to WHERE the Pain Points occur in the process, and WHY do they occur. For mapping WHERE the Pain Points occur a DevOps practice model illustrating the different lifecycle phases of a DevOps process has been used. The model may be updated as we learn from open-ended questions used in identification of Pain Points. For mapping WHY Pain Points occur, i.e. what are the underlaying deficiencies in the organisations, a capability model has been developed that comprises elements such as governance, processes, and tools. This model will be elaborated and will also be part of the overall result from WP5 defining what constitutes a good organizational capability to perform an effective DevOps practice. |
D5.2 WP5 Standards and Methods | This document consists on the description of all the activities related to Task 5.1: Processes and methods for DevOps in large organizations and Task 5.2: Methods to Quantify and Classify DevOps Tool Chains by Business Value, that will eventually result in the approach for the definition of flexible standards and procedures for System Integration that support DevOps, considering the critical constraints. The objective of work package 5 is to provide process standards and methods as they are needed in a big organization to support a DevOps approach. Further it should be clarified what are the resulting requirements for a tool chain supporting these adapted methods and processes. |
D5.3 Lessons Learned | Subsumed under 5.5 |
D5.4 DevOps KPIs (metrics) | This document presents an overview of general measurements that can be useful to apply in DevOps environments, regardless of which development process is used, and which can be applied agnostically to any application domains. |
D5.5 Lessons, Opportunities, Constraints and Risks | This document seeks to identify and evaluate generic potential opportunities, constraints and risks associated with IoD demonstrations, use cases and the lessons learned from applying the DevOps techniques to organizations. Specific objectives were to identify measurable and unmeasurable characteristics of DevOps tool chains, constraints and bottlenecks organizations may face during DevOps implementations, and the risks that the measurement is misleading. Also, in this document we are addressing the outcomes of applying different DevOps practices on different organizations varying from electronics and telecom to defence and security. We list these practices as different lessons obtained during the project. |
D5.6 Pain Points | This report, WP5 Deliverable D5.6, documents the identification and definition of Pain Points of classical agile development and operations when trying to apply DevOps. The work has been carried out in two rounds, one in early in the project and one following the midterm, and the results have provided input to the further work in this work package as well as other work packages. Much of the work has been to define models and methods to be able to classify and map the Pain Points to WHERE the Pain Points occur in the process, and WHY do they occur. For mapping WHERE the Pain Points occur a DevOps practice model illustrating the different lifecycle phases of a DevOps process has been used. The model may be updated as we learn from open-ended questions used in identification of Pain Points. For mapping WHY Pain Points occur i.e., what are the underlaying deficiencies in the organisations, a capability model has been developed that comprises elements such as governance, processes, and tools. This model also supports defining what constitutes a good enough organizational capability to perform an effective DevOps practice. |
D6.1 Dissemination and exploitation plan | This report is the version V0.1 of deliverable 6.1 “Dissemination and exploitation plan” of the IoD project. It captures the outcomes of different project tasks. The general principles and target audience of the dissemination strategy are discussed first. The actual strategy describes the methods that will be used for dissemination and exploitation. Finally, the last section summarizes the results of the dissemination activities. |
D6.2 Dissemination and exploitation plan | This report is the second version of the deliverable “Dissemination and exploitation plan” of the IoD project. It captures the outcomes of different project tasks. The general principles and target audience of the dissemination strategy are discussed first. The actual strategy describes the methods that will be used for dissemination and exploitation. Finally, the last section summarizes the results of the dissemination activities |
D6.3 Dissemination and exploitation plan | This report is the third version of the deliverable “Dissemination and exploitation plan” of project IoD . It captures the outcomes of different project tasks. The general principles and target audience of the dissemination strategy are discussed first. The actual strategy describes the methods that will be used for dissemination and exploitation. Second section summarizes the results of the dissemination activities. Last section exposes the joint exploitation plans. |
D6.4 WP6 Demonstrations | This document contains the demonstrations that are carried out as the project activities by the participants of the IoD Celtic+ project. Each partner demonstrated its own contributions to the project. Here we provide a list of demonstration descriptions that is aiming to form the baseline for the rest of the demonstrators. In the next versions of this document, we plan to include the demonstrators from the rest of the consortium and make it available for the final review. |
D6.5 WP6 Demonstrations | This document contains the demonstrations that have been carried out as the project activities by the participants of the IoD Celtic-Next project. Each partner demonstrated its own contributions to the project. In this version of the document, we emphasize the fact that many partners collaborated within the scope of the project, and we provide the outcomes of such collaborations in this document. Here we provide brief descriptions of the demonstrations and mentioned when it took place as collaborations between several partners. |
D6.6 List of Publications | This document provides the list of the publications resulted with efforts of the project participants during the IoD project. Although IoD project spans a timeline where the publications efforts are greatly affected with the cancellation of public events, fairs, conferences due to COVID-19 pandemic, still the participants of the project showed interest in disseminating the project outcomes either through virtual means or other written forms. |
D7.1 Project Progress Report | This document compiles all the inputs provided by the partners in the Celtic-Next online reporting tool for the reporting period starting on the 1st of December 2018 to the 31st of May 2019. After a brief introduction of the reporting process put on place in our project, all partner reports are sorted out by workpackages (WP). For each WP, a summary of the partner’s contributions is given, with a list of WP related achievements and released deliverables for the reporting period. Finally, an overall subjective assessment of the status for the WP is given for each partner1. |
D7.2 Project Progress Report | This document compiles all the inputs provided by the partners in the Celtic-Next online reporting tool for the reporting period starting on the 1st of June 2019 to the 31st of November 2019. After a brief introduction of the reporting process put on place in our project, all partner reports are sorted out by workpackages (WP). For each WP, a summary of the partner’s contributions is given, with a list of WP-related achievements and released deliverables for the reporting period. Finally, an overall subjective assessment of the status for the WP is given for each partner. |
D7.3 Project Progress Report | This document compiles all the inputs provided by the partners in the Celtic-Next online reporting tool for the reporting period starting on the 1st of December 2019 to the 31st of May 2020. After a brief introduction of the reporting process put on place in our project, all partner reports are sorted out by work-packages (WP). For each WP, a summary of the partner’s contributions is given, with a list of WP-related achievements and released deliverables for the reporting period. Finally, an overall subjective assessment of the status for the WP is given for each partner. |
D7.4 Project Progress Report | This document compiles all the inputs provided by the partners in the Celtic-Next online reporting tool for the reporting period starting on the 1st of June 2020 to the 30th of November 2020. After a brief introduction of the reporting process put on place in our project, all partner reports are sorted out by work-packages (WP). For each WP, a summary of the partner’s contributions is given, with a list of WP-related achievements and released deliverables for the reporting period. Finally, an overall subjective assessment of the status for the WP is given for each partner. |
D7.5 Project Progress Report | This document compiles all the inputs provided by the partners in the Celtic-Next online reporting tool for the reporting period starting on the 1st of December 2020 to the 31st of May 2021. After a brief introduction of the reporting process put on place in our project, all partner reports are sorted out by work-packages (WP). For each WP, a summary of the partner’s contributions is given, with a list of WP-related achievements and released deliverables for the reporting period. Finally, an overall subjective assessment of the status for the WP is given for each partner. |