13th European Conference on Software Architecture (ECSA) - 9-13 September 2019, Paris France
ECSA Week in a nutshell
|Monday, 9th September||Workshop day||SACBD, SAEroCon, DeMeSSA|
|Tuesday, 10th September||Workshop day|
|Wednesday, 11th September||Main conference||K1, RS1, RS2, RS3, DS1, K2, DS2, DS3|
|Thursday, 12th September||Main conference||K3, T1, IS1, K4, IS2, WS1, P1|
|Friday, 13th September||Main conference||K5, RS4|
[Full Paper] Amine El Malki and Uwe Zdun: Guiding Architectural Decision Making on Service Mesh Based Microservice Architectures
Microservices are becoming the de-facto standard way for software development in the cloud and in service-oriented computing. Service meshes have been introduced as a dedicated infrastructure for managing a network of containerized microservices, in order to cope with the complexity, manageability, and interoperability challenges in especially large-scale microservice architectures.
Unfortunately so far no dedicated architecture guidance for designing microservices and choosing among technology options in a service mesh exist. As a result, there is a substantial uncertainty in designing and using microservices in a service mesh environment today. To alleviate this problem, we have performed a model-based qualitative in-depth study of existing practices in this field in which we have systematically and in-depth studied 40 reports of established practices from practitioners. In our study we modeled our findings in a rigorously specified reusable architectural decision model, in which we identified 14 architectural design decisions with 47 decision outcomes and 77 decision drivers in total. We estimated the uncertainty in the resulting design space with and without use of our model, and found that a substantial uncertainty reduction can be potentially achieved by applying our model.
[Full Paper] Evangelos Ntentos, Uwe Zdun, Konstantinos Plakidas, Daniel Schall, Fei Li and Sebastian Meixner: Supporting Architectural Decision Making on Data Management in Microservice Architectures
Today many service-based systems follow the microservice architecture style. As microservices are used to build distributed systems and promote architecture properties such as independent service development, polyglot technology stacks including polyglot persistence, and loosely coupled dependencies, architecting data management is crucial in most microservice architectures. Many patterns and practices for microservice data management architectures have been proposed, but are today mainly informally discussed in the so-called "grey literature": practitioner blogs, experience reports, and system documentations. As a result, the architectural knowledge is scattered across many knowledge sources that are usually based on personal experiences, inconsistent, and, when studied on their own, incomplete. In this paper we report on a qualitative, in-depth study of 35 practitioner descriptions of best practices and patterns on microservice data management architectures. Following a model-based qualitative research method, we derived a formal architecture decision model containing 325 elements and relations. Comparing the completeness of our model with an existing pattern catalog, we conclude that our architectural decision model substantially reduces the effort needed to sufficiently understand microservice data management decisions, as well as the uncertainty in the design process.
[Full Paper] Luís Nunes, Nuno Santos and António Rito Silva: From a Monolith to a Microservices Architecture: An Approach Based on Transactional Contexts
Microservices have become the software architecture of choice for business applications. Initially originated at Netflix and Amazon, they result from the need to partition, both, software development teams and executing components, to, respectively, foster agile development and horizontal scalability. Currently, there is a large number of monolith applications that are being migrated to a microservices architecture. This article proposes the identification of business applications transactional contexts for the design of microservices. Therefore, the emphasis is to drive the aggregation of domain entities by the transactional contexts where they are executed, instead of by their domain structural inter-relationships. Additionally, we propose a complete workflow for the identification of microservices together with a set of tools that assist the developers on this process. The comparison of our approach with another software architecture tool and with an expert decomposition in two case studies revealed high precision values, which reflects that accurate service candidates are produced, while providing visualization perspectives facilitates the analysis of the impact of the decomposition on the application business logic.
[Full Paper] Ken Power and Rebecca Wirfs-Brock: An Exploratory Study of Naturalistic Decision Making in Complex Software Architecture Environments
Architects always make decisions in some context. That context shifts and changes dynamically. Different decision-making strategies are appropriate in different contexts. Architecture decisions are at times made under conditions of time pressure, high stakes, uncertainty, and with too little information. At other times, decision-makers have sufficient time to reflect on the decision and consider alternatives. Understanding context is critical to choosing appropriate approaches to architecture decision making. Naturalistic Decision Making (NDM) explains how people make decisions under real-world conditions. This paper investigates NDM in software architecture and studies architecture decisions in their environment and decision-making context. The research approach includes a case study of large technology organizations consisting of a survey, multiple focus groups, and participant observation. Previous studies that touch on NDM in software architecture have mainly focused on decision-making processes or tools or developing decision models. This paper provides three contributions. First, we build on previous studies by other researchers to produce an in-depth exploration of NDM in the context of software architecture. We focus on Recognition-Primed Decision (RPD) making as an implementation of NDM. Second, we present an examination of the decisions made by experienced architects under conditions that can be considered naturalistic. Third, we provide examples and recommendations that help software architects determine when an NDM approach is appropriate for their context.
[Full Paper] Hasan Sozer: Evaluating the Effectiveness of Multi-level Greedy Modularity Clustering for Software Architecture Recovery
Software architecture recovery approaches mainly analyze various types of dependencies among software modules to group them and reason about the high-level structure of a system. These approaches employ a variety of clustering techniques. In this paper, we present an empirical evaluation of a modularity clustering technique used for software architecture recovery. We use five open source projects as subject systems for which the ground-truth architectures were known. This dataset was previously prepared and used for evaluating four state-of-the-art architecture recovery approaches and their variants as well as two baseline clustering algorithms. We extended that empirical study with an evaluation of multi-level greedy modularity clustering. Results showed that the accuracy of this algorithm is comparable to that of the best known algorithm so far, outperforming it in approximately half of the cases. In addition, it scales better to very large systems for which it runs orders-of-magnitude faster than all the other algorithms.
[Short Paper] Matthias Galster, Fabian Gilson and Francois Georis: What Quality Attributes Can we Find in Product Backlogs? A Machine Learning Perspective
Automatically identifying quality attributes (e.g., security, performance) in agile user stories could help architects reason about early architecture design decisions before analyzing a product backlog in detail (e.g., through a manual review of stories). For example, architects may already get the ``bigger picture'' of potential architectural key drivers and constraints. Applying a previously developed method to automatically identify quality attributes in user stories, in this paper we investigate a) what quality attributes are potentially missed in an automatic analysis of a backlog, and b) how the importance of quality attributes (based on the frequency of their occurrence in a backlog) differs to that of quality attributes identified in a manual review of a backlog. As in previous works, we analyzed the backlogs of 22 publicly available projects including 1,675 stories. For most backlogs, automatically identified quality attributes are a subset of quality attributes identified manually. On the other hand, the automatic identification would usually not find more (and therefore potentially irrelevant) quality attributes than a manual review. We also found that the ranking of quality attributes differs between the automatically and manually analyzed user stories, but the overall trend of rankings is consistent. Our findings indicate that automatically identifying quality attributes can reduce the effort of an initial backlog analysis, but still provide useful (even though high-level and therefore potentially incomplete) information about quality attributes.
[Short Paper] Ivan Lujic and Hong-Linh Truong: Architecturing Elastic Edge Storage Services for Data-Driven Decision Making
In the IoT domain, a massive number of smart sensors, devices and equipment produce a variety of data at unprecedented scale. To analyze these produced data at the right time for decision making, proposed data analytics at the network edge proved to be a promising solution. Nevertheless, unlike scalable cloud-based storage services, edge storage has limited capacities posing a crucial challenge for maintaining only most relevant data for edge analytics. Currently, these problems are addressed mostly considering traditional cloud-based database perspectives including storage optimization and resource elasticity, while separately investigating data analytics approaches and system operations. For better support of future edge analytics, in this work, we analyze requirements and dependencies in edge data services together with edge data analytics support. We propose a novel, holistic approach for architecturing elastic edge storage services, featuring three aspects, namely, data/system characterization (e.g., metrics, key properties), system operations (e.g., elasticity, data management) and data processing utilities (e.g., approximation, prediction). In this regard, we present seven principles for the architecture design and engineering of edge data services.
[Full Paper] Angelika Musil, Juergen Musil, Danny Weyns and Stefan Biffl: Continuous Adaptation Management in Collective Intelligence Systems
Collective Intelligence Systems (CIS), such as wikis and social networks, enable enhanced knowledge creation and sharing at organization and society levels. From our experience in R&D projects with industry partners and in-house CIS development, we learned that these platforms go through a complex evolution process. A particularly challenging aspect in this respect represents uncertainties that can appear at any time in the life-cycle of such systems. A prominent way to deal with uncertainties is adaptation, i.e., the ability to adjust or reconfigure the system in order to mitigate the impact of the uncertainties. However, there is currently a lack of consolidated design knowledge of CIS-specific adaptation and methods for managing it. To support software architects, we contribute an architecture viewpoint for continuous adaptation management in CIS, aligned with ISO/IEC/IEEE 42010. We evaluated the viewpoint in a case study with a group of eight experienced engineers. The results show that the viewpoint is well-structured, useful and applicable, and that its model kinds cover well the scope to handle different CIS-specific adaptation problems.
[Full paper] Tobias Wägemann, Ramin Tavakoli Kolagari and Klaus Schmid: ADOOPLA -- Product-Line- and Product-Level PLA Optimization
Product lines of software-intensive systems have a great diversity of features and products, which leads to vast design spaces that are difficult to explore. In addition, finding optimal product line system architectures usually requires a consideration of several quality trade-offs at once, involving both product-level as well as product-line-wide criteria. This challenge cannot be solved manually for all but the smallest problems, and can, therefore, benefit from automated support. In this paper, we propose ADOOPLA, a tool-supported approach for the optimization of product line system architectures. In contrast to existing approaches where product-level approaches only support product-level criteria and product-line oriented approaches only support product-line-wide criteria, our approach integrates criteria from both levels in the optimization of product line architectures. Further, the approach can handle multiple objectives at once, supporting the architect in exploring the multi-dimensional Pareto-front of a given problem. We describe the theoretical principles of the ADOOPLA approach and demonstrate its application to a simplified case study from the automotive domain.
[Full paper] Michael Mayrhofer, Christoph Mayr-Dorn, Alois Zoitl, Georg Weichhart, Ouijdane Guiza and Alexander Egyed: Assessing Adaptability of Software Architectures for Cyber Physical Production Systems
Cyber physical production systems (CPPS) focus on increasing the flexibility and adaptability of industrial production systems, systems that comprise hardware such as sensors and actuators in machines as well as software controlling and integrating these machines.
The requirements of customised mass production imply that control and integration software needs to be adaptable without interrupting production. Software architecture plays a central role in achieving run-time adaptability. In this paper we describe five architectures, that define the structure and interaction of software components in CPPS. Three of them already are already well-developed and used in the field. The remaining two we envision to overcome limitations of the first three architectures.
We analyse the architectures' ability to support adaptability based on Taylor et al.'s BASE framework. We compare the architectures and discuss how the implications of CPPS affect the analysis with BASE. We further highlight what lessons from ``traditional'' software architecture research can be applied to arrive at adaptable software architectures for cyber physical production systems."
[Full Paper] Rajitha Yasaweerasinghelage, Mark Staples, Hye-Young Paik and Ingo Weber: Optimising architectures for performance, cost, and security
Deciding on the optimal architecture of a software system is difficult as the number of design alternatives and component interactions can be overwhelmingly large. Adding security considerations can make architecture evaluation even more challenging. Existing model-based approaches for architecture optimization usually focus on performance and cost constraints. This paper proposes a model-based architecture optimization approach that advances the state-of-the-art by adding security constraints. The proposed approach is implemented in a prototype tool, by extending Palladio Component Model (PCM) and PerOpteryx. Through a laboratory-based evaluation study of a multi-party confidential data analytics system, we show how our tool discovers distinct architectural design options on the Pareto frontier of cost and performance.
[Full Paper] Martina De Sanctis, Romina Spalazzese and Catia Trubiani: QoS-based Formation of Software Architectures in the Internet of Things
Architecting Internet of Things (IoT) systems is very challenging due to the heterogeneity of connected objects and devices and their dynamic variabilities such as mobility and availability. The complexity of this scenario is exacerbated when considering Quality-of-Service (QoS) constraints. Indeed, reasoning about multiple quality attributes, e.g., power consumption and response time, makes the management of IoT systems even more difficult since it is necessary to jointly evaluate multiple system characteristics. The focus of this paper is on modelling and analysing QoS-related characteristics in IoT architectures. To this end, we leverage on the concept of Emergent Architectures (EAs), i.e., a set of things temporarily cooperating to achieve a given goal, by intertwining EAs with QoS-related concepts. Our approach provides the automated formation of the most suitable EAs by means of a QoS-based optimisation problem. We developed an IoT case study and experimental results demonstrate the effectiveness of the proposed approach.
[Full Paper] Cristian Camilo Castellanos Rodriguez, Boris Rainiero Perez Gutierrez, Carlos A. Varela, Maria Del Pilar Villamil and Dario Correal: A Survey on Big Data Analytics Solutions Deployment
There are widespread and increasing interest in big data analytics (BDA) solutions to enable data collection, transformation, and predictive analyses. The development and operation of BDA application involve business innovation, advanced analytics and cutting-edge technologies which add new complexities to the traditional software development. Although there is a growing interest in BDA adoption, successful deployments are still scarce (a.k.a., the ``Deployment Gap'' phenomenon). This paper reports an empirical study on BDA deployment practices, techniques and tools in the industry from both the software architecture and data science perspectives to understand research challenges that emerge in this context. Our results suggest new research directions be tackled by the software architecture community. In particular, competing for architectural drivers, interoperability, and deployment procedures in the BDA field are still immature or have not been adopted in practice.
[Short Paper] Axel Busch, Dominik Fuchß, Maximilian Eckert and Anne Koziolek: Assessing the Quality Impact of Features in Component-based Software Architectures
In modern software development processes, existing software components are increasingly used to implement functionality instead of developing it from scratch. Reuse of individual components or even more complex subsystems leads to more cost-efficient development and higher quality of software.
Subsystems often offer a variety of features whose use is associated with unclear effects on the quality attributes of the software architecture, such as performance. It is unclear, especially in the design phase, whether the quality requirements for the overall system can be met by using a certain feature of a particular subsystem. After initial selection, features must be incorporated in the target architecture.
Due to a multitude of possibilities of placing the subsystem in the target system to be used, many architectural candidates may result which have to be semi-automatically evaluated in existing decision support solutions. The approach presented here enables software architects to automatically evaluate with the help of software architecture models the effects on quality of using individual features in existing software architecture. The result helps to automatically evaluate design decisions regarding features and to decide whether their use is compatible with the quality requirements. We show the benefits of our approach using different decision scenarios driven by features and their placement alternatives. All scenarios are automatically evaluated, demonstrating how decisions can be made to best meet the requirements.
[Short Paper] Michael Striewe: Components and Design Alternatives in E-Assessment Systems
In the domain of e-learning and e-assessment, many different components are used to realize particular system features. Even for similar features using similar components, there are different ways of realization in terms of connection and integration. This paper presents results from the literature review and design-space explorations that result in a catalog of components and an overview of design alternatives.