Publications
List of Publications
Business Informatics Group, TU Wien
Generating automatic as-built BIM models in conventional tunnel construction lifecycle
Dzan OpertaChristian Huemer
Operta, D. (2022). Generating automatic as-built BIM models in conventional tunnel construction lifecycle [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2022.97562
Risk-aware business process management using multi-view modeling: method and tool
Rafika ThabetDominik BorkAmine BoufaiedElyes LamineOuajdi KorbaaHervé PingaudKeywords: Software, Information Systems, Consistency, Multi-view modeling, Risk-aware business process management, Meta-modeling
Astract: Risk-aware Business Process Management (R-BPM) has been addressed in research since more than a decade. However, the integration of the two independent research streams is still ongoing with a lack of research focusing on the conceptual modeling perspective. Such an integration results in an increased meta-model complexity and a higher entry barrier for modelers in creating conceptual models and for addressees of the models in comprehending them. Multi-view modeling can reduce this complexity by providing multiple interdependent viewpoints that, all together, represent a complex system. Each viewpoint only covers those concepts that are necessary to separate the different concerns of stakeholders. However, adopt- ing multi-view modeling discloses a number of challenges particularly related to managing consistency which is threatened by semantic and syntactic overlaps between the viewpoints. Moreover, usability and efficiency of multi-view modeling have never been systematically evaluated. This paper reports on the conceptualization, implementation, and empirical evaluation of e-BPRIM, a multi-view modeling extension of the Business Process-Risk Management-Integrated Method (BPRIM). The findings of our research contribute to theory by showing, that multi-view modeling outperforms diagram-oriented modeling by means of usability and efficiency of modeling, and quality of models. Moreover, the developed modeling tool is openly available, allowing its adoption and use in R-BPM practice. Eventually, the detailed presentation of the conceptualization serves as a blueprint for other researchers aiming to harness multi-view modeling.
Thabet, R., Bork, D., Boufaied, A., Lamine, E., Korbaa, O., & Pingaud, H. (2021). Risk-aware business process management using multi-view modeling: method and tool. Requirements Engineering, 26(3), 371–397. https://doi.org/10.1007/s00766-021-00348-2
A technique for evaluating and improving the semantic transparency of modeling language notations
Dominik BorkBen RoelensKeywords: Modeling and Simulation, Software, Modeling language Notation, Concrete syntax, Semantic transparency, Empirical evaluation
Astract: The notation of a modeling language is of paramount importance for its efficient use and the correct comprehension of created models. A graphical notation, especially for domain-specific modeling languages, should therefore be aligned to the knowledge, beliefs, and expectations of the targeted model users. One quality attributed to notations is their semantic transparency, indicating the extent to which a notation intuitively suggests its meaning to untrained users. Method engineers should thus aim at semantic transparency for realizing intuitively understandable notations. However, notation design is often treated poorly-if at all-in method engineering methodologies. This paper proposes a technique that, based on iterative evaluation and improvement tasks, steers the notation toward semantic transparency. The approach can be efficiently applied to arbitrary modeling languages and allows easy integration into existing modeling language engineering methodologies. We show the feasibility of the technique by reporting on two cycles of Action Design Research including the evaluation and improvement of the semantic transparency of the Process-Goal Alignment modeling language notation. An empirical evaluation comparing the new notation against the initial one shows the effectiveness of the technique.
Bork, D., & Roelens, B. (2021). A technique for evaluating and improving the semantic transparency of modeling language notations. Software and Systems Modeling, 20(4), 939–963. https://doi.org/10.1007/s10270-021-00895-w
Evaluierungsmethoden digitaler Bausysteme -- Methoden und Werkezuge für die Entwicklung und Etablierung softwaregestützter Prozesse auf Baustellen
Leopold WinklerChristian HuemerKeywords:
Astract: Die Baubranche wird durch die Entwicklung digitaler Systeme in der Planung und Abwicklung von Bauprojekten sukzessiv beeinflusst. Dabei setzen Unternehmen und Auftraggeber je nach Größe auf Eigenentwicklungen oder auf dem Markt erhältliche Produkte. Dieser Beitrag zeigt, welche Möglichkeiten bestehen, um die Erstellung und Einführung digitaler Systeme in der Baubranche qualitativ und quantitativ zu bewerten.
Winkler, L., & Huemer, C. (2021). Evaluierungsmethoden digitaler Bausysteme -- Methoden und Werkezuge für die Entwicklung und Etablierung softwaregestützter Prozesse auf Baustellen. Bauaktuell, 5, 219–223. http://hdl.handle.net/20.500.12708/138374
Evaluating the arm TrustZone as an environment for rootkits : Analyzing the impact of a compromised secure world
Daniel MarthFlorian FankhauserThomas GrechenigKeywords: Arm TrustZone, rootkit, reverse engineering, memory manipulation
Astract: Mobile devices such as smartphones carry an increased amount of personal and confidential data. In order to protect sensitive services from malware, the Arm TrustZone logically divides the device into two so-called “worlds”. Critical services are running in an isolated execution environment called “secure world” which has its own operating system (OS). The regular OS and its applications are located in the “normal world” and can use services provided by the secure world. While the secure world memory is protected from the normal world, the secure world has full access to the normal world memory. Implementations of the Arm TrustZone are specific to the vendor and proprietary on currently relevant consumer devices. At the same time, security vulnerabilities have been discovered in all major implementations. Summarizing, the Arm TrustZone is isolated, proprietary, privileged, vulnerable and widespread. These properties are perfect preconditions for hosting advanced malware such as rootkits. Usage of the Arm TrustZone as an environment for rootkits has been suggested already back in 2013. Since then, no publications or implementations of rootkits utilizing the Arm TrustZone were presented to the best of our knowledge. Major challenge for a secure world rootkit is that there is no semantic interpretation of the normal world memory available. Reverse engineering of kernel data structures at runtime is required to implement rootkit features. Invariants are used to reconstruct compilation-dependent or randomized symbol addresses. This work makes the following contributions. 1) Design of a rootkit architecture utilizing the secure world. 2) Proof-of-concept implementation of rootkit functions supporting multiple recent Linux kernel versions as normal world OS and circumventing basic protection mechanisms. 3) Discussion of defensive techniques protecting the normal world from malware running in the secure world. Reconstructing the internal structures of the kernel depends on the underlying implementation. Linux is an actively developed project, thus kernel structures potentially change over time. Minor changes in the source code are compensated by the rootkit implementation. Stability of the developed rootkit is proven experimentally by testing it on various versions of the Linux kernel.
Marth, D. (2021). Evaluating the arm TrustZone as an environment for rootkits : Analyzing the impact of a compromised secure world [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2021.88999
Towards model-driven vertical integration : Using IEC 62264 and REA to facilitate real-time data exchange between manufacturing and enterprise systems
Philip Florian PolczerChristian HuemerKeywords: Model-Driven Engineering, Vertical Integration, IEC 62264, REA
Astract: Computer-integrated manufacturing is about digitizing, automating and connecting manufacturing processes. At the latest, the fourth industrial revolution, introducing the Internet of Things into manufacturing, led to the increasingly widespread adoption of connecting systems involved in manufacturing processes with each other. This integration is often making use of proprietary and customized solutions. Thus, it cannot be easily adapted when new systems are introduced or existing systems are updated. However, effective integration is an essential component of computer-integrated manufacturing. A way to solve this issue is to standardize the integration of systems by focusing on the manufacturing process instead of the systems. The integration should independent of the underlying systems. In this thesis, we propose two different approaches of connecting systems with each other by making use of existing standards and model-driven engineering. In particular, we make use of the IEC 62264 and ISO/IEC 15944-4:2015 standards in order to provide an implementation of a standardized model-driven integration of systems operating on different levels of a manufacturing process. In order to evaluate the two approaches, we implement an example manufacturing process and compare the approaches with each other. We show that a standardized vertical integration can be achieved by abstracting systems and focusing on the process itself, making the integration independent of the underlying systems.
Polczer, P. F. (2021). Towards model-driven vertical integration : Using IEC 62264 and REA to facilitate real-time data exchange between manufacturing and enterprise systems [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2021.73890
TPM 2.0 als Sicherheitsmaßnahme gegen Rootkits auf Linux-basierten Desktop-Systemen
Jasmin MarmsolerFlorian FankhauserThomas GrechenigKeywords: TPM 2.0, Rootkits, Secure Boot, Measured Boot, Linux boot process
Astract: Rootkits are a kind of malware that compromise components lower than the operating system for example the kernel, bootloader and BIOS. Rootkits pose a major threat to computer security as they operate with elevated privileges and are often hard to detect from the operating system level. A computer system requires a secure basis and a chain of trust on all levels up to the operating system to increase safety. To achieve this every component of the boot process is measured before being loaded and executed, this method is also known as measured boot. Secure Boot on the other side executes only components with a valid signature or a valid hash. The Trusted Platform Module (TPM) is a cryptographic microcontroller located on the computer’s motherboard. It securely stores the measurements of the boot process and can attest to the component’s integrity. This also means that not only users but also remote entities can check the system state. The TPM is a passive module which is called by other components and software. This thesis describes a concept to prevent the execution of rootkits on Linux-based desktop systems through a boot process with TPM 2.0. The concept is a combination of a secure and measured boot in which updating of components should not break the attestation or the trustworthiness of the system.
Marmsoler, J. (2021). TPM 2.0 als Sicherheitsmaßnahme gegen Rootkits auf Linux-basierten Desktop-Systemen [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2021.88000
Keywords: Tunnel Information Model, TIM, Building Information Model, BIM, Infrastructure Information Model, Digital Twin, Systematic Mapping Study
Astract: Using Information Systems (IS) throughout the life cycle of constructed facilities paved the way of computer-aided engineering in the field of automation in construction. Especially national digitization strategies and governmental mandates accelerated the application of Building Information Modelling (BIM) throughout the life cycle of civil infrastructure facilities. In the view of potential benefits of information models for tunnel life cycle, this study aims to identify the relevance of Tunnel Information Modelling (TIM) in the scientific community. Therefore, a design science methodology is applied in order to identify and classify relevant research work. The first part of the design science approach is a Systematic Mapping Study (SMS) following the guidelines by Petersen et al. [PVK15]. We conduct the literature classification in the field of TIM (i) to summarize research activities, (ii) to identify relevant publication trends and (iii) to uncover possible blind spots of the scientific work. In the second part of the design science methodology we develop a data mining artifact to automatize the SMS by adopting the data analysis process model Cross Industry Standard Process for Data Mining (CRISP-DM). In order to evaluate the results of the software artifact we use well-established classification metrics to measure how well the Data Mining (DM) approach may be used to automatize the literature identification and classification and, therefore, provide a set of relevant, state-of-the-art scientific studies. The main findings of this thesis are (i) the research effort in the area of TIM constantly increased between 2011 to 2019, (ii) governmental digitization and environmental strategies are major drivers of the increasing number of TIM related studies, (iii) the majority of studies concentrate on the application of information models during the design phase of a tunnel project, (iv) the usage of information models for continuous excavation receives more attention of the scientific community than its conventional counterpart, (v) a combination of Okapi BM25 and supervised text classification models yield positive performance measures by identifying relevant studies in the domain of TIM, and (vi) our term-based identification approach is not able to classify studies in regards to tunnel life cycle phases and excavation methods. The results of this thesis can assist researchers (i) to identify trends and state-of-the-art of the research domain, (ii) to identify open research issues and (iii) to propose new studies in the field of text classification.
Henglmüller, N. (2021). Trends in tunnel information modeling [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2021.78181
Keywords: model-driven engineering, cyber-physical systems, MDE, SysML, domain-specific modeling
Astract: Whether in factories, autonomous vehicles or smart buildings, cyber-physical systems (CPS) are finding increasing use in society. Since CPS, unlike traditional software systems, have a direct influence on their physical environment, monitoring the execution is an essential part of such systems. Just like the actual system, the monitoring application must be designed, developed, and tested. Mobile CPSs, in contrast to stationary CPSs, bring the additional requirement that instances can dynamically join, leave, or fail during execution time. This dynamic behavior must also be considered in the monitoring application. In this thesis a pipeline is presented for the model-driven development of CPS systems with mobile components including a cockpit application for monitoring and interacting with such a system. The pipeline starts with the formulation of the system and the CPSs it contains at an abstract level by the system architect using a modeling language designed for the system architect's needs. In a next step, this model is transformed to SysML 2 to get further extended and specified by system engineers on a more technical level. In the last step of the pipeline the SysML 2 model is used to generate code for the CPS devices, a system-wide digital twin and the cockpit application mentioned above. This cockpit enables the operator to configure and apply the monitoring as well as the interaction with the system during runtime.
Fend, A. (2021). Monitoring mobile cyber-physical systems using model-driven engineering [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2021.88541
A data analysis and maintenance framework for planning tasks of railway infrastructures
Alexander Maximilian WurlGerti KappelKeywords: Data Analytics, Data Integration, Industrial Data Quality, Data Analysis, Robust Regression, Feature Selection, Predictive Asset Management, Obsolescence Management, Digital Companion, Railway Infrastructure
Astract: The increasing extension of railway networks worldwide entails highly complex settings of signaling systems. For maintenance planning tasks, decision support approaches are required as analyzing data of differently structured sources may be extremely time consuming. Although existing data analytics techniques have proven to be able to extract and analyze large amounts of data in different fields of application, in the railway domain more advanced approaches are needed to tackle an interplay of complex industrial data settings. This includes aggregating data from different formats, identifying ambiguities, and visualizing most important information. To support maintenance decisions, a modern approach has to be able to process data in a coherent and consistent manner while minimizing the time required for interventions. In this industrial thesis, we present a procedural approach that combines new techniques for processing configuration and operational data of signal systems to support decision-making in maintenance tasks. Since configuration and operation data reveal different data structures and formats, in the first step the approach ensures high quality data integration into a data warehouse. As storing duplicate or contradicting information may have business-critical effects, an interactive technique provides an efficient process where the user resolves ambiguous data classifications. Once data is stored in the data warehouse, information of hardware components and its properties, i.e., features, can be used as variables. This allows the following technique to build a regression model to estimate the quantity of components based on a set of input features, but also ensures the identification of relevant features related to a hardware component. The resulting regression model is combined with a stochastic model to predict the number of hardware components needed for existing and planned systems in the future. Instead of showing plain prediction results, we propose advanced visualizations to support technical engineers to quickly grasp all important information including the uncertainty of the prediction. Extending the mere predictions, we propose the concept of a digital companion which prescriptively recommend maintenance actions in system configurations. All our findings have been evaluated in continuous collaboration with experts of the railway domain. Ultimately, the techniques developed have been integrated into the railway business which confirms the relevance and usefulness of our work.
Wurl, A. M. (2021). A data analysis and maintenance framework for planning tasks of railway infrastructures [Dissertation, Technische Universität Wien]. reposiTUm. https://doi.org/10.34726/hss.2022.100321