Publications

List of Publications

Business Informatics Group, TU Wien

Reset Filters

TROPIC

Manuel WimmerGerti KappelJohannes SchoenboeckAngelika KuselWerner RetschitzeggerWieland Schwinger

View .bib

Handle: 20.500.12708/52797; Year: 2009; Issued On: 2009-01-01; Type: Publication; Subtype: Inproceedings; Peer Reviewed:

Keywords:
Astract: Model transformation languages, the cornerstone of Model- Driven Engineering, often lack mechanisms for abstraction, reuse and debugging. We propose a model transformation framework providing different abstraction levels together with an extensible library of predefined transformations and a dedicated runtime model in terms of Coloured Petri Nets for transformation execution and debugging.

Wimmer, M., Kappel, G., Schoenboeck, J., Kusel, A., Retschitzegger, W., & Schwinger, W. (2009). TROPIC. In Proceeding of the 24th ACM SIGPLAN conference companion on Object oriented programming systems languages and applications - OOPSLA ’09. ACM. https://doi.org/10.1145/1639950.1640013

Right or Wrong? - Verification of Model Transformations using Colored Petri Nets

Manuel WimmerGerti KappelAngelika KuselWerner RetschitzeggerJohannes SchönböckWieland Schwinger

View .bib

Handle: 20.500.12708/52798; Year: 2009; Issued On: 2009-01-01; Type: Publication; Subtype: Inproceedings; Peer Reviewed:

Keywords:
Astract: Model-Driven Engineering (MDE) places models as firstclass artifacts throughout the software lifecycle requiring the availability of proper transformation languages. Most of today's approaches use declarative rules to specify a mapping between source and target models which is then executed by a transformation engine. Transformation engines, however, most often hide the operational semantics of the mapping and operate on a considerable lower level of abstraction, thus hampering debugging. To tackle these limitations we propose a framework called TROPIC (Transformations on Petri Nets in Color) providing a DSL on top of Colored Petri Nets (CPNs) to specify, simulate, and formally verify model transformations. The formal underpinnings of CPNs enables simulation and veri fication of model transformations. By exploring the constructed state space of CPNs we show how prede fined behavioral properties as well as custom state space functions can be applied for observing and tracking origins of errors during debugging.

Wimmer, M., Kappel, G., Kusel, A., Retschitzegger, W., Schönböck, J., & Schwinger, W. (2009). Right or Wrong? - Verification of Model Transformations using Colored Petri Nets. In Proceedings of the 9th OOPSLA Workshop on Domain-Specific Modeling (DSM´09). Helsinki Business School. http://hdl.handle.net/20.500.12708/52798

A Petri Net Based Debugging Environment for QVT Relations

Manuel WimmerGerti KappelJohannes SchoenboeckAngelika KuselWerner RetschitzeggerWieland Schwinger

View .bib

Handle: 20.500.12708/52805; Year: 2009; Issued On: 2009-01-01; Type: Publication; Subtype: Inproceedings;

Keywords:
Astract: In the Model-Driven Architecture (MDA) paradigm the Query/View/Transformation (QVT) standard plays a vital role for model transformations. Especially the high-level declarative QVT Relations language, however, has not yet gained widespread use in practice. This is not least due to missing tool support in general and inadequate debugging support in particular. Transformation engines interpreting QVT Relations operate on a low level of abstraction, hide the operational semantics of a transformation and scatter metamodels, models, QVT code, and trace information across different artifacts. We therefore propose a model-based debugger representing QVT Relations on bases of TROPIC, a model transformation language utilizing a variant of Colored Petri Nets (CPNs). As a prerequisite for convenient debugging, TROPIC provides a homogeneous view on all artifacts of a transformation on basis of a single formalism. Besides that, this formalism also provides a runtime model, thus making the afore hidden operational semantics of the transformation explicit. Using an explicit runtime model allows to employ model-based techniques for debugging, e.g., using the Object Constraint Language (OCL) for simply defining breakpoints and querying the execution state of a transformation.

Wimmer, M., Kappel, G., Schoenboeck, J., Kusel, A., Retschitzegger, W., & Schwinger, W. (2009). A Petri Net Based Debugging Environment for QVT Relations. In 2009 IEEE/ACM International Conference on Automated Software Engineering. IEEE International Conference on Automated Software Engineering (ASE), Aukland, New Zealand, Non-EU. IEEE. https://doi.org/10.1109/ase.2009.99

Problem space and special characteristics of security testing in live and operational environments of large systems exemplified by a nationwide IT infrastructure

Christian SchanesFlorian FankhauserThomas GrechenigMichael SchaffererKai BehningDieter Hovemeyer

View .bib

Handle: 20.500.12708/53068; Year: 2009; Issued On: 2009-01-01; Type: Publication; Subtype: Inproceedings; Peer Reviewed:

Keywords:
Astract: The paper discusses foundations and requirements for testing security robustness aspects in operational environments while adhering to defined protection values for data. It defines the problem space and special characteristics of security testing in large IT infrastructures. In this area there are different environments with varying characteristics, e.g., regarding confidentiality of data. Common environments based on an existing IT project are defined. Testing in dedicated test environments is state of the art, however, sometimes this is not sufficient and testing in operational environments is required. Case studies showed many restrictions in the security test process, e.g., limited access for testers, which have to be addressed. The problems of testing in these operational environments are pointed out. Experiences and some current solution approaches for testing these special environments are shown (e.g., usage of disaster/recovery mechanism).

Schanes, C., Fankhauser, F., Grechenig, T., Schafferer, M., Behning, K., & Hovemeyer, D. (2009). Problem space and special characteristics of security testing in live and operational environments of large systems exemplified by a nationwide IT infrastructure. In Advances in System Testing and Validation Lifecycle (pp. 161–166). IEEE. http://hdl.handle.net/20.500.12708/53068

Profitability Analysis of Workflow Management Systems

Horst GruberChristian Huemer

View .bib

Handle: 20.500.12708/53082; Year: 2009; Issued On: 2009-01-01; Type: Publication; Subtype: Inproceedings; Peer Reviewed:

Keywords:
Astract: Workflow technology promises an increase in efficiency in the execution of business processes. The technology is widely accepted, but often the high costs exceed the promised benefits. Thus, it is desirable to calculate the profitability prior to investing into workflow technology. After an investment into workflow management systems (WFMS), it has to be verified whether the expected benefits have been realized or not. In this paper we present a method that covers both, the cost-benefit-ratio calculations specially customized for WFMS and the calculation of the realized savings. The profitability analysis is based on simple measurable performance indicators that consider the tangible calculation of costs as well as the quantitative and qualitative benefits. Long time practical experience in implementing and operating workflow management supported the design of the method. The method presented in this paper has been successfully used in the IT company of a banking corporation.

Gruber, H., & Huemer, C. (2009). Profitability Analysis of Workflow Management Systems. In Proceedings of the 2009 IEEE Conference on Commerce and Enterprise Computing (CEC 2009) (pp. 233–238). IEEE Computer Society. http://hdl.handle.net/20.500.12708/53082

Verbesserung der Qualität in der Massenlehre durch die Moodle-Adaptierung TUWEL

Marion ScholzMartina Seidl

View .bib

Handle: 20.500.12708/84888; Year: 2009; Issued On: 2009-01-01; Type: Presentation; Subtype: Presentation;

Keywords:
Astract: Am Institut für Softwaretechnik und Interaktive Systeme (IFS) der TU Wien wird seit 5 Semestern TUWEL , die Moodle-Adaptierung der Technischen Universität Wien, erfolgreich als Kommunikations- und Content Management System sowie als E-Learning Plattform eingesetzt. TUWEL erleichtert und unterstützt besonders die Abwicklung von sehr großen Lehrveranstaltungen. Im Folgenden geben wir einen Überblick über die Herausforderungen in der durch E-Learning Plattformen unterstützten Massenlehre und präsentieren anhand der Lehrveranstaltung "Einführung in die Objektorientierte Modellierung" (OOM) wie wir uns diesen Herausforderungen durch den Einsatz von TUWEL stellen.

Scholz, M., & Seidl, M. (2009). Verbesserung der Qualität in der Massenlehre durch die Moodle-Adaptierung TUWEL. 6. Internationale Österreichische MoodleMoot.at, TU Wien, Austria. http://hdl.handle.net/20.500.12708/84888

Managing event streams for querying complex events

Szabolcs RozsnyaiChristian HuemerStefan Biffl

View PDF View .bib

Handle: 20.500.12708/10803; Year: 2008; Issued On: 2008-01-01; Type: Thesis; Subtype: Doctoral Thesis;

Keywords: Complex Event Processing, Event Stream Processing, Event-Based Systems, Event-Driven Architectures
Astract: Nowadays, business processes evolved to networked workflows that are complex and executed in parallel with little human involvement to meet the needs of today's agile and adaptive business. Contemporary business requirements yaw for agility, flexibility and service orientation. A simplified summarization of this widely discussed and necessary business trend can be reduced to the demand, that today's businesses have to adapt their processes and organizations faster than their competitors. Business organizations that are able to handle critical business events faster than their competitors will end up us winners in today's globalized and fast business.
The pillars of such business models are loosely coupled, distributed and service- or event driven-oriented systems that generate huge amounts of events at various granularity levels. The lack of tracking those events and maintaining the causal relationships and traceability between those events, as well as aggregating them to high level events or correlating them, is a problem that is currently investigated by many research groups.
Event-based systems are increasingly gaining a widespread attention for such classes of problems, that require integration with loosely coupled and distributed systems for time-critical business solutions. The field of event-based or event-processing systems is a quite young area of research and is mainly influenced by the publish-subscribe paradigm and relational database and later on by Active- and Zerolatency data warehousing. A promising solution for these problems is Complex Event Processing (CEP). The term of Complex Event Processing (CEP) was first introduced by David Luckham in his book The Power of Events and defines a set of technologies to process large amounts of events, utilizing them to monitor, steer and optimize the business in real time. A CEP system continuously processes and integrates the data included in events without any batch processes for extracting and loading data from different sources and storing it to a data warehouse for further processing or analysis. CEP solutions capture events from different sources, with different time order and take events with various relationships between eachother into account.
The contributions of this dissertation are settled in the research area of event processing systems with a special focus on CEP and Event Processing- and Query Languages. The results of this dissertation provide the research community as well as interested parties with a generic component model for event-based systems. The introduced model has been successfully evaluated through the implementation of the Event-Base. SARI-SQL is an integral part of the Event-Base and its implementation was a major challenge in terms retrieving events and their correlations in a reasonable time. The presented work set a special focus on a clean and expressive language design in order to encapsulate all the event-related entities.
Furthermore an emphasis was set on an efficient design of the query preparation and evaluation architecture that allows attaching different query optimizer strategies. With the introduced optimizer strategies the performance of queries on single-value types (which applies in 80% of the cases) is directly correlating with the underlying RDBMS performance constraints and thus creates only a small overhead. The future work on SARI-SQL includes efforts in optimizing the strategies of handling nested attribute types of events. This includes query analysis procedures and execution planning strategies in order to reduce the number of in-memory post-evaluation operations.
The presented work is part of a long-term research effort aiming at designing and developing a comprehensive event analysis toolset that allows users to query and analyze large repositories of real-time and historical events from various sources. In addition the goal is to consolidate and create a rich unified event model for event-based systems which can be supported by a wide range of event-based systems. A key focus of future research is also set on the aspect of the visualization of events with respect to their temporal occurrence, their correlation with other events, and event clusters.

Rozsnyai, S. (2008). Managing event streams for querying complex events [Dissertation, Technische Universität Wien]. reposiTUm. https://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-26307

Business process modelling - languages, goals, and variabilities

Birgit KorherrChristian HuemerGerti Kappel

View PDF View .bib

Handle: 20.500.12708/10872; Year: 2008; Issued On: 2008-01-01; Type: Thesis; Subtype: Doctoral Thesis;

Keywords: Business Process Modelling, Goals, Measures, Languages, Variabilities, UML, Profile, Metamodel
Astract: Over the last decade more and more companies started to optimize their business processes in a way to meet its business goals. They develop business process mod- els defining which activities have to be executed in which order under which con- ditions by whom and by using which resources. For this purpose a lot of different approaches to business process modelling have been developed, which resulted in many different Business Process Modelling Languages (BPMLs). The definition of a business process has to cover many different aspects (e.g. con- trol flow, organizational view, data view, etc.). A perfect business process modelling approach would address all the different aspects. Unfortunately, none of the exist- ing approaches provides concepts for addressing all of these aspects.
Each of them concentrates on some aspects. The focus on certain aspects is mainly due to the different applications areas, e.g. business engineering or software engineering etc. Although BPMLs are well established in industry and science, a comprehensive evaluation or a framework for an evaluation to compare the different BPMLs is still missing. Thus, it is the goal of this thesis to provide an evaluation framework for the comparison of BPMLs and to apply this framework in the evaluation of the currently most popular BPMLs. The resulting framework is based on a generic metamodel that captures all of the concepts appearing in any of the state-of-the-art BPMLs. On a high level this framework addresses the following views:
Business Process Context Perspective, Behavioural Perspective, Functional Perspective, In- formational Perspective, and Organisational Perspective. An evaluation based on this framework checks whether the certain aspects in each of these perspectives is supported by the concepts of each of the considered BPMLs. In the evaluation of this thesis, we used the following languages: UML 2 Activity Diagram, Business Process Modelling Notation, Event Driven Process Chain, IDEF3, Petri Net, Role Activity Diagram. According to the evaluation we were able to identify three main problems in current BPMLs. The first problem is that the definition of the dependency be- tween business processes and their supporting software systems is inadequately supported. In our approach we support the elicitation of requirements from busi- ness process models for the software systems to be developed by extending current BPMLs with software requirements and components to ensure a business-goal ori- ented software development. The second problem concerns the variability of similar, but well-distinguished software products within a software product line. These software products not only differ in its structural definition, but also in the process to create them. Today, vari- ability modelling is a domain specific modelling technique that is limited to the structural definition of similar software products. In our approach we extend the concepts of variability modeling to integrate the dynamical aspects into the UML. The resulting approach is based on a well defined dependency between UML class diagrams and UML activity diagrams. The third problem is that current conceptual BPMLs do not provide explicit mod- elling means for process goals and their measures. The modelling of goals and its monitoring is a critical step in business process modeling. Hence, we extend the metamodels of UML 2 AD, EPC and BPMN with business process goals and per- formance measures. These concepts become explicitly visible in the corresponding models. Furthermore, a mapping of the performance measures onto the Business Process Execution Language (BPEL) enables their monitoring in an execution envi- ronment.

Korherr, B. (2008). Business process modelling - languages, goals, and variabilities [Dissertation, Technische Universität Wien]. reposiTUm. https://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-21515

Integration von crosscutting concerns in aspectWebML

Cornelia TomasekAndrea SchauerhuberManuel WimmerGerti Kappel

View PDF View .bib

Handle: 20.500.12708/10906; Year: 2008; Issued On: 2008-01-01; Type: Thesis; Subtype: Diploma Thesis;

Keywords: aspectWebML, weaving, crosscutting concerns, modeling, model-driven softwaredevelopment, aspect-orientation
Astract: Ubiquitous web applications follow the anytime/anywhere/anymedia paradigm, meaning they are individually adapted to the context in which they are used, such as time, location and device. The implementation of such customizations are often complex, since a single change often does not only affect a specific part of the web application but instead a number of locations spread across the various levels of a web application, namely content, hypertext, and presentation. In the domain of model-driven software engineering, most web modeling languages do not take this peculiarity of customizations into account. Typically customization functionality is modeled on top of the core model and directly introduced to a number of locations. This approach however increases complexity of maintenance and extensibility and thus leads to an inefficient development process. In aspectWebML, an extension to the web modeling language WebML, customizations are instead seen as so-called crosscutting concerns and treated with the help of special aspect-oriented concepts. These allow for a clear separation of core functionality and crosscutting concerns in models, whereby a reduction in complexity of the models, an improvement in maintainability, as well as reusability of individual customizations can be accomplished.
Nevertheless, as part of the modeling process this clear separation of core functionality and crosscutting concerns has to be met with an adequate mechanism which integrates the separated parts again into a whole model to reach the result of the modeling process.
The goal of this work is therefore the implementation of an algorithm to integrate crosscutting concerns in aspectWebML on the basis of the existing aspect-oriented concepts. The requirements for such an integration will be analyzed and an appropriate technology will be chosen to implement the 13 different possibilities for modeling of customization scenarios in aspectWebML. The algorithm is then integrated into an existing modeling toolkit, which was created using EMF (Eclipse Modeling Framework) and enhanced with various functionalities to support modeling and will then be tested against a number of different customization scenarios in a case study.

Tomasek, C. (2008). Integration von crosscutting concerns in aspectWebML [Diploma Thesis, Technische Universität Wien]. reposiTUm. https://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-18340

From Mining to Mapping and Roundtrip Transformations : a systematic approach to model-based tool integration

Manuel WimmerWerner RetschitzeggerGerti Kappel

View PDF View .bib

Handle: 20.500.12708/11032; Year: 2008; Issued On: 2008-01-01; Type: Thesis; Subtype: Doctoral Thesis;

Keywords:
Astract: Model-Driven Engineering (MDE) gains momentum in academia as well as in practice. A wide variety of modeling tools is already available supporting different development tasks and advocating different modeling languages. In order to fully exploit the potential of MDE, modeling tools must work in combination, i.e., a seamless exchange of models between different modeling tools is crucial for MDE. Current best practices to achieve interoperability use model transformation languages to realize necessary mappings between concepts of the metamodels defining the modeling languages supported by different tools. However, the development of such mappings is still done in an ad-hoc and implementation-oriented manner which simply does not scale for large integration scenarios. The reason for this is twofold:
first, various modeling languages are not based on metamodeling standards but instead define proprietary languages rather focused on notational aspects, and second, existing model transformation languages are too fine granular to express mappings on a highlevel of abstraction and lack appropriate reuse mechanisms for already existing integration knowledge. This thesis proposes a comprehensive approach for realizing model-based tool integration which is inspired from techniques originating from the field of database integration, but employed in the context of MDE. For tackling the problem of missing metamodel descriptions, a semi-automatic approach for mining metamodels and models from textual language definitions is presented, representing a prerequisite for the subsequent steps which are based on metamodels and models, only. For raising the level of abstraction and the possibility of reuse of mappings between metamodels, a framework is proposed for building, applying, and executing reusable mapping operators. To demonstrate the applicability of the framework, it is applied for defining a set of mapping operators which are intended to resolve typical structural heterogeneities occurring between the core concepts used to define metamodels. Finally, for ensuring roundtrip capabilities of transformations, two approaches are presented how existing, non-roundtripping transformations can be enriched with rountrip capabilities.

Wimmer, M. (2008). From Mining to Mapping and Roundtrip Transformations : a systematic approach to model-based tool integration [Dissertation, Technische Universität Wien]. reposiTUm. https://resolver.obvsg.at/urn:nbn:at:at-ubtuw:1-27869