Nebeninhalt

Materialien

Abschlussarbeiten
(Bachelor/Master/Studienarbeiten)

Hier wollen wir einige Anregungen zu studentischen Arbeiten im Bereich unserer Forschungsthemen geben. Die Themen können jeweils auf die jeweiligen Rahmenbedingungen für Bachelor-, Studien- und Master-Arbeiten, sowie auf die speziellen Interessen der Bearbeitenden zugeschnitten werden. Die Zuordnung der einzelnen Themen zu Master- bzw. Studien-/Bachelor-Arbeiten ist deswegen auch nur als grobe Einstufung zu sehen und kann durch Kürzungen oder Erweiterungen des Themas gegebenenfalls auch angepasst werden.
Inhaltlich geht es dabei zum Beispiel um spannende Themen in den Bereichen Sicherheit, UML, Modell-basierte Entwicklung, E-Commerce, ... Gegebenfalls kann die Durchführung einer Abschlussarbeit mit der Tätigkeit im Rahmen einer Hiwi-Stelle verknüpft werden (siehe hier).

Allgemeine Informationen zu den Themen unserer Arbeitsgruppe gibt es in der Lehrgebietsvorstellung.

Bei Interesse melden Sie sich bitte so bald wie möglich bei den angegebenen Ansprechpartnern. Bitte schicken Sie mit Ihrer Anfrage einen Lebenslauf und den bisherigen Studienverlauf inkl. aktuellem Notenspiegel.

Beispiele für abgeschlossene Arbeiten

Konzeption und Implementierung einer Schnittstelle zur Sicherheitsanalyse mit CARiSMA im Kontext von Serviceorientierten Architekturen (Bachelor)
Autor: Beckmann, Andreas
Beteiligt: Jürjens, Jan; Pape, Sebastian

Berechnung einer optimalen Strategie in allgemeinen Risikoanalysen mit ARA (Master)
Autor: Michel, Marcel
Beteiligt: Jürjens, Jan; Schmitz, Andreas

Erweiterung der Business Process Model and Notation für Return on Security Investment Analysen (Diplom)
Autor: Thalmann, Dominik
Beteiligt: Jürjens, Jan; Pape, Sebastian

Mehrparteien-Signaturen für elektronische Dokumente im Bereich der Telemedizin (Master)
Autor: Hartwecker, Jannic
Beteiligt: Jürjens, Jan; Deiters, Wolfgang

Entwicklung eines Eclipse-Plugins für Matlab-unterstützte Auswertungen des Return on Security Investment (Bachelor)
Autor: Schnitzler, Theodor
Beteiligt: Jürjens, Jan; Schmitz, Andreas

Anwendung von Process-Mining-Ansätzen auf Log-Daten aus SAP-Systemen (Diplom)
Autor: Blyufshteyn, Igor
Beteiligt: Jürjens, Jan; Rehof, Jakob

Formale Abbildung der regulatorischen Compliance auf Security Policies (Diplom)
Autor: Bostan, Elena-Crina
Beteiligt: Jürjens, Jan; Hirsch, Martin

Aktuelle Themen

Offene Ausschreibungen Open topics

Reifegradmodelle für Softwaresysteme mit Fokus auf Transparenz und Akzeptanz in der medizinischen Domäne (Master) #0225

Kontakt/Contact:
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

Maturity Models are used for evaluating processes, organizations and systems and thereby contribute to an improvement of for instance processes or quality characteristics. They also objectify the assessment. In the field of medical processing of personal data, it is important that users accept medical data spaces, so that they can be established successfully.

 Zusammenfassung   PDF 

Maturity Models for Software Systems with a focus on Transparency and Acceptance in the Medical Domain (Master) #0225

Kontakt/Contact:
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

Maturity Models are used for evaluating processes, organizations and systems and thereby contribute to an improvement of for instance processes or quality characteristics. They also objectify the assessment. In the field of medical processing of personal data, it is important that users accept medical data spaces, so that they can be established successfully.

 Abstract   PDF 

Messung der Transparenz von Medizinischen Datenräumen (Master) #0224

Kontakt/Contact:
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

Transparency is one of the key criteria for medical data spaces. In literature different definitions of Transparency exist. Concerning medical data spaces from the legal perspective transparency should be ensured. Regarding the stakeholder perspective different questions arise for instance - how transparent are medical data spaces to stakeholders.

 Zusammenfassung   PDF 

Measurement of Transparency of Medical Data Spaces (Master) #0224

Kontakt/Contact:
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

Transparency is one of the key criteria for medical data spaces. In literature different definitions of Transparency exist. Concerning medical data spaces from the legal perspective transparency should be ensured. Regarding the stakeholder perspective different questions arise for instance - how transparent are medical data spaces to stakeholders.

 Abstract   PDF 

Transparenz-Anforderungen für Medizinische Datenräume (Master) #0223

Kontakt/Contact:
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

Medical Data Spaces (MeDS) are virtual places supporting secure exchange and integration of medical and health related data from different sources. MeDS are thereby protecting the digital sovereignty of the data owner. Transparency is a legal regulation and also one important aspect for stakeholders to participate in MeDS. Besides this the Acceptance has to be taken into account, because both criteria are interrelated.

 Zusammenfassung   PDF 

Transparency Requirements for Medical Data Spaces (Master) #0223

Kontakt/Contact:
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

Medical Data Spaces (MeDS) are virtual places supporting secure exchange and integration of medical and health related data from different sources. MeDS are thereby protecting the digital sovereignty of the data owner. Transparency is a legal regulation and also one important aspect for stakeholders to participate in MeDS. Besides this the Acceptance has to be taken into account, because both criteria are interrelated.

 Abstract   PDF 

Collecting and Analysing Legal and Ethical Requirements for a Data Sharing Platform (Master) #0220

Kontakt/Contact:
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

The European Open Science Cloud (EOSC) is a data sharing platform for research data that enables a secure and confidential data sharing within the EOSC ecosystem. In this thesis, initial legal and ethical requirements for secure data processing of research data within the EOSC ecosystem shall be defined.

 Abstract   PDF 

Lösung von Konflikten zwischen Sicherheits-, Datenminimierungs- und Fairness-Requirements (Master) #0216

Kontakt/Contact:
Dr. Qusai Ramadan (qramadan@uni-koblenz.de)

Requirements are inherently prone to conflicts. Data protection requirements such security, data-minimization and fairness requirements are no exception. Conflicts need to be detected and resolved early during the business process modeling to avoid difficulties of detecting conflicts in later stages of system development. Failure to recognize and resolve conflicts will confuse the stakeholders during the system design and implementation, and may lead to a failure system. In a previous work, we proposed a BPMN-based data protection engineering framework that supports the detection of conflicts between the specified requirements in the BPMN models based on a catalog of domain-independent anti-patterns. A limitation is that our approach is currently does not support the resolution of conflicts. Although a fully automated process would be appreciated, the resolution of conflicts may require human intervention, a further challenging task that involves reasoning on the privacy impact of different solution strategies. \sectionTasks/Goals

 Zusammenfassung   PDF 

Resolving Conflicts between Security, Data-minimization and Fairness Requirements (Master) #0216

Kontakt/Contact:
Dr. Qusai Ramadan (qramadan@uni-koblenz.de)

Requirements are inherently prone to conflicts. Data protection requirements such security, data-minimization and fairness requirements are no exception. Conflicts need to be detected and resolved early during the business process modeling to avoid difficulties of detecting conflicts in later stages of system development. Failure to recognize and resolve conflicts will confuse the stakeholders during the system design and implementation, and may lead to a failure system. In a previous work, we proposed a BPMN-based data protection engineering framework that supports the detection of conflicts between the specified requirements in the BPMN models based on a catalog of domain-independent anti-patterns. A limitation is that our approach is currently does not support the resolution of conflicts. Although a fully automated process would be appreciated, the resolution of conflicts may require human intervention, a further challenging task that involves reasoning on the privacy impact of different solution strategies. \sectionTasks/Goals

 Abstract   PDF 

Modellbasierte Realisierung von Resilienztechniken (Master) #0210

Kontakt/Contact:
Marco Ehl (mehl@uni-koblenz.de)

Resilience is the ability of a system to withstand adversities and to recover in case of system failure. We want to explore, categorize, support and implement resilience techniques with model-based methods. \sectionTasks/Goals

 Zusammenfassung   PDF 

Model-based Realization of Resilience Techniques (Master) #0210

Kontakt/Contact:
Marco Ehl (mehl@uni-koblenz.de)

Resilience is the ability of a system to withstand adversities and to recover in case of system failure. We want to explore, categorize, support and implement resilience techniques with model-based methods. \sectionTasks/Goals

 Abstract   PDF 

Data-driven Analytics within Computational Social Sciences (Bachelor / Master) #0209

Prof. Dr. Jan Jürjens (http://jan.jurjens.de)

Goal is the development of a distributed data science workflow or pipeline system for the analysis of occupation-related data, specifically, for example, occupation-related training.

 Abstract   PDF 

What Led to This Vulnerability? Trace-based Forensics (Bachelor / Master) #0208

Prof. Dr. Jan Jürjens (http://jan.jurjens.de)

Making software secure remains a signficant challenge in industrial software engineering. The thesis aims to support addressing that challenge, focusing on the tracing of security requirements.

 Abstract   PDF 

Digital compliance checks for buildings (Bachelor / Master) #0205

Prof. Dr. Jan Jürjens (http://jan.jurjens.de)

For the safety of humans it is important that buildings comply with building regulations. The goal of this work is to facilitate the demonstration of that compliance by providing IT support for it.

 Abstract   PDF 

Impact of Blockchains on companies organisation and new business models (Bachelor / Master) #0204

Prof. Dr. Jan Jürjens (http://jan.jurjens.de)

Blockchain is a current technology which receives a high level of interest. The goal of this thesis is to investigate if and under what conditions this interest is justified.

 Abstract   PDF 

Industrie 4.0 Analyse der Systemsicherheit (Master) #0203

Kontakt/Contact:
Mahmood Al-Doori (mahmoodaldoori@uni-koblenz.de)

In Industry 4.0, software becomes a crucial part of the industrial assets set. Since Industry 4.0 is a fairly recent initiative that packs many promises regarding the industry digitization, its grandiose adoption needs to be examined from the software security perspective.

 Zusammenfassung   PDF 

Industry 4.0 System Security Analysis (Master) #0203

Kontakt/Contact:
Mahmood Al-Doori (mahmoodaldoori@uni-koblenz.de)

In Industry 4.0, software becomes a crucial part of the industrial assets set. Since Industry 4.0 is a fairly recent initiative that packs many promises regarding the industry digitization, its grandiose adoption needs to be examined from the software security perspective.

 Abstract   PDF 

Datenschutzbewusste Trusted-Konnektoren für kognitive Häfen (Master) #0196

Kontakt/Contact:
Dr. Amirshayan Ahmadian (ahmadian@uni-koblenz.de)

In the context of the DataPorts project, a data platform for the connection of cognitive ports will be developed in which transportation and logistics companies around a seaport will be able to manage data like any other company asset, in order to create the basis to offer cognitive services\footnotehttps://dataports-project.eu. An overview of the DataPorts platform with several functionalities and components is provided in Figure dataports. Ensuring security is an important aspect of the DataPorts data platform. A secure environment of data exchange in reliable and trustworthy manner has to be created, with access permits and contracts to allow data sharing and the exploration of new artificial intelligence and cognitive services.

 Zusammenfassung   PDF 

Privacy-Aware Trusted Connectors for Cognitive Ports (Master) #0196

Kontakt/Contact:
Dr. Amirshayan Ahmadian (ahmadian@uni-koblenz.de)

In the context of the DataPorts project, a data platform for the connection of cognitive ports will be developed in which transportation and logistics companies around a seaport will be able to manage data like any other company asset, in order to create the basis to offer cognitive services\footnotehttps://dataports-project.eu. An overview of the DataPorts platform with several functionalities and components is provided in Figure dataports. Ensuring security is an important aspect of the DataPorts data platform. A secure environment of data exchange in reliable and trustworthy manner has to be created, with access permits and contracts to allow data sharing and the exploration of new artificial intelligence and cognitive services.

 Abstract   PDF 

Datenschutzbewusste Systeme in industriellen Umgebungen (Master) #0195

Kontakt/Contact:
Dr. Amirshayan Ahmadian (ahmadian@uni-koblenz.de)

A main problem for IT service providers is to avoid data breaches and provide data protection. Article 25 of Regulation (EU) 2016/679 refers to Privacy by Design (PbD). PbD implies the design of a system must be analyzed with regard to privacy preferences and, where necessary, be improved to technically support privacy and data protection.

 Zusammenfassung   PDF 

Privacy-Aware Systems in Industrial Ecosystems (Master) #0195

Kontakt/Contact:
Dr. Amirshayan Ahmadian (ahmadian@uni-koblenz.de)

A main problem for IT service providers is to avoid data breaches and provide data protection. Article 25 of Regulation (EU) 2016/679 refers to Privacy by Design (PbD). PbD implies the design of a system must be analyzed with regard to privacy preferences and, where necessary, be improved to technically support privacy and data protection.

 Abstract   PDF 

Kontexterkennung und -Definition von Anforderungen (Master) #0186

M.Sc. Katharina Großer (grosser@uni-koblenz.de)

Begriffe, die in der Formulierung von Anforderungen verwendet werden, können je nach Kontext eine unterschiedliche Bedeutung haben. Um Missverständnisse zu vermeiden, müssen die verwendeten Konzepte eindeutig identifiziert sein. Kontextsensitive Glossare oder Domänenmodelle, können diese Information liefern. Hier muss die Zuordnung zu den Begriffen in der Anforderung jedoch manuell erfolgen. Für eine automatisierte Unterstützung bei der Verlinkung der Konzepte muss auch der Kontext der Anforderung berücksichtigt werden, welcher auch für Qualitätsanalysen genutzt werden kann.

 Zusammenfassung   PDF 

Context dectection and definition for requirements (Master) #0186

M.Sc. Katharina Großer (grosser@uni-koblenz.de)

Terms that are used within the phrasing of requirements sentences can have very different meaning dependent on their context of usage. To avoid misunderstandings, these concepts have to be clearly identified. This information can be delivered by context sensitive glossaries or domain models. Yet, this identification has to be provided manually. To automate the linkage the context of the requirement has to be considered. These dependencies could potentially also be used for quality analysis.

 Abstract   PDF 

Implementation eines benutzerfreundlichen Editors für template-basierte Anforderungen (Bachelor / Master) #0185

M.Sc. Katharina Großer (grosser@uni-koblenz.de)
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

Es gibt viele verschiedene Templatesysteme, die durch vorgegebene Syntaxstrukturen die Qualität und Formalität von Anforderungen erhöhen sollen. In der Praxis zeigt sich jedoch, dass es Erfahrung im Umgang mit diesen Formulierungshilfen benötigt, um die positiven Effekte nutzen zu können und nicht nur zusätzlichen Aufwand zu erzeugen. Editoren, welche die Formulierung unterstützen, können den Lernprozess erleichtern und beschleunigen und die Einhaltung von Qualitätsbedingungen kontruktiv umsetzen.

 Zusammenfassung   PDF 

Implementation of a user-friendly editor for template-based requirements (Bachelor / Master) #0185

M.Sc. Katharina Großer (grosser@uni-koblenz.de)
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

There exist many different template systems which shall enhance the quality and formality of requirements phrasing by predefined syntactical structures. Yet, practice shows that some experience with these systems is necessary to actually achieve these positive effects without producing too much overhead. Editors with a proper user guidance can support and accelerate the learning process and ensure the satisfaction of quality constraints from the very beginning.

 Abstract   PDF 

Ableitung von Sicherheits- und Privacy-Anforderungen aus secBPMN Modellen (Master / Bachelor) #0178

M.Sc. Katharina Großer (grosser@uni-koblenz.de)

Business Process Model and Notation (BPMN) ist eine weit verbreitete Modellierungssprache zur Darstellung von Geschäftsprozessen. BPMN kann als Kommunikationsbasis während der Anforderungserhebung dienen oder als semi-formale notation, um diese Anforderungen zu erfassen. Die Formalisierung der Prozesse kann verstecktes Wissen freilegen, z.B. unbekannte Akteure, Verantwortlichkeiten und Systemgrenzen. Basierend auf dieser Hypothese ist es sinnvoll mit der Definition der Prozesse zu beginnen, um sicherzustellen, dass das entwickelte System den Bedürfnissen der Kunden entspricht. Jedoch setzt das Lesen und Verstehen der Modelle Wissen über die Notation voraus, was nicht bei allen Beteiligten gegeben ist. Natürlichsprachliche Anforderungen sind für alle Beteiligten ohne weiteres Training verständlich, insbesondere auch für solche, die nicht direkt an der Entwicklung beteiligt sind, aber Teil des Projektmanagements sind, wie etwas juristische Berater. Außerdem können nicht alle Anforderungen in Prozessmodellen ausgedrückt werden. Insbesondere nicht-funktionale Anforderungen werden in State-of-the-art Prozessmodellierungsmethoden unzureichend unterstützt. Für die meisten Projekte ist es daher notwendig ein erweitertes vollständiges natürlichsprachliches Anforderungsdokument zu erstellen, welches zu den Prozessmodellen konsistent ist. Es existieren Ansätze die Ableitung funktionaler Anforderungen aus BPMN-Prozessmodellen zu formalisieren. Dabei führt die Verwendung von Anforderungs-Templates zu qualitativ hochwertigen Anforderungen in strukturierter Sprache. Dies erzeugt nicht nur Anforderungen höherer Qualität, sondern es ermöglicht auch ein eindeutiges Mapping zwischen beiden Notationen und die Sicherstellung der Konsistenz. secBPMN erweitert BPMN um Sicherheits-Annotationen und es existiert weiterhin eine Erweiterung zu Privacy. Ziel dieser Arbeit ist es existierende Ansätze zur Anforderungserhebung aus BPMN-Modellen um Sicherheits- und Privacy-Anforderungen zu erweitern.

 Zusammenfassung   PDF 

Eliciting textual security and privacy requirements from secBPMN models (Master / Bachelor) #0178

M.Sc. Katharina Großer (grosser@uni-koblenz.de)

The Business Process Model and Notation (BPMN) is widely used to model business processes. It can thus serve as a communication basis during elicitation or as a semi-formal notation to document the respective requirements. The formalization of processes reveals hidden knowledge, e.g. unknown actors, and helps to identify responsibilities and system boundaries. Based on this hypothesis, developing a software that fits the customer's needs should start with engineering their processes. Nevertheless, modeling or even only understanding business process models requires knowledge of the respective notation, that is not always given for all domain experts involved in the requirements elicitation. Natural language requirements are comprehensible for all stakeholders without additional training. This is particularly important for stakeholders not involved in the development process itself but being part of the project management, e.g. legal advisers. Furthermore, not all requirements, can be expressed in process models. Especially for non-functional requirements there is only very limited support in state-of-the-art business process modelling. Therefore, for most projects there is a need to finally maintain an extended set of textual requirements consitent with the requirements expressed business process models. There exist methodologies to formalize the requirements elicitation process for functional requirements starting from BPMN process models. The usage of requirement templates leads to a set of well-formed and structured natural language requirements. Not only does this increase the quality of textual requirements, it also enables a distinct mapping of elements between both notations and to ensure consistency. The secBPMN notation and its extension for privacy annotates BPMN model with additional security and privacy constraints. The goal of this thesis is to extend the existing methodology to elicit privacy and security requirements from such annotated models.

 Abstract   PDF 
Vorgemerkte Arbeiten Reserved theses

Analysis of Open Source Building Blocks for Data Spaces (Gaia-X, International Data Spaces and Data Spaces Support Centre) (Master) #0222

Kontakt/Contact:
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

Within the scope of the European Open Science Cloud (EOSC) an architecture for a secure data sharing platform including sensitive data shall be defined. Hereby different open source building blocks should be analysed and compared.

 Abstract   PDF 

Collecting and Analysing Technical Requirements for a Data Sharing Platform (Master) #0221

Kontakt/Contact:
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)
Dr. Volker Riediger (riediger@uni-koblenz.de)

The European Open Science Cloud (EOSC) is a data sharing platform for research data that enables a secure and confidential data sharing within the EOSC ecosystem. In this thesis, initial technical requirements for secure data processing of research data within the EOSC ecosystem shall be defined.

 Abstract   PDF 

Integration eines Clinical Decision Support Systems in einen medizinischen Datenraum (Bachelor / Master) #0219

Kontakt/Contact:
Dipl.-Inf. Julian Flake (flake@uni-koblenz.de)

Clinical Decision Support Systeme (CDSS) sollen die Gesundheitsversorgung verbessern, indem sie Ärzten Diagnosen und Therapien vorschlagen. Patientenspezifische Bewertungs- oder Entscheidungsempfehlungen werden auf der Grundlage einer klinischen Wissensbasis in Kombination mit patientenspezifischen klinischen Daten errechnet. Ein Vorteil von CDSS besteht darin, dass sie in der Lage sind, riesige Datenmengen zu analysieren, die auf andere Weise schwer zu interpretieren oder zu nutzen sind. Um den Nutzen und die Interoperabilität von CDSS zu verbessern, ist es notwendig, allgemeine klinische Wissensdaten und patientenspezifische Daten sowie elektronische Patientenakten mit anderen Instanzen und Akteuren auszutauschen. Die Initiative International Data Spaces (IDS) definiert Standards, eine Referenzarchitektur und Softwarekomponenten, die einen souveränen Datenaustausch durch die Einrichtung sicherer und vertrauenswürdiger Datenräume ermöglichen. IDS definiert das Konzept der Data Space Konnektoren, die für den Datenaustausch zwischen Datenanbietern und Datenverbrauchern in einem gemeinsamen Datenraum verwendet werden.

 Zusammenfassung   PDF 

Integration of a clinical decision support system into a medical data space (Bachelor / Master) #0219

Kontakt/Contact:
Dipl.-Inf. Julian Flake (flake@uni-koblenz.de)

Clinical decision support systems (CDSS) aim to improve healthcare by suggesting diagnoses and therapies to physicians. Patient-specific assessments or recommendations for decisions are computed by using a clinical knowledge base in combination with patient-specific clinical data. One advantage of CDSS is that they are able to analyse huge amounts of data that may be difficult to interpret or to use otherwise. To improve utility and interoperability of CDSS, it is necessary to exchange general clinical knowledge data and patient-specific data as well as electronic health records with other instances and stakeholders. The International Data Spaces (IDS) initiative defines standards, a reference architecture and software components that allow for sovereign exchange of data by establishing secure and trustworthy data spaces. IDS defines the concept of data space connectors that is used to exchange data between data providers and data consumers in a shared data space.

 Abstract   PDF 

Abbildung von ORM zu T-Graph (Master) #0176

M.Sc. Katharina Großer (grosser@uni-koblenz.de)

Das Projekt ``T-Reqs'' in Kooperation mit der Europäischen Raumfahrtagentur (ESA) entwickelt Techniken, um die Qualität von Anforderungen bezüglich Präzision, Korrektheit und Vollständigkeit mit Hilfe von Ontologien zu verbessern. Ontologien sind Informations-Modelle, die Konzepte der realen Welt sowohl für Menschen als auch Maschinen lesbar beschreiben. So kann Wissen eines abgegrenzten Themengebiets formal dargestellt und für Computersysteme zugänglich gemacht werden. Dabei sind die definitorischen Grenzen zwischen Ontologien und konzeptuellen Modellen fließend. Object-Role Modelling (ORM) ist eine Sprache und Methode für die konzeptuelle Modellierung. Es ist ein Dialekt des Fact-Based Modelling (FBM), welches alle Fakten als attributlose Beziehungen behandelt. ORM basiert auf Prädikatenlogik erster Stufe und strukturiert Fakten als Mengen. ORM, in der aktuellen Version ORM2, besteht aus einer graphischen sowie mit der Formal ORM Language (FORML) einer textuellen Notation. ORM wird im Projekt ``T-Reqs'' eingesetzt, um ein Meta-Modell für Anforderungen und deren Qualitätsmerkmale zu erstellen. Historisch wurde ORM für den konzeptuellen Entwurf von Datenbanken entwickelt. Relationales Mapping von ORM Schemata auf Relationale Schemata ist daher ausführlich definiert. Instanzen eines ORM-Schemas werden üblicherweise in einer dem ORM-Schema entsprechenden Datenbank gehalten, Ableitungsregeln durch Views und Trigger umgesetzt. Ebenso müssen komplexere Constraints durch Trigger umgesetzt werden. Datenbanken haben jedoch Nachteile bei komplexen Abhängigkeiten und Ableitungen und eine geringe Flexibilität, besonders im Blick auf eine umgekehrte generierung von Schemata aus Instanzen. Eine Ableitung von Schemata aus Instanzen eines anderen Schemas ist für T-Reqs interessant, z.B. im Kontext der Erzeugung von Analysemodellen der Anwendungsdomäne aus den erfassten Anforderungen. Graphentechnologien, insbesondere T-Graphen, bieten sehr effiziente Datenstrukturen für Abfragen und Transformationen sowie die Möglichkeit der Modellierung auf verschiedenen Modellierungsebenen. Um die Datenhaltung von Instanzen eines ORM-Schemas durch Graphen zu ermöglichen, muss die Konsistenz zwischen Graph und Schema gewährleistet sein. Ein erstes Mapping existiert, welches die wesentlichen Konzepte von ORM in die Graphenwelt überträgt. Jedoch fehlen noch einige komplexere Bedingungen, Randfälle und Ableitungsregeln für implizietes Wissen. Das Ziel dieser Arbeit ist es diese Erweiterungen zu entwickeln und in das bestehende Framework zu integrieren

 Zusammenfassung   PDF 

Mapping ORM 2 T-Graph (Master) #0176

M.Sc. Katharina Großer (grosser@uni-koblenz.de)

The project ``T-Reqs'', a cooperation with the European Space Agency (ESA), develops techniques to improve requirements quality regarding precision, correctness, and completeness by the means of ontology support. Ontologies are information models defining real world concepts in a human as well as machine readable and formal way. Thereby, knowledge of a specific domain can be made accessible by computer systems. The boundaries to classical conceptual models are blurred. Object-Role Modelling (ORM) is a language and methodology for conceptual modelling. It is a dialect of Fact-Based Modelling (FBM), where all facts are treated as attributeless relations. The logic is based in first order logic and set theory. ORM in its current version ORM2 consists of a graphical notation and the textual Formal ORM Language (FORML). In the context of the T-Reqs project ORM is used to define a requirements meta-model capturing the structure and quality properties of requirements. Traditionally ORM is used in conceptual modelling for relational databases. The mapping of ORM schemata to relational schemata is therefore well defined. Instances of a ORM schema are the population of corresponding relational databases, where derivation rules are implemented as views and triggers. Complex constraints can also be implemented as triggers. Yet, relational databases have some drawbacks, e.g. in handling complex constraints and derivations or regarding their low flexibility. Especially it is hard to generate schemata from instances. This is of special interest in the context of T-Reqs to enable the generation of domain knowledge models for requirements analysis. Graph technologies, as e.g. T-Graph, are very efficient for queries and transformation. Furthermore, they allow for modelling on different levels. A basic mapping definition from ORM to T-Graph translates the main concepts of ORM to graph schemas, but some more complex constraints, corner-cases, and support for derivations are missing. The goal of this thesis is to extend the mapping in these areas and integrate this to the existing framework.

 Abstract   PDF 
Laufende Arbeiten Ongoing theses

Modellbasierte Analyse der Datensicherheit und Datensouveränität von Datenökosystemen und Marktplätzen (Master) #0218

Kontakt/Contact:
Prof. Dr. Jan Jürjens (http://jan.jurjens.de)
Dr. Qusai Ramadan (qramadan@uni-koblenz.de)

The era of profitable data accumulation has come to an end. The new paradigm emphasizes companies' capacity to effectively monetize their data. In response to the evolving landscape, a multitude of digital marketplaces have emerged to facilitate the trading and monetization of industrial data. Such systems, similar to traditional information systems, need to implement technical and organizational measures to safeguard data security and uphold data sovereignty. However, ensuring data security and privacy in data marketplaces poses greater challenges than in traditional information systems. Data marketplaces involve the sharing and exchange of data among multiple parties, creating a complex network that heightens the risks of data exposure, and unauthorized access. Additionally, the complex control of data monetization presents difficulties for data providers in effectively monitoring and ensuring compliance with data sovereignty policies. Aligned with the principles of security by design, which advocate for security analysis to be integrated into the earliest phases of system development, the goal of this thesis is to apply and extend an existing model-based analysis approach to report on the data security and sovereignty of data marketplaces at the modeling phase of the system development life cycle. This thesis is developed in the context DATAMITE project\footnotehttps://www.egi.eu/project/datamite/, which aims at delivering a modular framework to improve data trading and monetization. špace-3ex \sectionTasks/Goals

 Zusammenfassung   PDF 

Model-Based Data Security and Sovereignty Analysis of Data Marketplaces (Master) #0218

Kontakt/Contact:
Prof. Dr. Jan Jürjens (http://jan.jurjens.de)
Dr. Qusai Ramadan (qramadan@uni-koblenz.de)

The era of profitable data accumulation has come to an end. The new paradigm emphasizes companies' capacity to effectively monetize their data. In response to the evolving landscape, a multitude of digital marketplaces have emerged to facilitate the trading and monetization of industrial data. Such systems, similar to traditional information systems, need to implement technical and organizational measures to safeguard data security and uphold data sovereignty. However, ensuring data security and privacy in data marketplaces poses greater challenges than in traditional information systems. Data marketplaces involve the sharing and exchange of data among multiple parties, creating a complex network that heightens the risks of data exposure, and unauthorized access. Additionally, the complex control of data monetization presents difficulties for data providers in effectively monitoring and ensuring compliance with data sovereignty policies. Aligned with the principles of security by design, which advocate for security analysis to be integrated into the earliest phases of system development, the goal of this thesis is to apply and extend an existing model-based analysis approach to report on the data security and sovereignty of data marketplaces at the modeling phase of the system development life cycle. This thesis is developed in the context DATAMITE project\footnotehttps://www.egi.eu/project/datamite/, which aims at delivering a modular framework to improve data trading and monetization. špace-3ex \sectionTasks/Goals

 Abstract   PDF 

Spezifikation von Fairness-Anforderungen (Master) #0217

Kontakt/Contact:
Dr. Qusai Ramadan (qramadan@uni-koblenz.de)

Automated Decision-Making Software (DMS) became responsible for sensitive decisions with far-reaching societal impact in many areas of our lives. However, the risk that a falsely developed DMS may lead to unlawful discrimination against persons has raised public and legal awareness on software fairness. For instance, Recital 71 of the European General Data Protection Regulation (GDPR) prescribes to implement technical and organizational measures appropriate to prevent discriminatory effects on natural persons on the basis of racial or ethnic origin. Furthermore, software fairness is stipulated by Article 22, which forbids decisions based on special categories of data as defined in Article 9, such as ethnicity and gender. These data are known as protected characteristics. The works in the software fairness field can be classified into three broad categories based on their goals: First, \textitunderstanding fairness: works that aim at providing (un)formalized fairness definitions and help to understand how discrimination can happen in our systems such as. Second, \textitmitigating discrimination: works that aim at preventing discrimination. Approaches in this direction focus mainly on tackling discrimination in different stages of AI-based software development, namely, data preprocessing, inprocessing, and post-processing methods. Third, \textitdiscrimination detection: works that aim at test/verify whether a system is fair with respect to certain fairness measure. Despite the availability of many approaches in the field of software fairness, we have not found an existing approach that aim at supporting the specifying of fairness requirements at requirements engineering phase of the system development life cycle. According to Brun et al., as with software security, fairness needs to be a first-class entity in the software engineering process. \sectionTasks/Goals

 Zusammenfassung   PDF 

Supporting the Specification of Fairness Requirements (Master) #0217

Kontakt/Contact:
Dr. Qusai Ramadan (qramadan@uni-koblenz.de)

Automated Decision-Making Software (DMS) became responsible for sensitive decisions with far-reaching societal impact in many areas of our lives. However, the risk that a falsely developed DMS may lead to unlawful discrimination against persons has raised public and legal awareness on software fairness. For instance, Recital 71 of the European General Data Protection Regulation (GDPR) prescribes to implement technical and organizational measures appropriate to prevent discriminatory effects on natural persons on the basis of racial or ethnic origin. Furthermore, software fairness is stipulated by Article 22, which forbids decisions based on special categories of data as defined in Article 9, such as ethnicity and gender. These data are known as protected characteristics. The works in the software fairness field can be classified into three broad categories based on their goals: First, \textitunderstanding fairness: works that aim at providing (un)formalized fairness definitions and help to understand how discrimination can happen in our systems such as. Second, \textitmitigating discrimination: works that aim at preventing discrimination. Approaches in this direction focus mainly on tackling discrimination in different stages of AI-based software development, namely, data preprocessing, inprocessing, and post-processing methods. Third, \textitdiscrimination detection: works that aim at test/verify whether a system is fair with respect to certain fairness measure. Despite the availability of many approaches in the field of software fairness, we have not found an existing approach that aim at supporting the specifying of fairness requirements at requirements engineering phase of the system development life cycle. According to Brun et al., as with software security, fairness needs to be a first-class entity in the software engineering process. \sectionTasks/Goals

 Abstract   PDF 

Spezifikation von Datennutzungskontroll-Policies mit UML (Bachelor / Master) #0215

Dipl.-Inf. Julian Flake (flake@uni-koblenz.de)

Datenzugriffskontrolle ist ein bekanntes Konzept der IT-Sicherheit. Datennutzungskontrolle erweitert dieses Konzept durch die Möglichkeiten, weitere Regeln zu formulieren, wie die Spezifikation von Aktionen, die ausgeführt werden müssen, nachdem der Zugriff auf bestimmte Daten erlaubt wurde. Eine Menge solcher Regeln wird in eine Datennutzungs-Policy zusammengefasst. Um solche Policies automatisch durchzusetzen, müssen sie maschinenlesbar sein. Eine Möglichkeit solche maschinenlesbaren Policies zu erzeugen, ist die Extraktion relevanter Inforamtionen aus UML-Modellen, die durch entsprechende sicherheitsrelevante und weitere Informationen angereichert wurden, die für Datennutzungskontrolle relevant sind.

 Zusammenfassung   PDF 

Specification of Data Usage Control Policies with UML (Bachelor / Master) #0215

Dipl.-Inf. Julian Flake (flake@uni-koblenz.de)

Data access control is a well known concept in IT security. Data usage control extends this concept by enabling for expression of further rules, like the specification of actions that need to be taken after access to some usage controlled data has been granted. A set of such rules is bundled into a data usage control policy. To allow for automatic enforcement of such policies, the policies need to be machine readable. One possible way to create such machine readable policies is by extracting relevant informaton from UML models that may be enriched by addtional information related to security and especially related to data usage control.

 Abstract   PDF 

Distributed Data Analytics with the Eclipse Dataspace Components (Bachelor / Master) #0206

Dr. Volker Riediger (riediger@uni-koblenz.de)
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)

Data spaces are a topic which is currently receiving a lot of interest in industry. A special focus shall be set on data sharing in the healthcare sector. In the context of the EU funded project AI-NET-PROTECT4health, this thesis investigates the technological foundations for this topic.

 Abstract   PDF 

Konfigurationsmanagement für Anforderungen (Bachelor) #0202

Dr. Volker Riediger (riediger@uni-koblenz.de)
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)

Anforderungen an größere Projekte sind meist über viele untereinander abhängige Dokumente verteilt. Besonders in eingebetteten Systemen (engl. Embedded Systems), wie in der Luft- und Raumfahrt, führt dies zu einer hohen Komplexität von Abhängigkeiten zwischen Funktionalitäten, die über Hard- und Software auf verschiedene Subsysteme und ihre Spezifikationen verteilt sind. Diese Komplexität erfordert ein hohes Maß an Standardisierung. Für die Raumfahrt stellen die Standards der European Cooperation of Space Standardization (ECSS) wiederverwendbare Anforderungen zur Verfügung, welche in vielen Projekten angewendet werden. Für jedes dieser Projekte müssen diese Anforderungen angepasst (engl. tailored) werden, um den projektspezifischen Randbedingungen zu entsprechen. Dabei Änderungen an den Anforderungen zu verwalten -- insbesondere bei wiederverwendeten Anforderungen -- erfordert eine Versionskontrolle, die unterschiedliche Änderungsarten (semantisch, redaktionell) repräsentieren kann und die verschiedene Konfigurationen von Standards berücksichtigt. Ziel ist es die Konsistenz, Vollständigkeit und Korrektheit der gesamten Spezifikation sicherzustellen.

 Zusammenfassung   PDF 

Configuration Management for Requirements (Bachelor) #0202

Dr. Volker Riediger (riediger@uni-koblenz.de)
M.Sc. Veronika Vasileva (vvasileva@uni-koblenz.de)

Requirements for larger projects are usually distributed over several interrelated sources, as they mostly involve several cooperating partners. Especially embedded systems, like in aerospace, introduce complex dependencies between several subsystems with functions distributed over software and hardware components within the requirements. To handle this complexity, there is a strong need for standardization. For the space business, standards of the European Cooperation of Space Standardization (ECSS) contain reusable requirements which are used in different similar projects. For each project these requirements have to be tailored to the individual needs. Managing changes to requirements in the process -- especially for reused requirements -- requires version control which encompasses different types of changes (semantic, editorial) and which takes into account different configurations of standards. The goal is to ensure the consistency, completeness, and correctness of the entire specification.

 Abstract   PDF 

Flexibilität in Prozess-Modellen (Master) #0201

Kontakt/Contact:
Marco Ehl (mehl@uni-koblenz.de)

We live in a constantly changing and interconnected world. A lockdown in an asian city, changing fuel prices, a cargo ship stuck in a high-traffic channel or the lack of qualified employees can lead to disruptions in established processes. The key mechanism to cope with those disruptions is flexibility. Especially we want to look at processes that provide services or create products. These processes can, for example, be modeled in BPMN or UML Activity Diagrams. We want to explore what types of flexibility exist by creating a taxonomy of flexibility and flexibility increasing methods. Then we want to know the applicability, the implementation and the consequences of flexibility increasing methods. At the end, we want to create a system that suggests flexibility increasing methods for a given process model.

 Zusammenfassung   PDF 

Flexibility in Process Models (Master) #0201

Kontakt/Contact:
Marco Ehl (mehl@uni-koblenz.de)

We live in a constantly changing and interconnected world. A lockdown in an asian city, changing fuel prices, a cargo ship stuck in a high-traffic channel or the lack of qualified employees can lead to disruptions in established processes. The key mechanism to cope with those disruptions is flexibility. Especially we want to look at processes that provide services or create products. These processes can, for example, be modeled in BPMN or UML Activity Diagrams. We want to explore what types of flexibility exist by creating a taxonomy of flexibility and flexibility increasing methods. Then we want to know the applicability, the implementation and the consequences of flexibility increasing methods. At the end, we want to create a system that suggests flexibility increasing methods for a given process model.

 Abstract   PDF