Doktorsritgerðir - HR

Varanleg URI fyrir þennan undirflokkhttps://hdl.handle.net/20.500.11815/88

Skoða

Nýlegt

Niðurstöður 1 - 20 af 76
  • Verk
    Stress & cognitive function: Exploring the impact of PTSD and breast cancer on cognition and the potential benefit of bright light therapy
    (2025-02-01) Gudmundsdottir Aspelund, Snaefridur; Heiðdís B. Valdimarsdóttir og Birna Valborgar Baldursdóttir; Department of Psychology (RU); Sálfræðideild (HR); School of Social Sciences (RU); Samfélagssvið (HR)
    Cognitive function—the ability to learn, solve problems, and effectively use stored information—is essential for everyday activities. While many factors contribute to cognitive impairment, this Thesis specifically examines the often-overlooked influence of stress. Major life stressors (e.g., war exposure, sexual violence, cancer diagnosis and treatment) can cause posttraumatic stress disorder (PTSD), and affect biological (e.g., cortisol) and psychological (e.g., depressive symptoms) stress markers known to impair cognitive function. Given the significant negative impact of cognitive impairment on individuals and society, the overarching goals of this Thesis were to a) further investigate the relationship between life stressors and cognitive impairment, b) identify potential moderators of this relationship and c) explore whether circadian-stimulating bright light therapy (BLT) can mitigate the negative effects of life stressors on stress markers, and cognitive function. The three papers listed below highlight the main aims and findings of this Thesis. In Paper I, a multilevel random effect meta-analysis was conducted to investigate the relationship between PTSD and cognitive impairment, along with potential moderating factors. Literature search yielded 53 peer-reviewed relevant studies in which this relationship was examined. Age, study design, study population, neurocognitive outcome assessed, gender, study quality, type of PTSD measure, as well as the presence of comorbidities such as traumatic brain injury, depression, and substance use, were investigated as potential moderators. The results suggested that PTSD is associated with both cognitive impairment and neurocognitive disorder, compared to healthy controls (HC), demonstrating a consistent link that persisted across all examined moderators. Paper II examined cancer-related cognitive impairment (CRCI) among women undergoing the life stressor of breast cancer (BC) diagnosis. Previous studies primarily examined chemotherapy's effect on CRCI, while this study aimed to assess CRCI in women with BC before any treatment and explore potential associations with stress. A population-based study with 112 treatment-naïve women with BC and 67 HC was conducted. Cognitive function was assessed via neuropsychological assessment. Cognitive complaints and psychological stress markers (i.e., cancer-related stress (related to PTSD), depressive and anxiety symptoms) were measured with a self-report battery. Biological stress markers (i.e., cortisol and α-amylase) were collected from saliva. The findings revealed that treatment-naïve women with BC had greater impairments in processing speed and verbal memory, along with more frequent cognitive complaints than HC. Multilinear regressions showed that a) steeper α-amylase slope, younger age and lower overall cancer-related stress were associated with better overall cognitive performance, and b) greater depressive symptoms were associated with more frequent cognitive complaints. These findings indicate that CRCI can start before BC treatment and that stress could contribute to it. Paper III builds on the findings from Papers I and II, indicating that life stressors may contribute to cognitive impairment by affecting stress markers. Since the cancer itself and surgery can trigger stress responses and disrupt circadian rhythms, further increasing the risk of CRCI, Paper III explored whether BLT could mitigate the negative effects associated with BC (including BC surgery) on cognitive function and stress. A double-blind, randomized controlled trial was conducted with the same participants and measurements as in Paper II. Participants were randomly allocated to receive circadian-stimulating bright white light (BWL, N = 60) or non-circadian-stimulating dim white light (DWL, N = 57) for four weeks post-BC surgery. Linear regression and path analyses indicated that the BWL group reported significantly fewer cognitive complaints. Additionally, there were non-significant trends, with small to medium effects, for faster reaction times and fewer intrusive thoughts in the BWL group compared to the DWL group. Overall, the results of this Thesis emphasize the importance of monitoring cognitive function in individuals exposed to major life stressors and an early intervention when needed. Future studies should further investigate BLT's potential in ameliorating cognitive impairment and stress among individuals experiencing major life stressors and explore the underlying mechanisms.
  • Verk
    Molecular dynamics and sensing with low-cost organic-inorganic nanostructures
    (2024-11-21) Brophy, Rachel; Andrei Manolescu og Halldór Guðfinnur Svavarsson; Iðn- og tæknifræðideild (HR); Department of Applied Engineering (RU); School of Technology (RU); Tæknisvið (HR)
    The organic-inorganic methylammonium lead iodide (MAPI) perovskite has become a prominent research topic as a low-cost solution for solar cell technol- ogy. However, this material has yet to be commercialized due to its limiting factor of a high degradation rate. One major cause of this degradation is ionic migration. In the first part of this thesis, the migration process is simulated using molecular dynamics. We find that the dominant diffusion corresponds to the migration of io- dide vacancies. Our simulations indicate two ways to reduce the degradation rate of the MAPI perovskite due to this migration. The first is adding a compressive strain, which we prove causes the diffusion coefficient to decrease significantly. The second is adding a hydroxyl group (OH-) into the crystal structure, which re- places the iodide vacancy and stops the migration. Each mediating factor is tested on a perfect crystal structure and a grain boundary-infused structure. Combining the MAPI material with silicon nanowires (SiNWs) prepared at our Nanophysics Center at Reykjavik University is an attempt to stabilize the MAPI material and boost the photovoltaic effect experimentally. This thesis demon- strates an excellent physical matching of such a hybrid structure. However, during this research, we found that the behavior of the silicon nanowires is more complex than expected. The second part of the thesis is dedicated to that, specifically to the detection of organic molecules with silicon nanowires. SiNWs contain unique and versatile capabilities mainly due to their large surface- to-volume ratio and high sensitivity of current-voltage characteristics. These wires can enhance performance and miniaturization in various fields, ranging from only a few tens of nanometers in diameter. Using a form of wet chemistry, so-called metal-assisted chemical etching (MACE) process, allows for low-cost and highly repeatable fabrication of SiNWs. In the utilization of this structure, we were able to create an ultra-sensitive NO2 gas sensor. Unlike many commercially available gas sensors, our SiNW-based sensors can operate under high relative humidity con- centrations and can experimentally detect 20 parts per billion of NO2 gas through chemisorption.
  • Verk
    Advancing human soft tissue pathology assessment using artificial intelligence in medical imaging
    (2024-12) Khatun, Zakia; Paolo Gargiulo, Francesco Tortorella, Co-advisor: Halldór Jónsson; Department of Engineering (RU); Verkfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    Artificial Intelligence (AI) has revolutionized various fields by automating complex tasks, uncovering patterns in large datasets, and making accurate predictions. Machine learning, particularly deep learning, plays a pivotal role in enabling AI to emulate human intelligence for tasks such as image segmentation, classification, and recognition. In the realm of medical imaging, modalities like Magnetic Resonance Imaging (MRI), Computed Tomography (CT), and Ultrasound are indispensable tools for diagnosing pathologies, detecting abnormalities, guiding treatment plans, and monitoring disease progression. The integration of AI with medical imaging offers the potential to enhance the accuracy and efficiency of these processes. In particular, AI’s application in the assessment of tendon-related conditions, such as tendinopathy, presents a promising avenue for improving patient care. Tendinopathy an significantly impact a patient’s quality of life, and early detection is crucial to optimize treatment outcomes. This thesis focuses on the development of advanced AI-driven methods for analyzing human soft tissue pathologies, with a primary emphasis on tendon segmentation, pathology detection (classification), and tendon reflex response assessment. By automating the analysis of tendons and other human soft tissues, these methods aim to reduce human error and variability, thereby enabling more consistent and reliable clinical decisions. Ultimately, the goal is to support earlier, more accurate diagnoses and interventions, leading to better patient outcomes and more personalized treatment strategies. This thesis begins with a study that analyzes MRI and CT scans from 47 participants to investigate the relationships between the tendons, cartilage, and muscles in the knee. This study has two primary objectives: first, to predict knee cartilage degeneration, and second, to predict patellar tendinopathy. For both objectives, predictions are made using features extracted solely from the patellar tendon and quadriceps, rather than directly from the cartilage itself. This approach explores the potential of using features from surrounding tissues as indirect predictors of knee-related pathologies. This study demonstrates that both knee cartilage degeneration and patellar tendinopathy can be predicted using these features from adjacent structures, highlighting the importance of surrounding tissues as potential indicators of pathology. Traditional machine learning models are employed to identify the most relevant features for each prediction task, highlighting their importance in the diagnosis of these conditions. This foundational research deepens our understanding of the interrelationships between knee soft tissues, contributing to more accurate diagnostic approaches in musculoskeletal health and enhancing clinical decision-making and treatment strategies. A central focus of this thesis is the development of an end-to-end tendon segmentation module. This system integrates a superpixel-based coarse segmentation step that serves as a foundation for the final, more precise segmentation. In this approach, the segmentation task is framed as a superpixel classification problem. To achieve this, two distinct approaches are developed: (1) Random Forest (RF) and Support Vector Machine (SVM) classifiers for superpixel categorization, and (2) a Graph Convolutional Network (GCN) for transforming superpixels into graph structures for node classification. The RF and SVM classifiers demonstrate exceptional performance, achieving Area Under the Curve (AUC) scores of 0.992 and 0.987, respectively, with high sensitivity, indicating their effectiveness in accurately classifying superpixels. Although the GCN approach yields slightly lower performance, it showcases the potential of deep learning methods for improving segmentation by leveraging the structural relationships between superpixels. The findings suggest that both traditional machine learning and deep learning techniques offer promising avenues for advancing tendon segmentation, with superpixel-based methods offering a pathway to more reliable and automated segmentation in medical imaging. Another key component of this thesis is the development of an end-to-end tendon pathology detection module, utilizing the same MRI dataset. This module adopts a graph-based approach, where superpixels are treated as nodes and connected by edge relationships. Each MRI scan is transformed into a graph, with the task framed as a graph classification problem to determine the presence or absence of pathology. To achieve this, a Graph Echo State Network (GESN) is employed. Known for its ability to efficiently represent data without the need for iterative backpropagation, the GESN leverages both temporal and structural dependencies in the data, enhancing classification performance. In this study, the GESN outperforms traditional machine learning models, achieving a mean accuracy of 0.953 and sensitivity of 0.943. These results underscore the potential of the GESN to significantly enhance diagnostic accuracy, offering a powerful tool for early detection and clinical decision-making in tendon pathology assessment. Moreover, the GESN’s ability to handle complex, high-dimensional data suggests its broad applicability to other medical imaging tasks, further expanding its potential clinical utility. The final study of this thesis explores the impact of demographic factors, including age, height, weight, and gender, on reflex response times in healthy individuals. This analysis is based on electromyography (EMG) recordings from 40 participants. The results reveal that elderly individuals, particularly those who are taller, heavier, and male, exhibit delayed reflex onsets. Even after normalizing for height, older participants still demonstrate slower reflex responses. These findings highlight the role of demographic factors in neuromuscular reflexes, aiding in the diagnosis and early detection of related disorders. In conclusion, this research demonstrates the potential of AI, particularly superpixel based and graph-based models, to advance tendon pathology assessment and exploratory tendon reflex studies, leading to better patient outcomes and musculoskeletal health management.
  • Verk
    Explaining intelligent game-playing agents
    (2024-10-26) Pálsson, Aðalsteinn; Yngvi Björnsson; Department of Computer Science (RU); Tölvunarfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    Artificial intelligence (AI)- based systems increasingly affect our daily lives. Such intelligent computer agents are becoming increasingly complex; for example, they employ learned machine learning models and extensive lookahead search, often exploring millions of possibilities. Unfortunately, as the complexity of those systems grows, it becomes more difficult to understand the rationality behind their decisions. In this dissertation, we researched how to incorporate explainability into the game-playing domain. Furthermore, we also evaluated to what extent the game-playing domain suits the development of explanations, where we showed that using the game-playing domain enables quantification of many aspects that otherwise are costly or infeasible. We used three games as our testbed: Sudoku, Breakthrough, and Chess. We primarily focused on model explainability, with a secondary focus on the search part of game-playing. For model explainability, we investigated three approaches for explaining the evaluation of the model: saliency maps, surrogate models and concept probing. First, we developed evaluation methods for saliency maps to understand the reliability of the methods and quantified to what extent we can interpret them in a way we are likely to do. Furthermore, we also introduced a second explainability layer using a surrogate model, where we explain the explanation. This way, we can, on a higher level, interpret what a high saliency means for the explainability methods. Second, we developed methods to compare concept probing results and assess to what extent we can interpret the results as concept importance. We showed that concept probing results and concept importance are only moderately correlated. Third, we unveiled the concepts learned by the world-class chess-playing agent, Stockfish, and exposed differences between its neural network and hand-crafted methods. On the search front, we altered a search-based reasoning process to generate solutions that were more explainable to humans, using the domain of Sudoku puzzles as our test bed. It is a hybrid of a heuristic- and constrained-based solver that biases the search towards finding solutions easily explainable to humans.
  • Verk
    Systematic guidelines for software modelling: an empirical study on enhancing modelling education and training
    (2024-09-23) Chakraborty, Shalini; Grischa Liebel; Department of Computer Science (RU); Tölvunarfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    Background: Software modelling holds significant promise for enhancing various aspects of software and systems engineering, including productivity and cost efficiency. Despite these advantages, its widespread adoption across the entire field remains limited. Extensive research has explored the reasons behind this limited adoption, uncovering issues such as subpar code generation, inadequate tool support, and a lack of guidance or training. Specifically, it has been suggested that engineers are reluctant to embrace modelling because it requires excessive effort and offers little usefulness, a view shaped by their educational background. Aim: Overall, our goal is to conduct an empirical investigation with university students, understanding their perception of software modelling during their university studies. We aim to explore students' challenges with modelling assignments, tools, and the modelling content taught in courses. Additionally, we want to understand which aspects of modelling students find beneficial for learning and carry with them into their future academic or industry careers. Finally, we aim to create systematic guidelines for software modelling to be used by both students and instructors. Method: To achieve our goal, we conducted several empirical studies with university students, teaching assistants, and instructors. We collected data through interviews, surveys, and observation studies. To evaluate the effectiveness of the systematic guidelines, we applied them to university courses where modelling was taught. Results: The results described in the thesis are twofold: first, we present university students' perceptions of modelling, and then we describe the guidelines we created based on that perception. Our results show that students recognise the benefits of modelling, such as using models for planning and group communication, but their understanding is hindered by unclear assignment expectations, irregular and insufficient feedback, and lack of experience with problem domains. Finally, we conclude our thesis with systematic guidelines that will help students enhance their modelling skills and knowledge, guiding them to apply this knowledge in real-world industry settings. Conclusion: Our results can potentially enhance education and training in software modelling, benefiting both academic settings and industrial environments. The modelling guidelines encourage students and instructors to follow a structured approach, starting from understanding a modelling problem to selecting a suitable modelling strategy based on the problem domain and tools and ultimately interpreting the resulting model. The guidelines also assist instructors in providing regular and systematic feedback on students' modelling efforts. The guidelines improve communication between students and the course by clearly outlining expectations and values for modelling assignments, which reflects the value of modelling in different future endeavours.
  • Verk
    Advancing smart contract security : vulnerability characterization, classification, and automated detection
    (2024-10-03) Soud, Majd; Grischa Liebel, Mohammad Hamadqa; Department of Computer Science (RU); Tölvunarfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    Context: Smart contracts are computer programs deployed on the blockchain with significant value of cryptocurrency. They automate transactions and asset transfers, eliminating the need for intermediaries. These contracts serve as the foundation of decentralized applications (DApps), driving blockchain growth. However, their value and role make them attractive targets for attackers, leading to financial losses estimated at around \$6.45 billion. Objectives: This PhD dissertation aims to improve the security of smart contracts by systematically identifying, characterizing, and automatically detecting vulnerabilities in their code, hence advancing a more secure blockchain environment. Method: We employ repository mining, qualitative analysis, and data science techniques. Specifically, we utilize repository mining to extract Ethereum smart contract vulnerabilities from public coding platforms and vulnerability databases. Moreover, qualitative methods such as open card sorting are employed to characterize the extracted vulnerabilities. Finally, data science techniques, including machine learning algorithms, are applied to automatically detect and prioritize vulnerabilities early in the smart contract lifecycle. Results: Our research demonstrates the effectiveness of the mentioned methods in identifying, characterizing, and detecting vulnerabilities in smart contracts. Our findings reveal significant novel vulnerabilities in Ethereum smart contracts from the selected repositories. Through qualitative and quantitative analysis, we provide insights into the distribution of these vulnerabilities across several data sources, their characteristics, and dimensions, including error sources and impacts. Furthermore, this PhD dissertation provides a unified and comprehensive taxonomy of smart contract vulnerabilities. We offer a framework and a suite of tools for automated vulnerability mining, classification, prioritization, and detection using data science techniques to improve the overall security of smart contracts. Finally, several mitigation strategies, quantitatively extracted from real-world smart contract code changes, are presented, along with recommendations and implications for researchers and practitioners. Conclusions: In conclusion, this PhD dissertation highlights the importance of securing Ethereum smart contracts by focusing on vulnerability key characteristics, automated prioritization, and detection to prevent cyberattacks. Despite encountering challenges, such as the need for unified data and extensive resources, this PhD dissertation offers valuable insights into smart contract security during the development process. Another key challenge was addressing logic vulnerabilities, which required a more advanced understanding of code semantics and proved particularly complex compared to syntactic vulnerabilities. Future work should focus on investing in advanced semantic analysis to effectively mitigate complex vulnerabilities. Our findings enable researchers, practitioners, and tool builders to better understand smart contract vulnerabilities and strengthen security policies and tools using our datasets, tools, and framework.
  • Verk
    Autonomous causal generalization
    (2024-10) Sheikhlar, Arash; Kristinn R. Thórisson; Department of Computer Science (RU); Tölvunarfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    For any agent to effectively learn how to achieve its goals via interaction with environments, it must have causal reasoning capabilities. Causal reasoning enables an agent to predict actions’ consequences and hypothesize the necessary conditions for taking appropriate actions. Effective mechanisms for generalizing causal knowledge and reasoning lead to more adaptive and autonomous artificial intelligence (AI) systems. While the topic of generalization has extensively been researched in AI over the past few decades, the question of which mechanisms are required to enable causality-based agents to leverage their familiarity with tasks in order to generalize their knowledge when trying to achieve their goals has remained to be addressed. In this thesis, we present a cumulative learning-based generalization mechanism that allows AI agents to use their familiarity with experienced situations to guide their causal hypothesis generation and exploration processes, thereby making their task planning more flexible and well-informed. The mechanism also makes AI systems more flexible and imaginative by enabling them to exploit both good and bad analogies to invent fresh approaches to new tasks. Constructivism and causality serve as the primary foundations of this work’s methodology. To validate the proposed mechanism, we have implemented it into a general machine intelligence (GMI) aspiring control system called Autocatalytic Endogenous Reflective Architecture (AERA) and tested it on multiple robot learning tasks as a proof of concept. We have also compared the extended AERA to another GMI aspiring system called Non-Axiomatic Reasoning System (NARS) in terms of their generalizability. The mechanism presented showcases its efficacy through empirical evidence and analytical evaluation.
  • Verk
    Design and development of digital health platforms
    (2024) Schmitz, Lisa; Anna Sigríður Islind, Co-supervisor: Marta Kristín Lárusdóttir; Department of Computer Science (RU); Tölvunarfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    By providing equitable care, digital health platforms can lessen the pressure on global healthcare systems. Digital platforms are pieces of software that connect users, data, services, and systems. In industry, the use of digital platforms has brought advancements in optimizing and automating processes, which could likewise benefit healthcare. Digital health platforms have been utilized effectively in recent years to connect patients, healthcare experts, researchers, and study participants and provide general care, disease and symptom management, and palliative care. They further allow for the collection of data that enables the monitoring of patients and research participants. In care, sharing patient data between patients and healthcare experts facilitates more effective care, while in research, the data fosters the development of novel diagnostic and predictive algorithms and more effective treatments. For advancing care and research in these ways, a meaningful design of digital health platforms that engages users in the longitudinal collection of reliable data is vital. Current literature has focused on identifying aspects of meaningful digital health platforms through literature reviews, patient outcome analysis, and surveys with health experts. However, the design of a digital health platform is created during its design and development process. Still, research rarely examines digital health platforms from the insider view of a designer or a developer, and as a result, there is a lack of interpretation of the design and development processes of digital health platforms. This thesis addresses this gap and contributes to the information system literature by examining the design and development process of the Sleep Revolution digital health platform. The Sleep Revolution digital health platform was designed and developed with the goal of modernizing sleep healthcare and collecting data for the research and development of models and algorithms that have the potential to uncover patterns and provide novel insights. It connects researchers, sleep experts, and study participants to do so. A co-design process has been employed to design and develop the Sleep Revolution digital health platform. Co-design involves stakeholders such as end-users and health experts as active members in the design and development process. Research has highlighted promises and challenges of co-design and has criticized the failure to incorporate it beyond the initial ideation stage of the design of digital health platforms. Drawing from the experience of designing and developing the Sleep Revolution digital health platform from early ideation stages to active usage in sleep studies in multiple European countries, this thesis documents the learnings and challenges of a four-year action design research project that employed co-design through the insights gained from five papers. The included papers look into the design and development of both a mobile and web application for study participants, which are part of the Sleep Revolution digital health platform. Paper P1 follows the digitization process of an analog sleep diary involving expert and user feedback into a mobile application and derives design guidelines for similar mobile applications to record health-related data from the findings. Building on that research, paper P2 analyses user compliance with the digital sleep diary, and the results contribute to an understanding of improving the challenges associated with the analog version. This paper gives insights into the feasibility of collecting longitudinal sleep-related data with a mobile application and highlights compliance challenges. Paper P3 examines dignity affront responses that users experienced through digital nudging built into the mobile application. Through the formulation of design guidelines, this paper contributes a better understanding of digital nudging and the design of effective and dignity-preserving digital nudges. Concentrating on the technical side of the digital health platform, paper P4 investigates the modularization of its design through a design system, and it presents conclusions about achieving a clear structure of design systems. Paper P5 shifts the focus from the mobile application to the web application by addressing the digitalization of an in-laboratory tool for measuring cognitive functioning into an at-home web tool and frames the learnings from this research as design guidelines. While the five papers focus on different parts of the Sleep Revolution digital health platform, they all reflect various stages of its design and development process. Based on the findings of these five papers, along with supplementary insights gleaned from the design and development process of the Sleep Revolution platform, the thesis (i) examines how the continuous co-design process of the digital health platform addresses emerging challenges and (ii) presents a framework that conceptualizes three socio-technical key factors for engaging design of digital health platforms for reliable data collection. The thesis also presents the different dimensions of those key factors and how they are interconnected to inform future digital health platforms' design and development. Together, the findings contribute to the discourse on digital health platforms in the information systems literature with the conceptualization of engaging design and the examination of a continuous co-design process.
  • Verk
    On learning stochastic models: from theory to practice
    (2024-11-14) Reynouard, Raphaël; Anna Ingólfsdóttir; Department of Computer Science (RU); Tölvunarfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    The field of model checking offers numerous tools for analysing stochastic models. This analysis provides a comprehensive understanding of the behaviours exhibited by the system represented in the model. Consequently, such analyses are of paramount importance for critical systems. Nonetheless, in certain application domains, the model is not readily accessible and needs to be acquired from partially-observable executions of the system under analysis. This thesis proposes to improve the learning of stochastic models, thereby facilitating the application of model checking to systems for which no model is currently available. This objective is realised via three discrete strategies: (i) formulating an active learning algorithm to learn Markov decision processes, (ii) devising a learning algorithm tailored for synchronised compositions of continuous-time Markov chains, and (iii) developing a library compatible with model checkers, streamlining the process of stochastic model acquisition and its seamless incorporation into the model checking procedure. The first two strategies focus on enhancing and extending the theoretical foundations of learning stochastic models, while the third one centers on facilitating the application of learning stochastic models and their integration into the model checking workflow.
  • Verk
    Risk analysis applied to integrate safety and security into systems design
    (Háskólinn í Reykjavík, 2024-05) Björnsdóttir, Svana Helen; Páll Jensson; Verkfræðideild (HR); Department of Engineering (RU); Tæknisvið (HR); School of Technology (RU)
    The overall aim of this Ph.D. thesis is to contribute to the further development of the research area of risk analysis and risk management. It aims to bridge the gap between scientific research in this area and its practical application in industry and business, e.g., through the development of ISO standards. Industrial standards, notably ISO standards, are the tools organizations use to manage their risk, by following their guidance and complying with their requirements. Organizations confirm their compliance with these standards through certification, which means that they heavily depend on the quality of the ISO standards to enable them to effectively manage risk. In this thesis, the scientific foundation of ISO standards is analyzed, focusing on the guidance provided for key elements of risk management. The research also explores how well ISO standards are aligned with state-of-the-art risk management literature. The research reveals that the ISO standards lack uniformity in risk terminology and guidance on risk management, particularly for risk analysis. As a result, it is expected that risk management, and specifically the analysis of risk, is not executed satisfactorily. Therefore, it is hypothesized that certain flaws in risk management will be evident in practice. This is verified through six real-life case study examples. Part of this thesis work involved developing a two-step benchmarking model to assess the efficacy of ISO risk management systems with the aim of finding hidden risk issues and improvement opportunities. Furthermore, it is investigated whether risk analysis can be improved by using new and improved analysis techniques to identify hazards and threats. The thesis explores the application of recent analysis techniques that are based on systems theory to reinforce risk management systems based on ISO standards. Systems-Theoretic Accident Model and Processes (STAMP), and the derived Systems-Theoretic Process Analysis (STPA) and Systems-Theoretic Early Concept Analysis (STECA) are applied in real case studies and in an early phase of a major national infrastructure project to meet the safe-by-design engineering concept. The main contribution of this Ph.D. thesis is the identification of what is missing in ISO standards regarding risk management and the development of a two-step benchmarking model to assess the efficacy of ISO risk management systems. The research demonstrates how it is possible to improve risk identification and risk analysis with STAMP, STPA, and STECA techniques. To facilitate such analysis, a special STAMP/STPA software was developed as a part of this thesis work.
  • Verk
    Learning about learning : unravelling interactions in higher education with learning analytics
    (2024-06) López Flores, Nidia Guadalupe; María Óskarsdóttir, Co-supervisor: Anna Sigríður Islind; Department of Computer Science (RU); Tölvunarfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    Learning is a multidimensional process that evolves and changes, influenced and affected by several elements. The sudden shift in teaching modality when the pandemic hit implied changes in social interactions, digital platforms use, and collaboration dynamics; potentially impacting the students' learning experience. This research, initially motivated by the unknown effect of the pandemic on teaching and learning practices, lies on the grounds of the Learning Analytics (LA) research field, focused on analysing and understanding learning processes and the environments in which they occur. The dissertation takes the standpoint of interactions with the aim of furthering our understanding of how different types of interactions occurring in higher education inform and relate to students' learning strategies and behaviours. In this dissertation, quantitative and qualitative approaches, as well as a variety of data sources, are used to explore the research question: How and to what extent can the analysis of interactions be used to inform features and changes in undergraduates' learning strategies and behaviours? In educational contexts, interactions correspond to the various ways of communication and engagement that occur between learners, instructors, learning material, and technology. The dissertation thereby presents five chapters focused on exploring several interaction types in the context of higher education, including interactions between humans (student, instructor) and systems (digital ecosystem, content). The results presented in the chapters provide insights into the presence and evolution of study profiles, the relationship between usage of digital platforms and resources, assignment solving, and academic performance, the effect of a data-driven intervention of class schedules and its effect on students' learning activity and experiences, the design and adoption of an institutional programme for supporting instructors, as well as insights on the dynamics of discussion forum interactions in different teaching modalities, and their value and limitations for informing the identification of students at risk of failing. Three main contributions are highlighted in this dissertation. Firstly, by analysing different types of interactions in higher education, the dissertation provides an overview of how these interactions influence each other as well as their relationship with students' learning strategies and behaviours. Furthermore, these insights are helpful for informing the development and adoption of LA research on interactions, which are illustrated in a conceptual framework integrating the dissertation findings, implications and recommendations. Secondly, the dissertation contributes to addressing shortcomings of LA research. It provides insights into students' behaviours and strategies, interactions, learning material usage, course improvements, and interventions in educational settings. Furthermore, practical recommendations in regards to data, resources, and support are provided. Finally, by taking into consideration students, instructors, and digital ecosystems, the dissertation offers insights into the effect of the pandemic on teaching and learning practices in higher education.
  • Verk
    Assessing neurophysiological conditions using a multimetric approach (BioVRSea)
    (2024-06) Jacob, Deborah; Paolo Gargiulo; Department of Engineering (RU); Verkfræðideild (HR); Institute of Biomedical and Neural Engineering (IBNE) (RU); School of Technology (RU); Tæknisvið (HR)
    Current diagnosis and longitudinal evaluation of many neurological disorders rely on subjective, questionnaire-based approaches rather than measured biomarkers of the disease. Deficits of postural control are frequently seen in such diseases and provide a route for more objective assessment. This thesis reports the work completed using the unique BioVRSea setup to assess those with a history of concussion and those with early-stage Parkinson’s Disease and using a combination of neurophysiological (electromyography - EMG, electroencephalography - EEG, heart rate) and centre of pressure (CoP) measurements. The BioVRSea experiment is a challenging postural control task triggered by a moving platform and a virtual reality environment, during which the neurophysiological measurements are taken. In the first paper, measurements were performed on 54 professional athletes who self-reported their history of concussion or non-concussion. Biosignals and CoP parameters were analyzed before and after the platform movements, to compare the net response of individual postural control. The results showed that BioVRSea discriminated between the concussion and non-concussion groups. Particularly, EEG power spectral density in delta and theta bands showed significant changes in the concussion group and right soleus median frequency from the EMG signal differentiated concussed individuals with balance problems from the other groups. Anterior–posterior CoP frequency-based parameters discriminated concussed individuals with balance problems. In the second study on Parkinson’s Disease, 11 early-stage Parkinson’s subjects and 46 healthy over-50s took part in the experiment. Significant differences were found between the two groups in electromyographic and centre of pressure measurements. Correlation analysis of the EMG signal indicated opposite correlations in skewness in the right soleus muscle. In the second Parkinson’s Disease study, 29 healthy and 9 early-stage Parkinson’s Disease subjects were assessed. The results of our work show significant differences in several biosignal features, particularly in the right tibialis anterior, the ellipse area associated with the centre of pressure changes and the power spectral density changes in the alpha and theta bands of the EEG. This thesis shows the potential of BioVRSea as a quantitative means of developing a multi-metric signature capable of quantifying postural control and distinguishing healthy from pathological response.
  • Verk
    Runtime monitoring for asynchronous reactive components
    (2024-02-10) Attard, Duncan Paul; Adrian Francalanza, Luca Aceto, Anna Ingólfsdóttir; Department of Computer Science (RU); Tölvunarfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    Modern software is built on reactive principles, where systems are responsive, resilient, elastic, and message-driven. Despite the benefits they engender, these aspects make the correctness of reactive systems in terms of their expected behaviour hard to ascertain statically. This thesis investigates how the correctness of reactive systems can be ascertained dynamically at runtime. It considers a lightweight monitoring technique, called runtime verification, that circumvents the issues associated with traditional pre-deployment techniques. One major challenge of runtime verification lies in choosing a monitoring approach that does not impinge on the reactive aspects of the system under scrutiny. Such a goal is met only if the monitoring system is itself reactive. We propose a novel monitoring approach grounded on this precept. It treats the system as a black box, instrumenting monitors dynamically and in an asynchronous fashion, which is in tune with the requirements of reactive architectures. Our development approach is systematic, permitting us to directly map the constituent parts of our formal model to implementable modules. This gives assurances that the results obtained in the theory are preserved in the implementation. The first part of the thesis builds on established theoretical results. It lifts these results to a first-order setting to accommodate scenarios where systems manipulate data. We define an asynchronous instrumentation relation that decouples the operation of the system from that of its monitors. This definition forms the basis of our decentralised outline monitoring algorithm presented in the second part of the thesis. Our algorithm employs a tracing infrastructure to collect trace events as the system executes and uses key events as cues to instrument new monitors or terminate redundant ones dynamically. It accounts for the interleaving of events that arises from the asynchronous execution of the system and monitors, guaranteeing that events are analysed by monitors in the correct sequence and without gaps. Part three develops a runtime verification benchmarking framework that is tailored for reactive systems. The framework can generate models that faithfully capture the realistic behaviour of master-worker systems under typical load characteristics. Our tool collects different performance metrics suited to reactive applications, to give a multi-faceted depiction of the overhead induced by runtime monitoring tools. Part four of this thesis embarks on an extensive evaluation of our decentralised outline monitoring algorithm using the benchmarking tool developed in part three. The algorithm is compared against our implementation of inline and centralised monitoring---two prevalent methods used in state-of-the-art runtime verification tools. Apart from demonstrating that our monitoring algorithm is reactive, the experiments we conduct testify that it induces acceptable overhead that, in typical cases, is comparable to that of inlining. These results also confirm that centralised monitoring is prone to scalability issues, poor performance, and failure, making it generally inapplicable to reactive system settings. We are unaware of other comprehensive empirical runtime verification studies such as ours that compare decentralised, centralised, and inline monitoring.
  • Verk
    Decision aids to assist Icelandic men with PSA testing and prostate cancer treatment decision-making
    (2023-09) Eiriksdottir, Valgerdur Kristin; Heiðdís B Valdimarsdóttir; Department of Psychology (RU); Sálfræðideild (HR); School of Social Sciences (RU); Samfélagssvið (HR)
    Prostate cancer (PC) is the second most common cancer among men globally and the most common cancer among Icelandic men. Early detection of PC is possible with a prostate-specific antigen (PSA) test. Before making a decision about PSA testing and before deciding which treatment to choose for localized PC, shared decision-making (SDM) is encouraged as many uncertainties are associated with those decisions and it is important that patients understand the pros and cons of all options before making a decision. Decision aids (DAs) have been found to enhance SDM by, for example, affecting patient involvement and patient-physician communication. While it is known that Icelandic men, newly diagnosed with PC, lack information about the pros and cons of different treatment options, no study to date has examined how much information Icelandic men receive about the pros and cons of PSA testing prior to undergoing PSA testing. Furthermore, DAs for PSA testing decision and PC treatment decision are not available in Icelandic. To address these limitations, the aims of the current Thesis were to; 1) establish the need for an Icelandic PSA testing DA, 2) translate and culturally adapt a pre-existing PSA testing DA for Icelandic men, and 3) develop, culturally adapt and extend an interactive DA to assist men, diagnosed with localized PC, to make a treatment decision. In Paper I, all Icelandic men diagnosed with PC from 2015 to 2020 were invited to participate in a quantitative study evaluating how much information men receive about the pros and cons of PSA testing prior to undergoing a PSA test. Participants were 471 men aged 51 to 95 (M = 71.9, SD = 7.3). In Paper II, a pre-existing DA for PSA testing decision was translated and culturally adapted, and usability was tested in a mixed-methods study, first in a qualitative study and then in a quantitative study. Ten men, aged 51 to 66 (M = 59.9, SD = 5.6) participated in the qualitative study using a semi-structured interview and a questionnaire. Minor modifications were made to the DA following the qualitative study, whereafter, a quantitative study was conducted among 135 men aged 50 to 70 years (M = 59.7, SD = 5.2) to evaluate the final version of the DA. In Paper III, a DA for localized PC treatment decision was culturally adapted, modified and extended. The usability of the DA was evaluated in a mixed-methods study, first in a qualitative study and then in a quantitative study. The qualitative study included semi-structured interviews and a usability scale and participants were 12 men, aged 58 to 80 years (M = 70.66, SD=6.58), diagnosed with PC. A thematic analysis of the interviews led to minor revisions of the DA. Then a quantitative evaluation of the usability of the final version of the DA was conducted among 11 newly diagnosed men with PC, aged 60 to 74 (M = 66.18, SD = 4.79). Findings from Paper I underscored the need for an Icelandic PSA testing DA as Icelandic men lack information before making a PSA testing decision. Half of the participants received information about the pros and cons of PSA testing, a third did not receive any information and 22.2% did not even know they were being tested. Additionally, more than 80% of the men reported none or little knowledge of PSA testing. The findings of Paper II demonstrated that participants found the translation and cultural adaptation of the DA for PSA testing decision to be successful, as they found the DA helpful and comprehensible and almost all participants said they would recommend it to others. The results of Paper III demonstrated that the DA for treatment decision for localized PC was well received by participants. Participants were satisfied with the DA and the realistic information on side effects that was presented. They found the information about the pros and cons of treatment options helpful, and all noted they would recommend the DA to others facing the same decision. Currently, a randomized clinical trial is being conducted to evaluate the effectiveness of the DA for localized PC treatment decision. The main results from the overall Thesis were that men do not receive adequate information about the pros and cons of PSA testing and that the DAs for PSA testing and localized PC treatment decisions were successfully modified. DAs have been shown to enhance SDM, be cost-effective, and have a minimal burden on the healthcare system. Therefore, the usage of DAs is likely to benefit both patients and healthcare providers of the Icelandic healthcare system.
  • Verk
    Concussion history among Icelandic female athletes : mental health, cognition and possible concussion biomarkers
    (2022-12) Unnsteinsdóttir Kristensen, Ingunn; María Kristín Jónsdóttir; Sálfræðideild (HR); Department of Psychology (RU); Samfélagssvið (HR); School of Social Sciences (RU)
    Concussion symptoms are complex. They are non-specific to a concussion, and there is no gold standard for diagnosis and evaluation. For most, symptoms will resolve in days or weeks following a concussion. However, symptoms can become more serious, lasting for months or even years, considerably affecting quality of life. Long-lasting concussion symptoms can include worse mental health and cognitive function, impaired sleep, and ocular and vestibular problems. Sports are a significant risk factor for concussions. Previous concussions, medical history and background, age and gender are also factors influencing the prevalence and the sequela of concussion and progression of symptoms. Despite being underrepresented in the concussion literature, many studies have found that women are more at risk of sustaining a concussion and have more severe symptoms. All of the participants in this study were Icelandic female athletes, retired and still active. All had been playing at the highest level in their sport in Iceland. The aims of this Thesis were to 1) examine the usefulness of self-report of concussion history and test if different methods of obtaining self-report would affect the report given and the relationship with an outcome variable; 2) examine concussion history and symptoms among retired and still active female athletes and the relationship with mental health and cognitive abilities; 3) validate self-reported concussion history and symptoms by assessing phycological responses and physical markers in a virtual reality environment. Self-reported history varied according to the method used to elicit concussion history. This change indicates a lack of concussion knowledge and that detailed questioning might be preferable when asking for a self-report of concussion history. This change and how groups were formed depending on concussion count affected the relationship with current symptoms. History of concussion was connected to poorer impulse control, more current post-concussion symptoms and more problems with sleep, as well as more anxiety and depression symptoms. Retired athletes with a concussion history tended to have a worse outcome. When evaluating concussion symptoms and responses in a virtual reality environment, biological signals showed discriminative powers when comparing those with and without a concussion history. This supports their use as possibleiv biomarkers for concussion. The Random forest algorithm predicted concussion history with over 90% accuracy. Overall, the findings support the use of self-report while assessing concussion history and symptoms among female athletes with the appropriate framework. However, the limitations of self-report and how they can affect results are also recognised. In addition, results suggest that concussion history is connected to worse mental health and poorer impulse control. The findings also highlight the use of a multimodal approach to concussion assessment and support the use of several biological measures as possible biomarkers for concussion. Results also underline the importance of including technology from different fields in concussion assessment
  • Verk
    Designing capital-ratio triggers for Contingent Convertibles
    (2024-02) Segal, Maxime; Sverrir Ólafsson; Department of Engineering (RU); Verkfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    Contingent Convertible (CoCo) bonds represent a novel category of debt financial instruments, recently introduced into the financial landscape. Their primary role is to bolster financial stability by maintaining healthy capital levels for the issuing entity. This is achieved by converting the bond principal into equity or writing it down once the minimum capital ratios are violated. CoCos aim to recapitalize the bank before it is on the brink of collapse, to avoid a state bailout at a huge cost to the taxpayer. Under normal circumstances, CoCo bonds operate as ordinary coupon-paying bonds, which only in case of insufficient capital ratios are converted into equity of the issuer. However, the CoCo market has struggled to expand over the years, and the recent tumult involving Credit Suisse and its enforced CoCo write-off has underscored these challenges. The focus of this research work is on the first hand to understand the reasons for this failure, and, on the other hand, to modify its underlying design in order to restore its intended purpose: to act as a liquidity buffer, strengthening the capital structure of the issuing firm. The cornerstone of the proposed work is the design of a self-adaptive model for leverage. This model features an automatic conversion that does not hinge on the judgment of regulatory authorities. Notably, it allows the issuer's debt-to-assets ratio to remain within predetermined boundaries, where the likelihood of default on outstanding liabilities remains minimal. The pricing of the proposed instruments is difficult as the conversion is dynamic. We view CoCos essentially as a portfolio of different financial instruments. This treatment makes it easier to analyze their response to different market events that may or may not trigger their conversion to equity. We provide evidence of the model's effectiveness and discuss it implications of its implementation, in light of the regulatory environment and best market practices.
  • Verk
    Decision usefulness of accounting information and compliance with accounting standards
    (2023-01) Claessen, Arni; Stefan Wendt; Department of Business Administration (RU); Viðskiptadeild (HR); School of Social Sciences (RU); Samfélagssvið (HR)
    The overall aim of this thesis is to investigate decision usefulness of financial accounting information and compliance with financial accounting standards. This thesis builds on three complementary studies, which explore the decision usefulness of fair value accounting information and compliance of private companies with financial accounting standards. The information perspective of accounting information provides the theoretical framework for the thesis. From this perspective, information content of financial information is useful if it has impact on the users of the accounting information. The decision usefulness of fair value accounting (FVA) is analysed for two important user groups of financial statements, equity analysts and investors. The first study uses a case study approach and interviews with equity analysts to examine the usefulness of judgemental fair value adjustments (Level 3) and the impact that the implementation of IFRS 13 Fair Value Measurement had on the relevance of disclosures and disclosure practice. The second study focuses on investors and uses an event study methodology and regression analysis to examine the association between judgemental fair value adjustments recognised in the income statements and stock price reaction for listed European real estate companies. It also probes if increased disclosures about fair value following the implementation of IFRS 13 increased the relevance of the FVA. The third study examines the level of compliance with national accounting standards and factors which may influence the compliance level for a sample of Icelandic private companies. The first study finds that equity analysts focus on cash-flow and do not incorporate Level 3 fair values as an input in their valuation. These results indicate that Level 3 fair value measurements or fair value disclosures have little relevance or information value for equity analysts. However, the fair value disclosures appear to have to some extent confirmative value as they provide analysts with comfort over their own fair valuation measurements and verify the credibility of management. The additional fair value disclosure requirements implemented with IFRS 13 have scant relevance for equity analysts. The results provide evidence that standard-setters, auditors and preparers of financial statements with significant Level 3 fair value adjustments should focus on predictive and forward-looking disclosures to evaluate future cash flows. Detailed disclosures about the management valuation process and sensitivity analysis have limited relevance for the equity analysts. On the other hand, from the investors´ perspective, the findings of the second study indicate that FVA are value relevant after implementation of IFRS 13 but not in the period before the implementation. In addition, the findings indicate that FVA recognised in semi-annual financial statements are more value relevant compared to annual accounts and positive FVA have more value relevance than negative FVA. While the first two studies focus on the usefulness of accounting information, the third study goes a step further and explores management intentions to provide useful accounting information by analysing compliance with accounting standards. This study expands the literature by using management incentive theories to investigate compliance with national accounting standards by private companies, whereas prior research has mainly focused on publicly listed companies. The research reveals an overall compliance level of 75%, which demonstrates poor compliance, as the study is based on compliance with mandatory disclosures, where 100% compliance is required by law. Compliance is particularly low with mandatory disclosure requirements regarding investment in other companies, related party transactions and off-balance sheet liabilities. The overall results support concerns about lack of compliance, which have been raised by authorities, analysts, credit institutions and other consumers of Icelandic financial statements. Even though the information asymmetry in private companies appears to be resolved to some extent through private communications with different stakeholders, public financial statements play an important role. These findings have therefore direct implications for policy makers and regulators, as they highlight the importance of improving the enforcement and monitoring of compliance with the accounting regulation. Additionally, the study finds association between compliance levels and the size of a company, size of audit firm and sign-off date of financial statements. However, the age of a company, leverage or family ownership do not appear to influence compliance levels.
  • Verk
    Platform for decoupling experience managers and environments
    (2023-10) Mori, Giulio; David Thue; Stephan Schiffel; Tölvunarfræðideild (HR); Department of Computer Science (RU); Tæknisvið (HR); School of Technology (RU)
    Experience Management employs Artificial Intelligence technologies to enhance people's interactive application experiences by dynamically modifying the environment during the experience. In game-related research, there is a prevailing trend where each experience manager is tightly integrated with the specific environment it can manipulate. This integration poses a challenge in comparing different managers within a single environment or a single manager across multiple environments. In this dissertation, I propose a solution to address this issue by introducing EM-Glue, an intermediary software platform that decouples experience managers from the environments they can modify. Prior to presenting the solution, I provide a comprehensive problem description and conduct a literature review to explore the current state of the field. Subsequently, I outline the platform's structural design, including a communication protocol facilitating interaction between managers and environments, as well as the regular communication process. Additionally, I develop a use case to evaluate the effectiveness of the proposed solution. This involves employing an environment and two experience managers: the Camelot Wrapper, a software I constructed to extend the interactive visualization engine Camelot and connect it to the platform, PaSSAGE, an existing experience manager adapted for use with the platform, and a random experience manager. The evaluation results demonstrate the platform's ability to decouple experience managers from environments, enabling future work to compare experience managers across multiple environments.
  • Verk
    Charge and heat transport in semiconductor core-shell nanowires with temperature bias
    (2023-04-17) Rezaie Heris, Hadi; Sigurður Ingi Erlingsson; Department of Engineering (RU); Verkfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    In this dissertation, we calculate the electronic charge and heat transport generated by a temperature gradient and a chemical potential bias in tubular semiconductor nanowires, in the presence of a magnetic field. We use the Landauer-Büttiker approach to calculate charge current, electronic heat current, Seebeck coefficient, thermal conductivity, and figure of merit. We also study the influence of the crosssection shape and shell thickness on the electronic conduction properties of tubular nanowires. Then, charge and heat transport in tubular nanowires for different polygonal cross-sections and different numbers of impurities with various strengths are studied. Effects of transverse geometry on the thermal conductivity of Si and Ge nanowires are investigated by using molecular dynamics simulations with the LAMMPS software. We consider nanowires with different polygonal cross-sections, tubular (hollow) nanowires, and core/shell nanowires with combinations of Si/Ge and Ge/Si. We also study the diffusion and phonon drag thermopower in Si nanowires through numerical calculations with tight-binding models and quantum transport.
  • Verk
    Agora : unified framework for crowd simulation research
    (2023-06) Diamanti, Michelangelo; Hannes Högni Vilhjálmsson; Department of Computer Science (RU); Tölvunarfræðideild (HR); School of Technology (RU); Tæknisvið (HR)
    Crowd simulation focuses on modeling the movements and behaviors of large groups of people. This area of study has become increasingly important because of its several applications in various fields such as urban planning, safety, and entertainment. In each of these domains, the presence of virtual agents exhibiting realistic behavior greatly enhances the quality of the simulations. However, the inherently multifaceted and intricate nature of human behavior presents a unique challenge, necessitating the effective combination of multiple behavior models. This thesis introduces a novel theoretical framework for modeling human behavior in crowd simulations, addressing the unresolved issue of combining a plethora of behavior models, often developed in isolation. The proposed framework decomposes human behavior into fundamental driving stimuli, which are then represented graphically through the heatmap paradigm. Subsequently, the agent behavior is influenced by the heatmaps, which guide them toward attractive areas and steer them away from repulsive locations based on the encoded stimuli. A key advantage of this approach lies in the ability to combine heatmaps using well-defined color operations, effectively integrating different aspects of human behavior. Furthermore, the heatmap paradigm facilitates objective comparison of simulation output with real-world data, employing image similarity metrics to evaluate model accuracy. To realize this framework, the thesis presents a modular software architecture designed to support various tasks involved in crowd simulation, emphasizing the separation of concerns for each task. This architecture comprises a collection of abstract modules, which are subsequently implemented using appropriate software components to realize the underlying features, resulting in the Agora framework. To assess the ability of Agora to support the various tasks involved in crowd simulation, two case studies are implemented and analyzed. The first case study simulates tourists visiting Þingvellir national park in Iceland, examining how their behavior is influenced by the visibility of the surrounding environment. The second case study employs Agora to model the thermal and density comfort levels of virtual pedestrians in an urban setting. The results demonstrate that Agora successfully supports the development, combination, and evaluation of crowd simulation models against real-world data. The authoring process, assisted by Agora, is significantly more streamlined compared to its native counterpart. The integration of multiple models is achieved by combining the heatmaps, resulting in plausible behavior, and the model assessment is made convenient through the evaluator within the framework. The thesis concludes by discussing the implications of these findings for the field of crowd simulation, highlighting the contributions and potential future directions of the Agora framework.