Home      Log In      Contacts      FAQs      INSTICC Portal

Keynote Lectures

From Representation to Mediation: Modeling Information Systems in a Digital World
Jan Recker, University of Cologne, Germany

Big Data Integration
Philippe Cudré-Mauroux, University of Fribourg, Switzerland

Time and Processes
Johann Eder, Alpen-Adria Universität Klagenfurt, Austria

Architecting the Future of Software Engineering
Douglas Schmidt, Vanderbilt University, United States


From Representation to Mediation: Modeling Information Systems in a Digital World

Jan Recker
University of Cologne

Brief Bio
Prof Dr Jan Recker is Alexander-von-Humboldt Fellow, Chaired Professor for Information Systems at the University of Cologne, and Adjunct Professor at the QUT Business School, Australia.
In his research he explores the intersection of technology, people and work. He works with particularly large organizations, such as Woolworths, SAP, Hilti, Commonwealth Bank, Lufthansa, Ubisoft, federal and state governments, and with particularly small organizations ("start-ups") in the consumer goods, information techology, and financial sectors. He tackles questions such as:
• How do small and large organizations deal with digital innovation and transformation?
• How do products and processes change through digitalization?
• How can digital solutions help building a sustainable future?
Jan's research in these areas draws on quantitative, qualitative and mixed field methods. His research has appeared in leading information systems, management science, software engineering, project management, computer science, and sociology journals. He has also written popular textbooks on scientific research and data analysis, which are in use in over 500 institutions in over 60 countries. He ranks as one of the most published information systems academics of all time. In 2019, he was named #1 business researcher under 40 years of age by the German Magazine Wirtschaftswoche. He was the youngest academic ever to be named an AIS fellow in 2018.

The role of information systems is changing in an increasingly digitalized world. Does this situation mean that established conceptual modeling practices relevant to the analysis and design of systems must change as well? In this talk, I will answer this question with a definite and affirmative “yes”. I will review the traditional assumptions around the conceptual modeling of information systems and demonstrate how advances in digital technology increasingly challenge these assumptions. I will then present a new framework for conceptual modeling that is consistent with the emerging requirements of a digital world. The framework draws attention to the role of conceptual models as mediators between physical and digital realities. It identifies new research questions about grammars, methods, scripts, agents, and contexts that are situated in intertwined physical and digital realities. I will discuss several implications for conceptual modeling scholarship for systems analysis and design that relate to the necessity of developing new methods and grammars for conceptual modeling, broadening the methodological array of conceptual modeling scholarship, and considering new dependent variables.



Big Data Integration

Philippe Cudré-Mauroux
University of Fribourg

Brief Bio
Philippe Cudre-Mauroux is a Full Professor and the Director of the eXascale Infolab at the University of Fribourg in Switzerland. He received his Ph.D. from the Swiss Federal Institute of Technology EPFL, where he won both the Doctorate Award and the EPFL Press Mention in 2007. Before joining the University of Fribourg, he worked on information management infrastructures at IBM Watson (NY), Microsoft Research Asia and Silicon Valley, and MIT. He recently won the Verisign Internet Infrastructures Award, a Swiss National Center in Research award, a Google Faculty Research Award, as well as a 2 million Euro grant from the European Research Council. His research interests are in next-generation, Big Data management infrastructures for non-relational data and AI. Webpage: http://exascale.info/phil

Until recently, structured (e.g., relational) and unstructured (e.g., textual) data were managed very differently: Structured data was queried declaratively using languages such as SQL, while unstructured data was searched using boolean queries over inverted indices. Today, we witness the rapid emergence of Big Data Integration techniques leveraging knowledge graphs to bridge the gap between different types of contents and integrate both unstructured and structured information more effectively. I will start this talk by giving a few examples of Big Data Integration. I will then describe two recent systems built in my lab and leveraging such techniques: ZenCrowd, a socio-technical platform that automatically connects Web documents to semi-structured entities in a knowledge graph, and Guider, a Big Data Integration system for the cloud.



Time and Processes

Johann Eder
Alpen-Adria Universität Klagenfurt

Brief Bio
Johann Eder is full professor for Information and Communication Systems in the Department of Informatics-Systems of the Alpen-Adria Universität Klagenfurt, Austria. From 2005-2013 he served as Vice President of the Austrian Science Funds (FWF). He held positions at the Universities of Linz, Hamburg and Vienna and was visiting scholar at AT&T Shannon Labs, NJ, USA.
The research interests of Johann Eder are information systems engineering, business process management, and data management for medical research. A particular focus of his work is the evolution of information systems and the modelling and management of temporal information and temporal constraints.
Johann Eder successfully directed numerous competitively funded research projects on workflow management systems, temporal data warehousing, process modelling with temporal constraints, application interoperability and evolution, information systems modelling, information systems for medical research, etc.
Johann Eder published more than 190 papers in peer reviewed international journals, conference proceedings, and edited books. He serves in the steering committees of CAiSE (chair), ADBIS, and TIME. He chaired resp. served in numerous program committees for international conferences and as editor and referee for international journals.

Processes are ubiquitous for modeling dynamic phenomena in many areas like business, production, health care, robotics etc. Many of these applications require to adequately deal with temporal aspects. And the notion of a process already involves the notion of time, of being able to talk about before and after. Process models need to express temporal properties of the context resp. environment of processes and they need to express temporal conditions, the process execution has to satisfy. Process models, therefore, contain constructs for durations, temporal constraints like allowed time between events, and deadlines. Furthermore, process models need a notion of correctness and we discuss different notions like satisfiability, and controllability and techniques which can be employed to check these properties of process models at design time and of the execution of process instances at run-time.



Architecting the Future of Software Engineering

Douglas Schmidt
Vanderbilt University
United States

Brief Bio
Dr. Douglas C. Schmidt is the Cornelius Vanderbilt Professor of Computer Science, Associate Provost for Research Development and Technologies, Co-Chair of the Data Science Institute, and a Senior Researcher at the Institute for Software Integrated Systems, all at Vanderbilt University. He is also a Visiting Scientist at the Software Engineering Institute (SEI) at Carnegie Mellon University (CMU).
Dr. Schmidt's research covers a range of software-related topics, including patterns, optimization techniques, and empirical analyses of frameworks and model-driven engineering tools that facilitate the development of mission-critical middleware for distributed real-time embedded (DRE) systems and mobile cloud computing applications. He has published more than 650 technical papers and books and has mentored and graduated over 40 Ph.D. and Masters students.

Software has become essential to modern life and controls functions as diverse as the capabilities for home appliances, aircraft flight control systems, mobile phones, health care systems, financial transactions, and digital learning. Individuals, organizations, markets, and governments are increasingly dependent on software. Despite advances in computing and software technologies, however, we continue to see headlines about failures related to software.
This presentation will cover investments and advances in research needed to address these issues. In particular, I will discuss how the current notion of software development will be replaced by one where the software pipeline consists of AI and humans collaborating at scale on continuously evolving systems. These software-enabled systems will be even more pervasive than today, with levels of trust commensurate with the importance of the decisions they make.
I will also discuss how software and humans need to become mutually trustworthy peers comprising socio-technical ecosystems, where human expressions of intent are reliably understood and socio-technical platforms enable collaboration at scale, leading to socially resilient, ethical, and unbiased behavior. Advances in compositional correctness and continuous reflection ar also needed to allow intelligent systems to learn from experience and continuously improve.