Current projects
Students and members of the department can propose their own projects.
DeepStudy: Question Type Detection (SoSe 2021)
DeepStudy is a continuation of our project that was funded by DAAD. In the first phase of the project we worked together with a interdisciplinary team(Philosophy, Computer Science, Linguistic, Psychology) and did research about the topics: questions, learning and ai. In the second BaseCamp phase we are continuing our focus on developing an ai model for generating questions.
DeepStudy is a learning tool for students who are eager to deepen their understanding and strengthen their memory. Based on a input text DeepStudy will generate questions automatically. Questions are helpful because one can immediately check one’s understanding. When students create their own questions they might focus on understood text passages and leave out the hard ones. However, these hard questions are the turning point in understanding and creating new ideas. So by automating this process students will have more time to study and will also reflect a range of topics.
In Natural Language Processing the task of Question Generation (QG) has not been explored as much as Question Answering, especially in the german language. So with this project we can guide future projects in this domain. Because this field is unexplored we have two approaches which we will compare. By creating rules that generate questions for detected sentences the challenge lies in the linguistics. Understanding how the sentence is built is crucial so that we can create a question for it. Our pattern matched sentences are detected by spacy’s Matcher and are then given over to our own question generation functions. However, these question patterns can be quite repetitive, so choosing the balance between width and depth is important.To distance ourselves from writing many functions we took also the machine learning approach. Here our main worry was having enough German data. We have created our own German dataset by writing down questions manually but also scraping the web. This in combination with a newly released data set in german is the foundation for the ML approach. By fine tuning different german language models(like Bert) we hope to build the new state-of-the-art german question generation model.
Participants: B. Müller
Collaborative Development of Conversational Agents (WS21/22)
Dear students,
the WISTS group offers a new course for master students on "Collaborative Development of Conversational Agents" starting October 18th. This course focuses on the design, development, deployment, and evaluation of conversational agents (e.g., chatbots or voice assistants) for a given problem domain (e.g., customer service, team collaboration).
The course is a joint offering together with the Technische Universität Dresden (Prof. Brendel) and Universität des Saarlandes (Prof. Morana). Students will work on their projects collaboratively in mixed virtual teams with students from the other universities. UHH students can be credited 6 ECTS in "Freier Wahlbereich" as a base.camp project.
Course details and the application form can be found on the course website:
Collaborative Development of Conversational Agentshttps://www.uni-saarland.de/lehrstuhl/morana/lehre/collaborative-development-of-conversational-agents-master.html
As we can only take 15 students in total, please apply asap, if you want to participate!
Looking forward to working with you.
Kind regards,
Eva Bittner
Prototype Development and Integration of a Modular Tactile Sensor Array for Robotic Grippers (WS 20/21)

The project presented in this project aims to develop and integrate a prototype of a modular tactile sensor array for robotic grippers. While many grippers are able to control the force with which they grip an object, they are incapable of making assumptions about the texture or shape of the object. The work planned for this project is supposed to pave the way for experiments involving tactile measurements inside the gripper and for further evaluations of tactile sensor arrays. It is features novelties such as wireless data transmission, a modular design and is easy to recreate.
Participant: Niklas Fiedler
Towards a Decentralized Trust-Based Reputation System for Citizen-Driven Marketplaces (SoSe 2020)
As part of the master project “Projekt Smart Cities” InfM-Proj-VS work started on implementing a marketplace to share self-generated data. This proposal describes why further effort is necessary for the functionality of the project in the context of the base.camp. In the modern world where every residence is getting smarter and producing more data the need for sharing data is rising. This data can be produced by e.g. selfbuilt weather stations, smartwatches, or other “smart” devices. Currently the data is only locally available for individuals who produced it. The topic of the project was to develop a platform for sharing this self-generated data independent from the type that is produced.
Participant: Fin Töter
drasyl (SoSe 2020)
Existing overlay networks almost always come as highly specialized, contextrelated bulk frameworks. This means that they handle nearly all aspects from communication,consistency checks, (un-)marshalling of application-specific entities up to particular method calling models for Remote Method Invocation (RMI) support. Often this happens by a forced specification of a programming paradigm/model, such as the actor or agent model.

Besides, there are typical usage constraints such as the need for a network interface (which requires root privileges for installation) or additional overhead from multi-layer models that mimic IP or VLAN. These frameworks may be an optimal solution for some use cases, but also means limitations and overhead for projects based on a different model and do not need the additional features. All these approaches have in common that they have to solve the problem of communication between nodes in an optimal way. It is precisely this problem and the gap between raw IP-based communication and context-related frameworks that drasyl aims to close by offering a minimal, expandable and transparent transport layer.
Participant: Kevin Röbert
Analysis of Comments on Online News Articles (SoSe 2019)
The shift from paper based news sources towards online news hubs allows readers to voice their opinions and interact by using the comment function that is often provided alongside the news articles. In 2017 the German news website Spiegel Online (abbr. SPON) reported that usually around 8000 comments are published in a single day. This increasingly high amount of semantically rich natural language data could, in various ways, be used to enhance interaction between users, or to allow aggregated insight into the topics addressed in the comments. The task of this base.camp project is to collect the news articles and comments from the popular news website SPON and apply recently developed natural language models for the following two tasks:
Automated integration of user comments into news articles
Even though the number of users actively participating in discussions on news articles steadily increases, the comment sections are often clearly separated from the articles themselves. In the case of SPON, comments can be accessed either through forums or below articles, visually separated by links to social media and related articles. Furthermore, comments are sorted chronologically instead of topically. Throughout the actual article there is a topical structure, making it easy to follow along for users. Comments, however, are sorted chronologically, meaning that comments concerning different sections of the article are mixed. This makes it harder for users to keep an overview over which aspects of the article are being discussed. The idea is to enhance the discourse between users by showing the comments alongside the specific section of the article they concern. To achieve this, I want to automatically recognize strong semantic relatedness between comments and sections of the article in the sense of aspect mining The preliminary approach I would like to explore is for the model to learn the relation of comments and article sections from known relations between forum comments, i.e. users answering to other users. My goal is to explore whether or not I can successfully transfer the learned comment-to-comment relationships onto comment-to-article relations. I plan to make the results available as a browser extension, customizing the user experience when using the SPON news page.
Identification and classiffication of meta-comments
A special kind of comment is the meta-comment. This kind of comment does not only address the content of the article, but primarily focuses on how this content is covered, e.g. giving feedback on the journalist, moderators or the news website. Previous work by Häring et al. has already explored this topic, applying both traditional as well as end-to-end machine learning approaches to group meta-comments into three classes. Subsequent manual content analysis has shown that a more fine-grained classification may be useful. My goal is to improve the meta-comment identification performance achieved by Häring et al. and to revise the classification structure. This new classification model will be based on the Bidirectional Encoder Representations from Transformers (abbr. BERT) architecture.
This project was proposed by Marlo Häring from the MAST research group and is actively supported by him and Prof. Dr. Maalej. It is related to the Forum 4.0 research project which aims to analyze, aggregate, and visualize the content and quality of comments in order to enable constructive participation.
Participant
Tim Pietz
V-connect-U (SoSe 2019)
We developed the app V-connect-U during a hackathon hosted by VSIS. lts concept is the sharing of any kind of good with other users of the app. lnstead of monetary compensation, sharing goods shall invite social interactions between users. At the end of the hackathon, the app was in a presentable state with partially implemented core functionality. In this proposal, we will display the current state of the app and describe the features we would like to develop as part of the base.camp with the goal of finishing all core features and performing a field study to evaluate the usability of the app.
A moderate amount of people carry objects like power banks with them everyday, but rarely come into a situation where they actually need to use them. Sometime a group of friends will have a BBQ in a public park, but they might only use a fraction of their grill's energy before running out of food to grill. We think that it would be great to share these commodities which would be wasted otherwise.
Sharing commodities for free has some advantages over putting a price tag on them: - Removes the hassle of picking a fair price or having to bargain - Provides the good feeling of helping someone eise out - Allows people to easily start a conversation with strangers
Our Sharing App aims to have the following features: - Creating share events (What, Where, When) - Searching for share events through multiple filters (Location, Category, etc.) - Signing in/out of events, either for the full duration of the event or for a set amount of time - Rating host and/or other participants after the event is over.
Target Features
• Registration for new users (currently users have tobe written directly into the database)
• Password functionality for users
• Fill the map-view with actual shares from the database
• Implement filtering logic for existing shares instead of showing all of them at all times
• Create a detail view for shares that can be accessed from list/map view
• Sign in/out of shares through the aforementioned detail view
• New subsection under the share section of the app to show users all of their created shares with editing/deletion functionality
• Rating of host and participants after a share event has been completed
MineRL Competition for Sample-Efficient Reinforcement Learning (SoSe 2019)
We are planning to take part in the MineRL competition. The goal of the competition is to foster the development of sample efficient reinforcement learning algorithms while providing a level playing field for all participants regarding available resources. The competition is organized by various universities, sponsored by Microsoft and one of the eight competitions in the NeurIPS Competition track. Participants will devise a general algorithm that will be evaluated in Minecraft, a popular video game. The target for the agent is to mine a diamond as soon as possible which requires various previous crafting and gathering steps. These previous steps or subtasks will also give a reward and are specified by the organizers. The use of any domain specific feature extraction or code is forbidden. For training, a dataset containing more than 60 million state-action pairs is provided by the organizers.
Participants
Anton Wiehe, Florian Schmalzl and Brenda Vasiljevic Souza Mendes
Supervisor
Manfred Eppe
A Model-Based Retrospective and Prospective Inference Approach (WS 19/20)
Inspired by the human event cognition, this project expounds the use of model-based approaches to solve tasks in a more efficient manner. With a combination of continuous optimization of upcoming future events and pondering about the past, an agent can flexibly plan regarding the task and context given. Ongoing optimization of the internal state leads to potentially better event encodings, resulting in improved goal-oriented actions. To handle and predict sequences of events, recurrent neural networks in various shapes represent possible solutions. Besides, techniques like experience replay and model-predictive control could be used to build an architecture capable of adapting its internal reasoning for the given task. For the evaluation, the physics simulation MuJoCo and a goal-based Robotics simulation are used to apply the concept to different environments. This project extends the Hindsight Experience Replay technique by a forward-model, to show improvements in learning performance in comparison to a purely model-free approach.
Participants
Frank Röder
Supervisor
Manfred Eppe
LAMP – Learning Analysis and Management Project (WS 19/20)
The constant improvements of quality management in university teaching have been a focus topic in recent years and its importance has increased accordingly which has also been the case for the department of informatics at the University of Hamburg. In this context, formative-adaptive tests based on the open source platform OpenOLAT were introduced in selected modules, which are intended to deepen and interactively review the individual preparation and follow-up of the lecture content. Despite the potential of data analysis and the adaptation of the lecture contents and exercises based on the student results, the analysis options of these software solutions have so far been very limited and have had a low informative value. The LAMP base.camp project is intended to develop a software solution that provides advanced management and analysis options of these tests, which enable lecturers to gain specific insights into the learning progress of the students and to develop learning profiles based on that, which characterize those achievements. The solution aims to provide an option of compiling those tests and creating process mining based analyses that characterize the difficulty of individual questions and thus creating an expandable platform that significantly increases the teaching quality through data-driven statements that can be derived from those analyses.
Participants
Julien Scholz, Pascal Wiesenganger
Analysing and classifying toxic speech in online forum (SoSe 2020)
Manual moderation of comments in online forums without the assistance of any tools can be time consuming. One of the use cases of our deep learning model trained in this project could be to aid in the moderators' decision making process of determining if a comment complies with or violates the rules. Comments gathered through crawling will be used to finetune the model for this specific task. In the end the model will give suggestions to whether or not a comment should be removed or approved while providing a reason with confidence level. This model will be based on approaches of distilling the knowledge of larger, heavier models such as BERTLARGE (Delvin et al., 2019) into smaller, faster models to achieve near real time classification of comments while retaining a high accuracy.

Based on the Forum 4.0 project, the deep learning model trained in this project is going to assist moderators of online forums in real time to ease their decision of approving or removing a user comment. The goal of this project is to use the model to classify clear-cut comments that violate the rules. As for comments that are borderline acceptable, the aim is to have the moderator decide. With this human-in-the-loop approach we strive to quickly deal with clear cut cases and focus on the hard cases. The deep learning model could also be used in other cases. For example labeling comments as part of a study to see how many people are coronavirus sceptics. In the case of Spiegel Online, they receive roughly 300,000 comments in a month. With that amount of comments we average a comment every 8.6 seconds. This leaves little time for the model to classify the comment in real time. Even in this example a state-of-the-art model won't be able to keep up with the speed the comments are being made. In this project we explore different implementations and uses of the state-of-the-art natural language processing model BERT (Delvin et al., 2019). This model trains itself by looking at the given text sequence from both left-to-right and right-to-left thus achieving a better understanding of language context compared to previous models that only worked in one direction. Our goal here is to find a distilled model of BERT so that we get an inference time on the CPU low enough for near real time classification while still retaining a high accuracy.
Participant: Kevin Friedrich
EHIC2IoT (SoSe2020)
When doctors and laboratories are communicating about probes, e.g. those taken from corpses, the documentation of test results is currently handled by hand-written notes and pages which have to be physically transmitted to each party or have to be typed in a software manually. EHIC2IoT wants to help doctors, laboratories and medical staff to deal with probes more efficiently so that measurement can be applied faster.

The basic idea for this project one may find here: https://devpost.com/software/ehic2qr. During the EUvsVirus Hackathon in 2020 students of the university of Hamburg came together to build an app for the laboratories especially in Hamburg who are dealing with the probes taken from suspected Coronavirus infected people. The idea is to extend and adjust the EHIC2QR App so that it has not only one use case of helping the medical procedure when dealing with a Coronavirus case, but rather helping the general procedure of medical staff communicating with laboratories. To get the time and effort in this communication task reduced is a very crucial challenge as it is the key factor of how fast infectious diseases can spread. Just imagine, today an infected person possibly could be told one week after testing if he/she was tested positive or negative. This is unacceptable and that is why Dr. Armin Hoffmann from the UKE, Hamburg and J. Harder and J. Khattar want to build this app.
Participants: Jan Harder, Johnvir Khattar
Mobile CRM Application for B2B Communication enabling Ordering, Messaging and Connecting between Suppliers and Retailers (SoSe 2019)
Introduction
Today, communication between a retailer and his supplier has no standardization regarding the communication channel. In fact, both of those kinds of business partners do use individual forms of communication which leads to the fact that suppliers receive orders and messages through different channels of communication. Communication in such cases is done by e-mailing, calling, or by using messaging apps such as WhatsApp. From a supplier´s perspective receiving orders, complaints and questions through different channels causes higher transaction costs and more errors than receiving a major part of the communication through one channel which is primarily designed for this information management task.
There are several possibilities to efficiently connect two businesses via standardization of communication, by using standards like EDI, or by using a B2B web shop. Nevertheless there is still the issue that not every business implements the same way of communication, either because of the complexity of a specific channel, or because of a long-term practice using one specific communication channel, without evaluating it over time.
Talking to business owners reveals that there is another issue of doing too many things manually in the process of receiving orders. E.g. when one supplier plans to deliver to one customer, it is common practice to manually contact all other customers which are located near that first customer to request them to place orders. There are several other situations in which the supplier manually has to define and then contact customers, to receive a certain reaction from them. This causes the transaction costs to be higher than doing such tasks by automation.
Our solution
This project has the goal to build and distribute a mobile first B2B-Communication application to efficiently design and implement a tool so that suppliers and retailers of different goods can do their business communication through one central channel and automate basic tasks.
The access to the app is supposed to be exclusively limited to businesses with a business registration. After registration and getting access to the account every user is supposed to customize it´s user account by entering data which is relevant for business communication. Within the app every user can connect an in-app shop to its account. This lets everyone in this network potentially see and shop at different suppliers on their mobile phones, as soon as they get the access from the specific shop owner. For a better imagination assume there is a global shop for every user, where all the products from all connected suppliers are shown. In this global shop there is the possibility to filter by user product, etc. As soon as the shop owner enables the access, a user can send orders, which will be sent via pdf or similar to the recipient in the messaging display, who can easily send order confirmations back and export that order in different formats to get it into their ERP-system. Like in other messaging apps one user can write messages to another user. The whole application has the objective to let every user see and easily adjust the visibility of it´s personal information, shop, etc. Every user can easily manage and see all the connected suppliers and the connected retail-customers. To support every user there will be an implementation to automate certain communication tasks such as automatically finding out which customers have to be contacted concerning purchasing orders by entering the transportation route of the suppliers´
vehicle. By implementing such functions in this application the user’s processes will have faster lead times, because of less errors.
Trash King
Trash King is an app designed to help cleaning the city of Hamburg. It should motivate citizens to report and collect trash in public environments. We display trash that is already reported but also provide the ability to report other trash. Aditionally, there are trash areas that we believe to get polluted frequently. Those areas can be cleaned periodically by users. To give some kind of reward we asigned points to each trash point and area and implemented a highscore as gamification aspect. We plan to integrate the "Peertrust" metric to prevent gaming the system, as well as social functions. We are using the Melde-Michel city service API and plan to enhance the feedback between our app and the service.
Participants: F. Stiefel, L. van der Veen
Supervision: P. Kisters, Prof. Lamersdorf