Nabla [at the TU]

System for the automatic generation of practice exercises

With Nabla, any number of single-use tasks of a given type can be generated randomly and individually from a large pool on the spot. The user's solution is automatically checked, a model solution is also generated on the spot and displayed, and in the case of an incorrect answer, the user's mistakes are highlighted.

Concrete example exercises for independent “working through” are a key foundation for understanding complex STEM topics, especially during the critical first semesters. Nabla provides an adequate system solution through the automatic generation of practice exercises. It has been shown to lead to improved learning and exam performance.

Possible Uses

In principle, suitable types of exercises can be implemented in Nabla for the entire STEM field.

Example of a generated practice exercise in Nabla
Example of a generated practice exercise in Nabla

In our experience, even more complex types of exercises – going far beyond mere calculation – can be designed in such a way that they assess understanding rather than “drill calculation”, and that the level of difficulty of the randomly generated one-off exercises remains very consistent.

The latter is particularly important when Nabla is used, as it is in our case, for individual exam appointments in which ad-hoc, personalised one-off exercises are generated for each participant. (Example: use in the course “Foundations of Computer Science II (GdI II)”)

Observations have shown that many students have already used Nabla independently to familiarise themselves with topics before they were even covered in the lecture. Although not specifically designed for this purpose, Nabla therefore appears to be suitable for and accepted as a tool for independent learning.

  • A range of exercise types from various courses have already been implemented in Nabla, and further types are in development.
  • All tasks that have ever been generated are stored permanently in anonymised form. For each task, a unique identifier is saved along with information on (1) whether the task was generated in test or practice mode and (2) whether it was solved correctly or not. These data allow for a wide range of useful statistical evaluations while ensuring complete anonymity.
  • Users can log in with their TU-ID and access the exercises they have generated themselves. By sharing the task ID, exercises can also be made accessible to others.
  • Anonymous use is possible. Registration via email address is planned.

Author: Prof. Dr. Karsten Weihe (Computer Science)

This tool is provided centrally by TU Darmstadt and has been checked from a data protection perspective.