All these exercises are supposed to be run either live in the lecture with instant feedback or as homework. Students can submit their solutions multiple times within the due date and use the (semi-)automatically provided feedback to improve their solution.
Artemis offers different exercise types that have these things in common:
Release Date: The date at which the exercise is released to the students. When an exercise does not have a release date, it is shown to the students immediately.
Due Date: The date until the students can submit their solution.
Assessment Due Date: The date until the tutors should finish the assessment of the student submissions. All assessments prior to this date will be released on the assessment due date.
Points: You can achieve points in each exercise. Depending on the exercise configuration these points are not included in the course score, count towards the course score or are used as bonus points.
Manual assessment is available for all exercise types except quiz exercises.
The manual assessment begins after the deadline of an exercise has passed for all students and is double-blind. This means that the tutors do not know the names of the students they assess, and the students do not know the identity of the tutors.
Instructors can use the assessment training process to make the grading more consistent. They define a series of example submissions and assessments that the tutors must first read through.
Students have the option to rate the assessments of the tutors. They can also complain or ask for more feedback.
To keep track of the manual assessments, Artemis offers the assessment dashboard.
It shows the assessment progress of each exercise by showing the state of the exercise, the total number of submissions, the number of submissions that have been assessed, and the number of complaints and more feedback requests.
It also shows the average rating the students have given to each exercise.
Artemis also offers a way for instructors to monitor the tutors’ assessments. The first part of this is the tutor leaderboard, which is visible to all tutors. The tutor leaderboard shows the number of assessments each tutor has done and the amount of more feedback requests and accepted complaints about them.
It also shows the average score the tutor has given and the average rating they received for their assessments.
Each exercise also has its own assessment dashboard that shows all of this information for a single exercise.
Automatic assessment is available for programming and quiz exercises.
For quiz exercises this is the only mode of assessment available. Artemis automatically grades students’ submissions after the quiz deadline has passed. See the section about Quiz exercise for more information about this.
For programming exercises, this is done via instructor-written test cases that are run for each submission either during or after the deadline. See the section about Programming Exercise for detailed information about this.
Instructors can enable complaints for automatically graded programming exercises.
After receiving an assessment, students can complain once about the assessment of an exercise if the instructor enabled this option and the students think the assessment is erroneous.
The student has to write an additional text when submitting a complaint to justify the reevaluation.
A complaint leads to a reevaluation of the submission by another tutor. This tutor sees the existing assessment and the complaint reason. The tutor can then either accept or reject the complaint.
Only if the tutor accepts the complaint, they can modify the assessment’s score.
The instructor can set a maximum number of allowed complaints per course. These so-called tokens are used for each complaint.
The token is given back to the student if the tutor accepts the complaint.
This means a student can submit as many complaints as they want, as long as they are accepted.
Another possibility after receiving an assessment is the More Feedback Request.
Compared to the complaints, they do not cost a token, but the tutor cannot change the score after a feedback request.
Warning
Sending a More Feedback Request removes the option to complain about the assessment entirely.
The score cannot be changed even if the tutor made a mistake during the first assessment and acknowledges this during the More Feedback Request.
Conducting a programming exercise consists of 7 steps distributed among
instructor, Artemis and students:
Instructor prepares exercise: Set up a repository containing the
exercise code and test cases, build instructions on the CI server,
and configures the exercise in Artemis.
Student starts exercise: Click on start exercise on Artemis which
automatically generates a copy of the repository with the exercise
code and configures a build plan accordingly.
Optional: Student clones repository: Clone the personalized
repository from the remote VCS to the local machine.
Student solves exercise: Solve the exercise with an IDE of choice
on the local computer or in the online editor.
Student uploads solution: Upload changes of the source code to
the VCS by committing and pushing them to the remote server (or by
clicking submit in the online editor).
CI server verifies solution: verify the student’s submission by
executing the test cases (see step 1) and provide feedback which
parts are correct or wrong.
Student reviews personal result: Reviews build result and
feedback using Artemis. In case of a failed build, reattempt to solve
the exercise (step 4).
Instructor reviews course results: Review overall results of all
students, and react to common errors and problems.
The following activity diagram shows this exercise workflow.
Artemis and its version control and continuous integration infrastructure is independent of the programming language and thus supports
teaching and learning with any programming language that can be compiled and tested on the command line.
Instructors have a lot of freedom in defining the environment (e.g. using build agents and Docker images) in which student code is executed and tested.
To simplify the setup of programming exercises, Artemis supports several templates that show how the setup works.
Instructors can still use those templates to generate programming exercises and then adapt and customize the settings in the repositories and build plans.
The support for a specific programming language templates depends on the used continuousintegration system. The table below gives an overview:
Programming Language
Bamboo
Jenkins
GitLab CI
Java
yes
yes
yes
Python
yes
yes
no
C
yes
yes
no
Haskell
yes
yes
no
Kotlin
yes
yes
no
VHDL
yes
no
no
Assembler
yes
no
no
Swift
yes
yes
no
OCaml
yes
no
no
Not all templates support the same feature set and supported features can also change depending on the continuous integration system set up.
Depending on the feature set, some options might not be available during the creation of the programming exercise.
The table below provides an overview of the supported features.
In case a feature has different support for different continuous integration systems, feature support for Bamboo is denoted first (B), support for Jenkins is denoted second (J), and support for GitLab CI is denoted third (G).
Programming Language
Sequential Test Runs
Static Code Analysis
Plagiarism Check
Package Name
Project Type
Solution Repository Checkout
Testwise Coverage Analysis
Publish Build Plan URL
Auxiliary repositories
Java
B: yes J: yes G:no
B: yes J: yes G: no
B: yes J: yes G: no
yes
yes
no
yes
yes
yes
Python
B: yes J: no
no
yes
no
no
no
no
yes
yes
C
no
B: yes J: no
yes
no
no
no
no
yes
yes
C (FACT framework)
no
B: yes J: no
yes
no
no
no
no
yes
yes
Haskell
B: yes J: no
no
no
no
no
B: yes J: no
no
yes
yes
Kotlin
yes
no
yes
yes
no
no
yes
yes
yes
VHDL
no
no
no
no
no
no
no
yes
yes
Assembler
no
no
no
no
no
no
no
yes
yes
Swift
no
yes
yes
yes
no
no
no
yes
yes
OCaml
no
no
no
no
no
yes
no
yes
yes
Sequential Test Runs: Artemis can generate a build plan which first executes structural and then behavioral tests. This feature can help students to better concentrate on the immediate challenge at hand.
Static Code Analysis: Artemis can generate a build plan which additionally executes static code analysis tools.
Artemis categorizes the found issues and provides them as feedback for the students. This feature makes students aware of code quality issues in their submissions.
Plagiarism Checks: Artemis is able to automatically calculate the similarity between student submissions. A side-by-side view of similar submissions is available to confirm the plagiarism suspicion.
Package Name: A package name has to be provided
Solution Repository Checkout: Instructors are able to compare a student submission against a sample solution in the solution repository
Testwise Coverage Analysis: Artemis can generate a build plan which additionally executes a testwise coverage analysis.
Artemis aggregates the recorded data into different metrics. This feature allows instructors to check which code in the solution submission is how often executed by the test cases.
Publish Build Plan: Artemis can display the URL to the build plan (e.g. on Bamboo or Jenkins) to the student. This will not work for the upcoming internal continuous integration system that has no graphical user interface.
Note
Only some templates for Bamboo support SequentialTestRuns at the moment.
Note
Static Code Analysis for C exercises is only supported for Bamboo at the moment.
Note
Testwise Coverage Analysis is only supported for Bamboo and exercises with regular test runs at the moment.
Note
Instructors are still able to extend the generated programming exercises with additional features that are not available in one specific template.
We encourage instructors to contribute improvements to the existing templates or to provide new templates. Please contact Stephan Krusche and/or create Pull Requests in the GitHub repository.
Guided Mode: The form for generating new programming exercises offers the option to switch to the guided mode by clicking the .
This mode splits up the generation of a programming exercise into multiple steps. The following video shows an exemplary use of the guided mode.
Artemis provides various options to customize programming exercises:
Title: The title of the exercise. It is used to create a project on the VCS server for the exercise.
Instructors can change the title of the exercise after its creation.
Short Name: Together with the course short name, the exercise short name is used as a unique identifier for
the exercise across Artemis (incl. repositories and build plans). The short name cannot be changed after the
creation of the exercise.
Preview: Given the short name of the exercise and the short name of the course, Artemis displays a preview of the
generated repositories and build plans.
Auxiliary Repositories: Instructors can add auxiliary repositories with a name, checkout directory and description.
These repositories are created and added to the build plan when the exercise is created. Auxiliary repositories
cannot be changed after the creation of the exercise.
Note
Auxiliary repositories are checked out to the specified checkout directory during the automatic testing
of a student submission in case the checkout directory is set. This can be used e.g. for providing
additional resources or overwriting template source code in testing exercises.
Categories: Instructors can freely define up to two categories per exercise. The categories are visible to students
and should be used consistently to group similar kinds of exercises.
Difficulty: Instructors can give students information about the difficulty of the exercise.
Mode: The mode determines whether students work on the exercise alone or in teams. Cannot be changed after the exercise creation.
Team size: If Team mode is chosen, instructors can additionally give recommendations for the team size. Instructors/Tutors define the teams after
the exercise creation.
Programming Language: The programming language for the exercise. Artemis chooses the template accordingly.
Refer to the programming exercise features for an overview of the supported features for each template.
Project Type: Determines the project structure of the template. Not available for all programming languages.
With exemplary dependency: Adds an external Apache commons-lang dependency to the generated project as an example
how maven dependencies should be used with Artemis exercises. Only available for Java exercises.
Package Name: The package name used for this exercise. Not available for all programming languages.
Release Date: Release date of the exercise. Students will only be able to participate in the exercise after this date.
Automatic Tests: Every commit of a participant triggers the execution of the tests in the Test repository.
Excluded are tests, which are specified to run after the due date. This is only possible if Run Tests once after Due Date has been activated.
The tests that only run after the due date are chosen in the grading configuration.
Due Date: The deadline for the exercise. Commits made after this date are not graded.
Complaint on Automatic Assessment: This option allows students to write a complaint on the automatic assessment after the due date.
This option is only available if complaints are enabled in the course or the exercise is part of an exam.
Note
Students can still commit code and receive feedback after the exercise due date, if manual review and complaints are not activated.
The results for these submissions will not be rated.
Run Tests once after Due Date: Activate this option to build and test the latest in-time submission of each student on this date.
This date must be after the due date. The results created by this test run will be rated.
Manual Review: Instructors/Tutors have to manually review the latest student submissions after the automatic tests were executed.
Assessment Due Date: The deadline for the manual reviews. On this date, all manual assessments will be released to the students.
Example Solution Publication Date: The date when the solution repository becomes available to download for students. If left blank, example solutions are never published.
Should this exercise be included in the course / exam score calculation?
Yes: Instructors can define the maximum achievable Points and Bonus points for the exercise.
The achieved total points will count towards the total course/exam score
Bonus: The achieved Points will count towards the total course/exam score as a bonus.
No: The achieved Points will not count towards the total course/exam score.
Submission Policy: Configure an initial submission policy for the exercise.
The submission policy defines the effect that a submission has on the participation of one participant in a programming exercise.
You can choose between 3 different types of submission policies: None, Lock Repository, Submission Penalty. Those policies can be used
to limit how many times student can submit their code and receive feedback from automated tests. The feature and configuration
is independent from programming language settings and works in combination with static code analysis penalties.
Detailed information about the different types of policies and their respective setup can be found in the section
configuring submission policies.
Note
Submission policies can only be edited on the Grading Page of the programming exercise after the initial exercise generation.
Enable Static Code Analysis: Enable static code analysis for the exercise.
The build plans will additionally execute static code analysis tools to find code quality issues in the submissions.
This option cannot be changed after the exercise creation. Artemis provides a default configuration for the static code analysis tools
but instructors are free to configure the static code analysis tools.
Refer to the programming exercise features to see which programming languages support static code analysis.
Max Static Code Analysis Penalty: Available if static code analysis is active.
Determines the maximum amount of points that can be deducted for code quality issues found in a submission as a percentage (between 0% and 100%) of Points.
Defaults to 100% if left empty. Further options to configure the grading of code quality issues are available in the grading configuration.
Note
Given an exercise with 10 Points. If Max Static Code Analysis Penalty is 20%, at most 2 points will be deducted
from the points achieved by passing test cases for code quality issues in the submission.
Problem Statement: The problem statement of the exercise. Refer to interactive problem statement for more information.
Grading Instructions: Available if Manual Review is active. Create instructions for the manual assessment of the exercise.
Sequential Test Runs: Activate this option to first run structural and then behavior tests.
This feature allows students to better concentrate on the immediate challenge at hand.
Not supported together with static code analysis. Cannot be changed after the exercise creation.
Check out repository of sample solution: Activate this option to checkout the solution into the ‘solution’ path.
This option is useful to compare the student’s submission with the sample solution. This option is not available for all programming languages.
Allow Offline IDE: Allow students to clone their personal repository and work on the exercise with their preferred IDE.
Allow Online Editor: Allow students to work on the exercise using the Artemis Online Code Editor.
Record Testwise Coverage: Activate this option to record the testwise coverage for the solution repository.
This option is only available for Java/Kotlin-exercises with non-sequential test runs.
Note
At least one of the options Allow Offline IDE: and Allow Online Editor: must be active
Show Test Names to Students: Activate this option to show the names of the automated test cases to the students.
If this option is disabled, students will not be able to visually differentiate between automatic and manual feedback.
Publish Build Plan: Allow students to access and edit their personal build plan. Useful for exercises where students should
configure parts of the build plan themselves.
Click on to create the exercise
Result: Programming Exercise
Artemis creates the repositories:
Template: template code, can be empty, all students receive this code at the beginning of the exercises
Test: contains all test cases, e.g. based on JUnit and optionally static code analysis configuration files. The repository is hidden for students
Solution: solution code, typically hidden for students, can be made available after the exercise
Artemis creates two build plans
Template: also called BASE, basic configuration for the test + template repository, used to create student build plans
Solution: also called SOLUTION, configuration for the test + solution repository, used to manage test cases and to verify the exercise configuration
Update exercise code in repositories
Alternative 1: Clone the 3 repositories and adapt the code on your local computer in your preferred development environment (e.g. Eclipse).
To execute tests, copy the template (or solution) code into a folder assignment in the test repository and execute the tests (e.g. using maven clean test)
Commit and push your changes
Notes for Haskell: In addition to the assignment folder, the executables of the build file expect the solution repository checked out in the solution subdirectory of the test folder and also allow for a template subdirectory to easily test the template on your local machine.
You can use the following script to conveniently checkout an exercise and create the right folder structure:
#!/bin/sh# Arguments:# $1: exercise short name as specified on Artemis# $2: (optional) output folder name## Note: you might want to adapt the `BASE` variable below according to your needsif[-z"$1"];thenecho"No exercise short name supplied."exit1fiEXERCISE="$1"if[-z"$2"];then# use the exercise name if no output folder name is specifiedNAME="$1"elseNAME="$2"fi# default base URL to repositories; change this according to your needsBASE="ssh://git@bitbucket.ase.in.tum.de:7999/$EXERCISE/$EXERCISE"# clone the test repository
gitclone"$BASE-tests.git""$NAME"&&\# clone the template repositorygitclone"$BASE-exercise.git""$NAME/template"&&\# clone the solution repositorygitclone"$BASE-solution.git""$NAME/solution"&&\# create an assignment folder from the template repositorycp-R"$NAME/template""$NAME/assignment"&&\# remove the .git folder from the assignment folderrm-r"$NAME/assignment/.git/"
Notes for OCaml: The tests expect to be placed in a folder tests next to a folder assignment containing the submission to test and a folder solution with the solution repository.
You can use the following script to conveniently checkout an exercise and create the right folder structure:
#!/bin/sh# Arguments:# $1: exercise short name as specified on Artemis# $2: (optional) output folder name## Note: you might want to adapt the `BASE` variable below according to your needs# shortname of the course to pick exercises fromPREFIX=if[-z"$1"];thenecho"No exercise short name supplied."exit1fi# full name of the exercise to loadEXERCISE="$PREFIX$1"if[-z"$2"];then# use the exercise name if no output folder name is specifiedNAME="$1"elseNAME="$2"fi# default base URL to repositories; change this according to your needsBASE="ssh://git@bitbucket.ase.in.tum.de:7999/$EXERCISE/$EXERCISE"# clone the test repository
gitclone"$BASE-tests.git""$NAME/tests"# clone the template repository
gitclone"$BASE-exercise.git""$NAME/template"# clone the solution repository
gitclone"$BASE-solution.git""$NAME/solution"# hardlink the various assignment interfaces to ensure they stay in sync# the version in the solution repository is authoritative in case of conflict
rm"$NAME/template/src/assignment.mli"
rm"$NAME/tests/assignment/assignment.mli"
rm"$NAME/tests/solution/solution.mli"
ln"$NAME/solution/src/assignment.mli""$NAME/template/src/assignment.mli"
ln"$NAME/solution/src/assignment.mli""$NAME/tests/assignment/assignment.mli"
ln"$NAME/solution/src/assignment.mli""$NAME/tests/solution/solution.mli"
To run the tests run the following script in either the solution or template folder:
It is possible to checkout additional student repositories next to the solution and template folder to run tests on them for manual grading.
Alternative 2: Open in Artemis (in the browser) and adapt the code in online code editor
You can change between the different repos and submit the code when needed
Edit in Editor
Alternative 3: Use IntelliJ with the Orion plugin and change the code directly in IntelliJ
Check the results of the template and the solution build plan
They should not have the status
In case of a result, some configuration is wrong, please check the build errors on the corresponding build plan.
Hints: Test cases should only reference code, that is available in the template repository. In case this is not possible, please try out the option Sequential Test Runs
Optional: Adapt the build plans
The build plans are preconfigured and typically do not need to be adapted
However, if you have additional build steps or different configurations, you can adapt the BASE and SOLUTION build plan as needed
When students start the programming exercise, the current version of the BASE build plan will be copied. All changes in the configuration will be considered
Optional: Configure static code analysis tools
The Test repository contains files for the configuration of static code analysis tools, if static code analysis was activated during the creation/import of the exercise
The folder staticCodeAnalysisConfig contains configuration files for each used static code analysis tool
On exercise creation, Artemis generates a default configuration for each tool, which contains a predefined set of parameterized activated/excluded rules. The configuration files serve as a documented template that instructors can freely tailor to their needs.
On exercise import, Artemis copies the configuration files from the imported exercise
The following table depicts the supported static code analysis tools for each programming language, the dependency mechanism used to execute the tools and the name of their respective configuration files
Programming Language
Execution Mechanism
Supported Tools
Configuration File
Java
Maven plugins
(pom.xml or build.gradle)
Spotbugs
spotbugs-exclusions.xml
Checkstyle
checkstyle-configuration.xml
PMD
pmd-configuration.xml
PMD Copy/Paste Detector (CPD)
Swift
Script
SwiftLint
.swiftlint.yml
C
Script
GCC
Note
The Maven plugins for the Java static code analysis tools provide additional configuration options.
Note
GCC can be configured by passing the desired flags in the tasks. For more information, see GCC Documentation.
The build plans use a special task/script for the execution of the tools
Note
Instructors are able to completely disable the usage of a specific static code analysis tool by removing the plugin/dependency from the execution mechanism.
In case of Maven plugins, instructors can remove the unwanted tools from the pom.xml or build.gradle.
Alternatively, instructors can alter the task/script that executes the tools in the build plan.
PMD and PMD CPD are a special case as both tools share a common plugin. To disable one or the other, instructors must delete the execution of a tool from the build plan.
Adapt the interactive problem statement
Click the button of the programming exercise or navigate into and adapt the interactive problem statement.
The initial example shows how to integrate tasks, link tests and integrate interactive UML diagrams
Configure Grading
General Actions
Save the current grading configuration of the open tab
Reset the current grading configuration of the open tab to the default values. For Test Case Tab, all test cases are set to weight 1, bonus multiplier 1 and bonus points 0. For the Code Analysis Tab, the default configuration depends on the selected programming language.
Re-evaluates all scores according to the currently saved settings using the individual feedback stored in the database
Trigger all build plans. This leads to the creation of new results using the updated grading configuration
Two badges display if the current configuration has been saved yet and if the grading was changed. The following graphic visualizes how each action affects the grading page state:
Warning
Artemis always grades new submissions with the latest configuration but existing submissions might have been graded with an outdated configuration. Artemis warns instructors about grading inconsistencies with the Updated grading badge.
Test Case Tab: Adapt the contribution of each test case to the overall score
Note
Artemis registers the test cases defined in the Test repository using the results generated by Solution build plan. The test cases are only shown after the first execution of the Solution build plan.
On the left side of the page, instructors can configure the test case settings:
Test Name: Name of the test case as defined in Test repository
Weight: The points for a test case are proportional to the weight (sum of all weights as the denominator) and are calculated as a fraction of the maximum points
Note
Bonus points for an exercise (implied by a score higher than 100%) are only achievable if at least one bonus multiplier is greater than 1 or bonus points are given for a test case
Bonus multiplier: Allows instructors to multiply the points for passing a test case without affecting the points rewarded for passing other test cases
Bonus points: Adds a flat point bonus for passing a test case
Visibility: Select the visibility of feedbacks to students for this test case.
Always: Feedback associated with this test case is visible to students directly after the automatic grading process for their submission.
After Due Date: Feedback associated with this test case is visible to students only after the due date for this exercise has passed.
Tutors and Instructors are able to see the feedback before the due date.
If for some students an individual due date is set, the detailed feedback for those tests is invisible to all other students until the exercise submission is no longer possible for all students.
Other students can however still see if the tests passed or failed and receive points accordingly, even if the latest individual due has not passed yet.
Warning
For manual assessments all feedback details will be visible to this student, even if the due date has not passed yet for others.
Tutors can start the manual assessment for all students with the regular due date as soon as it has passed.
Set an appropriate assessment due date in the exercise settings to make sure that students cannot tell still working students about the test case details.
Never: Feedback associated with this test case is never visible to students even after the due date for this exercise has passed.
Tutors and Instructors are able to see the feedback before and after the due date, e.g. when manually assessing submissions.
Additionally, results of this test case are not considered in the student score calculation.
Is Active: Displays whether the test case is currently part of the grading configuration. The Show inactivate test cases controls whether inactive test cases are displayed
Passed %: Displays statistics about the percentage of participating students that passed or failed the test case
Note
Example 1: Given an exercise with 3 test cases, maximum points of 10 and 10 achievable bonus points. The highest achievable score is \(\frac{10+10}{10}*100=200\%\). Test Case (TC) A has weight 2, TC B and TC C have weight 1 (bonus multipliers 1 and bonus points 0 for all test cases). A student that only passes TC A will receive 50% of the maximum points (5 points).
Note
Example 2: Given the configuration of Example 1 with an additional bonus multiplier of 2 for TC A. Passing TC A accounts for \(\frac{2*2}{2+1+1}*100=100\%\) of the maximum points (10). Passing TC B or TC C accounts for \(\frac{1}{4}*100=25\%\) of the maximum points (2.5). If the student passes all test cases he will receive a score of 150%, which amounts to 10 points and 5 bonus points.
Note
Example 3: Given the configuration of Example 2 with additional bonus points of 5 for TC B. The points achieved for passing TC A and TC C do not change. Passing TC B now accounts for 2.5 points plus 5 bonus points (7.5). If the student passes all test cases he will receive 10 (TC A) + 7.5 (TC B) + 2.5 (TC C) points, which amounts to 10 points and 10 bonus points and a score of 200%.
On the right side of the page, charts display statistics about the current test case configuration. If changes are made to the configuration, a of the statistics is shown.
Weight Distribution: The distribution of test case weights. Visualizes the impact of each test case for the score calculation
Total Points: The percentage of points given to students according to a specific test case. 100% in the chart represents full scores (100%) of all students
Code Analysis Tab: Configure the visibility and grading of code quality issues on a category-level
Note
The Code Analysis Tab is only available if static code analysis was activated for the exercise.
Code quality issues found during the automatic assessment of a submission are grouped into categories. Artemis maps categories defined by the static code analysis tools to Artemis categories according to the following table:
Mapping
Category
Description
Java
Swift
C
Bad Practice
Code that violates recommended
and essential coding practices
Spotbugs BAD_PRACTICE
GCC BadPractice
Spotbugs I18N
PMD Best Practices
Code Style
Code that is confusing
and hard to maintain
Spotbugs STYLE
Swiftlint (all rules)
Checkstyle blocks
Checkstyle coding
Checkstyle modifier
PMD Code Style
Potential Bugs
Coding mistakes, error-prone
code or threading errors
Spotbugs CORRECTNESS
GCC Memory
Spotbugs MT_CORRECTNESS
PMD Error Prone
PMD Multithreading
Duplicated Code
Code clones
PMD CPD
Security
Vulnerable code, unchecked
inputs and security flaws
Spotbugs MALICIOUS_CODE
GCC Security
Spotbugs SECURITY
PMD Security
Performance
Inefficient code
Spotbugs PERFORMANCE
PMD Performance
Design
Program structure/architecture
and object design
Checkstyle design
PMD Design
Code Metrics
Violations of code complexity
metrics or size limitations
Checkstyle metrics
Checkstyle sizes
Documentation
Code with missing or flawed
documentation
Checkstyle javadoc
Checkstyle annotation
PMD Documentation
Naming & Format
Rules that ensure the readability
of the source code (name conventions,
imports, indentation, annotations,
white spaces)
Checkstyle imports
Checkstyle indentation
Checkstyle naming
Checkstyle whitespace
Miscellaneous
Uncategorized rules
Checkstyle miscellaneous
GCC Misc
Note
For Swift, only the category Code Style can contain code quality issues currently. All other categories displayed on the grading page are dummies.
Note
The GCC SCA option for C does not offer categories by default. The issues were categorized during parsing with respect to the rules.
On the left side of the page, instructors can configure the static code analysis categories.
Category: The name of category defined by Artemis
State:
INACTIVE: Code quality issues of an inactive category are not shown to students and do not influence the score calculation
FEEDBACK: Code quality issues of a feedback category are shown to students but do not influence the score calculation
GRADED: Code quality issues of a graded category are shown to students and deduct points according to the Penalty and Max Penalty configuration
Penalty: Artemis deducts the selected amount of points for each code quality issue from points achieved by passing test cases
Max Penalty: Limits the amount of points deducted for code quality issues belonging to this category
Detected Issues: Visualizes how many students encountered a specific number of issues in this category
Verify the exercise configuration
Open the page of the programming exercise
The template result should have a score of 0% with 0 of X passed or 0 of X passed, 0 issues (if static code analysis is enabled)
The solution result should have a score of 100% with X of X passed or X of X passed, 0 issues (if static code analysis is enabled)
Note
If static code analysis is enabled and issues are found in the template/solution result, instructors should improve the template/solution or disable the rule, which produced the unwanted/unimportant issue.
Click on
Below the problem statement, you should see Test cases ok and Hints ok
On exercise import, Artemis copies the repositories, build plans, interactive problem statement and grading configuration from the imported exercise.
Open Course Management
Open
Navigate into Exercises of your preferred course
Import programming exercise
Click on Import Programming Exercise
Select an exercise to import
Note
Instructors can import exercises from courses, where they are registered as instructors
Artemis provides special options to update the assessment process
Recreate Build Plans: Create new build plans instead of copying them from the imported exercise
Update Template: Update the template files in the repositories. This can be useful if the imported exercise is old and contains outdated dependencies.
For Java, Artemis replaces JUnit4 by Ares (which includes JUnit5) and updates the dependencies and plugins with the versions found in the latest template. Afterwards you might need to adapt the test cases.
Instructors are able to activate/deactivate static code analysis. Changing this option from the original value, requires the activation of Recreate Build Plans and Update Template.
Note
Recreate Build Plans and Update Template are automatically set if the static code analysis option changes compared to the imported exercise. The plugins, dependencies and static code analysis tool configurations are added/deleted/copied depending on the new and the original state of this option.
Fill out all mandatory values and click on
Note
The interactive problem statement can be edited after finishing the import. Some options such as Sequential Test Runs cannot be changed on exercise import.
All tool categories and their rules are active by default except for the NOISE and EXPERIMENTAL category.
Refer to the Spotbugs documentation for a description of all rules.
Artemis uses the following default configuration to detect code duplications for the category Copy/Paste Detection.
For a description of the various PMD CPD configuration parameters refer to the PMD CPD documentation.
<!-- Minimum amount of duplicated tokens triggering the copy-paste detection --><minimumTokens>60</minimumTokens><!-- Ignore literal value differences when evaluating a duplicate block.If true, foo=42; and foo=43; will be seen as equivalent --><ignoreLiterals>true</ignoreLiterals><!-- Similar to ignoreLiterals but for identifiers, i.e. variable names, methods names.If activated, most tokens will be ignored so minimumTokens must be lowered significantly --><ignoreIdentifiers>false</ignoreIdentifiers>
For a description of the rules/warnings refer to the GCC Documentation.
For readability reasons the rule/warning prefix -Wanalyzer- is omitted.
Category (Tool/Artemis)
Rule
Memory Management
/ Potential Bugs
free-of-non-heap
malloc-leak
file-leak
mismatching-deallocation
Undefined Behavior
/ Potential Bugs
double-free
null-argument
use-after-free
use-of-uninitialized-value
write-to-const
write-to-string-literal
possible-null-argument
possible-null-dereference
Bad Practice/Bad Practice
double-fclose
too-complex
stale-setjmp-buffer
Security/Security
exposure-through-output-file
unsafe-call-within-signal-handler
use-of-pointer-in-stale-stack-frame
tainted-array-index
Miscellaneous/Miscellaneous
Rules not matching to above categories
Note
GCC output can still contain normal warnings and compilation errors. That will also be added to the Miscellaneous category.
Usually it’s best to disable this category, as it contains errors not related to the SCA.
Therefore, if the warning/error does not belong to first four above categories it is not an SCA issue as of GCC11.1.0.
The following sections explains the configuration options for submission policies:
The submission policy defines the effect that a submission has on the participation of one participant in a programming exercise.
A programming exercise might have no submission policy at all, or one submission policy, but never more than one. Submission policies are initially specified
in the creation process of a programming exercise and can later be adjusted in the grading configuration of the particular programming exercise.
Note
One submission is defined by one push to the exercise participation repository by the participant themself that triggers the
automatic tests resulting in feedback for the participant. Automatic test runs triggered by instructors are not considered submissions.
Choosing the right submission policy configuration depends on the exercise and your teaching style.
In general, lock repository and submission penalty policies combat trial-and-error solving approaches.
1. None
When selecting no submission policy, exercise participants can submit their solution as often as they want until the deadline.
2. Lock Repository
Participants can submit a fixed number of times within the submission period of a programming exercise. Once the participant reaches the submission limit,
further participation in the exercise is prevented by locking the participation repository. The participant may still work on their solution locally, but
cannot submit it to Artemis to receive feedback.
With the example configuration shown in the figure above, participants can submit their solution 5 times and receive feedback 5 times.
After that, Artemis locks the participation repository, so the participant can no longer push their solutions to their repository.
Note
When locking the participation repository upon reaching the submission limit fails for any reason and the participant submits again,
Artemis attempts to lock the repository again and sets the newly generated result to .
3. Submission Penalty
Participants can submit as often as they want until the deadline, however, for each submission exceeding the submission limit,
the exceeding submission limit penalty is deducted from the participant’s score. The exceeding submission limit penalty must be
provided as a positive number of points.
With the example configuration shown in the figure above, participants can submit their solution 3 times regularly. For every
submission exceeding the limit of 3, 1.5 points are deducted from the participant’s score. The score cannot be negative.
For example, when the participant reaches 6 out of 12 points on the 4th submission, 1.5 points are deducted for the one submission exceeding
the limit of 3, resulting in a participation score of 4.5 instead of 6 out of 12. On the 5th submission, 3 points are deducted
for 2 submissions exceeding the limit. The student receives feedback that explains the deduction.
After generating a programming exercise initially, submission policies can be updated and toggled on the grading page
of the programming exercise.
1. (De)activating Submission Policies
When the submission policy of a programming exercise is active, the button is displayed.
When the policy is inactive, the button is displayed instead. The active submission policy of an exercise can
be deactivated by pressing . When the policy is deactivated, Artemis will no longer enforce the policy. Repositories that were
locked due to the enforcement of the submission policy, get unlocked. Submission policies can be activated again by pressing .
When (de)activating a submission penalty policy, must be pressed in order to apply the effect.
2. Updating Submission Policies
Submission policies can be updated during the exercise by modifying the configuration and pressing . When updating a policy,
the effect of the former policy is removed and the effect of the new policy is applied. When the new policy is a submission penalty policy,
must be pressed to update the latest results of all participants.
3. Deleting Submission Policies
Submission policies can be deleted by selecting None as submission policy type and pressing . When deleting submission policies,
their effect on participations is removed. Every repository that is locked due to a lock repository policy is unlocked and the
participant can continue working on the exercise. When deleting a submission penalty policy, must be pressed in order
to revert the submission policy effect.
The following screenshot shows the online code editor with interactive
and dynamic exercise instructions on the right side. Tasks and UML
diagram elements are referenced by test cases and update their color
from red to green after students submit a new version and all test cases
associated with a task or diagram element pass. This allows the students
to immediately recognize which tasks are already fulfilled and is
particularly helpful for programming beginners.
The online editor displays only visible files, to avoid showing for example .git and .gitignore config files.
This also means that other config files, like SwiftLint’s .swiftlint.yml file, are not shown. Those files currently
can only be accessed via your own IDE.
The following tables lists the different types of repositories and gives an overview of the access rights different users have.
To gain these access rights, a user must assume the indicated role in the course the repository belongs to.
The different repository types are:
Base
This includes all repositories that are set up when the exercise is created (template repository, solution repository, tests repository, auxiliary repositories).
Student Assignment
A student’s assignment repository copied from the template repository. This includes team assignment repositories.
Teaching Assistant (TA) Assignment
An assignment repository created by a teaching assistant for themself.
Instructor Assignment
An assignment repository created by an editor or instructor for themselves. Not available for exam exercises.
Student Practice
A student’s practice repository copied either from the template repository, or from their assignment repository. Can only be created after the due date of the exercise has passed. Not available for exam exercises.
Teaching Assistant (TA) Practice
A practice repository created by a teaching assistant for themself. Not available for exam exercises.
Instructor Practice
A practice repository created by an editor or instructor for themselves. Not available for exam exercises.
Instructor Exam Test Run
A test run repository created by an editor or instructor for an exam. An instructor can create an exam test run before the start date of the exam to allow the instructor to test the exam from a student perspective before releasing it. This repository should be deleted before the exam is conducted.
The different roles are:
Student (S)
A student in the course.
Teaching Assistant (TA)
A teaching assistant (tutor) in the course.
Editor
An editor in the course.
Instructor (I)
An instructor in the course.
Note
Editors and instructors are included in the role “Instructor” in the table as both roles have the same access rights.
The different points in time are:
Before start
Before the exercise start date for a course exercise, before the exam start date for an exam exercise.
Working time
After the exercise start date for a course exercise, after the exam release date for an exam exercise, before the due date for a course exercise, before the exam end date for an exam exercise.
After due
After the due date for a course exercise, after the exam end date for an exam exercise.
Note
For the Instructor Exam Test Run Repository, the point in time “Before start” is the start date of the test run, and the point in time “After due” is the end date of the test run. Both are before the exam start date.
Read access (R) includes gitfetch, gitclone, and gitpull, if you are using your local Git client to access the repository.
Write access (W) corresponds to gitpush if you are using your local Git client.
Repository type
Role
Point in time
Access
Base
S
all
none
TA
all
R
I
all
R/W
Student Assignment
S
Before start
none
S
Working time
R/W
S
After due
R 1
TA
all
R
I
all
R/W
TA Assignment
S
all
none
TA
Before start
R
TA
Working time
R/W
TA
After due
R
I
all
R/W
Instructor Assignment
S
all
none
TA
all
R
I
all
R/W 2
Student Practice
S
Before start
none
S
Working time
none
S
After due
R/W
TA
Before start
none
TA
Working time
none
TA
After due
R
I
Before start
none
I
Working time
none
I
After due
R/W
TA Practice
S
all
none
TA
Before start
none
TA
Working time
none
TA
After due
R/W
I
Before start
none
I
Working time
none
I
After due
R/W
Instructor Practice
S
all
none
TA
Before start
none
TA
Working time
none
TA
After due
R
I
Before start
none
I
Working time
none
I
After due
R/W
Instructor Exam Test Run
S
all
none
TA
all
R
I
all
R/W
1) Only valid for course exercises.
Students cannot read their repository for exam exercises after the due date.
2) The instructor can access the Instructor Assignment repository using the online editor either from the Edit in editor view accessed via the Course Management (-> Exercises -> Edit in editor) or from the Course Overview (clicking on the course card -> Open code editor).
After the due date of the exercise has passed, the instructor can push to the repository only via the online editor reached from the Course Management or using their local Git client.
The online editor accessible from the Course Overview will show that the repository is locked, as it does for all students taking part in the course.
Note
The Practice repositories as well as the TA assignment repository and the instructor assignment repository in the table above only exist for course exercises.
The following sections describe best practices for writing test cases.
The examples and explanations are specifically written for Java (using Ares/JUnit5), but the practices can also be generalized
for other programming languages.
These comments should contain information about what is tested specifically, which task from the problem statement
is addressed or which TODO (if there are numbered TODOs in the template), how many points this test is worth when
passed and more if necessary. But make sure to keep the information consistent with the settings on Artemis like the weight of each testcase.
/** * Tests that borrow() in Book successfully sets the available attribute to false * Problem Statement Task 2.1 * Worth 1.5 Points (Weight: 1) */@TestpublicvoidtestBorrowInBook(){// Test Code}
Better yet, for manual correction, use these comments in the display name of the test. This allows the assessors, who execute the
tests in the IDE, to have more meaningful names displayed. The following example would make counting points easier.
@DisplayName("1.5 P | Books can be borrowed successfully")@TestpublicvoidtestBorrowInBook(){// Test Code}
Use Appropriate and Descriptive Names for Test Cases
After exercises and exams, test names will be used to create statistics. If the tests are called test1, test2, test3,
it will be hard to read those statistics. This is the same reason as to why you should not name your variables inta,
doubleb, Stringc. For example, if you want to test the method borrow in the class Book,
testBorrowInBook() would be an appropriate name for the test case.
@TestpublicvoidtestBorrowInBook(){// Test Code}
If you have many tests in different (nested) classes that are not completely distinct, add the name of the tested
class to the test to avoid having two tests with the same names. For example, if you test both add methods of a
LinkedList and an ArrayList. Having the same test name will lead to errors on Artemis.
Clearer test names also makes it easier to read and configure the grading in Artemis.
@Testpublicvoidtest_LinkedList_add(){// Test Code}
Hint
For Java exercises: If all test methods are in a single class this is not necessary, because the Java compiler
won’t allow multiple methods with override-equivalent signatures.
Use Appropriate Timeouts for Test Cases
For regular test cases, using a @StrictTimeout(1) annotation is enough. This represents a strict timeout of one
second. The value type of the strict timeout annotation defaults to seconds. If you need a shorter timeout, you can use
@StrictTimeout(value=500,unit=TimeUnit.MILLISECONDS). This annotation can also be added over a test class.
In that case the timeout applies individually to every test in that class.
@Test@StrictTimeout(1)publicvoidtestBorrowInBook(){// Test Code}
Note
When defining timeouts, you should take into account that the tests are run on a Continuous Integration Server
(using build agents). The tests will most likely execute a lot faster on your local machine.
Avoiding Assert Statements
Instead use conditional fail() calls to hide confusing information from the students. This could be considered a
bad practice in regular testing but it helps to create fail messages that are less confusing, especially for beginners.
Additionally, this also hides information about test implementation details, if the specific inputs or expected outputs
should stay unknown to students.
@TestpublicvoidtestBorrowInBook(){Objectbook=newInstance("Book",0,"Some title");invokeMethod(book,"borrow");assertFalse((Boolean)invokeMethod(book,"isAvailable"),"A borrowed book must be unavailable!");}
If the student fails the test, Artemis will display something like org.opentest4j.AssertionFailedError: A borrowed book must be unavailable! ==> Expected <false> but was <true>.
The part after ‘==>’ should not be shown to the student as it contains implementation detail.
@TestpublicvoidtestBorrowInBook(){Objectbook=newInstance("Book",0,"Some title");invokeMethod(book,"borrow");if((Boolean)invokeMethod(book,"isAvailable")){fail("A borrowed book is not available anymore!");}}
This will just display the message ‘org.opentest4j.AssertionFailedError: A borrowed book is not available anymore!’, which focuses, except for the first part, on the actual error instead of test internals.
Write Tests that are as Independent of the Student’s Code as Possible
Students can break everything and will break everything. Avoid direct code references and use reflective operations
instead. That way if a student modifies the template by accident and the test code would normally not compile,
they still get more meaningful feedback than a simple build error.
@TestpublicvoidtestBorrowInBook(){Bookbook=newBook(0,"Some title");book.borrow();if(book.isAvailable()){fail("A borrowed book must be unavailable!");}}
The code above will lead to a build error, if the student accidentally changes the Book class. Test code build errors
usually have a cryptic fail messages and it should be avoided that students see those confusing error messages.
@TestpublicvoidtestBorrowInBook(){Objectbook=newInstance("Book",0,"Some title");invokeMethod(book,"borrow");if((Boolean)invokeMethod(book,"isAvailable")){fail("A borrowed book must be unavailable!");}}
The code above will lead to an error message like The class ‘Book’ was not found within the submission. Make sure to implement it properly.
The message is clear and tells the student exactly what is wrong with their code.
Check for Hard-Coded Student Solutions
It is possible that students hardcode values to pass a certain set of tests. Therefore, you should check whether this
is the case or not. This is especially important in an exam setting, so students don’t get awarded points for a
solution that does not fulfill the requirements described in the problem statement.
Avoid Relying on a Specific Order in which Students Solve the Tasks
Tests should successfully cover one aspect of the submission without requiring the implementation of a different part
of the exercise, even if those aspects are heavily coupled.
In this example, the student is supposed to expand the translate method first and after that the
runService method:
publicStringtranslate(Stringword,Stringlanguage){returnswitch(language){caseTranslationService.LANGUAGE_GERMAN->translateToGerman(word);// TODO: Add a case for the French languagedefault->thrownewIllegalStateException("Illegal language requested: "+language);};}
publicStringrunService(StringserviceName,Stringparameter){Stringresult=null;if(serviceName.equals(TranslationService.SERVICE_NAME_TRANSLATION_GERMAN)){result=translate(parameter,TranslationService.LANGUAGE_GERMAN);}// TODO: Add a case for the French languageelse{System.out.println("Can't offer service "+serviceName);}returnresult;}
There are two separate tests, one testing the translation and the other one testing the runService method.
The test for runService must not assume that the translate method is already implemented correctly.
A possible solution for this problem could look like this:
This test correctly checks, if the student added the case for the French language and called the appropriate method
with the appropriate parameters. Because the translation method was overridden, it doesn’t matter whether the student
has already completed the previous task or not.
Note
If you use the this technique, you should have some way to deal with students that make the class or method final, either via
problem statement or test. Otherwise students get compilation errors in the test code.
Catch Possible Student Errors
Handle possible student mistakes appropriately in the test case. For example, if a method of the student returns null
and this is not handled appropriately in the test, this might produce a NullPointerException, which will lead to a
cryptic fail message. A null check in the test case allows providing a more clear fail message to the student.
Use Constant String Attributes to Represent the Base Package
Some courses use long package identifiers like de.tum.in.ase.pse. When instantiating objects with reflections,
the instantiation method usually takes the full canonical name of a class, which is de.tum.in.ase.pse.Book for example.
To avoid writing out this full canonical name all the time, you can add a constant String attribute representing the
base package name to the top of your test class.
privatestaticfinalStringBASE_PACKAGE="de.tum.in.ase.pse.";@TestpublicvoidtestBorrowInBook(){Objectbook=newInstance(BASE_PACKAGE+"Book",0,"Some title");// Test Code}
Use JUnit5 and Ares Features
More information can be found in the JUnit5 and
Ares documentation. The following list adds some
useful notes:
In combination with display names for both tests and the nested classes, this allows to structure grading with
tests and the grouping is also helpful when executing the test in the IDE. One example would be to structure the
tests by exercise sub-tasks or to group tests that all check a single more complicated method. You can also do this
by using static nested classes instead of inner classes annotated with @Nested (decide depending on your scenario).
Define a custom, well-structured and predictable test execution order with @Order
If you test multiple cases in a single test (e.g. because you want “all or nothing” grading for those cases or simply
check with multiple inputs), you can use assertDoesNotThrow
to pass a message that is displayed in Artemis, in case an exception occurs in the student code.
If you want to test multiple assertions that are fairly independent in a single test (e.g. because you want
“all or nothing” grading for those cases or simply check with multiple inputs) you should consider if
assertAll
is what you need. This will execute all passed executables and aggregate the failures, which allows showing students
multiple wrong aspects of their solution directly.
If you have special needs, consider using Dynamic Tests
and/or write your own extension.
If you need to test tests, use the JUnit Platform Test Kit.
For providing wrong implementations that students need to test, consider
passing interface implementations (easiest and safe) → pure Java
pass mocked objects (flexible and safe, students don’t need to know) → EasyMock/Mockito
mock single methods of tested objects (partial mock) or mock constructors → one of the above plus PowerMock
Define your own Annotations
Own annotations are an easy and powerful tool that keeps your code readable. This example defines an annotation
that combines both test and strict timeout.
Jqwik allows testing with arbitrary inputs and shrinks the test errors, resulting in excellent
counter examples when student code fails (usually exactly the edge case).
Eclipse Compiler and Best-Effort Compilation
Use the Eclipse Java Compiler for partial, best-effort compilation. This is particularly useful for exam exercises and
other more “dirty” programming work. It is also useful for testing complicated generics (you really don’t want to do
that with java reflection). This causes compilation errors to be transformed into errors that are thrown where code
does not compile. This is done on method and class level (essentially replacing the body of the class/method with
thrownewError("Unresolvedcompilationproblems:..."), which is then thrown whenever the class/method is used).
If you intend to write tests that exploit that, make sure that only the body of the test methods does not compile
(e.g. if a student didn’t implement something from a task). If your complete test class does not compile, this causes
the complete test class initialization to fail, which results in cryptic feedback in Artemis. Anything in the test class
that is not a method body or nested class body must compile. This includes method return types and parameter types,
and therefore also lambdas! You can avoid that by using e.g. the Object class and cast inside / at the call site.
Use e.g. a nested class for Fields, Field types and Methods with student class return/parameter types that could
potentially not compile. Because the nested class is a separate class which is loaded separately, the top level test
class will still load successfully and only methods using that nested class will fail due to the error from the nested
class initialization.
You can choose to use the Eclipse compiler for both student and test code, or for test code only, depending on whether
you want to grade not fully compiling code.
Note
The Eclipse Compiler released under this maven coordinate does not always support the latest Java version. You can
still compile student code with the latest Java version and only the test code with the previous.
The Reflection API is limited when it comes to constant attributes. Constant attributes are static final attributes
with a primitive or String type. Java inlines such attributes during compile-time, which makes it more or less
impossible to change the value during runtime.
Be careful with long output, arrays or Strings. This might be unreadable in Artemis or even cut of after 5000 characters.
Per default the results of all unit tests are extracted and sent back to Artemis without any further manual interaction needed.
Only for some custom setups a semi-automatic approach might be necessary.
In the Jenkins CI-System the test case feedbacks are extracted from XML-Files in the JUnit format.
The Jenkins plugins reads all such files from a folder results in the top level of the Jenkins workspace.
The files resulting from the execution of regular executed unit tests are copied to this folder automatically.
To add additional custom test case feedbacks another mechanism is provided by creating a folder
customFeedbacks also on the top level of the workspace.
In this folder an arbitrary number of JSON-Files can be created.
Each one represents a single test case feedback and should have the format:
{"name":string,"successful":boolean,"message":string}
name: This is the name of the test case as it will be shown for example on the ‘Configure Grading’ page.
It should therefore have a for this exercise uniquely identifiable name and has to be non-null and not empty.
successful: Indicates if the test case execution for this submission should be marked as successful or failed.
Defaults to false if not present.
message: The message shown as additional information to the student.
Required for non-successful tests/feedback, optional otherwise.
On the Artemis home page, click on the Course Management button .
Navigate into Exercises of a specific course by clicking on the exercise button.
In the quiz exercises section, click on the Create Quiz button to open the following form to create the new quiz exercise.
Title: Provide a title for the quiz exercise (The red line means that this field is mandatory).
Categories: Type the category for the quiz exercise.
Difficulty: Select the difficulty level among Easy, Medium and Hard. It is possible to select No Level.
Duration: Provide the time in minutes and seconds for students to solve the quiz exercise.
Options: Choose between presenting the questions in random order or not.
Batch Mode: Batch Mode controls how students can participate in the quiz.
Synchronized: There is exactly one opportunity for all students to participate in the quiz.
Batched: There are multiple opportunities for students to participate in the quiz by using a password.
Individual: Students can participate in the quiz by themselves at any time while it is released.
Visible from: The date and hour when the quiz becomes visible to students.
Schedule Quiz Start: To establish the date and hour at which the quiz will be available for solving.
Start of working time: Set the time for the students to see the questions and start answering them. Students can start working on the quiz from this time until the duration ends.
Should this exercise be included in the course score calculation?:
Yes: the points will be included in the course score calculation.
No: the points will not be included in the course score calculation.
Bonus: the points will be considered as bonus points.
Questions: There are four ways to add questions to a quiz exercise.
Add Multiple-Choice Question
This kind of question is composed of a problem statement with multiple options.
Short question title: Provide a short title (Mandatory).
Points: Assign the value points for this question.
Scoring type:
All or Nothing
Proportional with Penalty
Proportional without Penalty
Present answer options in random order.
Single Choice Mode: When there is just one correct option. This disables the Scoring type (resp. sets it to All or Nothing).
Delete icon: To delete the current question.
Edit View: Enables the text editor to write the quiz statement and its options, hints and explanations.
Edit bar: When the edit view is enabled, the format bar provides:
Style to the statement text
Correct Options [correct]
Incorrect Options [wrong]
Explanations [exp]
Hints [hint]
Text editor: The quiz statement can be developed with options, hints and explanations.
Preview View: Enables the student view.
Visual View: This view shows the question from the Edit view as a rendered question. The different parts of the question are editable and answer options can be added and removed.
It is possible to toggle between the Visual and the Edit view at all times and changes to the question will be respected. The following video shows an exemplary use of the Visual view.
Add Drag-And-Drop Question
This kind of question is composed of a problem statement, a background image, and drag and drop options.
Short question title: Provide a short title.
Points: Assign the value points for this question.
Scoring type:
All or Nothing
Proportional with Penalty
Proportional without Penalty
Present Drag Items in Random order.
Delete icon: To delete the current question.
Edit View: Enables the text editor to write the question statement with explanations and hints.
Edit bar: When the edit view is enabled, the format bar provides:
Style to the statement text
Explanations [exp]
Hints [hint]
Text editor: The quiz statement can be developed with hints and explanations.
Upload Background: To select and upload the background from the PC files to drag and drop the options over it.
Add Drag Items:
Text items: Type the options.
Image items: Can be uploaded from the PC files.
Preview View: Enables the student view.
Add Short-Answer Question
This kind of question is composed of a statement and spots to fill them out by typing the answers.
Short question title: Provide a short title.
Points: Assign the value points for this question.
Scoring type:
All or Nothing
Proportional with Penalty
Proportional without Penalty
Match Letter Case
Match Answers Exactly: This option moves the match slider to 100%.
Delete icon: To delete the current question.
Add Spot Option: To add the spot between the text to be filled out.
Add Answer Option: To provide the answer for each spot.
Text editor: The quiz statement can be developed with the spots and options.
Text View: Enables the text editor to write and edit the question statement.
Visual View: Enables the student view.
Add Existing Questions
This option allows to import existing questions from other quiz exercises, courses, exams and files.
Source buttons:
From a course
From an exam
From a file
List picker to select a specific course, exam or file.
Searching bar: to look for the question providing its name or part of it.
Filter options according to the type of questions:
Drag and Drop Question
Multiple Choice Question
Short Answer Question
Apply filter button
List of questions with the title, short title, and Type. In the Add column, it is possible to select all questions to be imported.
At the end of the list, click the Add selected Questions button to import all selected questions.
Footer: On the creation quiz page there is a footer with the following fields:
In the quiz exercises section, click on the Import a Quiz button .
The list of existing quizzes will appear.
The searching bar: Allows to look for a specific quiz by typing its name or part of it.
The list of quizzes: Whit their ID, title, course and indicator if they are exam questions.
Clicking the Import button opens the quiz editor with the existing questions. Here it is possible to edit all parameters such as in Create new Quiz Exercise.
If a quiz exercise is available on the Artemis home page as a current exercise, it will be possible to see it in the course overview or inside the course where it belongs.
The current exercise box will show:
The name of the quiz
The button to start the quiz
The category
The message if the quiz is active
The due date
To start the quiz, the student must press the Open quiz button .
If the quiz is set to start at a specific time and the student opens it before, he/she will see a message asking to wait until the quiz starts and displaying the remaining time.
When the quiz starts, the student can see the questions and solve them.
The quiz page is composed of:
Number and title of the question
Points for solving that question
The quiz statement
Options:
Options with circles mean one choice could be correct.
Options with squares mean multiple options could be correct.
In the footer:
The number of questions and overall points.
Time left to complete the quiz.
Last time saved: The quiz will save all changes after they occur.
Connection status.
Submit button: To allow the student to submit the quiz before the time ends.
In the case of Drag and Drop questions, the items to be dragged and dropped in the spots will be available on the right side.
To submit and finish the quiz, the student must press the Submit button . However, when the quiz time’s up, the answers will be submitted automatically.
The assessment is automatic and the student can see the result of the overall quiz and of specific questions. In the case of MC questions, the solution will be displayed.
In the case of Drag and Drop questions, the solution is shown by clicking the Show Sample Solution button .
The following screenshot illustrates the first section of the form. It consists of:
Title: Title of an exercise.
Categories: Category of an exercise.
Difficulty: Difficulty of an exercise. (No level, Easy, Medium or Hard).
Mode: Solving mode of an exercise. This cannot be changed afterwards (Individual or Team).
Release Date: Date after which students can access the exercise.
Due Date: Date till when students can work on the exercise.
Assessment Due Date: Date after which students can view the feedback of the assessments from the instructors.
Inclusion in course score calculation: Option that determines whether or not to include exercise in course score calculation.
Points: Total points of an exercise.
Bonus Points: Bonus points of an exercise.
Diagram Type: Type of diagram that is used throughout an exercise.
Note
Fields marked with red are mandatory to be filled.
Note
The field Diagram Type determines the components that students/instructors can use while working on the exercise.
This option cannot be changed after creating the exercise.
For example: If the instructor selects class diagram as its diagram type, users (instructors/students) will now only be able to use components of class diagrams throughout the exercise.
The following screenshot illustrates the second section of the form. It consists of:
Enable automatic assessment suggestions: When enabled, Artemis tries to automatically suggest assessments for diagram elements based on previously graded submissions for this exercise.
Problem Statement: The task description of the exercise as seen by students.
Assessment Instructions: Instructions for instructors while assessing the submission.
Note
If you are not clear about any of the fields, you can access additional hints by hovering over the icon for many of them.
The following screenshot illustrates the last section of the form. It consists of:
Example Solution: Example solution of an exercise.
Example Solution Explanation: Explanation of the example solution.
Example Solution Publication Date: Date after which the example solution is accessible for students. If you leave this field empty, the solution will only be published to tutors.
Once you are done defining the schema of an exercise, you can now create an exercise by clicking on button.
You will then be redirected to Example Submissions for Assessment Training Page.
In this page, you can either Create Example Submission or Use as Example Submission for Assessment Training.
Example submissions can be used to assess the submissions of students semi-automatically.
Artemis uses those submissions to automatically apply the known assessment comments to similar model elements in other submissions as well.
Select if you want to create an example submission from scratch.
Alternatively, after the exercise already started, you can also use some submissions submitted by students as an example submission. For that, click on .
Note
Artemis uses semi-automatic grading of modeling exercises using machine learning.
You can hence train the model by selecting Use in Assessment Training checkbox while creating an example submission.
Alternatively, you can also import modeling exercise from the existing one by clicking on Import Modeling Exercise.
An import modal will prompt up, where you will have an option to select and import previous modeling exercises from the list by clicking on button.
Once you import one of the exercise, you will then be redirected to a form which is similar to Create new modeling exercise form with all the fields filled from imported exercise. You can now modify the fields as per your necessity to create a new Modeling Exercise.
When the exercise is released students can work on the exercise.
They can start the exercise by clicking the button.
Once they start the exercise, they will now have the option to work on it in an online modeling editor by clicking on the button.
The screenshot below depicts the online modeling exercise interface for students. They can read the Problem Statement, work on the online editor and also provide an explanation to their solutions, if needed.
When the due date is over you can assess the submissions.
To assess the submissions, first click on Assessment Dashboard.
Then click on Submissions of the modeling exercise.
You will then be redirected to Submissions and Assessments Page.
Click on button of specific student. You will then be redirected to the assessment page where you will be able to assess submission of that student.
You can now start assessing the elements of the model by double clicking it. Once you double click, you will get an assessment dialog where you can assign points, feedback and navigate through all other assessable components.
Alternatively, you can also assess the diagram by dragging and dropping assessment instructions from the Assessment Instructions section.
Feedback to the entire submission can also be added by clicking on the button.
Once you’re done assessing the solution, you can either:
Click on to save the incomplete assessment so that you can continue it afterwards.
Click on to submit the assessment.
Click on to cancel and release the lock of the assessment.
The following sections describe the supported features and the process of creating a new file upload exercise.
Open .
Navigate into Exercises of your preferred course.
Click on Create a new file upload exercise.
Fill out all mandatory values and click on .
The exercise-specific File Pattern defines which file types students can upload as solution. The input field accepts all file endings without leading dot separated by commas.
Result: File Upload Exercise.
Click the button of the file upload exercise to update the configuration and assessment instructions.
You can get an overview of the exercise by clicking on the title.
When the due date is over, you can assess the submissions. From the assessment dashboard, go to exercise the assessment dashboard of the file upload exercise.
There you can assess the submitted student submissions, by first downloading the file, and then adding feedback with points.
When creating an exercise, the instructor first needs to select the mode Team. This is only available during the creation of the exercise, as it can not be changed later on.
The team size can also be configured, but it is just a recommendation that can be overridden when creating the actual teams.
Clicking opens the dialog shown below. To manually create a team instructors must define the name, short name and students for the team. Optionally, a
tutor can also be assigned to the team.
To facilitate the process of creating teams, instructors can use the button to export the teams in an exercise to a file, which can then be imported in other exercises.
To import teams into an exercise, instructors can use the button. This allows them to choose between importing from a file, or importing directly from
another exercise in the course. In both cases, instructors must choose if they want to delete all the existing teams, or only create new teams, as shown below:
When working on a team exercise, students can work collaboratively using the live editors. This is available for
Modeling exercises and Textual exercises.
The live editors show the status of all the team members, and allow students to simultaneously edit the same exercise:
The same team can be shared for multiple exercises. Viewing the page for a single team allows students, tutors and instructors to get an overview of all the
exercises for that team along with their current status. To access the team overview page, users can click on or on the team’s short name.
Instructors can upload files, such as lecture slides, and partition the lecture’s content into individual lecture units.
Lecture units can consist of files, text, external links, videos or livestreams (e.g., lecture recordings).
To directly link the necessary knowledge to its application, regular course exercises can be added to the lecture as a unit, too.
Instructors can also define learning goals so that students can keep track of the knowledge they should have after working with those lecture materials.
On the course management site, clicking on opens the following page for managing lectures.
Instructors have three options for creating a lecture.
1. Create a new lecture from scratch by clicking on .
Lectures consist of a title, a description, and optionally a start and end date.
2. In addition to creating a new lecture from the default mode, instructors can switch to the guided lecture creation by clicking on .
This guided mode helps creating a new lecture and adding its contents through a multi-step process. The following video shows an exemplary use of the guided mode.
3. Alternatively, instructors can also import a lecture from any other course where they at least have editor access.
Clicking on opens the import modal, where instructors can search for an existing lecture and import it.
Once a lecture is created, instructors can add attachments to it.
An attachment is a file (e.g., document, image) with a title and an optional release date.
Lectures can be divided into lecture units, which can be of the following types:
Text unit: A text with markup.
Exercise unit: An exercise from the same course.
Video unit: An embedded video stream or video from an external source.
Online unit: A link to an external website.
Attachment unit: A file that the student may download.
Students see all released lecture units on the lecture details page.
Clicking on a unit opens its contents.
Artemis shows a flag icon with a popover next to the unit if it is associated with a learning goal.
Students complete lecture units automatically (e.g., when they are opened) or manually by clicking the checkbox.
Instructors can create lecture units on the lecture unit management page.
After adding lecture units, instructors may edit or delete each one with the buttons to the right of the unit.
Using the arrow buttons, the order of the lecture units can be changed.
An exercise can be added as a unit to a lecture.
For the exercise unit, Artemis uses the title, release date, etc. of the exercise itself.
Students complete this unit when they participate in the exercise.
An online unit consists of a link to an external website, a name, and optionally a description and release date.
Artemis automatically pre-fills the title and description from the website’s metadata once the URL is set.
Students complete this unit once they navigate to the external website.
A video unit consists of a name, an embeddable video link, and optionally a description and release date.
Artemis can convert the website link from common video sources to an embeddable URL using the arrow button.
Students complete this unit when they watch the video for at least five minutes.
Instructors can create competencies, which are desired learning objectives, and link lecture units to them.
See Learning Analytics for more information.
The exam mode in Artemis tolerates issues with the Internet connection.
If you lose your connection, you can continue working on text-, quiz- and modeling exercises, but you might get warnings that your solutions cannot be saved.
If your Internet connection recovers, Artemis will save your solution.
Artemis tries to save your solution every 30 seconds, when you navigate between exercises, and when you click or .
Programming exercises have 2 modes.
Online code editor: can only be used when you are online.
Note
You have to click on ! Otherwise your solution will not be pushed to the VC server and no build will be triggered.
Local IDE: you only need to be online when you clone the repository and when you push your commits (i.e. submit your solution).
At the end of the online exam, you must be online within a given graceperiod and submit your exam, otherwise it will not be graded.
If you reload the browser, the Welcome Screen screen opens and you must enter your name and confirm the checkbox again.
You should only reload if an error occurs that cannot be recovered otherwise!
Participate in ONE browser window and only one browser tab!
Working in multiple browser windows or tabs at the same time is not allowed! Having multiple Artemis windows or tabs open is ok, as long as only one of them accesses the exam.
It will lead to synchronization issues and is seen as suspicious behaviour that can be flagged as cheating.
The welcome screen gives you an overview of all the important information you need about the exam.
Carefully read through the instructions.
Once you have read them, confirm that you will follow the rules by ticking the corresponding checkbox, sign with your full name and click .
Note
Your full name represents your signature. You can find your full name as registered on Artemis below the input field.
After you confirm, if the exam working time has started, the Exam Conduction screen will automatically appear.
Otherwise, you must wait until the exam begins. This wont be longer than 5 minutes. A popup will appear which will notify you how much time is left before the planned start.
Once the exam working time starts and you have confirmed your participation, the Exercise Overview screen will appear. This screen lists all exercises that are part of your exam with their respective amount of points, title and exercise type. The status column indicates the status of each exercise and whether you have a submission in them or not.
On the header, you will find the Exam Navigation Bar. You can use this bar to navigate between different exercises. For each exercise an icon will display your current status.
When there are unsaved or unsubmitted changes, the exercise representation on the navigation bar becomes .
When your changes are saved and submitted, the exercise representation on the navigation bar becomes .
indicates that you have not started this exercise.
You can also navigate through the exercises when you are done with one by clicking . This action will save and submit your changes and move to the next exercise.
Warning
For programming exercises, there is no save button. You must manually press otherwise your solution will not be graded!
On the header, you will also find the button. If you press this, you will be sent to the exam End Screen.
The time left until the end of the exam is also shown next to the action buttons, or below, depending on your screen size.
Note
When the time is about to run out, the background of the timer will turn yellow to warn you.
Various question types can be included in quiz exam exercises. These are:
Multiple choice questions
Short Answer questions
Drag and Drop questions
All questions are listed in the main screen below one another.
To navigate between them you can either scroll or use the questionoverview on the left. When you click on one of the question representations, your view will automatically scroll to the respective question.
To submit your solution, press .
Note
Your submission will automatically be saved every 30 seconds.
The text exercise view is divided into two sections, the text editor, and the problem statement. The problem statement is docked to the right.
Note
On small screens, the problem statement is shown above the text editor.
If you want to focus only on the text editor, you can collapse the problem statement by pressing on in the top right of the image below. This can be reverted by pressing the arrow again.
Note
You can also choose to resize the problem statement by dragging the outline box .
Within the editor you can type out your solution. The editor will automatically track your number of words and number of characters.
The modeling exercise view is divided into two sections, the modeling editor, and the problem statement. The problem statement is docked to the right.
Note
On small screens, the problem statement is shown above the modeling editor.
If you want to focus only on the modeling editor, you can collapse the problem statement by pressing on . This can be reverted by pressing the arrow again.
Note
You can also choose to resize the problem statement by dragging the outline box .
Within the editor you can model your solution. Depending on the diagram type, you will find the available elements on the right side of the editor. Simply drag and drop them into the editing field.
When you click on a dropped element, you can configure it by setting it’s name, it’s attributes, methods etc.
To connect elements you can drag an element’s edges to another element. The editor will then automatically connect those two.
If you are unclear about how to use the modeling editor, you can click on . It will provide further information about how to use the modeling editor.
Note
If you need more space, you can work in fullscreen by clicking on . This mode will use your whole screen for the modeling exercise thereby giving you more space to model your solution. To exit the fullscreen mode, click .
Depending on your exam, programming exercises can come in three forms:
Online Code Editor + support for local IDE
Online Code Editor
Support for local IDE
If your exercise allows the use of the code editor your screen will be divided into three sections, from left to right:
The file browser
The code Editor
The instructions
The file browser displays the file structure of the assignment. You can access any file within the assignment. Artemis will display the selected file’s content in the code editor where you can edit it.
You can add new files and directories using the and buttons.
You also have the ability to rename and delete files and folders, therefore caution is advised.
The code editor allows you to edit the content of specific files. It shows the line numbers and will also annotate the appropriate line, if a compilation error occurs.
The instructions are docked to the right.
If you want to focus only on the code editor, you can collapse the instructions by pressing on the . This can be reverted by pressing the arrow again. Similarly, if you want to collapse the file browser, you can press the above the file browser.
Note
You can also choose to resize any of the three sections by dragging the .
When you press all unsaved changes are overwritten in the online code editor. Your changes are auto-saved every 30 seconds by Artemis in the code editor.
When you press , your changes are pushed to the version control (VC) server and a build is started on the continuous integration (CI) server. This is indicated by the results changing from to . You need to first press to get feedback on your submissions build status.
Warning
There is no auto-submit!
Participating in Programming Exercises with the online code editor and local IDE enabled
If your exercise allows the use of the local IDE you will have access to the button .
When you click it you can choose to clone the exercise via HTTPS or SSH, if you have configured your private key.
Note
You must link a public key to your account in advance if you want to use SSH.
To work offline follow these steps:
Clone the Exercise
Import the project in your IDE
Work on the code
Commit and push the code. A push is equivalent to pressing the button.
You are responsible for pushing/submitting your code. Your instructors cannot help you if you did not submit.
Your instructors can decide to limit the real-time feedback in programming exercises during the online exam.
In that case, you will only see if your code compiles or not:
means that your code does not compile!
means that your code compiles but provides no further information about your final score.
Warning
Edit a programming exercise EITHER in the online editor OR in your local IDE! Otherwise, conflicts can occur that are hard to resolve.
If you work in the online code editor and a merge conflict occurs, the file browser will display the conflict state .
You can use the button, which is then displayed instead of the submit button, to resolve the conflict within the online code editor.
This will reset your changes to the latest commit. Manual merging is not possible with the online code editor.
When you are finished with the exercises, or the time runs out you navigate to the End Screen.
This is done either by clicking on or automatically when the exam conduction time is over.
Note
If you navigated to this screen via , you have the option to return to the conduction by clicking on .
In this screen you should confirm that you followed all the rules and sign with your full name, similar to the Welcome Screen.
You are given an additional graceperiod to submit the exam after the conduction is over. This additional time is added to the timer shown on the top right.
Warning
Your exam will not be graded, should you fail to submit!
Once you submit your exam, no further changes can be made to any exercise.
After you hand in, you can view the summary of your exam.
You always have access to the summary. You can find it by following the steps displayed in: Accessing the Exam.
Further you have the opportunity to export the summary as a PDF file by clicking on .
The summary contains an aggregated view of all your submissions. For programming exercises, it also contains the latest commit hash and repository URL so you can review your code.
Once the results have been published, you can view your score in the summary.
Additionally, if within the student review period, you have the option to complain about manual assessments made. To do this, click on and explain your rationale.
A second assessor, different from the original one will have the opportunity to review your complaint and respond to it.
The complaint response will become visible to you as soon as it has been assessed.
Again, you can export the summary including your score as a PDF file by clicking on . The PDF will also contain any complaints and complaint assessments.
Note
The results will automatically be updated, if your complaint was successful.
Complaining about the Assessment of a Text Exercise
The grades below the FirstPassingGrade are shown in red, and the passing grades are shown in green.
If the instructor defined a bonus configuration for your exam, you will also see your final grade with the applied bonus below your raw exam grade before bonus.
For more information about all the grading intervals, you can click the button to view all grade step boundaries with their bound inclusivity.
A square bracket [ or ] in the interval of a grade step means the bound is included in the current grade step, and a parenthesis ( or ) means it is excluded.
For example, if the grade step for 2.0 shows the percentage interval as [80-85) this means that a student achieving 80% has the grade 2.0, whereas a student achieving 85% receives the grade right above 2.0 (i.e. 1.7 if the default grading key is used).
Exam Grading Key for a student receiving 135 points out of 150
During the exam creation and configuration, you can create your exam and configure it to fit your needs. Add exercises with different variants, register students, generate student exams and conduct test runs. For more information see 1.2 Create and Configure Exam.
Artemis supports two exam modes. The conduction of normal exams and test exams.
The normal exam mode is suitable for conducting end-of-semester exams. Students can view and work on the exam between the configured working time. Afterwards you can perform a manual or automated evaluation of the students’ submissions. The results will be published on the specified date.
Test exams provide you with the possibility to provide students with a practice opportunity for the end-of-semester exam. The main difference is that you choose a working window within which students can freely start the exam. Students then have the configured working time to complete the exam. After submitting, students immediately get the automated assessment for programming and quizzes. Manual correction of the submissions is not necessary.
When you click on you are presented with the Create Exam view. Here you can set the basic information such as title, examiner etc.
You can choose between the exam and test exam mode.
The timeline of the exam is defined by the dates: visiblefrom, startofworkingtime, endofworkingtime, releasedateofresults, beginofstudentreview, endofstudentreview.
The first three dates are mandatory when you create an exam. The rest can be set when required.
The graceperiod defines the amount of time the students have at their disposal to hand in their exam after the workingtime is over. This is set to 3 minutes by default.
Before the exam’s assessment you can choose the Numberofcorrectionroundsinexam. If you want two tutors to assess a student’s exam subsequently, set the number to two here. This enables the second correction.
You can also define the numberofexercises in the exam. You can leave this out initially, however it must be set before you can generate the student exams. For more information, see 1.3 Exercise Groups.
Artemis will randomize the order of the exercises for each student if you activate randomizeorderofexercisegroups.
Finally, you can fill out the exam starttext and endtext. Artemis will present these texts to the students during the exam conduction, at the Start- and End page respectively.
Instead of creating a new exam, you can import an existing exam by clicking on from any of the courses you are an instructor in.
Artemis displays a list of all available exams. To select one specific exam for the import, click on the button.
You are now presented with the Import Exam view. All information except for the dates are copied from the exam you selected for the import. You can find more information regarding this view at the section create exam.
Additionally, you can select or deselect exercises which are imported alongside the exam. You can find more information regarding the exercise import in the section regarding the exercise group import.
Artemis exam mode allows you to define multiple exercise variants so that each student can receive a unique exam. Artemis achieves this through exercise groups. Exercise groups represent an individual exercise slot for each student exam. Within one exercise group you can define different exercises.
Artemis selects one exercise per exercise group randomly, to generate the individual student exams.
You can distinguish between mandatory exercise groups and non-mandatory exercise groups.
Artemis always includes mandatory exercise groups in the individual exam of a student.
Non-mandatory exercise groups can be left out, if there are more exercise groups than the numberofexercises defined in the exam configuration.
By default, every exercise group is mandatory. You can set the mandatory flag when you add an exercise group initially, or later by clicking on the exercise group.
Artemis exam mode allows you to import one or more exercise groups from an existing exam.
The import process consists of two steps.
Step 1: Select Exam
When you click on , you can select one exam from which exercise group(s) should be imported.
To select one exam, click on .
Step 2: Select Exercises and Exercise Groups
In the next step you can select or deselect exercises which should be imported alongside the exercise groups.
You can also change the title and isMandatory of an exercise group, as well as the title (and short-name for programming exercises) for the individual exercises.
The title and short name of programming exercises must be unique. If you want to import an exercise group into the same course, you must change the title and short name before you can import the exercise group.
After you have started the import by clicking on , Artemis checks if the title and short name of the selected programming exercise(s) are unique. If they are not unique, a warning is displayed and you have to change the corresponding title and short name.
Note
Further changes to the individual exercises can be made after the import by editing the respective exercise.
Programming exercises are imported using their initial configuration. This import functionality cannot be used for changing the submission policy, for activating / deactivating the static code analysis or for creating new build plans. In this case, please import the exercises individually into the exercise groups.
Exercise groups can contain multiple exercises. For every student exam, Artemis will randomly select one exercise per exercise group.
Note
If you want all student to have the same exam, define only one exercise per exercise group.
To add exercises navigate to the Exercise Groups of the exam. On the header of each exercise group you will find the available exercise types. You can choose between creatinganewexercise or importinganexistingone from your courses.
For exercise types text and modeling you can also define example submissions and example assessments to guide your assessor team.
Assessors will review the example submissions and assessments in order to familiarise themselves with the exercise and assessment instructions, before they can assess the real submissions.
1.4.1 Programming Exercises
Programming exercises have multiple special options to adjust their behaviour:
You can check the option to allowmanualassessment.
Note
If you do not set this flag, your assessors will not be able to manually assess the student’s submissions during the assessment process.
You can activate RunTestsonceafterDueDate. This will compile and run the test suite on all the student submissions once after the set date.
After you add a programming exercise you can configure the grading via .
In the Configure Grading screen, you can tweak the weight of the tests, the bonusmultiplier and add bonuspoints.
You can hide tests so that they are not executed during the exam conduction. Students can not receive feedback from hidden tests during the exam conduction.
Note
If you hide all tests, the students will only be able to see if their submission compiles during the conduction. Set the RunTestsonceafterDueDate after the
exam end date to achieve this effect.
To register students to the exam, navigate from the exam management to the Students page. Artemis offers two options to register students. You can:
Add students manually by searching via the search bar.
Bulk import students using a CSV file. You can do this by pressing the button.
Register every student in the course. You can do this by pressing the button.
Note
Just registering the students to the exam will not allow them to participate in the exam. First, individual student exams must be generated.
You can also remove students from the exam. When you do so, you have the option to also delete their participations and submissions linked to the user’s student exam.
Student exams represent the exam of a student. It consists of an individual set of exercises based on the configured exercise groups.
Student exams are managed via the Student Exams page.
Here you can have an overview of all student exams. When you press View on a student exam, you can view the detailsofthestudent, the allocated workingtime, their participationstatus, their summary, as well as their scores. Additionally, you will also be able to view which assessor is responsible for each exercise.
Note
You can change the individual working time of students from here. The screenshot Individual Working Time below shows where you can do that.
To generate student exams you must click on . This will trigger Artemis to create a student exam for every registered user.
button will be locked once the exam becomes visible to the students. You cannot perform changes to student exams once the exam conduction has started.
If you have added more students recently, you can choose to .
creates a participation for each exercise for every registered user, based on their assigned exercises. It also creates the individual repositories and build plans for programming exercises. This action can take a while if there are many registered students due to the communication between the version control (VC) and continuous integration (CI) server.
On the Student Exams page, you can also maintain the repositories of student exams. This functionality only affects programming exercises. You can choose to and all student repositories.
Note
Artemis locks and unlocks the student repositories automatically based on the individual exam start and end date. These buttons are typically not necessary unless something went wrong.
Test runs are designed to offer the instructors confidence that the exam conduction will run smoothly. They allow you to experience the exam from the student’s perspective. A test run is distinct from a student exam and is not taken into consideration during the calculation of the exam scores.
You can manage your test runs from the Test Run page.
To create a new test run you can press . This will open a popup where you can select an exercise for each exercise group. You can also set the workingtime. A test run will have as many exercises as there are exercise groups. It does consider the numberofexercises set in the exam configuration.
Note
Exercise groups with no exercises are ignored.
Create test run popup with one exercise variant selected for each exercise group.
When you start the test run, you conduct the exam similar to how a student would. You can create submissions for the different exercises and end the test run.
An instructor can also assess his test run submissions. To do this, you must have completed at least one test run. To navigate to the assessment screen of the test runs click .
Test run conduction marked with the banner on the top left.
Note
Only the creator of the test run is able to assess his submissions.
You can view the results of the assessment of the test run by clicking on . This page simulates the Student Exam Summary where the students can view their submissions and the results once they are published.
Here instructors can also use the complaint feature and respond to it to conclude the full exam timeline.
Note
You should delete test runs before the actual exam conduction takes place.
After you create an exam, the exam checklist appears at the top of the exam’s detail page.
The exam checklist helps you oversee and ensure every step of the exam is executed correctly.
You can track the progress of the steps mentioned in this document and spot missed steps easily.
Each row of the checklist includes the name of the task, description and short summary where it is applicable and the page column which navigates the instructors to the relevant action.
Going through each task from the start until the current task and making sure the description column contains no warnings or errors can help instructors conduct the exam smoothly.
The exam conduction starts when the exam becomes visible to the students and ends when the latest working time is over. When the exam conduction begins, you cannot make any changes anymore to the exam configuration or individual student exams. When the conduction starts, the students can access and start their exam. They can submit their solutions to the exercises within the given individual working time. When a student submits the exam, they cannot make any changes anymore to his exercise submissions. For more information, see participating in the online exam.
The assessment begins as soon as the latest student exam working time is over.
During this period, your team can assess the submissions of the students and provide results.
Artemis executes the test suites for programming exercises automatically and grades these.
You can enhance the automatic grading with a manual review.
You can also trigger the automatic grading of the quiz exercises via the Manage Student Exams Screen.
If you want you can also enable the second correction feature for the exam.
Once the exam conduction is over and the latest individual working time has passed, your team can begin the assessment process.
This is done through the Assessment Dashboard.
Note
If the exam conduction is not over, you will not be able to access this page.
The assessment process is anonymized. Artemis omits personal student data from the assessors.
The Assessment Dashboard provides an overview over the current assessment progress per exercise. For each exercise, you can view how many submissions have already been assessed and how many are still left. The status of the student complaints is also displayed here.
Additionally, once the exam conduction ends, you can click on . This action will evaluate all student exam submissions for all quiz exercises and assign an automatic result.
Note
If you do not press this button, the students quiz exercises will not be graded.
After the exam conduction ends, you can click on . This action will automatically evaluate all submissions with 0 points for unsubmitted student exams. Additionally, empty submissions will be automatically graded with 0 points.
Note
If you do not press this button, the unsubmitted student submissions and the empty submissions will appear in the assessment dashboard of the exam, which leads to unnecessary effort during grading.
To assess a submission for an exercise, you can click on .
Your assessors must first complete the example submissions and assessments, if you have attached those to the exercise, see 1.4 Add Exercises.
If there is a submission which has not been assessed yet, you can click . This will fetch a random student submission of this exercise which you can then assess.
Artemis grades programming exercises automatically. However, if the exercise allows a manual assessment, you can review and enhance the automatic results.
You can trigger Artemis to automatically grade quiz exercises via the Manage Student Exams Screen. Therefore, quiz exercises do not appear in the Assessment Dashboard.
Set the number of correction rounds of the exam to 2.
When the second correction is enabled, the assessment progress can be observed in the Assessment Dashboard.
There you can see the state of the individual correction rounds, and the state of the complaints.
You can toggle if tutors can assess specific exercises in the second round. Disabling the second correction again, does not affect already created second assessments.
Correction in the second round can be enabled/disabled anytime.
To assess a submission a second time go to the exercise assessment dashboard. When it is enabled, a button will be visible in the second correction round.
The new second assessment will have all the feedback copied from the first assessment. Those can be overridden, and new feedback can be added as well. This does not override the original result, but saves a separate second result.
Within the second correction round review instructors and tutors can highlight which feedback was created for which correction round. This is displayed as a badge at the bottom of every feedback. This view can be enabled or disabled any time during the second correction round review by pressing the button at the top of the page. The feature is currently available for text, modeling and file-upload exercises.
You can access each assessment of both rounds by going to the exam’s ->
Artemis also allows you to detect plagiarism attempts.
Artemis conducts this by analyzing the similarities between all student submissions and flagging those which exceed a given threshold. You can compare all flagged submissions side by side and confirm plagiarism attempts.
Instructors can download a CSV report of accepted and rejected plagiarism attempts for further processing on external systems.
To apply the plagiarism check, you must navigate to the individual exercise. This can be done by navigating to:
-> -> exercise-title
Detecting Plagiarism attempts on Modeling Exercises
At the bottom of the page you will find the option .
Optionally, you can create a grading key for your exam by clicking at the top of the exam’s detail page.
Defining a grading key allows the exam score to be converted to a grade automatically by Artemis, students are then able to see their own grades after the specified Release Date of Results.
Using a grading key also enhances the generated statistics so that the instructor is able to view grade distributions.
For an easy out-of-the-box configuration, you can click and then click Save.
By default, grades are defined as percentages of the total obtainable score. You can also display their point equivalent if you specify Maximumnumberofpointsforexam.
If you would like to define custom grade steps, you can use the button and modify the grade step intervals.
Note
Keep an eye out for the warnings at the bottom of the page to ensure that the grading key is valid.
Inclusivity field allows you to decide which grade should be assigned if the student’s score is exactly equal to a boundary value between two grades.
There are two grade types you can use: Grade and Bonus. The Grade type allows you to set a final grade for the exam with custom grade step names, while the Bonus type allows you to assign bonus points to each grade step so they can contribute to the grade of another course or exam.
Note
If the GradeType is Grade you should set FirstPassingGrade.
For more fine grained control, you can switch to Detailed editing mode and set grade step bounds manually.
buttons enable you to save the grading key as a CSV file and re-use it in other courses and exams.
You can specify the moment when Artemis publishes the results of the exam, see 1.2.2 Create Exam. This is usually when the exam assessment ends, but you can specify this at any point in time. During the publication of the results, the student can view their results from their summary page. You can also view the exam statistics from the exam Scores page and export the data into external platforms such as TUM Online as a CSV file, see 4.1 Exam Scores.
You can access the exam scores by clicking on . This view aggregates the results of the students and combines them to provide an overview over the students’ performance.
You can view the spread between different achieved scores, the average results per exercise, as well as the individual students’ results.
Additionally, you can choose to modify the dataset by selecting onlyincludesubmittedexams or onlyincludeexerciseswithatleastonenon-emptysubmission.
Note
Unsubmitted exams are not eligible for the assessment process.
Review student performance using various metrics such as average, median and standard deviation.
Unsubmitted exams are not eligible for assessment and thereby appear as having no score. The corresponding students are assigned with a no-participation special grade if a grading key exists. It can happen that an exercise is not part of any student exam. This is the case when Artemis selects a different exercise of the same exercise group for every student exam. Similarly to the unsubmitted exams, they can warp the results and statistics of the exam. By eliminating unsubmitted exams and exercises which were not part of the exam conduction, you can gain a more realistic overview of the performance of the students.
Review the students perceived difficulty of every exercise to improve exams in the future.
The exam scores can also be exported via . This is useful to upload the results into university systems like TUM Online as a CSV file.
The exported CSV file includes the students’ name, username, email, registrationnumber, their assigned exercises, their score for every exercise, overallexampoints, overallexamscore, grades (before bonus if bonus is configured), presentationscore, submitted (yes/no) and passed (yes/no) values.
If bonus is configured, the file also contains bonusgrades and finalgrade.
If there is at least one plagiarism verdict in the exam, the file also contains plagiarismverdicts.
If there is at least one plagiarism verdict in the bonus source, the file also contains plagiarismverdictsinbonuscourse/exam.
The exported CSV file also contains the aggregated statistics of the exam conduction such as the numberofparticipations and the averagescore per exercise.
Optionally, you can publish the example solutions of text, modeling, file upload and programming exercises to students with submissions after a desired date by setting ExampleSolutionPublicationDate of the exam to a non-empty date.
All example solutions of these exercises are published according to this date set in the exam, as opposed to the course exercises which have their own individual example solution publication dates.
Example solution publication date can be empty, in this case solutions are never published. This is the default value.
If set, example solution publication date must be the same or after the visiblefrom and endofworkingtime if they are set.
During the review period, students have the opportunity to review the assessment of their exam. If they find inconsistencies, they can submit complaints about perceived mistakes made in the assessment. Students can provide their reasoning through a text message to clarify their objections. You can set the student review period in the exam configuration, see 1.2.2 Create Exam.
Students can submit complaints about their assessment in the Summary page.
During the student review, a complaint button will appear for every manually assessed exercise.
Students cannot submit complaints for automatically assessed exercises like quiz and programming exercises.
Students will be able to submit a complaint for programming exercises, if the automatic result has been reviewed manually by an assessor. This is only possible if manual assessment is enabled for the programming exercise.
Note
If you have found a mistake in the automatic assessment of quiz and programming exercises, you can edit those and re-trigger the evaluation for all participants.
For more information on how students can participate in the student review and submit complaints, see student summary guide.
Artemis collects the complaints submitted by the students during the student review. You can access and review the complaints similar to the submissions from the Assessment Dashboard. Every assessor can evaluate a complaint about the assessment of their peers and either accept or reject the complaint. Artemis will automatically update the results of accepted complaints. You can view the updated scores immediately in the Scores page. There you can also export the updated data in CSV format, see 4.1 Exam Scores.
The complaints appear below the exercise submissions.
The original assessor of an assessment cannot respond to the complaint. A second assessor must review the complaint and respond to it.
Artemis tracks the progress of the complaint assessment and displays a progress bar in the Assessment Dashboard. This allows you to keep track of the complaint assessment and see how many open complaints are left.
Artemis provides several ways to enable the exam live statistic dashboard. Instructors can enable the exam live statistics dashboard on the exam creation page.
Checkbox to enable the exam live statistics during the exam configuration
In the exam live statistics dashboard, instructors can see the current status of this feature. In addition, they can then deactivate or activate it.
The exam live statistics status indicates that the exam live statistics is enabled
The exam live statistics status indicates that the exam live statistics is disabled
Admins can enable and disable this feature via the global feature toggle.
There are two ways to access the exam live statistics dashboard. First, the instructor can directly enter the dashboard’s corresponding URL. Second, they can navigate to the exam checklist and find the button there as well.
The exam conduction section of the exam checklist contains a button to navigate to the exam live statistics dashboard
The exam live statistics header displays the essential information about the exam. It includes the title of the exam, its start and end date, and the registered participants. Additionally, it presents the number of planned exercises and the exercise groups.
Below the short description of the dashboard, there are three sections. These sections allow the instructor to see the collected information in multiple variants without reloading or switching the page.
The header of the exam live statistics dashboard contains the exam title, the start and end date, the registered students, the number of exercises, and exercise groups. In addition, it describes the functionality of the dashboard.
This section provides a user-friendly overview of the exam live statistics dashboard. Instructors can summarize essential information in the overview section. In particular, it provides all the information on one page without the user having to switch between multiple pages. The overview contains the following main components:
The header of the box contains the title of the respective subsection. In addition, there is a short description of the exercise details that instructors can see. To get more details, instructors can click on the right arrow and navigate to the exercises section, which provides detailed statistics about the current progress of each exercise. The box contains multiple charts:
The exercise section is the first component of the overview section, which provides additional statistics about the current progress per exercise.
In the top left (A), the chart shows the current number of students grouped by the exercise groups. In addition, in the top right (B), a bar chart shows the current number of students per exercise. In those numbers we do not include who is on the exam overview page or who is just about to hand in their exam.
At the bottom left (C), the diagram shows the number of initial navigations per exercise, providing the possibility for instructors to track how many students have viewed each exercise. Last, at the bottom right (D), instructors can see the number of first submissions per exercise, providing the option for instructors to get details on the individual progress per exercise.
The header of the box contains the title of the activity log section. In addition, there is a short description of the activities that instructors can see. Similar to the previous section, instructors can click on the arrow to get more details and navigate to the activity log section, which provides detailed statistics about the performed actions and the current progress of each student. The box also contains multiple charts:
The activity log section is the second component of the overview section, which provides additional details about the performed actions.
At the top left (A), the chart shows the current number of actions performed by all students during the exam. It provides an overview of how many interactions are performed during the exam. For example, whether there are more interactions at the beginning or more at the end. Additionally, on the top right (B), the bar chart shows the current average number of performed actions. Since students have different habits, and some switch or hand in assignments more often than others, it is interesting to see how they compare to the average.
Last, at the bottom (C), the diagram shows the number of initial actions performed per category, which allows instructors to track precisely when and how many actions students performed. Thus, we provide the possibility to observe how many students start the exam on time or delayed, or whether students often visit the hand-in early page during the exam but then decide to do not so.
Similar to the overview page, the exercise page contains the four previously mentioned diagrams. In addition to the charts, this page also contains a table that provides details for each exercise.
The table shows the exercise id, the corresponding group id, and the title of the exercise per column. In addition, it highlights the respective type with an icon. In order to provide important information in the most space-saving way, each column is clickable. Additionally, instructors can unfold the column and receive detailed information on the current progress per exercise via three charts.
Like the general exercise overview, an instructor can see the respective initial navigations and first submission per student. In order to see the current activity progress per exercise, instructors can see the current students at this exercise.
The table of exercises shows the first navigations, first submissions, and current participants per exercise.
Similar to the overview page, the activity log page contains the three previously mentioned diagrams. In addition to the charts, this page also contains a table that provides a log of the performed actions. The table shows the student exam id, the timestamp, and the action category per column. In addition, it provides different badges based on the category.
The table of actions shows the first timestamp, type, and additional details per action. Depending on the type of action, we display different details.
Since each start or restart of the exam creates a unique session, we show the assigned session id (A). This information is only available for the started exam actions. Furthermore, users can switch through exercises or to the exam overview page. For each switched exercise action, we show either the badge containing the exercise id or nothing, which means that the user switched to the overview page (B).
Each time a student saves the current exercise state, we display the associated exercise id and submission id. Depending on the details of the performed action, we show the Automatically badge. If the user or the system forces a save, we extend the states of the save by the fact of whether it was successful or not (C). In some scenarios, the server may be unreachable during the exam, or the student may lose their internet connection. The connection updated actions contain the current connection status, which we display with two badges (D).
The activity log table contains details per action. We display different action detail badges containing relevant information depending on the received action.
The Orion plugin for IntelliJ supports students, teaching assistants (tutors), and instructors with the conduction of programming exercises. It integrates Artemis in IntelliJ and automates the download (clone) and upload (push) of exercises.
Orion is installed like every other plugin via IntelliJ’s marketplace. Alternatively, although usually not needed, it is possible to install builds directly from Orion’s GitHub repository. The installation process is described in the readme.
After installation, Orion provides a custom tool window at View -> Tool Windows -> Artemis. This window contains an integrated browser displaying Artemis.
At the top of the tool window are the following buttons:
A help button , which opens this documentation page.
A back button . Clicking it returns to the initially opened page, that is:
The exercise details page if an exercise is opened as a student.
The exercise edit in editor page if an exercise is opened as an instructor.
The assessment dashboard if an exercise is opened as a tutor, or, if a submission is downloaded, the assessment details page of that submission.
The Artemis home page if no Artemis project is opened.
If you opened this page in Orion, use the back button to return to Artemis.
Orion’s settings are at Settings -> Tools -> Orion. The settings include:
Artemis base url: Can be changed to switch to a specific Artemis instance. Defaults to https://artemis.cit.tum.de. Important: The url must not end with a /, otherwise it does not work!
Artemis exercise paths: Orion suggests to store newly downloaded exercises at default-path/course/exercise-name, with the default path dependent of the setting.
Default commit message: The default message for each commit.
Change user agent: The user agent is sent to Artemis to identify Orion. Usually, no changes are required.
Restart the integrated browser: If Orion gets stuck, restarting the browser might solve the issue without restarting IntelliJ.
Instructors can set up programming exercises via Orion by performing the following steps:
The exercise needs to be created as described at the exercise creation of programming exercises, step 1 and 2.
After the creation, navigate to the instructor exercise overview using the integrated browser.
Each programming exercise provides a button to edit the exercise in Orion . The button is rightmost in the table and might require scrolling. Clicking it downloads the template, solution and test repository of the exercise.
Edit the repository files in IntelliJ.
To submit the changes, click . This commits and pushes all local changes to their respective repository.
The integrated browser displays the editor to update the problem statement.
To test the code locally, click , which copies the tests with the local template or solution (whichever was selected) into a new folder and executes them locally.
Tutors can assess programming exercises via Orion by performing the following steps:
Navigate to the assessment dashboard of the exercise using the integrated browser.
Click to automatically set up the assessment project.
After downloading or opening the project in IntelliJ, the submission overview is shown in the integrated browser. Each submission can be opened in Orion. To start a new submission, click . This downloads the submission files and overwrites the previous submission.
The student’s code is located in the directories assignment and studentSubmission (assignment contains the files that can be edited, studentSubmission contains an uneditable copy that can be assessed). The tests are in the directory tests.
Opening a file in either assignment or studentSubmission opens the editor with two available modes that can be switched using the tabs at the bottom of the editor.
In edit mode (“Text” tab), the files can be edited regularly, e.g. to try out fixes.
In assessment mode (“Assessment” tab), the student’s submission without the local changes is displayed in read-only mode. In this mode, assessment comments can be added, similar to the assessment in Artemis. Click the plus on the gutter on the left of the editor to add a new comment.
The integrated browser displays the problem statement, the assessment instructions, and the buttons to edit the general feedback.
Artemis enables students, tutors, and instructors to actively participate with its communication capabilities.
Various communication features allow students to engage with peers and ask all kinds of questions whereas moderators
(e.g., instructors and tutors) can provide general course information and answer content-related questions.
Communication can be made in different contexts, namely for lectures, exercises, or courses. Course participants can also
message each other to communicate in private. Below, you can find more information on specific features and how to use them.
Artemis courses will by default enable all the communication features.
In case you do not want to provide users with these features, you can disable them on course creation by unchecking the
respective checkbox (Enablecommunicationfeatures) - it can also be edited afterwards.
Besides lecture or exercise related questions, Artemis offers a third post type: posts with course-wide topics, such as
Organization or TechSupport. These posts can only be created on the course communication overview, which is shown
in the screenshot below.
The Communication space of an Artemis course serves as overview for all posts in a course.
Hence, course-wide posts as well as exercise posts and lecture posts are listed.
Here, users can easily query, sort, and filter existing posts.
Users of a course can communicate in private via the Messages page. (see image below) The page consists of a collapsible
Conversation sidebar on the left, where users can search for other participants of the current course and start a conversation
with them.
If the recipient is browsing another conversation when they receive a new message, an envelope icon appears in their
Conversation sidebar, next to the affiliated user who has sent the message. This way, users become aware of the new message
within that discussion.
The authorities of tutors and instructors are more restricted in the Messages Page compared to the Course Communication
Overview. Messages of a conversation are exclusive to its participants and are only editable or deletable by their respective
author.
Messages do not consist of titles. Users can continue a discussion around a specific message topic by clicking the messages’
“Reply in thread” button, which opens the Thread sidebar (see image below). The Thread sidebar is a collapsible sidebar
located on the right-hand side of the Messages Page when displayed.
To build trust between users utilizing the system’s communication features, we prepend an icon to the author’s name in the
headers of posts and their replies. The icon we introduce differs according to the role of the author within the course
and reveals their highest authoritative role. Via tooltips that are shown as users hover over these icons (see images below),
the system displays a brief explanation of that specific role. This way, the system builds trust in the author, and readers
can tangibly confirm the origin of the shared information.
To foster interaction between users, we integrate the well-known emoji reaction bar.
Each user in the course can react on any post by making use of the emoji selection button.
The + emoji serves as the up-voting reaction, which influences the display order of posts.
Users can reference to different course entities within their posts, such as other posts, course exercises, course lectures,
and attachments of these lectures. All references are then prepended with icons which are unique to the reference’s type,
to help users distinguish them conveniently. In the image below, we see all possible types of references that can be created
in an Artemis post.
If users want to refer to other posts, they can integrate a simple pattern including the hashtag (#) combined with
the post identifier. A post’s identifier is appended to the post title (such as seen in the screenshots above).
When clicking a post reference used in a post’s text, the user is navigated to the referenced post.
Users can refer to exercises of the current course, via the dropdown menu Exercise available on the posting markdown
editor (see image above). The following types of exercises are prepended unique icons to help distinguish the type of the
exercise being referenced.
Users can refer to lectures of the current course, via the dropdown menu Lecture available on the posting markdown
editor (see image above). Here, lecture attachments can be found in a nested structure.
In order to prevent duplicated questions from being posted, we integrate a duplication check that runs during post creation.
We strongly recommend users that create a post, to check the automatically provided list of similar posts to find out whether
the question in mind has already been asked and resolved in the best case.
Marking a post as resolved will indicate to other users that the posted question is resolved and does not need any further input.
This can be done by clicking the check mark next to the answer post. (see image below)
Note, that only the author of the post as well as a moderator can perform this action.
This is helpful for moderators to identify open questions, e.g., by applying the according filter in the course overview.
It also highlights the correct answer for other students that have a similar problem and search for a suitable solution.
When creating a post, users can choose to add arbitrary tags.
Tagging a post will further narrow down the post purpose or content in precise and descriptive keywords, that might follow a course-specific taxonomy.
Tutors can change the context (lecture, exercise, course-wide topic) in the edit mode of the post.
By changing the context, for example from a certain exercise to a course-wide topic, the post is automatically moved.
In the example at hand, the post will not be shown on the according exercise page anymore, but rather only in the course-wide
communication overview, associated with that certain course-wide topic.
By clicking the pushpin icon next to the reaction button of a post, a moderator can pin the post.
As a consequence, the post is displayed at the top of any post list to receive higher attention.
As a complement to pinning, i.e., highlighting posts, a moderator can archive posts and thereby put them at the bottom of a post list.
This can be achieved by clicking the folder icon next to the reaction button.
Moderators should be aware that this reduces the visibility of the post.
Instructors can create course-wide posts that serve as Announcements.
They target every course participant and have higher relevance than normal posts.
Announcements can be created in the course communication overview by selecting the topic Announcement.
As soon as the announcement is created, all participants, that did not actively refrain from being notified, will receive an email containing the announcement’s content.
Additionally, announcements visually differ from normal posts and are always displayed on top of the communication overview.
Artemis facilitates the coordination of tutorial groups in a course. Tutorial groups are a learning strategy where students teach and learn from each other in small groups (20-30). In this strategy, proficient students act as tutors and lead the groups. The tutor and the group members usually meet weekly either on campus or online. Students present their solutions to homework assignments or other tasks and receive feedback and suggestions from the tutor and their peers.
Note
The tutorial group overview page (used by students) displays dates in the student’s current time zone. The tutorial group management page (used by instructors and tutors to manage groups and their sessions) displays dates in the course time zone. This is helpful if the instructors or tutors travel and work across multiple time zones. A header at the top of the page shows the current time zone used for dates.
Before the tutorial group feature can be used, three configurations need to be set up:
Time zone information: This ensures that the tutorial group meeting times are displayed correctly for each student and instructor.
Default tutorial group period: This is the semester period when the groups usually meet. It is used to prefill the meeting period when creating a new tutorial group. The tutorial group period can be changed later on for each group individually.
Artemis managed tutorial group channels: This option allows Artemis to create and manage a dedicated channel for each tutorial group in the ‘Messages’ section of the course. This feature is only selectable if the course has the Messaging feature enabled in the course settings. If activated, tutorial group channels can still be managed manually but Artemis automatically performs some common tasks, such as:
Adding and removing students from the channel when they register or unregister for the tutorial group
Making the assigned tutor a moderator of the channel
Deleting the channel when the tutorial group is deleted
If not all of these configurations are done, the instructor will see a checklist page with the missing configurations. These configurations can also be changed later in the GlobalConfiguration section of the tutorial group page. A link to this section is hidden behind the More... button.
Tutorial groups can be created manually or by importing a CSV file. Importing a CSV file is a convenient option if the tutorial groups and student assignments already exist in a campus management system (e.g. TUM-Online). This way, both the groups and the assignments (student to tutorial group) can be created at once.
The assigned tutor and the session schedule are the most important settings of a tutorial group. The tutor holds the sessions, tracks the number of attending students, and gives feedback to the students. The tutor can also register or unregister students and edit the sessions by cancelling or rescheduling them. The meeting schedule shows the regular times of the sessions during the semester. It is used to create the individual sessions automatically.
By clicking on the Holidays button, the instructor can define course-wide days where no tutorial group sessions are held (no matter which tutorial group). If such a day overlaps with a tutorial group session, the session is automatically cancelled and the holiday given as the reason.
Assigned tutors can manage their tutorial groups by navigating to the course’s TutorialGroups page in the course administration section. The tutor can view the group’s details, register or unregister students, and edit the sessions by cancelling or rescheduling them. The tutor also has moderation rights in the tutorial group’s channel in the Messages section of the course if the Artemismanagedtutorialgroupchannels feature is enabled in the tutorial group settings.
The groups for which the tutor is responsible have a blue background.
Note
The instructor can perform the same actions as the tutor for all tutorial groups in the course. The tutor can only manage the tutorial groups that they are assigned to.
By clicking on the RegisteredStudents button, the tutor can view the list of students that are registered for the tutorial group. From this list, the tutor can register or unregister students for the tutorial group. Only users that are enrolled in the course can be registered for a tutorial group.
Note
An instructor can also import student registrations from a CSV file and export the list of registered students as a CSV file. Extra buttons for these actions are available in the RegisteredStudents page of the tutorial group page for instructors.
By clicking on the Sessions button, the tutor can view the list of sessions that are scheduled for the tutorial group. From this list, the tutor can cancel or reschedule sessions. The tutor can also create new sessions by clicking on the CreateNewTutorialGroupSession button. Furthermore, the tutor can enter the number of attending students for each session. This information is used to calculate the utilization of the tutorial group.
Note
The utilization of a tutorial group is the average attendance divided by capacity (if defined). The average attendance considers the last three sessions. If no attendance is entered, the corresponding session is ignored and the calculation is performed with one or two sessions.
Students can access their tutorial groups for a specific course by navigating to the course’s TutorialGroup page.
The view is split into two tabs that can be changed via the toolbar. In the Showmytutorialgroups tab, students can see the groups for which they are registered. Each group is displayed as a card with the date of the next session in the center. At the bottom of the card, a link to the tutorial groups communication channel is shown (if this feature is activated). On the right side of the page, all course-wide holidays are displayed where no tutorial groups meet.
In the ShowAllTutorialGroups tab, students can see the main information for all tutorial groups of the course. This includes the name, utilization and responsible tutor. This page is useful if they want to switch to a less crowded group. Then they can pick one via low utilization and contact the tutor so that they get registered.
To see more details about a tutorial group (including an overview of all sessions), the student can click on the name of the tutorial group. This will open the detail page of the respective group. The session table is shown at the bottom of the page. By default, only the top row is displayed which is the next session. The student can expand the whole table by clicking on the ShowAllSessions button at the bottom.
Artemis allows tutors and exercise instructors to check assignment submissions from students for plagiarism.
With this feature, different types of assignments can be checked in Artemis, including programming assignments, modeling assignments, and text assignments.
Plagiarism checks are available for both course and exam exercises.
To perform the plagiarism check, the responsible tutors must initiate the checking process for a specific exercise.
First, we give an overview of different available features of the plagiarism check.
Next, we explain the plagiarism check workflows from the perspective of various Artemis users, using UML Activity Diagrams to visualize the process.
In this section we give an overview of available features of plagiarism check. We explain different configuration settings, possible user actions and present the plagiarism result views, which are accessible after plagiarism checks run.
Before starting the plagiarism check, the user can configure different settings to get the best possible results.
Similarity Threshold in % (minimum value 0, maximum value 100).
Ignore comparisons whose similarity is below this threshold.
A similarity of 0% means that there is no overlap between two submissions, a similarity of 50% means that about
half of two submissions are identical.
A similarity of 100% means that two submissions are completely identical.
Minimum Score in % (minimum value 0, maximum value 100).
Consider only submissions with a score greater than or equal to this value.
Minimum Size.
Default value: Consider only submissions whose size is greater or equal to this value.
Modeling exercises: Consider only submissions that have at least as many modeling elements as the specified value.
Text exercises: Consider only submissions that have at least as many words as the specified value.
The user can use different actions to run a plagiarism check, inspect the results or improve the performance.
Note
Plagiarism detection can take a long time for large courses and exams.
Detect plagiarism: The user can execute the plagiarism check using this action.
Rerun plagiarism detection: The user can rerun the plagiarism check by executing this action. It can be helpful to check for plagiarism using the different settings as described in section Settings.
Clean up: The user can clean up the latest plagiarism check results for the selected exercise. It helps to keep the database slim and safe storage capacity. Moreover, it should improve the overall system performance. To execute the clean-up action, the user must approve it by clicking ok on the dialog, as this action deletes all potentially approved or denied plagiarism cases.
Download: The user can download the plagiarism results in JSON or CSV format to open them in a selected editor for the further analysis.
After the plagiarism check was executed, the results can be inspected in different views.
Overview of the similarity distribution. This statistical overview shows the similarity distribution based on the percentage as a histogram chart. The user can analyze the distribution quickly and adjust the plagiarism check settings as needed accordingly.
Selected submission. When the user selects the submission from the submission list on the left side of the plagiarism run result view, new actions can be selected to initiate the plagiarism check workflow. We will provide further details to the workflow in the next section.
Tutors and instructors can execute plagiarism checks. They carefully review the automatically identified cases and provide a first decision whether to accept or deny the found case.
We visualized the process in the following diagram.
Open the Exercise via:
For course exercises: Course Management → Course → Exercises → Your Exercise.
For exam exercises: Course Management → Course → Exams → Exam → Exercise Groups → Your Exercise.
Access the plagiarism tool as seen in the picture
Run the Plagiarism Check.
Tutors and instructors can adjust the similarity threshold and other settings as described in section Settings if deemed necessary (most of the time not, depends on the formulation of the exercise and on the number of possible solution. A similarity of less than 50% typically means that students did not plagiarize).
Checking exercises with many students can take some time (sometimes multiple minutes) and might be resource intensive → We recommend to run plagiarism checks at times when only a few users actively use Artemis in such cases
Start checking for plagiarism.
Review if the presented matches are actual plagiarism cases or not.
Depending on your decision, either deny or confirm the match as plagiarism.
Continue until the matches start to get “too different”.
The instructors can execute the same actions as tutors, additionally they are able to make a final decision on the particular confirmed plagiarism case.
We visualized the process in the following diagram.
Open the plagiarism cases via:
For course exercises: Course Management → Course → Plagiarism Cases.
Overview the current status of all confirmed plagiarism cases by inspecting the management overview. The overview provides the following information about each confirmed case:
In how many comparisons the confirmed case appears.
Student was notified or not.
Student has responded or not.
Final verdict.
Export the confirmed cases as CSV file to analyze them in another editor.
Select one confirmed case and navigate to the case detail view.
Notify the student about the potential plagiarism case and ask him or her for a statement.
Make a final verdict by selecting one of 3 available options. The final verdict must be approved by clicking ‘Confirm’ in the dialog.
No plagiarism. The instructor can deem the accusation invalid and resolve the plagiarism case.
Warning. The instructor can write a warning message to the student and confirm the verdict in the dialog.
Point deduction in % (minimum value 0, maximum 100). Deduct exercise points and confirm the verdict in the dialog.
Plagiarism. If a grading key exists, the student receives the plagiarism special grade for the exam or course that the corresponding exercise belongs to. The next steps must be made manually by contacting responsible persons in the university to mark the student’s grade as “Unterschleif”.
After the student got a notification that the instructor asked for the statement, he or she has one week to respond.
The process is visualized in the following diagram.
Open the notified plagiarism confirmed case via:
For course exercises: Course Overview → Course → Exercise → Plagiarism Case / Resolved Plagiarism Case.
For exam exercises: Course Overview → Course → Exams → Exam → Plagiarism Case / Resolved Plagiarism Case.
Artemis integrates different statistics for students to compare themselves to the course average.
It allows instructors to evaluate the average student performance based on exercises and competencies.
Note
To preserve the individual’s data privacy, Artemis calculates the data on the fly and only shows aggregated data to instructors so that it’s not possible to track an individual student within the learning analytics modules.
To get a quick overview about the achieved points per exercise, a bar chart shows the achieved points of the student for every exercise grouped by the exercise type.
Competencies are overarching learning objectives that tie together various lectures and exercises.
In case competencies are defined, students can get an overview of their individual progress and confidence on the competencies tab.
The page lists all competencies with their title, description, and taxonomy.
Expanding the prerequisites section shows the student all competencies from previous courses the instructor has selected as a prerequisite for this course.
When clicking on a competency, a page opens and displays detailed statistics about the competency together with all linked lecture units and exercises.
The tripartite rings show the student’s advancement:
The blue ring describes the progress, the percentage of completed lecture units and exercises.
The green ring visualizes the confidence, the average score in all linked exercises in relation to the threshold required for mastering the competency (set by the instructor).
The red ring is a weighted metric of the student’s progress and confidence, which shows the overall advancement towards competency mastery.
A competency is considered mastered by a student when they completed all linked learning objects (progress equals 100%) and have an adequate confidence level (average score greater or equal to mastery threshold).
Artemis provides instructors with several different learning analytics, which are detailed in this section.
Key statistics such as the number of total assessments or the average student score are already displayed on the course’s main page.
More detailed as well as additional course statistics can be inspected by instructors when navigating to the course’s Statistics page.
On this page instructors can explore and evaluate all available course statistics such as the average points, number of submissions, number of active users, and much more.
All the statistics can be generated for different time frames.
Similar to course statistics instructors can have an overview of different learning metrics such as the average score or the participation rate for a specific exercise.
To get to this view, instructors can either click on one of the average score bars of the Course Statistics or click on the Statistics button that is displayed on each exercise overview page.
Competencies are overarching learning objectives that link together different course materials.
An instructor can view and edit all competencies of a course on the competency management page.
The table shows the title, description, and percentage of students that mastered the respective competency.
On a side note, instructors can also select competencies from previous courses they thaught as a prerequisite for this course.
Students can see these on the competencies page and during the course self-registration.
An instructor can create or edit competencies using the following form.
Besides a title and description, they can optionally set a taxonomy.
The mastery threshold describes the minimum average score required for a student to reach 100% confidence in this competency.
The current average score of all linked exercises shown on this page can be used as a basis for defining a reasonable threshold value.
Instructors can link competencies to lecture units on this page by first choosing a lecture and then selecting desired lecture units.
Alternatively, instructors can also link competencies to an exercise or lecture unit on the respective management page using the selection box shown below.
The user can change their preference for different types of notifications and decide if they want to receive emails, web notifications or no notification.
These settings can be found after opening the web notifications. The gear on the top left of the sidebar then leads to the settings.
In case you have cancelled a previous tour or want to do an already finished tour again, you can start any tour that
is available on your current page by opening your account menu and selecting “Continue Tutorial”:
You can run all tours as often as you’d like. Any changes to the tutorial course you performed during your last run
will be reset automatically.
Note
To resume or restart a tour, you need to be on the page where you started it the last time. The menu will
not show resumable tours that are not available on the page you’re currently on.
By default, Artemis will choose the preferred theme according to your OS settings.
This is especially useful if you use a mechanism that changes your UI theme system-wide based on your ambient light or
the time of the day.
This setting will be disabled if you ever change the theme of Artemis manually. To re-enable the synchronized
behavior, enable this setting in the theme settings popover.
In the following example, the user has manually set the light theme, but the OS reports a preference of dark
themes, so the theme changes automatically when enabling the synchronization.
Artemis allows instructors to define grading keys for courses and exams.
When the grading key is defined, the total points obtained by a student from all exercises are calculated and mapped to a grade.
Grading Keys can be either:
Grade type which maps the points to a letter or numeric grade with a FirstPassingGrade. This can be used e.g., for final exams or courses without exams that are graded solely by exercises.
Bonus type which maps the points to a numeric value. This can be used e.g., when the grade obtained in the given course or exam is not an end result but complements a final exam.
There are two configurable special grades that are automatically treated as failing grades:
PlagiarismGrade (default U) is assigned when a student has received a Plagiarism verdict in one or more exercises.
No-participationGrade (default X) is assigned according to the conditions below for courses and exams:
For a course, a student receives this grade if they do not start any exercise, i.e., the number of participations is 0.
For an exam, a student receives this grade if they do not submit the exam.
In bonus assignment calculations, these two special grades are treated equivalent as receiving a total score of 0 from the corresponding course or exam.
Instructors can export and import grading keys in CSV format to reuse them inside or outside of Artemis.
If a grading key is defined, exporting student results includes the obtained grade information as well.
It is also possible to create or modify grading keys after an exam or course is over.
In that case student grades will be adjusted automatically.
During a semester, students can see their grade based on attainable points for a course.
Attainable points are the sum of the maximum points for all exercises assessed so far, independent of a student’s participation.
This means the students can track their relative performance during the semester without having to wait until all the exercises are conducted and assessed.
Note
The Grades section has more detailed information on how to read the boundaries of the grading keys.
Instructors can create bonus configurations for exams with Grade type grading keys by clicking on on the exam detail page.
A bonus configuration maps the grade received from a bonus source, which can be a course or another exam, as an improvement to the final exam grade.
In order to configure a bonus, an instructor needs to choose appropriate values for the fields below:
Bonus strategy defines how the grade obtained from the bonus source will affect the final exam. Artemis currently supports 2 strategies:
Grades: First, calculates the target exam grade. Then, applies the bonus to that.
Points: First, applies the bonus to the student’s points. Then, calculates the final grade by matching the resulting points to the target exam’s grading key.
Discreteness (Only applicable if grades bonus strategy is selected) specifies how to combine the bonus grade with the exam grade. Currently only the first discreteness option is implemented.
Continuous: Applies bonus arithmetically to the student’s grade. Final grade can be any numeric value between the best and the worst grade step values (e.g. from 1.3 to 1.2).
Discrete: (Not available yet) Bumps the student’s grade to a better grade step. Final grade will be one of the grade steps in the target exam (e.g. from 1.3 to 1.0 or from B to A).
Calculation defines the sign of the operation to indicate if it is an addition or subtraction.
− (Default option for grades): Subtracts bonus from target exam’s grades/points. Prefer this when lower means better.
+ (Default option for points): Adds bonus to target exam’s grades/points. Prefer this when higher means better.
Bonus source is the course or exam determining the bonus amount. When calculating the final grade for a student, the grade they received from the bonus source is substituted into the Bonus parameter in the formula explained below. The dropdown lists courses and exams with Bonus type grading keys if the current user is an instructor of it.
The bonus configuration page has a wizard mode where the options appear one by one initially to navigate the new users through the process easily.
When an instructor opens the bonus configuration page for an exam without a bonus, Artemis displays the options in wizard mode. Artemis shows the grade steps and max points of the selected grading key below the dropdown as a reminder to the instructor.
When the instructor is editing an already saved bonus configuration, Artemis hides the explanations inside the tooltip and only shows them on hover. Also, Artemis presents all options at once to provide a compact view that is quicker to navigate for the users who are already familiar with the bonus configuration.
After the instructor chose values for all the fields above, Artemis generates the bonus calculation formula along with 5 examples to enable instructors to check the bonus configuration is correct before saving. Artemis tries to generate the examples using a heuristic with the following conditions:
the exam points are in ascending order,
the bonus source student points are in descending order,
the first example shows that the bonus is not applied when the exam grade is a failing grade,
the final example shows final grade cannot exceed the maximum grade.
The last row of examples enables instructors to type arbitrary exam points and bonus source student points to try out custom examples dynamically to test the bonus configuration manually.
Artemis calculates the resulting values in the example table when the instructor types the desired value in the corresponding number input field and then clicks outside of the current input.
Courses contain exercises, exams and lectures.
Different roles can be attached to different sets of course participants: Instructors, Editors, Tutors and Users.
You can find a detailed listing of the different roles and their corresponding permissions here.
Administrators can create new courses by clicking on in the header and then clicking on .
Administrators can then specify options in the creation form, Instructors can later change these values:
Title: Name of the course that is shown to users e.g. in the . Can be changed later.
Short Name: Unique identifier of the course. Can not be changed later as it is included in resources of external systems such as build plans.
Color: Color of the course that will be used e.g. in the .
Course Description: Description of the course that will e.g. be shown in the Course Registration table and in the header of the course (once a user clicks on the course in their dashboard).
Course Organizations: Users and Courses can be grouped into different organizations (e.g. based on the email-address of users). Users will then only see courses from their organizations, allowing multiple organizations to use the same Artemis environment without interfering with each other.
If no course organization is set, all users can interact with the course.
Start/End: Start and End Date of the course. Users can only self-register to a course between the Start and End Date.
Test Course: Whether this course should be used for testing. Upcoming exams and exercises within Test Courses are not displayed to users.
Semester: The semester in which the course is conducted. Courses are grouped by semesters to simplify the search in Artemis environments with a large number of courses.
Maximum number of points for course: This number is used for the grading key calculation. Updating the maximum number will recalculate the grading key, if it has been set up before.
Amount of decimal places used for calculating the scores: Specify the granularity for score calculations. You might have to increase this value if you create exercises with a small number of maximum points and a large number of tests.
Default Programming Language: The programming language that should be preselected when a new programming exercise is created in this course.
Customize group names: Allows specifying custom group names that should be used in external user management systems. Default group names (based on the course short name) will be used if not specified. Note: Only administrators can change this setting. Instructors can not change it.
Enable complaints: Whether complaints should be enabled in this course. Details regarding complaints can be found here.
Enable more feedback requests: Whether more feedback requests should be enabled in this course. Details regarding more feedback requests can be found here.
Enable postings by students: Whether the posting sections should be enabled for lectures and exercises. More details regarding this can be found here.
Online Course: Whether this course should be part of an external online course using the LTI interface (e.g. on edX). This requires additional setup by administrators. Note: Online Course and Student Course Registration Enabled are mutually exclusive.
Student Course Registration Enabled: Whether students should be able to register themselves. If enabled, students can register for the course by clicking in their . An optional confirmation message can be specified, it will be shown to students during the registration and can be used e.g. to inform students about examination rules. Note: Online Course and Student Course Registration Enabled are mutually exclusive.
Presentation Score: Whether students have to hold presentations in this course, e.g. to be eligible for an exam bonus. The required minimum number of presentations can be defined if this option is enabled. Note that you can define for every exercise whether it should be eligible for the presentation score within the exercise settings.
After a course has been created, it can be accessed within the . Courses are grouped by the semesters (or in the Semester: Unset group if no semester is specified). Test Courses are also in a separate group.
Artemis is a scalable system that supports large courses: Some courses at TUM have been conducted using Artemis with more than 2000 students.
Depending on your setup and the number of users - as well as other requirements such as availability - it might be necessary for you to scale your Artemis installations.
Different aspects of the Artemis infrastructure can be scaled as listed in this document.
The build system is responsible for providing feedback to students when they work on programming exercises.
Especially during lectures and before deadlines, a lot of students work simultaneously on their submissions, causing high load on the build system.
The build system must be scaled adequately in order to provide feedback within a reasonable time.
Artemis supports scaling to provide high availability as well as improved performance.
This is especially important if you plan to conduct exams using Artemis.
A Single Server Setup offers a simple installation and can be used for testing purposes and a small (100) to medium (500) number of concurrent users.
A Multi Server Setup uses a load balancer to distribute requests between the different instances of the Artemis Application Server.
Environments can easily be scaled to support more than 1000 concurrent users.
All instances share the same database (and filesystem) and can be added/removed from the environment during runtime.
Details regarding the scaling of Artemis can be found in the corresponding section of the documentation.
Depending on your setup and the number of users, you might also be required to scale/optimize other parts of the Artemis infrastructure, e.g. the database or used third-party systems.
Please refer to the corresponding documentations for detailed information.
Note that support for Kubernetes is currently being added.
Artemis extends the basic Markdown syntax to support Artemis-specific features. This Artemis flavored Markdown is used to format text content across the platform using an integrated markdown editor.
The markdown editor contains a formatting toolbar at the top, allowing users to format text without learning Markdown syntax.
In addition, images can be uploaded and included by either dragging and dropping them into the editor field or by clicking at the footer of the editor, which brings up the file selection dialog.
The user can switch to a preview of the formatted content by clicking on the Preview button.
Markdown is also supported in the context of communicating with other users. Here, the Markdown syntax is extended to allow users to reference other posts, lectures or exercises.
In this guide you learn how to setup the development environment of
Artemis. Artemis is based on JHipster,
i.e. Spring Boot
development on the application server using Java 17, and TypeScript
development on the application client in the browser using
Angular and Webpack. To get an overview of the
used technology, have a look at the JHipster Technology stack
and other tutorials on the JHipster homepage.
You can find tutorials how to setup JHipster in an IDE (IntelliJ IDEA
Ultimate is recommended) on
https://jhipster.github.io/configuring-ide. Note that the Community
Edition of IntelliJ IDEA does not provide Spring Boot support (see the
comparison
matrix).
Before you can build Artemis, you must install and configure the
following dependencies/tools on your machine:
Java
JDK:
We use Java (JDK 17) to develop and run the Artemis application
server which is based on Spring
Boot.
MySQL Database Server 8, or PostgreSQL:
Artemis uses Hibernate to store entities in an SQL database and Liquibase to
automatically apply schema transformations when updating Artemis.
Node.js: We use Node LTS (>=18.14.0 < 19) to compile
and run the client Angular application. Depending on your system, you
can install Node either from source or as a pre-packaged bundle.
Npm: We use Npm (>=9.4.0) to
manage client side dependencies. Npm is typically bundled with Node.js,
but can also be installed separately.
( Graphviz: We use Graphviz to generate graphs within exercise task
descriptions.
It’s not necessary for a successful build,
but it’s necessary for production setups as otherwise errors will show up during runtime. )
( A version control and build system is necessary for the programming exercise feature of Artemis.
There are multiple stacks available for the integration with Artemis:
The required Artemis schema will be created / updated automatically at startup time of the server application.
Artemis supports MySQL and PostgreSQL databases.
Download and install the MySQL Community Server (8.0.x).
You have to run a database on your local machine to be able to start Artemis.
We recommend to start the database in a docker container. You can run the MySQL Database Server
using e.g. dockercompose-fdocker/mysql.ymlup.
If you run your own MySQL server, make sure to specify the default character-set
as utf8mb4 and the default collation as utf8mb4_unicode_ci.
You can achieve this e.g. by using a my.cnf file in the location /etc.
For the development environment the default MySQL user is ‘root’ with an empty password.
(In case you want to use a different password, make sure to change the value in
application-local.yml(spring > datasource > password) and in liquibase.gradle(within the ‘liquibaseCommand’ as argument password)).
If you have problems connecting to the MySQL 8 database using an empty root password, you can try the following command
to reset the root password to an empty password:
mysql-uroot--execute"ALTER USER 'root'@'localhost' IDENTIFIED WITH caching_sha2_password BY ''";
Warning
Empty root passwords should only be used in a development environment.
The root password for a production environment must never be empty.
No special PostgreSQL settings are required.
You can either use your package manager’s version, or set it up using a container.
An example Docker Compose setup based on the official container image is provided in src/main/docker/postgresql.yml.
When setting up the Artemis server, the following values need to be added/updated in the server configuration (see setup steps below) to connect to PostgreSQL instead of MySQL:
This example assumes that the database is called Artemis.
You might have to update this part of spring.datasource.url as well if you chose a different name.
To start the Artemis application server from the development
environment, first import the project into IntelliJ and then make sure
to install the Spring Boot plugins to run the main class
de.tum.in.www1.artemis.ArtemisApp. Before the application runs, you
have to change some configuration options.
You can change the options directly in the file application-artemis.yml in the folder
src/main/resources/config. However, you have to be careful that you do not
accidentally commit your password. Therefore, we strongly recommend, to create a new file
application-local.yml in the folder src/main/resources/config which is ignored by default.
You can override the following configuration options in this file.
artemis:repo-clone-path:./repos/repo-download-clone-path:./repos-download/bcrypt-salt-rounds:11# The number of salt rounds for the bcrypt password hashing. Lower numbers make it faster but more unsecure and vice versa.# Please use the bcrypt benchmark tool to determine the best number of rounds for your system. https://github.com/ls1intum/bcrypt-Benchmarkuser-management:use-external:truepassword-reset:credential-provider:<provider># The credential provider which users can log in though (e.g. TUMonline)links:# The password reset links for different languagesen:'<link>'de:'<link>'external:url:https://jira.ase.in.tum.deuser:<username># e.g. ga12abcpassword:<password>admin-group-name:tumuserldap:url:<url>user-dn:<user-dn>password:<password>base:<base>version-control:url:https://bitbucket.ase.in.tum.deuser:<username># e.g. ga12abcpassword:<password>token:<token># VCS API token giving Artemis full Admin access. Not needed for Bamboo+Bitbucketci-token:<token from the CI># Token generated by the CI (e.g. Jenkins) for webhooks from the VCS to the CI. Not needed for Bamboo+Bitbucketcontinuous-integration:url:https://bamboo.ase.in.tum.deuser:<username># e.g. ga12abctoken:<token># Enter a valid token generated by bamboo or leave this empty to use the fallback authentication user + passwordpassword:<password>vcs-application-link-name:LS1 Bitbucket Server# If the VCS and CI are directly linked (normally only for Bitbucket + Bamboo)empty-commit-necessary:true# Do we need an empty commit for new exercises/repositories in order for the CI to register the repo# Hash/key of the ci-token, equivalent e.g. to the ci-token in version-control# Some CI systems, like Jenkins, offer a specific token that gets checked against any incoming notifications# from a VCS trying to trigger a build plan. Only if the notification request contains the correct token, the plan# is triggered. This can be seen as an alternative to sending an authenticated request to a REST API and then# triggering the plan.# In the case of Artemis, this is only really needed for the Jenkins + GitLab setup, since the GitLab plugin in# Jenkins only allows triggering the Jenkins jobs using such a token. Furthermore, in this case, the value of the# hudson.util.Secret is stored in the build plan, so you also have to specify this encrypted string here and NOT the actual token value itself!# You can get this by GETting any job.xml for a job with an activated GitLab step and your token value of choice.secret-push-token:<token hash># Key of the saved credentials for the VCS service# Bamboo: not needed# Jenkins: You have to specify the key from the credentials page in Jenkins under which the user and# password for the VCS are storedvcs-credentials:<credentials key># Key of the credentials for the Artemis notification token# Bamboo: not needed# Jenkins: You have to specify the key from the credentials page in Jenkins under which the notification token is storednotification-token:<credentials key># The actual value of the notification token to check against in Artemis. This is the token that gets send with# every request the CI system makes to Artemis containing a new result after a build.# Bamboo: The token value you use for the Server Notification Plugin# Jenkins: The token value you use for the Server Notification Plugin and is stored under the notification-token credential aboveauthentication-token:<token>git:name:Artemisemail:artemis@in.tum.deathene:url:http://localhostbase64-secret:YWVuaXF1YWRpNWNlaXJpNmFlbTZkb283dXphaVF1b29oM3J1MWNoYWlyNHRoZWUzb2huZ2FpM211bGVlM0VpcAo=token-validity-in-seconds:10800
Change all entries with <...> with proper values, e.g. your TUM
Online account credentials to connect to the given instances of JIRA,
Bitbucket and Bamboo. Alternatively, you can connect to your local JIRA,
Bitbucket and Bamboo instances. It’s not necessary to fill all the
fields, most of them can be left blank. Note that there is additional
information about the setup for programming exercises provided:
Note
Be careful that you do not commit changes to application-artemis.yml.
To avoid this, follow the best practice when configuring your local development environment:
Create a file named application-local.yml under src/main/resources/config.
Copy the contents of application-artemis.yml into the new file.
Update configuration values in application-local.yml.
By default, changes to application-local.yml will be ignored by git so you don’t accidentally
share your credentials or other local configuration options. The run configurations contain a profile
local at the end to make sure the application-local.yml is considered. You can create your own
configuration files application-<name>.yml and then activate the profile <name> in the run
configuration if you need additional customizations.
If you use a password, you need to adapt it in
gradle/liquibase.gradle.
This setup is recommended for production instances as it registers Artemis as a service and e.g. enables auto-restarting
of Artemis after the VM running Artemis has been restarted.
As alternative you could take a look at the section below about
deploying artemis as docker container.
For development setups, see the other guides below.
This is a service file that works on Debian/Ubuntu (using systemd):
The following parts might also be useful for other (production) setups, even if this service file is not used:
-Djava.security.egd=file:/dev/./urandom: This is required if repositories are cloned via SSH from the VCS.
The default (pseudo-)random-generator /dev/random is blocking which results in very bad performance when using
SSH due to lack of entropy.
The file should be placed at /etc/systemd/system/artemis.service and after running sudosystemctldaemon-reload,
you can start the service using sudosystemctlstartartemis.
You can stop the service using sudoserviceartemisstop and restart it using sudoserviceartemisrestart.
Logs can be fetched using sudojournalctl-uartemis-f-n200.
You can find the latest Artemis Dockerfile at docker/artemis/Dockerfile.
The Dockerfile has multiple stages: A builder stage,
building the .war file, an optional external_builder stage to import a pre-built .war file,
a war_file stage to choose between the builder stages via build argument and a runtime stage with minimal
dependencies just for running artemis.
The Dockerfile defines three Docker volumes (at the specified paths inside the container):
/opt/artemis/config:
This can be used to store additional configuration of Artemis in YAML files.
The usage is optional and we recommend using the environment files for overriding your custom configurations
instead of using src/main/resources/application-local.yml as such an additional configuration file.
The other configurations like src/main/resources/application.yml, … are built into the .war file and
therefore are not needed in this directory.
Tip
Instead of mounting this config directory, you can also use environment variables for the configuration as
defined by the
Spring relaxed binding.
You can either place those environment variables directly in the environment section,
or create an .env-file.
When starting an Artemis container directly with the Docker-CLI, an .env-file can also be given via the
--env-file option.
To ease the transition of an existing set of YAML configuration files into the environment variable style, a
helper script can be used.
/opt/artemis/data:
This directory should be used for any data (e.g., local clone of repositories).
This is preconfigured in the docker Java Spring profile (which sets the following values:
artemis.repo-clone-path, artemis.repo-download-clone-path,
artemis.course-archives-path, artemis.submission-export-path, and artemis.file-upload-path).
/opt/artemis/public/content:
This directory will be used for branding.
You can specify a favicon, imprint.html, and privacy_statement.html here.
The Dockerfile assumes that the mounted volumes are located on a file system with the following locale settings
(see #4439 for more details):
The Docker containers have the possibility to enable Java Remote Debugging via Java environment variables.
Java Remote Debugging allows you to use your preferred debugger connected to port 5005.
For IntelliJ you can use the Remote Java Debugging for Docker profile being shipped in the git repository.
With the following Java environment variable you can configure the Remote Java Debugging inside a container:
This is already preset in the Docker Compose Artemis-Dev-MySQL Setup.
For issues at the startup you might have to suspend the java command until a Debugger connected.
This is possible by setting suspend=y.
Run the server via a run configuration in IntelliJ
The project comes with some pre-configured run / debug configurations that are stored in the .idea directory.
When you import the project into IntelliJ the run configurations will also be imported.
The recommended way is to run the server and the client separated. This provides fast rebuilds of the server and hot
module replacement in the client.
Artemis (Server): The server will be started separated from the client. The startup time decreases significantly.
Artemis (Client): Will execute npminstall and npmrunserve. The client will be available at
http://localhost:9000/ with hot module replacement enabled (also see
Client Setup).
Artemis (Server & Client): Will start the server and the client. The client will be available at
http://localhost:8080/ with hot module replacement disabled.
Artemis (Server, Jenkins & GitLab): The server will be started separated from the client with the profiles
dev,jenkins,gitlab,artemis instead of dev,bamboo,bitbucket,jira,artemis.
Artemis (Server, Athene): The server will be started separated from the client with athene profile enabled
(see Athene Service).
Run the server with Spring Boot and Spring profiles
The Artemis server should startup by running the main class
de.tum.in.www1.artemis.ArtemisApp using Spring Boot.
Note
Artemis uses Spring profiles to segregate parts of the
application configuration and make it only available in certain
environments. For development purposes, the following program arguments
can be used to enable the dev profile and the profiles for JIRA,
Bitbucket and Bamboo:
Text Assessment Analytics is an internal analytics service used to gather data regarding the features of the text
assessment process. Certain assessment events are tracked:
Adding new feedback on a manually selected block
Adding new feedback on an automatically selected block
Deleting a feedback
Clicking to resolve feedback conflicts
Clicking to view origin submission of automatically generated feedback
Hovering over the text assessment feedback impact warning
Editing/Discarding an automatically generated feedback
Clicking the Submit button when assessing a text submission
Clicking the Assess Next button when assessing a text submission
These events are tracked by attaching a POST call to the respective DOM elements on the client side.
The POST call accesses the TextAssessmentEventResource which then adds the events in its respective table.
This feature is disabled by default. We can enable it by modifying the configuration in the file:
src/main/resources/config/application-artemis.yml like so:
You should be able to run the following
command to install development tools and dependencies. You will only
need to run this command when dependencies change in package.json.
npminstall
To start the client application in the browser, use the following
command:
npmrunserve
This compiles TypeScript code to JavaScript code, starts the hot module
replacement feature in Webpack (i.e. whenever you change a TypeScript
file and save, the client is automatically reloaded with the new code)
and will start the client application in your browser on
http://localhost:9000. If you have activated the JIRA profile (see
above in Server Setup) and if you have configured
application-artemis.yml correctly, then you should be able to login
with your TUM Online account.
Hint
In case you encounter any problems regarding JavaScript heap memory leaks when executing npmrunserve or
any other scripts from package.json, you can adjust a
memory limit parameter
(node-options=--max-old-space-size=6144) which is set by default in the project-wide .npmrc file.
If you still face the issue, you can try to set a lower/higher value than 6144 MB.
Recommended values are 3072 (3GB), 4096 (4GB), 5120 (5GB) , 6144 (6GB), 7168 (7GB), and 8192 (8GB).
There are several variables that can be configured when using programming exercises.
They are presented in this separate section to keep the ‘normal’ setup guide shorter.
Repositories that the Artemis server needs are stored in this folder.
This e.g. affects repositories from students which use the online code editor or
the template/solution repositories of new exercises, as they are pushed to the VCS after modification.
Files in this directory are usually not critical, as the latest pushed version of these repositories are
also stored at the VCS.
However, changed that are saved in the online code editor but not yet committed will be lost when
this folder is deleted.
artemis.repo-download-clone-path
Repositories that were downloaded from Artemis are stored in this directory.
Files in this directory can be removed without loss of data, if the downloaded repositories are still present
at the VCS.
No changes to the data in the VCS are stored in this directory (or they can be retrieved by performing
the download-action again).
artemis.template-path
Templates are available within Artemis.
The templates should fit to most environments, but there might be cases where one wants to change the templates.
This value specifies the path to the templates which should overwrite the default ones.
Note that this is the path to the folder where the templates folder is located, not the path to the
templates folder itself.
Templates are shipped with Artemis (they can be found within the src/main/resources/templates folder in GitHub).
These templates should fit well for many deployments, but one might want to change some of them for special deployments.
As of now, you can overwrite the jenkins folders that is present within the src/main/resources/templates folder.
Files that are present in the file system will be used, if a file is not present in the file system,
it is loaded from the classpath (e.g. the .war archive).
We plan to make other folders configurable as well, but this is not supported yet.
The build process in Jenkins is stored in a config.xml-file (src/main/resources/templates/jenkins)
that shares common steps for all programming languages (e.g. triggering a build when a push to GitLab occurred).
It is extended by a Jenkinsfile that is dependent on the used programming language which will be included
in the generic config.xml file.
The builds steps (including used docker images, the checkout process, the actual build steps,
and the reporting of the results to Artemis) is included in the Jenkinsfile.
A sample Jenkinsfile can be found at src/main/resources/templates/jenkins/java/Jenkinsfile.
Note that the Jenkinsfilemust start either
with pipeline (there must not be a comment before pipeline, but there can be one at any other position,
if the Jenkinsfile-syntax allows it)
or the special comment //ARTEMIS:JenkinsPipeline in the first line.
The variables #dockerImage, #testRepository, #assignmentRepository, #jenkinsNotificationToken and
#notificationsUrl will automatically be replaced
(for the normal Jenkinsfile, within the Jenkinsfile-staticCodeAnalysis, #staticCodeAnalysisScript is also replaced).
You should not need to touch any of these variables, except the #dockerImage variable,
if you want to use a different agent setup (e.g. a Kubernetes setup).
The Docker image used to run the maven-tests already contains a set of commonly used dependencies
(see artemis-maven-docker).
This significantly speeds up builds as the dependencies do not have to be downloaded every time a build is started.
However, the dependencies included in the Docker image might not match the dependencies required in your tests
(e.g. because you added new dependencies or the Docker image is outdated).
You can cache the maven-dependencies also on the machine that runs the builds
(that means, outside the docker container) using the following steps:
Adjust the agent-args and add the environment block.
You have to add permissions to the folder (which will be located at the $HOME folder of the user that jenkins uses),
e.g. with sudochmod777maven-cache-docker-R.
Note that this might allow students to access shared resources (e.g. jars used by Maven), and they might be able
to overwrite them.
You can use Ares to prevent this by restricting the resources
the student’s code can access.
This section describes how to set up a programming exercise environment
based on Bamboo, Bitbucket and Jira.
Please note that this setup will create a deployment that is very
similar to the one used in production but has one difference:
In production, the builds are performed within Docker containers that
are created by Bamboo (or its build agents). As we run Bamboo in a
Docker container in this setup, creating new Docker containers within
that container is not recommended (e.g. see this
article). There
are some solution where one can pass the Docker socket to the Bamboo
container, but none of these approaches work quite well here as Bamboo
uses mounted directories that cause issues.
Therefore, a check is included within the BambooBuildPlanService that
ensures that builds are not started in Docker agents if the development
setup is present.
Before you start the docker compose, check if the bamboo version in the
build.gradle (search for com.atlassian.bamboo:bamboo-specs) is
equal to the bamboo version number in the docker compose in
docker/atlassian.yml
If the version number is not equal, adjust the version number.
Further details about the docker compose setup can be found in docker
Execute the docker compose file e.g. with
dockercompose-fdocker/atlassian.ymlup-d.
Error Handling: It can happen that there is an overload with other
docker networks
ERROR:Pooloverlapswithotheroneonthisaddressspace. Use the
command dockernetworkprune to resolve this issue.
Make sure that docker has enough memory (~ 6GB). To adapt it, go to Settings→Resources
In case you want to enable Swift or C programming exercises, refer to the readme in
docker
By default, the Jira instance is reachable under localhost:8081, the
Bamboo instance under localhost:8085 and the Bitbucket instance
under localhost:7990.
Get licenses for Bamboo, Bitbucket and Jira Service Management.
Bamboo: Select Bamboo(DataCenter) and notinstalledyet
Bitbucket: Select Bitbucket(DataCenter) and notinstalledyet
Jira: Select JiraServiceManagement(formerlyServiceDesk)(DataCenter) and notinstalledyet
Provide the just created license key during the setup and create an admin user with the same credentials
in all 3 applications.
- Bamboo:
Choose the H2 database
Select the evaluation/internal/test/dev setups if you are asked
Put the admin username and password into application-local.yml at artemis.version-control.user
and artemis.continuous-integration.user.
Jira:
On startup select I'llsetitupmyself
Select Build In Database Connection
Create a sample project
Bitbucket: Do not connect Bitbucket with Jira yet
Make sure that Jira, Bitbucket and Bamboo have finished starting up.
(Only Linux & Windows) Make sure that xdg-utils
is installed before running the following script.
xdg-utils for Windows users
An easy way to use the xdg-utils on Windows would be to install them on the linux-subsystem,
which should be activated anyways when running Docker on Windows.
For the installation on the subsystem the above linked explanation can be used.
Make sure to execute the script from the subsystem.
Execute the shell script atlassian-setup.sh in the
docker/atlassian directory (e.g. with
./docker/atlassian/atlassian-setup.sh). This script creates
groups, users and assigns the user to their respective group.
In addition, it configures disabled application links between the 3 applications.
Enable the created application
links
between all 3 application (OAuth Impersonate). The links should open automatically after the shell script
has finished. If not open them manually:
The script (step 3) has already created the required users and assigned them to their respective group in Jira.
Now, make sure that they are assigned correctly according to the following test setup:
users 1-5 are students, 6-10 are tutors, 11-15 are
editors and 16-20 are instructors. The usernames are artemis_test_user_{1-20}
and the password is again the username. When you create a course in artemis
you have to manually choose the created groups (students, tutors, editors,
instructors).
Go to Jira → User management → Jira user server → Add application →
Create one application for bitbucket and one for bamboo → add the
IP-address 0.0.0.0/0 to IP Addresses
Go to Bitbucket and Bamboo → User Directories → Add Directories →
Atlassian Crowd → use the URL http://jira:8080 as Server URL →
use the application name and password which you used in the previous
step. Also, you should decrease the synchronisation period (e.g. to 2
minutes). Press synchronise after adding the directory, the users and
groups should now be available.
Give the test users User access on Bitbucket: On the Administration interface (settings cogwheel on the top),
go to the Global permissions. Type the names of all test users in the search field (“Add Users”) and give them
the “Bitbucket User” permission. If you skip this step, the users will not be able to log in to Bitbucket or
clone repositories.
In Bamboo create a global variable named
SERVER_PLUGIN_SECRET_PASSWORD, the value of this variable will be used
as the secret. The value of this variable should be then stored in
src/main/resources/config/application-local.yml as the value of
artemis-authentication-token-value.
You can create a global variable from settings on Bamboo.
Download the
bamboo-server-notification-plugin
and add it to bamboo. Go to Bamboo → Manage apps → Upload app → select
the downloaded .jar file → Upload
Approve the agent and edit the IP address in a development setup to *.*.*.* as the Docker container doesn’t
have a static IP address.
Generate a personal access token
While username and password can still be used as a fallback, this option is already marked as deprecated and will
be removed in the future.
Personal access token for Bamboo:
Log in as the admin user and go to Bamboo → Profile (top right corner) → Personal access tokens →
Create token
Insert the generated token into the file application-local.yml in the section continuous-integration:
artemis:continuous-integration:user:<username>password:<password>token:#insert the token here
Personal access token for Bitbucket:
Log in as the admin user and go to Bitbucket → Your profile image (top right corner) → Manage account →
HTTP access tokens → Create token
Insert the generated token into the file application-local.yml in the section version-control:
artemis:version-control:user:<username>password:<password>token:#insert the token here
Add a SSH key for the admin user
Artemis can clone/push the repositories during setup and for the online code editor using SSH.
If the SSH key is not present, the username + token will be used as fallback
(and all git operations will use HTTP(S) instead of SSH).
If the token is also not present, the username + password will be used as fallback (again, using HTTP(S)).
You first have to create a SSH key (locally), e.g. using ssh-keygen
(more information on how to create a SSH key can be found e.g. at ssh.com
or at atlassian.com).
The list of supported ciphers can be found at Apache Mina.
It is recommended to use a password to secure the private key, but it is not mandatory.
Please note that the private key file must be named id_rsa, id_dsa, id_ecdsa or id_ed25519,
depending on the ciphers used.
You now have to extract the public key and add it to Bitbucket.
Open the public key file (usually called id_rsa.pub (when using RSA)) and copy it’s content
(you can also use catid_rsa.pub to show the public key).
Navigate to BITBUCKET-URL/plugins/servlet/ssh/account/keys and add the SSH key by pasting the content of
the public key.
<ssh-private-key-folder-path> is the path to the folder containing the id_rsa file (but without the filename).
It will be used in the configuration of Artemis to specify where Artemis should look for the key and
store the known_hosts file.
<ssh-private-key-password> is the password used to secure the private key.
It is also needed for the configuration of Artemis, but can be omitted if no password was set
(e.g. for development environments).
Modify src/main/resources/config/application-local.yml to include the correct URLs and credentials:
repo-clone-path:./repos/repo-download-clone-path:./repos-download/bcrypt-salt-rounds:11# The number of salt rounds for the bcrypt password hashing. Lower numbers make it faster but more unsecure and vice versa.# Please use the bcrypt benchmark tool to determine the best number of rounds for your system. https://github.com/ls1intum/bcrypt-Benchmarkuser-management:use-external:trueexternal:url:http://localhost:8081user:<jira-admin-user>password:<jira-admin-password>admin-group-name:instructorsinternal-admin:username:artemis_adminpassword:artemis_adminversion-control:url:http://localhost:7990user:<bitbucket-admin-user>password:<bitbucket-admin-password>token:<bitbucket-admin-token># step 10.2ssh-private-key-folder-path:<ssh-private-key-folder-path>ssh-private-key-password:<ssh-private-key-password>continuous-integration:url:http://localhost:8085user:<bamboo-admin-user>password:<bamboo-admin-password>token:<bamboo-admin-token># step 10.1vcs-application-link-name:LS1 Bitbucket Serverempty-commit-necessary:trueartemis-authentication-token-value:<artemis-authentication-token-value># step 7
Also, set the server URL in src/main/resources/config/application-local.yml:
server:port:8080# The port of artemisurl:http://172.20.0.1:8080# needs to be an ip// url:http://docker.for.mac.host.internal:8080# If the above one does not work for mac try this one// url:http://host.docker.internal:8080# If the above one does not work for windows try this one
In addition, you have to start Artemis with the profiles bamboo,
bitbucket and jira so that the correct adapters will be used,
e.g.:
This section describes how to set up a programming exercise environment
based on Jenkins and GitLab. Optional commands are in curly brackets {}.
The following assumes that all instances run on separate servers. If you
have one single server, or your own NGINX instance, just skip all NGINX
related steps and use the configurations provided under Separate NGINX
Configurations
If you want to setup everything on your local development computer,
ignore all NGINX related steps.Just make sure that you use
unique port mappings for your Docker containers (e.g.8081for
GitLab,8082for Jenkins,8080for Artemis)
In order to use Artemis with Jenkins as Continuous Integration
Server and Gitlab as Version Control Server, you have to configure
the file application-prod.yml (Production Server) or
application-artemis.yml (Local Development) accordingly. Please note
that all values in <..> have to be configured properly. These values
will be explained below in the corresponding sections. If you want to set up a local environment, copy the values
below into your application-artemis.yml or application-local.yml file (the latter is recommended), and follow
the Gitlab Server Quickstart guide.
artemis:course-archives-path:./exports/coursesrepo-clone-path:./reposrepo-download-clone-path:./repos-downloadbcrypt-salt-rounds:11# The number of salt rounds for the bcrypt password hashing. Lower numbers make it faster but more unsecure and vice versa.# Please use the bcrypt benchmark tool to determine the best number of rounds for your system. https://github.com/ls1intum/bcrypt-Benchmarkuser-management:use-external:falseinternal-admin:username:artemis_adminpassword:artemis_adminaccept-terms:falselogin:account-name:TUMversion-control:url:http://localhost:8081user:rootpassword:artemis_admin# created in Gitlab Server Quickstart step 2token:artemis-gitlab-token# generated in Gitlab Server Quickstart steps 4 and 5ci-token:jenkins-secret-token# pre-generated or replaced in Automated Jenkins Server step 3continuous-integration:user:artemis_adminpassword:artemis_adminurl:http://localhost:8082empty-commit-necessary:truesecret-push-token:AQAAABAAAAAg/aKNFWpF9m2Ust7VHDKJJJvLkntkaap2Ka3ZBhy5XjRd8s16vZhBz4fxzd4TH8Su# pre-generated or replaced in Automated Jenkins Server step 3vcs-credentials:artemis_gitlab_admin_credentialsartemis-authentication-token-key:artemis_notification_plugin_tokenartemis-authentication-token-value:artemis_adminbuild-timeout:30git:name:Artemisemail:artemis.in@tum.dejenkins:internal-urls:ci-url:http://jenkins:8080vcs-url:http://gitlab:80use-crumb:falseserver:port:8080url:http://172.17.0.1:8080# `http://host.docker.internal:8080` for Windows
In addition, you have to start Artemis with the profiles gitlab and
jenkins so that the correct adapters will be used, e.g.:
For a local setup on Windows you can use http://host.docker.internal appended
by the chosen ports as the version-control and continuous-integration url.
Make sure to change the server.url value in application-dev.yml
or application-prod.yml accordingly. This value will be used for the
communication hooks from GitLab to Artemis and from Jenkins to Artemis.
In case you use a different port than 80 (http) or 443 (https) for the
communication, you have to append it to the server.url value,
e.g. 127.0.0.1:8080.
When you start Artemis for the first time, it will automatically create
an admin user.
Note: Sometimes Artemis does not generate the admin user which may lead to a startup
error. You will have to create the user manually in the MySQL database and in GitLab. Make sure
both are set up correctly and follow these steps:
Use the tool mentioned above to generate a password hash.
Connect to the database via a client like MySQL Workbench
and execute the following query to create the user. Replace artemis_admin and HASHED_PASSWORD with your
chosen username and password:
4. Create a user in Gitlab (http://your-gitlab-domain/admin/users/new) and make sure that the username,
email, and password are the same as the user from the database:
The following steps describes how to set up the GitLab server in a semi-automated way.
This is ideal as a quickstart for developers. For a more detailed setup, see
Manual Gitlab Server Setup.
In a production setup, you have to at least change the root password (by either specifying it in step 1 or extracting
the random password in step 2) and generate random access tokens (instead of the pre-defined values).
Set the variable GENERATE_ACCESS_TOKENS to true in the gitlab-local-setup.sh script and use the generated
tokens instead of the predefined ones.
Start the GitLab container defined in docker/gitlab-jenkins-mysql.yml by running
If you want to generate a random password for the root user, remove the part before dockercompose from
the command. GitLab passwords must not contain commonly used combinations of words and letters.
The file uses the GITLAB_OMNIBUS_CONFIG environment variable to configure the Gitlab instance after the container
is started.
It disables prometheus monitoring, sets the ssh port to 2222, and adjusts the monitoring endpoint whitelist
by default.
Wait a couple of minutes since GitLab can take some time to set up. Open the instance in your browser
(usually http://localhost:8081).
You can then login using the username root and your password (which defaults to artemis_admin,
if you used the command from above).
If you did not specify the password, you can get the initial one using:
Insert the GitLab root user password in the file application-local.yml (in src/main/resources) and insert
the GitLab admin account.
If you copied the template from above and used the default password, this is already done for you.
This script can also generate random access tokens, which should be used in a production setup. Change the
variable $GENERATE_ACCESS_TOKENS to true to generate the random tokens and insert them into the Artemis
configuration file.
GitLab provides no possibility to set a users password via API without forcing the user to change it afterwards
(see Issue 19141).
Therefore, you may want to patch the official gitlab docker image.
Thus, you can use the following Dockerfile:
FROMgitlab/gitlab-ce:latestRUNsed-i'/^.*user_params\[:password_expires_at\] = Time.current if admin_making_changes_for_another_user.*$/s/^/#/'/opt/gitlab/embedded/service/gitlab-rails/lib/api/users.rb
This Dockerfile disables the mechanism that sets the password to expired state after changed via API.
If you want to use this custom image, you have to build the image and replace all occurrences of
gitlab/gitlab-ce:latest in the following instructions by your chosen image name.
Pull the latest GitLab Docker image (only if you don’t use your custom gitlab image)
Run the image (and change the values for hostname and ports). Add
-p2222:22 if cloning/pushing via ssh should be possible. As
GitLab runs in a docker container and the default port for SSH (22)
is typically used by the host running Docker, we change the port
GitLab uses for SSH to 2222. This can be adjusted if needed.
Make sure to remove the comments from the command before running it.
dockerrun-itd--namegitlab \
--hostnameyour.gitlab.domain.com \ # Specify the hostname--restartalways \
-m3000m \ # Optional argument to limit the memory usage of Gitlab-p8081:80-p443:443 \ # Alternative 1: If you are NOT running your own NGINX instance-p<someportofyourchoosing>:80 \ # Alternative 2: If you ARE running your own NGINX instance-p2222:22 \ # Remove this if cloning via SSH should not be supported-vgitlab_data:/var/opt/gitlab \
-vgitlab_logs:/var/log/gitlab \
-vgitlab_config:/etc/gitlab \
gitlab/gitlab-ce:latest
Wait a couple of minutes until the container is deployed and GitLab
is set up, then open the instance in you browser.
You can get the initial password for the root user using
dockerexecgitlabcat/etc/gitlab/initial_root_password.
We recommend to rename the root admin user to artemis. To rename
the user, click on the image on the top right and select Settings.
Now select Account on the left and change the username. Use the
same password in the Artemis configuration file
application-artemis.yml
letsencrypt['enable']=true# GitLab 10.5 and 10.6 require this optionexternal_url"https://your.gitlab.domain.com"# Must use https protocolletsencrypt['contact_emails']=['gitlab@your.gitlab.domain.com']# Optionalnginx['redirect_http_to_https']=truenginx['redirect_http_to_https_port']=80
Reconfigure GitLab to generate the certificate.
# Save your changes and finally rungitlab-ctlreconfigure
If this command fails, try using
gitlab-ctlrenew-le-certs
Login to GitLab using the Artemis admin account and go to the profile
settings (upper right corner → Preferences)
(Optional, only necessary for local setup) Allow outbound requests to local network
There is a known limitation for the local setup: webhook URLs for the
communication between GitLab and Artemis and between GitLab and Jenkins
cannot include local IP addresses. This option can be deactivated in
GitLab on <https://gitlab-url>/admin/application_settings/network →
Outbound requests. Another possible solution is to register a local URL,
e.g. using ngrok, to be available over a domain
the Internet.
Adjust the monitoring-endpoint whitelist. Run the following command
This will disable the firewall for all IP addresses. If you only want to
allow the server that runs Artemis to query the information, replace
0.0.0.0/0 with ARTEMIS.SERVER.IP.ADDRESS/32
If you use SSH and use a different port than 2222, you have to
adjust the port above.
Disable prometheus.
As we encountered issues with the Prometheus log files not being deleted and therefore filling up the disk space,
we decided to disable Prometheus within GitLab.
If you also want to disable prometheus, edit the configuration again using
Artemis can clone/push the repositories during setup and for the online code editor using SSH.
If the SSH key is not present, the username + token will be used as fallback (and all git operations will use
HTTP(S) instead of SSH).
You first have to create a SSH key (locally), e.g. using ssh-keygen (more information on how to create a SSH
key can be found e.g. at ssh.com or
at gitlab.com).
The list of supported ciphers can be found at Apache Mina.
It is recommended to use a password to secure the private key, but it is not mandatory.
Please note that the private key file must be named ìd_rsa, id_dsa, id_ecdsa or id_ed25519,
depending on the ciphers used.
You now have to extract the public key and add it to GitLab.
Open the public key file (usually called id_rsa.pub (when using RSA)) and copy it’s content (you can also
use catid_rsa.pub to show the public key).
Navigate to GITLAB-URL/-/profile/keys and add the SSH key by pasting the content of the public key.
<ssh-key-path> is the path to the folder containing the id_rsa file (but without the filename). It will
be used in the configuration of Artemis to specify where Artemis should look for the key and store
the known_hosts file.
<ssh-private-key-password> is the password used to secure the private key. It is also needed for the
configuration of Artemis, but can be omitted if no password was set (e.g. for development environments).
Note that upgrading to a major version may require following an upgrade path. You can view supported paths
here.
Start a GitLab container just as described in Start-Gitlab and wait for a couple of minutes. GitLab
should configure itself automatically. If there are no issues, you can
delete the old container using dockerrmgitlab_old and the olf
image (see dockerimages) using dockerrmi<old-image-id>.
You can also remove all old images using dockerimageprune-a
The following steps describe how to deploy a pre-configured version of the Jenkins server.
This is ideal as a quickstart for developers. For a more detailed setup, see
Manual Jenkins Server Setup.
In a production setup, you have to at least change the user credentials (in the file jenkins-casc-config.yml) and
generate random access tokens and push tokens.
1. Create a new access token in GitLab named Jenkins and give it api and read_repository rights. You can
do either do it manually or using the following command:
Jenkins is then reachable under http://localhost:8082/ and you can login using the credentials specified
in jenkins-casc-config.yml (defaults to artemis_admin as both username and password).
You need to generate the ci-token and secret-push-token.
The application-local.yml must be adapted with the values configured in jenkins-casc-config.yml:
artemis:user-management:use-external:falseinternal-admin:username:artemis_adminpassword:artemis_adminversion-control:url:http://localhost:8081user:artemis_adminpassword:artemis_adminci-token:# pre-generated or replaced in Automated Jenkins Server step 3continuous-integration:user:artemis_adminpassword:artemis_adminurl:http://localhost:8082secret-push-token:# pre-generated or replaced in Automated Jenkins Server step 3vcs-credentials:artemis_gitlab_admin_credentialsartemis-authentication-token-key:artemis_notification_plugin_tokenartemis-authentication-token-value:artemis_admin
Open the src/main/resources/config/application-jenkins.yml and change the following:
Again, if you are using a development setup, the template in the beginning of this page already contains the
correct values.
Run the following command to get the latest jenkins LTS docker image.
dockerpulljenkins/jenkins:lts
Create a custom docker image
In order to install and use Maven with Java in the Jenkins container,
you have to first install maven, then download Java and finally
configure Maven to use Java instead of the default version.
You also need to install Swift and SwiftLint if you want to be able to
create Swift programming exercises.
To perform all these steps automatically, you can prepare a Docker
image:
Create a Dockerfile with the content found here <docker/jenkins/Dockerfile>.
Copy it in a file named Dockerfile, e.g. in
the folder /opt/jenkins/ using vimDockerfile.
Now run the command dockerbuild--no-cache-tjenkins-artemis.
This might take a while because Docker will download Java, but this
is only required once.
If you run your own NGINX or if you install Jenkins on a local development computer, then skip the next steps (4-7)
Create a file increasing the maximum file size for the nginx proxy.
The nginx-proxy uses a default file limit that is too small for the
plugin that will be uploaded later. Skip this step if you have your
own NGINX instance.
The NGINX default timeout is pretty low. For plagiarism check and unlocking student repos for the exam a higher
timeout is advisable. Therefore we write our own nginx.conf and load it in the container.
Run the NGINX proxy docker container, this will automatically setup
all reverse proxies and force https on all connections. (This image
would also setup proxies for all other running containers that have
the VIRTUAL_HOST and VIRTUAL_PORT environment variables). Skip this
step if you have your own NGINX instance.
The nginx proxy needs another docker-container to generate
letsencrypt certificates. Run the following command to start it (make
sure to change the email-address). Skip this step if you have your
own NGINX instance.
Run Jenkins by executing the following command (change the hostname
and choose which port alternative you need)
dockerrun-itd--namejenkins \
--restartalways \
-vjenkins_data:/var/jenkins_home \
-v/var/run/docker.sock:/var/run/docker.sock \
-v/usr/bin/docker:/usr/bin/docker:ro \
-eVIRTUAL_HOST=your.jenkins.domain-eVIRTUAL_PORT=8080 \ # Alternative 1: If you are NOT using a separate NGINX instance-eLETSENCRYPT_HOST=your.jenkins.domain \ # Only needed if Alternative 1 is used-p8082:8080 \ # Alternative 2: If you ARE using a separate NGINX instance OR you ARE installing Jenkins on a local development computer-uroot \
jenkins/jenkins:lts
If you still need the old setup with Python & Maven installed locally, use jenkins-artemis instead of
jenkins/jenkins:lts.
Also note that you can omit the -uroot, -v/var/run/docker.sock:/var/run/docker.sock and
-v/usr/bin/docker:/usr/bin/docker:ro parameters, if you do not want to run Docker builds on the Jenkins controller
(but e.g. use remote agents).
Open Jenkins in your browser (e.g. localhost:8082) and setup the
admin user account (install all suggested plugins). You can get the
initial admin password using the following command.
# Jenkins highlights the password in the logs, you can't miss itdockerlogs-fjenkinsoralternativelydockerexecjenkinscat/var/jenkins_home/secrets/initialAdminPassword
Set the chosen credentials in the Artemis configuration
application-artemis.yml
Note: The custom Jenkins Dockerfile takes advantage of the
Plugin Installation Manager Tool for Jenkins
to automatically install the plugins listed below. If you used the Dockerfile, you can skip these steps and
Server Notification Plugin.
The list of plugins is maintained in docker/jenkins/plugins.yml.
You will need to install the following plugins (apart from the
recommended ones that got installed during the setup process):
Timestamper for adding the
time to every line of the build output (Timestamper might already be installed)
Pipeline for defining the
build description using declarative files (Pipeline might already be installed)
Note: This is a suite of plugins that will install multiple plugins
Pipeline Maven to use maven within the pipelines. If you want to
use Docker for your build agents you may also need to install
Docker Pipeline .
Matrix Authorization Strategy Plugin for configuring permissions
for users on a project and build plan level (Matrix Authorization Strategy might already be installed).
The plugins above (and the pipeline-setup associated with it) got introduced in Artemis 4.7.3.
If you are using exercises that were created before 4.7.3, you also have to install these plugins:
Please note that this setup is deprecated and will be removed in the future.
Please migrate to the new pipeline-setup if possible.
Multiple SCMs for combining the
exercise test and assignment repositories in one build
Post Build Task for preparing build
results to be exported to Artemis
Xvfb for exercises based on GUI
libraries, for which tests have to have some virtual display
Choose “Download now and install after restart” and checking the
“Restart Jenkins when installation is complete and no jobs are running” box
Artemis needs to receive a notification after every build, which
contains the test results and additional commit information. For that
purpose, we developed a Jenkins plugin, that can aggregate and POST
JUnit formatted results to any URL.
You can download the current release of the plugin
here
(Download the .hpi file). Go to the Jenkins plugin page (Manage
Jenkins → Manage Plugins) and install the downloaded file under the
Advanced tab under Upload Plugin
Create a new access token in GitLab named Jenkins and give it
api rights and read_repository rights. For detailed
instructions on how to create such a token follow Gitlab Access
Token.
Copy the generated token and create new Jenkins credentials:
Kind: GitLab API token
Scope: Global
API token: your.copied.token
Leave the ID field blank
The description is up to you
Go to the Jenkins settings Manage Jenkins → Configure System. There
you will find the GitLab settings. Fill in the URL of your GitLab
instance and select the just created API token in the credentials
dropdown. After you click on “Test Connection”, everything should
work fine. If you have problems finding the right URL for your local docker setup,
you can try http://host.docker.internal:8081 for Windows or http://docker.for.mac.host.internal:8081 for Mac
if GitLab is reachable over port 8081.
Copy the generated ID (e.g. ea0e3c08-4110-4g2f-9c83-fb2cdf6345fa)
of the new credentials and put it into the Artemis configuration file
application-artemis.yml
GitLab has to notify Jenkins build plans if there are any new commits to
the repository. The push notification that gets sent here is secured by
a token generated by Jenkins. In order to get this token, you have to do
the following steps:
Create a new item in Jenkins (use the Freestyle project type) and
name it TestProject
In the project configuration, go to Build Triggers → Build when a
change is pushed to GitLab and activate this option
Click on Advanced.
You will now have a couple of new options here, one of them being a
“Secret token”.
Click on the “Generate” button right below the text box for that
token.
Copy the generated value, let’s call it $gitlab-push-token
Apply these change to the plan (i.e. click on Apply)
Perform a GET request to the following URL (e.g. with Postman)
using Basic Authentication and the username and password you chose
for the Jenkins admin account:
If you have xmllint installed, you can use this command, which will output the secret-push-token from
steps 9 and 10 (you may have to adjust the username and password):
Copy the secret-push-tokenvalue in the line
<secretToken>{secret-push-token}</secretToken>. This is the encrypted value of the gitlab-push-token
you generated in step 5.
Now, you can delete this test project and input the following values
into your Artemis configuration application-artemis.yml (replace
the placeholders with the actual values you wrote down)
In a local setup, you have to disable CSRF otherwise some API endpoints will return HTTP Status 403 Forbidden.
This is done be executing the following command:
dockercompose-fdocker/<Jenkinssetuptobelaunched>.ymlexec-Tjenkinsddof=/var/jenkins_home/init.groovy<docker/jenkins/jenkins-disable-csrf.groovy
The last step is to disable the use-crumb option in application-local.yml:
In order to upgrade Jenkins to a newer version, you need to rebuild the Docker image targeting the new version.
The stable LTS versions can be viewed through the changelog
and the corresponding Docker image can be found on
dockerhub.
Open the Jenkins Dockerfile and replace the value of FROM with jenkins/jenkins:lts.
After running the command dockerpulljenkins/jenkins:lts, this will use the latest LTS version
in the following steps.
You can also use a specific LTS version.
For example, if you want to upgrade Jenkins to version 2.289.2, you will need to use the
jenkins/jenkins:2.289.2-lts image.
If you’re using dockercompose, you can simply use the following command and skip the next steps.
You can remove the backup container if it’s no longer needed:
dockerrmjenkins_old
You should also update the Jenkins plugins regularly due to security
reasons. You can update them directly in the Web User Interface in the
Plugin Manager.
Go to Manage Jenkins > Manage Nodes and Clouds > master
Configure your master node like this (adjust the number of executors, if needed). Make sure to add the docker label.
An alternative way of adding a build agent that will use docker (similar to the remote agents below) but running
locally, can be done using the jenkins/ssh-agent docker image docker image.
You might want to run the builds on additional Jenkins agents, especially if a large amount of students should use
the system at the same time.
Jenkins supports remote build agents: The actual compilation of the students submissions happens on these
other machines but the whole process is transparent to Artemis.
This guide explains setting up a remote agent on an Ubuntu virtual machine that supports docker builds.
Add a new user to the remote machine that Jenkins will use: sudoadduser--disabled-password--gecos""jenkins
Add the jenkins user to the docker group (This allows the jenkins user to interact with docker):
sudousermod-a-Gdockerjenkins
Generate a new SSH key locally (e.g. using ssh-keygen) and add the public key to the .ssh/authorized_keys
file of the jenkins user on the agent VM.
Validate that you can connect to the build agent machine using SSH and the generated private key and validate that
you can use docker (docker ps should not show an error)
Log in with your normal account on the build agent machine and install Java: sudoaptinstalldefault-jre
Add a new secret in Jenkins, enter private key you just generated and add the passphrase, if set:
Add a new node (select a name and select Permanent Agent):
Set the number of executors so that it matches your machine’s specs: This is the number of concurrent builds
this agent can handle. It is recommended to match the number of cores of the machine,
but you might want to adjust this later if needed.
Set the remote root directory to /home/jenkins/remote_agent.
Set the usage to Only build jobs with label expressions matching this node.
This ensures that only docker-jobs will be built on this agent, and not other jobs.
Add a label docker to the agent.
Set the launch method to Launch via SSH and add the host of the machine.
Select the credentials you just created and select Manually trusted key Verification Strategy
as Host key verification Strategy.
Save it.
Wait for some moments while jenkins installs it’s remote agent on the agent’s machine.
You can track the progress using the Log page when selecting the agent. System information should also be available.
Change the settings of the master node to be used only for specific jobs.
This ensures that the docker tasks are not executed on the master agent but on the remote agent.
Artemis supports user management in Jenkins as of version 4.11.0. Creating an account in Artemis will also create an
account on Jenkins using the same password. This enables users to login and access Jenkins. Updating and/or deleting
users from Artemis will also lead to updating and/or deleting from Jenkins.
Unfortunately, Jenkins does not provide a Rest API for user management which present the following caveats:
The username of a user is treated as a unique identifier in Jenkins.
It’s not possible to update an existing user with a single request.
We update by deleting the user from Jenkins and recreating it with the updated data.
In Jenkins, users are created in an on-demand basis.
For example, when a build is performed, its change log is computed and as a result commits from users
who Jenkins has never seen may be discovered and created.
Since Jenkins users may be re-created automatically, issues may occur such as 1) creating a user, deleting it,
and then re-creating it and 2) changing the username of the user and reverting back to the previous one.
Updating a user will re-create it in Jenkins and therefore remove any additionally saved Jenkins-specific
user data such as API access tokens.
Artemis takes advantage of the Project-based Matrix Authorization Strategy plugin to support build plan
access control in Jenkins.
This enables specific Artemis users to access build plans and execute actions such as triggering a build.
This section explains the changes required in Jenkins in order to set up build plan access control:
Navigate to Manage Jenkins -> Manage Plugins -> Installed and make sure that you have the
Matrix Authorization Strategy plugin installed
Navigate to Manage Jenkins -> Configure Global Security and navigate to the “Authorization” section
Select the “Project-based Matrix Authorization Strategy” option
In the table make sure that the “Read” permission under the “Overall” section is assigned to
the “Authenticated Users” user group.
In the table make sure that all “Administer” permission is assigned to all administrators.
You are finished. If you want to fine-tune permissions assigned to teaching assistants and/or instructors,
you can change them within the JenkinsJobPermission.java file.
This section describes how to set up a programming exercise environment
based on GitLab CI and GitLab.
Note
Depending on your operating system, it might not work with the predefined values (host.docker.internal).
Therefore, it might be necessary to adapt these with e.g. your local IP address.
This section describes how to set up a development environment for Artemis with GitLab and GitLab CI.
The same basic steps as for a GitLab and Jenkins setup apply, but the steps that describe generating tokens for Jenkins can be skipped.
For a production setup of GitLab, also see the documentation of the GitLab and Jenkins setup.
Go to http://host.docker.internal/-/profile/personal_access_tokens and generate an access token with all scopes.
This token is used in the Artemis configuration as artemis.version-control.token.
Allow outbound requests to local network
For setting up the webhook between Artemis and GitLab, it is necessary to allow requests to the local network.
Go to http://host.docker.internal/admin/application_settings/network and allow the outbound requests.
More information about this aspect can be found in the GitLab setup instructions (step 12).
You should now find the runner in the list of runners (http://host.docker.internal/admin/runners)
Note
Adding a runner in a production setup works the same way.
The GitLab administration page also contains alternative ways of setting up GitLab runners.
All variants should allow the passing of the configuration options tag-list, run-untagged, locked, and access-level similarly as in the Docker command above.
If forgotten, Artemis might not use this runner to run the tests for exercise submissions.
Make sure that the database is empty and contains no data from previous Artemis runs.
Generate authentication token
The notification plugin has to authenticate to upload the test results.
Therefore, a random string has to be generated, e.g., via a password generator.
This should be used in place of notification-plugin-token value in the example config below.
Configure Artemis
For local development, copy the following configuration into the application-local.yml file and adapt it with the values from the previous steps.
artemis:user-management:use-external:falseinternal-admin:username:artemis_adminpassword:gHn7JlggD9YPiarOEJSx19EFp2BDkkq9login:account-name:TUMversion-control:url:http://host.docker.internal:80user:rootpassword:password# change this valuetoken:gitlab-personal-access-token# change this valuecontinuous-integration:build-timeout:30artemis-authentication-token-value:notification-plugin-token# change this valuegit:name:Artemisemail:artemis.in@tum.deserver:url:http://host.docker.internal:8080
Note
In GitLab, the password of a user must not be the same as the username and must fulfill specific requirements.
Therefore, there is a random password in the example above.
Start Artemis
Start Artemis with the gitlab and gitlabci profile.
The Athene service is running on a dedicated machine and is addressed via
HTTP. We need to extend the configuration in the file
src/main/resources/config/application-artemis.yml like so:
The Apollon conversion service is running on a dedicated machine and is addressed via
HTTP. We need to extend the configuration in the file
src/main/resources/config/application-artemis.yml like so:
On the first startup, there might be issues with the text_block table.
You can resolve the issue by executing ALTERTABLEtext_blockCONVERTTOCHARACTERSETutf8mb4COLLATEutf8mb4_unicode_ci;
in your database.
One typical problem in the development setup is that an exception occurs during the database initialization.
Artemis uses Liquibase to automatically upgrade the database scheme
after the data model has changed.
This ensures that the changes can also be applied to the production server.
In case you encounter errors with Liquibase checksum values:
Run the following command in your terminal / command line: ./gradlewliquibaseClearChecksums
You can manually adjust the checksum for a breaking changelog: UPDATE`DATABASECHANGELOG`SET`MD5SUM`=NULLWHERE`ID`='<changelogId>'
If you are using a machine with limited RAM (e.g. ~8 GB RAM) you might have issues starting the Artemis Client.
You can resolve this by following the description in Using the command line
When setting up the Bamboo, Bitbucket, and Jira, at the same time within the same browser,
you might receive the message that the Jira token expired.
You can resolve the issue by using another browser for configuring Jira,
as there seems to be a synchronization problem within the browser.
When you create a new programming exercise and receive the error message
Theproject<ProgrammingExerciseName>alreadyexistsintheCIServer.Pleasechooseadifferentshortname! and you have double checked that this project does not exist within
the CI Server Bamboo, you might have to renew the trial licenses for the Atlassian products.
Update Atlassian Licenses
You need to create new Atlassian Licenses, which requires you to retrieve the server id and navigate to the license editing page after
creating new trial licenses.
Bamboo: Retrieve the Server ID and edit the license in License key details(Administration > Licensing)
Bitbucket: Retrieve the Server ID and edit the license in License Settings(Administration > Licensing)
Jira: Retrieve the Server ID(System > System info) and edit the JIRA Service DeskLicense key
in Versions & licenses
When you push new code to Bitbucket (from your local machine or from the online code editor), for a Java or a Kotlin exercise and no result is being displayed in Artemis,
check the corresponding Bamboo build plan. If the plan failed and it says “No failed test found. A possible compilation error occurred.”, then check the logs.
If it says ./gradlew:Permissiondenied, then go to the build plan configuration (Actions -> Configure plan -> Default Job) and add chmod+xgradlew to the “Tests” Script before the ./gradlewcleantest line.
If it says Executionfailedfortask':compileJava'.>invalidsourcerelease:17, then change the ./gradlewcleantest command in the build configuration to ./gradlewcleantest-Dorg.gradle.java.home=/usr/lib/jvm/java-17-oracle (pointing to the Java installation that you added as a server capability).
There are certain scenarios, where a setup with multiple instances of the application server is required.
This can e.g. be due to special requirements regarding fault tolerance or performance.
Artemis also supports this setup (which is also used at the Chair for Applied Software Engineering at TUM).
Multiple instances of the application server are used to distribute the load:
A load balancer (typically a reverse proxy such as nginx) is added, that distributes the requests
to the different instances.
Note: This documentation focuses on the practical setup of this distributed setup.
More details regarding the theoretical aspects can be found in the Bachelor’s Thesis
Securing and Scaling Artemis WebSocket Architecture, which can be found here:
pdf.
Artemis uses a cache provider that supports distributed caching: Hazelcast.
All instances of Artemis form a so-called cluster that allows them to synchronize their cache.
You can use the configuration argument spring.hazelcast.interface to configure the interface on which Hazelcast
will listen.
One problem that arises with a distributed setup is that all instances have to know each other in order
to create this cluster.
This is problematic if the instances change dynamically.
Artemis uses a discovery service to solve the issue (named JHipster Registry).
{{artemis_eureka_urls}} must be the URL where Eureka is reachable,
{{artemis_ip_address}} must be the IP under which this instance is reachable and
{{artemis_eureka_instance_id}} must be a unique identifier for this instance.
You also have to setup the value jhipster.registry.password to the password of the registry
(which you will set later).
Note that Hazelcast (which requires Eureka) is by default binding to 127.0.0.1 to prevent other instances
to form a cluster without manual intervention.
If you set up the cluster on multiple machines (which you should do for a production setup),
you have to set the value spring.hazelcast.interface to the ip-address of the machine.
Hazelcast will then bind on this interface rather than 127.0.0.1,
which allows other instances to establish connections to the instance.
This setting must be set for every instance, but you have to make sure to adjust the ip-address correspondingly.
# Common configuration shared between all applicationsconfigserver:name:Artemis JHipster Registrystatus:Connected to the Artemis JHipster Registryjhipster:security:authentication:jwt:secret:''base64-secret:THE-SAME-TOKEN-THAT-IS-USED-ON-THE-ARTEMIS-INSTANCESeureka:client:service-url:defaultZone:http://admin:${jhipster.registry.password}@localhost:8761/eureka/
WebSockets should also be synchronized (so that a user connected to one instance can perform an action
which causes an update to users on different instances, without having to reload the page - such as quiz starts).
We use a so-called broker for this (named Apache ActiveMQ Artemis).
Adjust the configuration of the broker: sudovim/opt/broker/broker1/etc/broker.xml
<?xml version='1.0'?><configurationxmlns="urn:activemq"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xmlns:xi="http://www.w3.org/2001/XInclude"xsi:schemaLocation="urn:activemq /schema/artemis-configuration.xsd"><corexmlns="urn:activemq:core"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="urn:activemq:core "><name>0.0.0.0</name><journal-pool-files>10</journal-pool-files><acceptors><!-- STOMP Acceptor. --><acceptorname="stomp">tcp://0.0.0.0:61613?tcpSendBufferSize=1048576;tcpReceiveBufferSize=1048576;protocols=STOMP;useEpoll=true;heartBeatToConnectionTtlModifier=6</acceptor></acceptors><connectors><connectorname="netty-connector">tcp://localhost:61616</connector></connectors><security-settings><security-settingmatch="#"><permissiontype="createNonDurableQueue"roles="amq"/><permissiontype="deleteNonDurableQueue"roles="amq"/><permissiontype="createDurableQueue"roles="amq"/><permissiontype="deleteDurableQueue"roles="amq"/><permissiontype="createAddress"roles="amq"/><permissiontype="deleteAddress"roles="amq"/><permissiontype="consume"roles="amq"/><permissiontype="browse"roles="amq"/><permissiontype="send"roles="amq"/><!-- we need this otherwise ./artemis data imp wouldn't work --><permissiontype="manage"roles="amq"/></security-setting></security-settings><address-settings><!--default for catch all--><address-settingmatch="#"><dead-letter-address>DLQ</dead-letter-address><expiry-address>ExpiryQueue</expiry-address><redelivery-delay>0</redelivery-delay><!-- with -1 only the global-max-size is in use for limiting --><max-size-bytes>-1</max-size-bytes><message-counter-history-day-limit>10</message-counter-history-day-limit><address-full-policy>PAGE</address-full-policy><auto-create-queues>true</auto-create-queues><auto-create-addresses>true</auto-create-addresses><auto-create-jms-queues>true</auto-create-jms-queues><auto-create-jms-topics>true</auto-create-jms-topics></address-setting></address-settings></core></configuration>
Service configuration: sudovim/etc/systemd/system/broker1.service
The last (and also easiest) part to configure is the file system:
You have to provide a folder that is shared between all instances of the application server (e.g. by using NFS).
You then have to set the following values in the application config:
Where {{artemis_repo_basepath}} is the path to the shared folder
The file system stores (as its names suggests) files, these are e.g. submissions to file upload exercises,
repositories that are checked out for the online editor, course icons, etc.
Artemis uses scheduled tasks in various scenarios: e.g. to lock repositories on due date, clean up unused resources, etc.
As we now run multiple instances of Artemis, we have to ensure that the scheduled tasks are not executed multiple times.
Artemis uses to approaches for this:
Tasks for quizzes (e.g. evaluation once the quiz is due) are automatically distributed (using Hazelcast)
Tasks for other exercises are only scheduled on one instance:
You must add the Scheduling profile to exactly one instance of your cluster.
This instance will then perform scheduled tasks whereas the other instances will not.
You have to change the nginx configuration (of Artemis) to ensure that the load is distributed between all instances.
This can be done by defining an upstream (containing all instances) and forwarding all requests to this upstream.
We DON’T support the usage of the Compose standalone binary (docker-compose command) as its installation
method is no longer supported by Docker.
We recommend the latest version of Docker Desktop or Docker Engine and Docker CLI with Docker Compose Plugin.
The minimum version for Docker Compose is 1.27.0+ as of this version the
latest Compose file format is supported.
Hint
Make sure that Docker Desktop has enough memory (~ 6GB). To adapt it, go to Settings->Resources.
Check that all local network ports used by Docker Compose are free (e.g. you haven’t started a local MySQL server
when you would like to start a Docker Compose instance of mysql)
Run dockercomposepull&&dockercomposeup in the directory docker/
Run dockercomposedown in the directory docker/ to stop and remove the docker containers
Tip
The first dockercomposepull command is just necessary the first time as an extra step,
as otherwise Artemis will be built from source as you don’t already have an Artemis Image locally.
For Arm-based Macs, Dev boards, etc. you will have to build the Artemis Docker Image first as we currently do not
distribute Docker Images for these architectures.
Overview of the Artemis Docker / Docker Compose structure
The easiest way to configure a local deployment via Docker is a deployment with a docker compose file.
In the directory docker/ you can find the following docker compose files for different setups:
artemis-dev-mysql.yml: Artemis-Dev-MySQL Setup containing the development build of Artemis and a MySQL DB
artemis-prod-mysql.yml: Artemis-Prod-MySQL Setup containing the production build of Artemis and a MySQL DB
atlassian.yml: Atlassian Setup containing a Jira, Bitbucket and Bamboo instance
(see Bamboo, Bitbucket and Jira Setup Guide
for the configuration of this setup)
gitlab-gitlabci.yml: GitLab-GitLabCI Setup containing a GitLab and GitLabCI instance
gitlab-jenkins.yml: GitLab-Jenkins Setup containing a GitLab and Jenkins instance
(see Gitlab Server Quickstart Guide for the configuration of this setup)
monitoring.yml: Prometheus-Grafana Setup containing a Prometheus and Grafana instance
mysql.yml: MySQL Setup containing a MySQL DB instance
nginx.yml: Nginx Setup containing a preconfigured Nginx instance
postgresql.yml: PostgreSQL Setup containing a PostgreSQL DB instance
There is also a single docker-compose.yml in the directory docker/ which mirrors the setup of artemis-prod-mysql.yml.
This should provide a quick way, without manual changes necessary, for new contributors to startup an Artemis instance.
If the documentation just mentions to run dockercompose without a -f<file.yml> argument, it’s
assumed you are running the command from the docker/ directory.F
For each service being used in these docker compose files a base service (containing similar settings)
is defined in the following files:
artemis.yml: Artemis Service
mysql.yml: MySQL DB Service
nginx.yml: Nginx Service
postgresql.yml: PostgreSQL DB Service
gitlab.yml: GitLab Service
jenkins.yml: Jenkins Service
For testing mails or SAML logins you can append the following services to any setup with an artemis container:
mailhog.yml: Mailhog Service (email testing tool)
saml-test.yml: Saml-Test Service (SAML Test Identity Provider for testing SAML features)
Base services (compose file with just one service) and setups (compose files with multiple services)
should be located directly in docker/.
Additional files like configuration files, Dockerfile, …
should be in a subdirectory with the base service or setup name (docker/<baseserviceorsetupname>/).
Everything related to the Docker Image of Artemis (built by the Dockerfile) can be found
in the Server Setup section.
All Artemis related settings changed in Docker compose files are described here.
The artemis.ymlbase service (e.g. in the artemis-prod-mysql.yml setup) defaults to the latest
Artemis Docker Image tag in your local docker registry.
If you want to build the checked out version run dockercomposebuildartemis-app before starting Artemis.
If you want a specific version from the GitHub container registry change the image: value to the desired image
for the artemis-app service and run dockercomposepullartemis-app.
See the Debugging with Docker section for detailed information.
In all development docker compose setups like artemis-dev-mysql.yml Java Remote Debugging is enabled by default.
To keep the documentation short, we will use the standard form of dockercomposeCOMMAND from this point on.
You can use the following commands also with the -fdocker/<setuptobelaunched>.yml argument pointing
to a specific setup.
app container:
dockercomposeexecartemis-appbash or if the container is not yet running:
dockercomposerun--rmartemis-appbash
mysql container:
dockercomposeexecmysqlbash or directly into mysql dockercomposeexecmysqlmysql
Docker is a platform for developing, shipping and running applications.
In our case, we will use it to build the images which we will deploy.
It is also needed from k3d to create a cluster. The cluster nodes are deployed on Docker containers.
Docker Hub is a service provided by Docker for finding and sharing container images.
Account in DockerHub is needed to push the Artemis image which will be used by the Kubernetes deployment.
k3d is a lightweight wrapper to run k3s which is a lightweight Kubernetes distribution in Docker.
k3d makes it very easy to create k3s clusters especially for local deployment on Kubernetes.
Windows users can use choco to install it. More details can be found in the link under OtherInstallationMethods
kubectl is the Kubernetes command-line tool, which allows you to run commands against Kubernetes clusters.
It can be used to deploy applications, inspect and manage cluster resources, and view logs.
To be able to deploy Artemis on Kubernetes, you need to set up a cluster.
A cluster is a set of nodes that run containerized applications.
Kubernetes clusters allow for applications to be more easily developed, moved and managed.
With the following commands, you will set up one cluster with three agents as well as Rancher
which is a platform for cluster management with an easy to use user interface.
IMPORTANT: Before you continue make sure Docker has been started.
Set environment variables
The CLUSTER_NAME, RANCHER_SERVER_HOSTNAME and KUBECONFIG_FILE environment variables need to be set
so that they can be used in the next commands.
If you don’t want to set environment variables you can replace their values in the commands.
What you need to do is replace $CLUSTER_NAME with “k3d-rancher”, $RANCHER_SERVER_HOSTNAME with “rancher.localhost”
and $KUBECONFIG_FILE with “k3d-rancher.yml”.
With the help of the commands block below you can create a cluster with one server and three agents
at a total of four nodes.
Your deployments will be distributed almost equally among the 4 nodes.
Using k3dclusterlist you can see whether your cluster is created and how many of its nodes are running.
Using kubectlgetnodes you can see the status of each node of the newly created cluster.
You should also write the cluster configuration into the KUBECONFIG_FILE.
This configuration will be later needed when you are creating deployments.
You can either set the path to the file as an environment variable or replace it with “<path-to-kubeconfig-file>”
when needed.
For macOS/Linux:
k3d cluster create $CLUSTER_NAME --api-port 6550 --servers 1 --agents 3 --port 443:443@loadbalancer --wait
k3d cluster list
kubectl get nodes
k3d kubeconfig get $CLUSTER_NAME > $KUBECONFIG_FILE
export KUBECONFIG=$KUBECONFIG_FILE
For Windows:
k3d cluster create $env:CLUSTER_NAME --api-port 6550 --servers 1 --agents 3 --port 443:443@loadbalancer --wait
k3d cluster list
kubectl get nodes
k3d kubeconfig get ${env:CLUSTER_NAME} > $env:KUBECONFIG_FILE
$env:KUBECONFIG=($env:KUBECONFIG_FILE)
Install cert-manager
cert-manager is used to add certificates and certificate issuers as resource types in Kubernetes clusters.
It simplifies the process of obtaining, renewing and using those certificates.
It can issue certificates from a variety of supported sources, e.g. Let’s Encrypt, HashiCorp Vault, Venafi.
In our case, it will issue self-signed certificates to our Kubernetes deployments to secure the communication
between the different deployments.
Before the installation, you need to add the Jetstack repository and update the local Helm chart repository cache.
cert-manager has to be installed in a separate namespace called cert-manager so one should be created as well.
After the installation, you can check the status of the installation.
Rancher is a Kubernetes management tool that allows you to create and manage Kubernetes deployments
more easily than with the CLI tools.
You can install Rancher using Helm - the package manager for Kubernetes.
It has to be installed in a namespace called cattle-system and
we should create such a namespace before the installation itself.
During the installation, we set the namespace and the hostname on which Rancher will be accessible.
Then we can check the installation status.
You will be notified that the connection is not private.
The reason for that is that the Rancher deployment uses a self-signed certificate by an unknown authority ‘dynamiclistener-ca’.
It is used for secure communication between internal components.
Since it’s your local environment this is not an issue and you can proceed to the website.
If you can’t continue using the Chrome browser, you can try with another browser, e.g. Firefox.
You will be prompted to set a password which later will be used to log in to Rancher.
The password will often be used, so you shouldn’t forget it.
Then you should save the Rancher Server URL, please use the predefined name.
After saving, you will be redirected to the main page of Rancher, where you see your clusters.
There will be one local cluster.
You can open the workloads using the menu, there will be no workloads deployed at the moment.
Create a new namespace in Rancher
Namespaces are virtual clusters backed by the same physical cluster. Namespaces provide a scope for names.
Names of resources need to be unique within a namespace, but not across namespaces.
Usually, different namespaces are created to separate environments deployments e.g. development, staging, production.
For our development purposes, we will create a namespace called artemis.
It can be done easily using Rancher.
Navigate to Namespaces using the top menu of Rancher
Select AddNamespace to open the form for namespace creation
Put artemis as namespace’s name and select the Create button
The Artemis image will be stored and managed in DockerHub. Kubernetes will pull it from there and deploy it afterwards.
After you log in to your DockerHub account you can create as many public repositories
as you want.
To create a repository you need to select the Createrepository button.
DockerHub:
Then fill in the repository name with artemis. Then use the Create button to create your repository.
The username in DockerHub is called Docker ID.
You need to set your Docker ID in the artemis-deployment.yml resource so that Kubernetes knows
where to pull the image from.
Open the src/main/kubernetes/artemis/deployment/artemis-deployment.yml file and edit
template:spec:containers:image:<DockerId>/artemis
and replace <DockerId> with your docker ID in DockerHub
To run Artemis, you need to configure the Artemis’ User Management, Version Control and Continuous Integration.
You can either run it with Jira, Bitbucket, Bamboo or Jenkins, GitLab.
Make sure to configure the src/main/resources/config/application-artemis.yml file with the proper configuration
for User Management, Version Control and Continuous Integration.
You should skip setting the passwords and token since the Docker image that we are going to build is going to include
those secrets.
You can refer to chapter Add/EditSecrets for setting those values.
If you want to configure Artemis with Bitbucket,Jira,Bamboo you can set a connection to existing staging or
production deployments.
If you want to configure Artemis with local user management and no programming exercises continue with
ConfigureLocalUserManagement.
If you want to run with local user management and no programming exercises setup follow the steps:
1. Go to the src/main/resources/config/application-artemis.yml file, and set use-external in
the user-management section to false.
If you have created an additional application-local.yml file as it is described in the
Setup documentation, make sure to edit this one.
Another possibility is to add the variable directly in src/main/kubernetes/artemis/configmap/artemis-configmap.yml.
data:artemis.user-management.use-external:"false"
2. Remove the jira profile from the SPRING_PROFILES_ACTIVE field in the ConfigMap found at
src/main/kubernetes/artemis/configmap/artemis-configmap.yml
Now you can continue with the next step BuildArtemis
ConfigMaps are used to store configuration data in key-value pairs.
You can change the current Spring profiles used for running Artemis in the
src/main/kubernetes/artemis/configmap/artemis-configmap.yml file by changing SPRING_PROFILES_ACTIVE.
The current ones are set to use Bitbucket, Jira and Bamboo.
If you want to use Jenkins and GitLab please replace bamboo,bitbucket,jira with jenkins,gitlab.
You can also change prod to dev if you want to run in development profile.
Kustomization files declare the resources that will be deployed in one place and with their help we can do
the deployment with only one command.
Once you have your Artemis image pushed to Docker you can use the kustomization.yml file in src/main/kubernetes
to deploy all the Kubernetes resources.
You can do it by executing the following command:
It may take some time but in the end, you should see that all the workloads have Active status.
In case there is a problem with some workloads you can check the logs to see what the issue is.
You will get the same “Connection is not private” issue as you did when opening https://rancher.localhost/.
As said before this is because a self-signed certificate is used and it is safe to proceed.
It takes several minutes for the application to start.
If you get a “Bad Gateway” error it may happen that the application has not been started yet.
Wait several minutes and if you still have this issue or another one you can check out the pod logs
(described in the next chapter).
Open the workload which logs you need to check.
There is a list of pods. Open the menu for one of the pods and select ViewLogs.
A pop-up with the logs will be opened.
If the Artemis application is successfully deployed but there is an error while trying to run the application,
the reason is most likely related to the Artemis yml configuration files.
One of the common errors is related to missing server.url variable.
You can fix it by adding it as an environment variable to the Artemis deployment.
The main application is stored under /src/main and the main folders are:
resources - script, config files and templates are stored here.
config - different configurations (production, development, etc.) for application.
liquibase - contains master.xml file where all the changelogs from the changelog folder are specified.
When you want to do some changes to the database, you will need to add a new changelog file here.
To understand how to create new changelog file you can check existing changelog files or read documentation: https://www.liquibase.org/documentation/databasechangelog.html.
java - Artemis Spring Boot application is located here. It contains the following folders:
config - different classes for configuring database, Sentry, Liquibase, etc.
domain - all the entities and data classes are located here (the model of the server application).
exception - store custom types of exceptions here. We encourage to create custom exceptions to help other developers understand what problem exactly happened.
This can also be helpful when we want to provide specific exception handling logic.
security - contains different POJOs (simple classes that don’t implement/extend any interface/class and don’t have annotations) and component classes related to security.
repository - used to access or change objects in the database. There are several techniques to query database: named queries, queries with SpEL expressions and Entity Graphs.
service - represents the controller of the server application. Add the application logic here. Retrieve and change objects using repositories.
web - contains two folders:
rest - contains REST controllers that act as the view of the server application. Validate input and security here, but do not include complex application logic
websocket - contains controllers that handle real-time communication with the client based on the Websocket protocol. Use the MessagingTemplate to push data to the client or to notify the client about events.
All variables, methods and classes should use CamelCase style. The only difference: the first letter of any class should be capital. Most importantly use intention-revealing, pronounceable names.
One method should be responsible for only one action, it should do it well and do nothing else. Reduce coupling, if our method does two or three different things at a time then we should consider splitting the functionality.
There is no standard pattern for method length among the developers. Someone can say 5, in some cases even 20 lines of code is okay. Just try to make methods as small as possible.
Avoid code duplication. If we cannot reuse a method elsewhere, then the method is probably bad and we should consider a better way to write this method. Use Abstraction to abstract common things in one place.
Encapsulate the code you feel might change in future.
Make variables and methods private by default and increase access step by step by changing them from a private to package-private or protected first and not public right away.
Classes, methods or functions should be open for extension and closed for modification (open closed design principle).
Program for the interface and not for implementation, you should use interface type on variables, return types of a method or argument type of methods. Just like using SuperClass type to store object rather using SubClass.
The use of interface is to facilitate polymorphism, a client should not implement an interface method if its not needed.
Type inference of variables - var vs. actual type:
Variables with primitive types like int, long, or also String should be defined with the actual type by default.
Types which share similar functionality but require different handling should also be explicitly stated, e.g. Lists and Sets.
Variable types which are untypically long and would decrease readability when writing can be shortened with var (e.g. custom DTOs).
Default packages are not allowed. It can cause particular problems for Spring Boot applications that use the @ComponentScan, @EntityScan or @SpringBootApplication annotations since every class from every jar is read.
All variables in the class should be declared at the top of the class.
If a variable is used only in one method then it would be better to declare it as a local variable of this method.
Methods should be declared in the same order as they are used (from top to bottom).
More important methods should be declared at the top of a class and minor methods at the end.
Write performant queries that can also deal with more than 1000 objects in a reasonable time.
Prefer one query that fetches additional data instead of many small queries, but don’t overdo it. A good rule of thumb is to query not more than 3 associations at the same time.
Think about lazy vs. eager fetching when modeling the data types.
Only if it is inevitable, use nested queries. You should try use as few tables as possible.
Simple datatypes: immediately think about whether null should be supported as additional state or not. In most cases it is preferable to avoid null.
Use Datetime instead of Timestamp. Datetime occupies more storage space compared to Timestamp, however it covers a greater date range that justifies its use in the long run.
Only write comments for complicated algorithms, to help other developers better understand them. We should only add a comment, if our code is not self-explanatory.
Utility methods can and should be placed in a class named for specific functionality, not “miscellaneous stuff related to project”. Most of the time, our static methods belong in a related class.
Spring Boot favors Java-based configuration.
Although it is possible to use Sprint Boot with XML sources, it is generally not recommended.
You don’t have to put all your @Configuration into a single class.
The @Import annotation can be used to import additional configuration classes.
One of the flagship features of Spring Boot is its use of Auto-configuration. This is the part of Spring Boot that makes your code simply work.
It gets activated when a particular jar file is detected on the classpath. The simplest way to make use of it is to rely on the Spring Boot Starters.
11. Keep your @RestController’s clean and focused
RestControllers should be stateless.
RestControllers are by default singletons.
RestControllers should not execute business logic but rely on delegation.
RestControllers should deal with the HTTP layer of the application.
RestControllers should be oriented around a use-case/business-capability.
Route naming conventions:
Always use kebab-case (e.g. “…/exampleAssessment” → “…/example-assessment”).
The routes should follow the general structure list-entity > entityId > sub-entity … (e.g. “exercises/{exerciseId}/participations”).
Use plural for a route’s list-entities (e.g. “exercises/…”), use singular for a singleton (e.g. “…/assessment”), use verbs for naming remote methods on the server (e.g. “…/submit”).
Specify the key entity at the end of the route (e.g. “text-editor/participations/{participationId}” should be changed to “participations/{participationId}/text-editor”).
Use consistent routes that start with courses, exercises, participations, exams or lectures to simplify access control. Do not start routes with other entity names.
When defining a new route, all subroutes should be addressable as well, e.g. your new route is “exercises/{exerciseId}/statistics”, then both “exercises/{exerciseId}” and “exercises” should be addressable.
If you want an alternative representation of the entity that e.g. sends extra data needed for assessment, then specify the reason for this alternative route at the end of the route, for example “participations/{participationId}/for-assessment”.
Additional notes on the controller methods:
The REST Controllers route should end with a tailing “/” and not start with a “/” (e.g. “api/”), the individual endpoints routes should not start and not end with a “/” (e.g. “exercises/{exerciseId}”).
Use …ElseThrow alternatives of all Repository and AuthorizationCheck calls whenever applicable, this increases readability (e.g. findByIdElseThrow(...) instead of findById(...) and then checking for null).
POST should return the newly created entity.
POST should be used to trigger remote methods (e.g. “…/{participationId}/submit” should be triggered with a POST).
Verify that API endpoints perform appropriate authorization and authentication consistent with the rest of the code base.
Always use @PreAuthorize to only allow certain roles to access the method.
Perform additional security checks using the AuthorizationCheckService.
Check for other common weaknesses, e.g., weak configuration, malicious user input, missing log events, etc.
Never trust user input and check if the passed data exists in the database.
Verify the consistency of user input by e.g. checking ids in body and path to see if they match, comparing course in the RequestBody with the one referenced by id in the path.
Check for user input consistency first, then check the authorization, if e.g. the ids of the course in body and path don’t match, the user may be INSTRUCTOR in one course and just a USER in another, this may lead to unauthorized access.
REST Controller should only handle authentication, error handling, input validation and output creation, the actual logic behind an endpoint should happen in the respective Service or Repository.
Handle exceptions and errors with a standard response. Errors are very important in REST APIs. They inform clients that something went wrong, after all.
Always use different response status codes to notify the client about errors on the server, e.g.:
Forbidden - the user is not authorized to access the controller.
Bad Request - the request was wrong.
Not Found - can’t find the requested data or it should be not accessible yet.
Some of you may argue with this, but by favoring constructor injection you can keep your business logic free from Spring. Not only is the @Autowired annotation optional on constructors, you also get the benefit of being able to easily instantiate your bean without Spring.
Use setter based DI only for optional dependencies.
Avoid circular dependencies, try constructor and setter based DI for such cases.
Don’t write code when you are tired or in a bad mood.
Optimization vs Readability: always write code that is simple to read and which will be understandable for developers. Because the time and resources spent on hard-to-read code cost much more than what we gain through optimization
Commit messages should describe both what the commit changes and how it does it.
ARCHITECTURE FIRST: writing code without thinking of the system’s architecture is useless, in the same way as dreaming about your desires without a plan of achieving them.
Always use the least possible access level, prefer using private over public access modifier (package-private or protected can be used as well).
Previously we used transactions very randomly, now we want to avoid using Transactional. Transactions can kill performance, introduce locking issues and database concurrency problems, and add complexity to our application. Good read: https://codete.com/blog/5-common-spring-transactional-pitfalls/
Define a constant if the same value is used more than once. Constants allow you to change code later a lot easier. Instead of looking for the places where this variable was used, you only need to change it in only one place.
Facilitate code reuse. Always move duplicated code to reusable methods. IntelliJ is very good at suggesting duplicated lines and even automatically extracting them. Also don’t be shy to use Generics.
Always qualify a static class member reference with its class name and not with a reference or expression of that class’s type.
Prefer using primitive types to classes, e.g. long instead of Long.
Use ./gradlewspotlessCheck and ./gradlewspotlessApply to check Java code style and to automatically fix it.
Don’t use .collect(Collectors.toList()). Instead use only .toList() for an unmodifiable list or .collect(Collectors.toCollection(ArrayList::new)) to explicitly create a new ArrayList.
Query parameters for SQL must be annotated with @Param("variable")!
Do not write
@Query(""" SELECT r FROM Result r LEFT JOIN FETCH r.feedbacks WHERE r.id = :resultId """)Optional<Result>findByIdWithEagerFeedbacks(LongresultId);
but instead annotate the parameter with @Param:
@Query(""" SELECT r FROM Result r LEFT JOIN FETCH r.feedbacks WHERE r.id = :resultId """)Optional<Result>findByIdWithEagerFeedbacks(@Param("resultId")LongresultId);
The string name inside must match the name of the variable exactly!
We prefer to write SQL statements all in upper case. Split queries onto multiple lines using the Java Text Blocks notation (triple quotation mark):
@Query(""" SELECT r FROM Result r LEFT JOIN FETCH r.feedbacks WHERE r.id = :resultId """)Optional<Result>findByIdWithEagerFeedbacks(@Param("resultId")LongresultId);
SQL statements which do not contain sub-queries are preferable as they are more readable and have a better performance.
So instead of:
@Query(""" SELECT COUNT (DISTINCT p) FROM StudentParticipation p WHERE p.exercise.id = :#{#exerciseId} AND EXISTS (SELECT s FROM Submission s WHERE s.participation.id = p.id AND s.submitted = TRUE """)longcountByExerciseIdSubmitted(@Param("exerciseId")longexerciseId);
you should use:
@Query(""" SELECT COUNT (DISTINCT p) FROM StudentParticipation p JOIN p.submissions s WHERE p.exercise.id = :#{#exerciseId} AND s.submitted = TRUE """)longcountByExerciseIdSubmitted(@Param("exerciseId")longexerciseId);
Functionally both queries extract the same result set, but the first one is less efficient as the sub-query is calculated for each StudentParticipation.
21. REST endpoint best practices for authorization
To reject unauthorized requests as early as possible, Artemis employs a two-step system:
PreAuthorize and Enforce annotations are responsible for blocking users with wrong or missing authorization roles without querying the database.
The AuthorizationCheckService is responsible for checking access rights to individual resources by querying the database.
Because the first method without database queries is substantially faster, always annotate your REST endpoints with the corresponding annotation. Always use the annotation for the minimum role that has access.
The following example makes the call only accessible to ADMIN and INSTRUCTOR users:
Artemis distinguishes between six different roles: ADMIN, INSTRUCTOR, EDITOR, TA (teaching assistant), USER and ANONYMOUS.
Each of the roles has the all the access rights of the roles following it, e.g. ANONYMOUS has almost no rights, while ADMIN users can access every page.
The table contains all annotations for the corresponding minimum role. Different annotations get used during migration.
Minimum Role
Endpoint Annotation
ADMIN
@EnforceAdmin
INSTRUCTOR
@PreAuthorize(“hasRole(‘INSTRUCTOR’)”)
EDITOR
@PreAuthorize(“hasRole(‘EDITOR’)”)
TA
@PreAuthorize(“hasRole(‘TA’)”)
USER
@PreAuthorize(“hasRole(‘USER’)”)
ANONYMOUS
@PreAuthorize(“permitAll()”)
If a user passes the pre-authorization, the access to individual resources like courses and exercises still has to be checked. For example, a user can be a teaching assistant in one course, but only a student in another.
However, do not fetch the user from the database yourself (unless you need to re-use the user object), but only hand a role to the AuthorizationCheckService:
// If we pass 'null' instead of a user here, the service will fetch the user object// and check if the user has at least the given role and access to the resourceauthCheckService.checkHasAtLeastRoleForExerciseElseThrow(Role.INSTRUCTOR,exercise,null);
To reduce duplication, do not add explicit checks for authorization or existence of an entity but always use the AuthorizationCheckService:
The course repository call takes care of throwing a 404NotFound exception if there exists no matching course. The AuthorizationCheckService throws a 403Forbidden exception if the user with the given role is unauthorized. Afterwards delegate to a service or repository method. The code becomes much shorter, cleaner and more maintainable.
22. Assert using the most specific overload method
When expecting results use assertThat for server tests. That call must be followed by another assertion statement like isTrue(). It is best practice to use more specific assertion statement rather than always expecting boolean values.
This gives better error messages when an assertion fails and improves the code readability. However, be aware that not all methods can be used for assertions like this.
If you can’t avoid using isTrue use the as keyword to add a custom error message:
assertThat(submission.isSubmittedInTime()).as("submission was not in time").isTrue();
Write meaningful comments for your tests.
These comments should contain information about what is tested specifically.
/** * Tests that borrow() in Book successfully sets the available attribute to false */@TestvoidtestBorrowInBook(){// Test Code}
Use appropriate and descriptive names for test cases. This makes it easier for other developers to understand what you actually test without looking deeper into it.
This is the same reason why you should not name your variables int a, double b, String c, and so on. For example, if you want to test the method borrow in the class Book, testBorrowInBook() would be an appropriate name for the test case.
@TestvoidtestBorrowInBook(){// Test Code}
Try to follow the best practices for Java testing:
Write small and specific tests by heavily using helper functions, parameterized tests, AssertJ’s powerful assertions, not overusing variables, asserting only what’s relevant and avoiding one test for all corner cases.
Write self-contained tests by revealing all relevant parameters, insert data right in the test and prefer composition over inheritance.
Write dumb tests by avoiding the reuse of production code and focusing on comparing output values with hard-coded values.
KISS > DRY (“Keep it simple, Stupid!” and “Don’t repeat yourself!”)
Invest in a testable implementation by avoiding static access, using constructor injection, using Clocks and separating business logic from asynchronous execution.
It’s possible to write tests that check how many database calls are performed during a REST call. This is useful to ensure that code changes don’t lead to more database calls,
or at least to remind developers in case they do. It’s especially important for commonly used endpoints that users access multiple times or every time they use Artemis.
However, we should consider carefully before adding such assertions to a test as it makes the test more tedious to maintain.
An example on how to track how many database calls are performed during a REST call is shown below. It uses the HibernateQueryInterceptor which counts the number of queries.
For ease of use, a custom assert assertThatDb was added that allows to do the check in one line. It also returns the original result of the REST call and so allows you to
add any other assertions to the test, as shown below.
Do not use the @SpyBean or @MockBean annotation unless absolutely necessary, or possibly in an abstract Superclass. If you want to see why in more detail, take a look here.
Basically, every time @MockBean appears in a class, the ApplicationContext cache gets marked as dirty, hence the runner will clean the cache after the test-class is done and restarts the application context.
This leads to a large overhead, which tends to make the tests take a lot more time.
Here is an example how to replace a @SpyBean. We wanted to test an edge case which is only executed if an IOException is thrown. We did this by mocking the service method and making it throw an Exception.
As mentioned above, we should really avoid this.
Instead we can use Static Mocks. When we look deeper in the export() method we find that there is a call of File.newOutputStream(..).
Now, instead of mocking the whole Service, we can just mock the static method, like this:
classTestExportextendsAbstractSpringIntegrationBambooBitbucketJiraTest{// No beans used anymore@Test@WithMockUser(username="instructor1",roles="INSTRUCTOR")voidtestExportAll_IOException()throwsException{MockedStatic<Files>mockedFiles=mockStatic(Files.class);mockedFiles.when(()->Files.newOutputStream(any(),any())).thenThrow(IOException.class);request.postWithResponseBodyFile("/api/file-upload-export/"+fileUploadExercise.getId(),HttpStatus.BAD_REQUEST);mockedFiles.close();}}
You should notice here that we can avoid the use of a Bean and also test deeper. Instead of mocking the uppermost method we only throw the exception at the place where it could actually happen. Very important to mention is that you need to close the mock at the end of the test again.
For a real example where a SpyBean was replaced with a static mock look at the SubmissionExportIntegrationTest.java in here.
Be aware that Buttons navigate only in the same tab while Links provide the option to use the context menu or a middle-click to open the page in a new tab. Therefore:
Buttons are best used to trigger certain functionalities (e.g. <button(click)='deleteExercise(exercise)'>...</button)
Links are best for navigating on Artemis (e.g. <a[routerLink]='getLinkForExerciseEditor(exercise)'[queryParams]='getQueryParamsForEditor(exercise)'>...</a>)
If you use icons next to text (for example for a button or link), make sure that they are separated by a newline. HTML renders one or multiple newlines as a space.
Use labels to caption inputs like text fields and checkboxes.
Associated labels help screen readers to read out the text of the label when the input is focused.
Additionally they allow the label to act as an input itself (e.g. the label also activates the checkbox).
Make sure to associate them by putting the input inside the label component or by adding the for attribute in the label referencing the id of the input.
Use arrow functions over anonymous function expressions.
Always surround arrow function parameters.
For example, x=>x+x is wrong but the following are correct:
(x)=>x+x
(x,y)=>x+y
<T>(x:T,y:T)=>x===y
Always surround loop and conditional bodies with curly braces. Statements on the same line are allowed to omit braces.
Open curly braces always go on the same line as whatever necessitates them.
Parenthesized constructs should have no surrounding whitespace.
A single space follows commas, colons, and semicolons in those constructs. For example:
for(vari=0,n=str.length;i<10;i++){}
if(x<10){}
functionf(x:number,y:string):void{}
Use a single declaration per variable statement (i.e. use varx=1;vary=2; over varx=1,y=2;).
else goes on the same line from the closing curly brace.
Use 4 spaces per indentation.
We use prettier to style code automatically and eslint to find additional issues.
You can find the corresponding commands to invoke those tools in package.json.
Modern garbage collectors improve on this algorithm in different ways, but the essence is the same: reachable pieces of memory are marked as such and the rest is considered garbage.
Unwanted references are references to pieces of memory that the developer knows he or she won’t be needing
anymore but that for some reason are kept inside the tree of an active root. In the context of JavaScript, unwanted references are variables kept somewhere in the code that will not be used anymore and point to a piece of memory that could otherwise be freed.
Number 2: Using the experimental leak detection feature from jest
--detectLeaks **EXPERIMENTAL**: Detect memory leaks in tests.
After executing a test, it will try to garbage collect the global object used,
and fail if it was leaked [boolean] [default: false]
--runInBand, -i Run all tests serially in the current process
(rather than creating a worker pool of child processes that run tests). This is sometimes useful for debugging, but such use cases are pretty rare.
The ideal schema for routes is that every variable in a path is preceded by a unique path segment: \entityA\:entityIDA\entityB\:entityIDB
For example, \courses\:courseId\:exerciseId is not a good path and should be written as \courses\:courseId\exercises\:exerciseId.
Doubling textual segments like \lectures\statistics\:lectureId should be avoided and instead formulated as \lectures\:lectureId\statistics.
When creating a completely new route you will have to register the new paths in navbar.ts. A static/textual url segment gets a translation string assigned in the mapping table. Due to our code-style guidelines any - in the segment has to be replaced by a _. If your path includes a variable, you will have to add the preceding path segment to the switch statement inside the addBreadcrumbForNumberSegment method.
constmapping={courses:'artemisApp.course.home.title',lectures:'artemisApp.lecture.home.title',// put your new directly translated url segments here// the index is the path segment in which '-' have to be replaced by '_'// the value is the translation stringyour_case:'artemisApp.cases.title',};addBreadcrumbForNumberSegment(currentPath:string,segment:string):void{switch(this.lastRouteUrlSegment){case'course-management':// handles :courseIdbreak;case'lectures':// handles :lectureIdbreak;case'your-case':// add a case here for your :variable which is preceded in the path by 'your-case'break;}}
In order to configure the content of the tooltips in the chart, declare a ng-template with the reference #tooltipTemplate
containing the desired content within the selector. The framework dynamically recognizes this template. In the example above,
the tooltips are configured in order to present the percentage value corresponding to the absolute value represented by the bar.
Depending on the chart type, there is more than one type of tooltip configurable.
For more information visit https://swimlane.gitbook.io/ngx-charts/
In order to manipulate the content of the data label (e.g. the text floating above a chart bar), the framework provides a [dataLabelFormatting] property in the
HTML template that can be assigned to a method. For example:
The method is passed to the framework itself and executed there. This means that at runtime it does not have access to global variables of the component it is originally implemented in.
If this access is necessary, create a (readonly) variable assigned to this method and bind it to the component: readonlybindFormatting=this.formatDataLabel.bind(this);
Some design properties are not directly configurable via the framework (e.g. the font-size and weight of the data labels).
The tool ::ng-deep is useful in these situations as it allows to change some of these properties by overwriting them in
a corresponding style sheet. Adapting the font-size and weight of data labels would look like this:
Warning
::ng-deep breaks the view encapsulation of the rule. This can lead to undesired and flaky side effects on other pages of Artemis.
For more information, refer to the Angular documentation.
Therefore, only use this annotation if this is absolutely necessary. To limit the potential of side effects, add a :host in front of the command.
In order to make the chart responsive in width, bind it to the width of its parent container.
First, annotate the parent container with a reference (in the example #containerRef).
Then, when configuring the dimensions of the chart in [view], insert containerRef.offsetWidth instead
of an specific value for the width.
There are two ways to keep axis labels and axis ticks translation-sensitive if they contain natural language:
Axis labels are passed directly as property in the HTML template. Simply insert the translation string together with the translate pipe:
For some chart types, the framework derives the ticks of one axis from the name property of the passed data objects.
So, these names have to be translated every time the user switches the language settings.
In this case, inject the TranslateService to the underlying component and subscribe to the onLangChange event emitter:
constructor(privatetranslateService:TranslateService){this.translateService.onLangChange.subscribe(()=>{this.updateXAxisLabel();// a method re-assigning the names of the objects to the translated string});}
Ensure that the layout of your page or component shrinks accordingly and adapts to all display sizes (responsive design).
Prefer using the .container class (https://getbootstrap.com/docs/5.2/layout/containers/) when you want to limit the page width on extra-large screens.
Do not use the following for this purpose if it can be avoided:
<divclass="row justify-content-center"><divclass="col-12 col-lg-8"><!-- Do not do this --></div></div>
There are different tools available to support client testing. We try to limit ourselves to Jest as much as possible. We use NgMocks for mocking the dependencies of an angular component.
A component should be tested in isolation without any dependencies if possible. Do not simply import the whole production module. Only import real dependencies if it is essential for the test
that the real dependency is used. Instead, use mock pipes, mock directives and mock components that the component under test depends upon. A very useful technique is writing stubs for child components. This has the benefit of being able to test the interaction with the child components.
More examples on test speed improvement can be found in the following PR.
Services should be mocked if they simply return some data from the server. However, if the service has some form of logic included (for example converting dates to datejs instances),
and this logic is important for the component, do not mock the service methods, but mock the HTTP requests and responses from the API. This allows us to test the interaction
of the component with the service and in addition test that the service logic works correctly. A good explanation can be found in the official angular documentation.
import{HttpClientTestingModule,HttpTestingController}from'@angular/common/http/testing';describe('SomeComponent',()=>{beforeEach(()=>{TestBed.configureTestingModule({imports:[HttpClientTestingModule],});...httpMock=injector.get(HttpTestingController);});afterEach(()=>{...httpMock.verify();jest.restoreAllMocks();});it('should make get request',fakeAsync(()=>{constreturnedFromApi={some:'data'};component.callServiceMethod().subscribe((data)=>expect(data.body).toEqual(returnedFromApi));constreq=httpMock.expectOne({method:'GET',url:'urlThatMethodCalls'});req.flush(returnedFromApi);tick();}));});
Do not use NO_ERRORS_SCHEMA (angular documentation). This tells angular to ignore the attributes and unrecognized elements, prefer to use component stubs as mentioned above.
Calling jest.restoreAllMocks() ensures that all mocks created with Jest get reset after each test. This is important if they get defined across multiple tests. This will only work if the mocks were created with jest.spyOn. Manually assigning jest.fn() should be avoided with this configuration.
It is preferable to test a component through the interaction of the user with the template. This decouples the test from the concrete implementation used in the component.
For example, if you have a component that loads and displays some data when the user clicks a button, you should query for that button, simulate a click, and then assert that the data has been loaded and that the expected template changes have occurred.
Do not remove the template during tests by making use of overrideTemplate(). The template is a crucial part of a component and should not be removed during test. Do not do this:
describe('SomeComponent',()=>{letsomeComponentFixture:ComponentFixture<SomeComponent>;letsomeComponent:SomeComponent;beforeEach(()=>{TestBed.configureTestingModule({imports:[],declarations:[SomeComponent,],providers:[],}).overrideTemplate(SomeComponent,'')// DO NOT DO THIS.compileComponents().then(()=>{someComponentFixture=TestBed.createComponent(SomeComponent);someComponent=someComponentFixture.componentInstance;});});});
Name the variables properly for test doubles:
constclearSpy=jest.spyOn(someComponent,'clear');constgetNumberStub=jest.spyOn(someComponent,'getNumber').mockReturnValue(42);// This always returns 42
Spy: Doesn’t replace any functionality but records calls
Mock: Spy + returns a specific implementation for a certain input
Stub: Spy + returns a default implementation independent of the input parameters.
Try to make expectations as specific as possible. If you expect a specific result, compare to this result and do not compare to the absence of some arbitrary other value. This ensures that no faulty values you didn’t expect can sneak in the codebase without the tests failing. For example toBe(5) is better than not.toBeUndefined(), which would also pass if the value wrongly changes to 6.
When expecting results use expect for client tests. That call must be followed by another assertion statement like toBeTrue(). It is best practice to use more specific expect statements rather than always expecting boolean values. It is also recommended to extract as much as possible from the expect statement.
If you have minimized expect, use the verification function that provides the most meaningful error message in case the verification fails. You can use verification functions from core Jest or from Jest Extended.
For situations described below, only use the uniform solution to keep the codebase as consistent as possible.
Artemis ships with two themes: Light Mode and Dark Mode. It is important to ensure that all UI components look
consistent and good in both themes. Therefore, please follow the rules and hints below.
In general, keep in mind: All UI changes need to be verified in both themes!
We use different colors for our UI elements in both themes. These colors are passed into Bootstrap, so if you use
default components such as buttons, cards, forms, etc., they will be automagically themed correctly.
For your custom components and custom stylesheets, please follow this strict global rule:
Do not hard code any color values in your component stylesheets, templates or TypeScript files!
Most likely, any colors hard-coded in your component stylesheets will look bad in either the light mode or dark mode.
So, you either need to specify different colors for both themes, or you could just use default colors, which is preferred.
Warning
Pull Requests with hard-coded colors in component files will not be merged.
We want to avoid further color fragmentation in the future.
You need a good justification to not use default or already provided colors, be it derived or completely custom ones.
Please check your available options in this order:
Use global default colors and Bootstrap classes (preferred)
For most use-cases, using one of the pre-provided colors is the way to go. Really think deeply whether you need
a custom color.
Check out the top of _default_variables.scss to see the available default colors. While you should not
use black,gray-XXX,white in your components as that would equal a hard-coded, not theme-aware color, you should re-use signal colors, base colors and pre-provided ‘colorful’ colors.
All variables in this file are globally available in the application as native CSS variables.
For example, $danger can be accessed in all SCSS files using var(--danger).
The default font color; will be black in light mode and white in dark mode.
CSS: var(--bs-body-color)
bs-body-bg
The body background color; will be something bright in light mode and something darker in dark mode.
It can be used for smaller boxes in main content, as it’s usually pretty distinguishable from the
background color used in Artemis’ primary content area.
CSS: var(--bs-body-bg)
artemis-dark
A dark color that is typical for some of Artemis’ UI elements, for example the navbar background.
CSS: var(--artemis-dark)
primary
A blue-ish default color to indicate a primary action. Also used as link color. Use this to indicate
the primary next step for the user (or one of them).
CSS: var(--primary)
For text: <spanclass="text-primary">
For buttons: <buttonclass="btnbtn-primary">
secondary
A gray color. Use this for secondary action buttons and hint texts.
CSS: var(--secondary)
For text: <spanclass="text-secondary">
For buttons: <buttonclass="btnbtn-secondary">
success
A green color indicating a successful operation, state, or safe action.
CSS: var(--success)
For text: <spanclass="text-success">
For buttons: <buttonclass="btnbtn-success">
danger
A red color indicating a failed operation, error-state, or dangerous action.
CSS: var(--danger)
For text: <spanclass="text-danger">
For buttons: <buttonclass="btnbtn-danger">
warning
An orange color indicating a partly failed operation, a warning, or an unsafe, yet not ultra-dangerous action.
CSS: var(--warning)
For text: <spanclass="text-warning">
For buttons: <buttonclass="btnbtn-warning">
info
An teal-ish color indicating an informational element.
CSS: var(--info)
For text: <spanclass="text-info">
For buttons: <buttonclass="btnbtn-info">
There are more theme-aware colors to choose from; please see the variables file.
If you need to design entire boxes using one of the signal colors, you should use alert boxes.
Either add one of the Boostrap alert classes to your box, such as alertalert-danger, or use our globally
defined colors:
with XXX being one of: info,danger,warning,success,neutral.
If you need to separate something from the background, try to use the bg-light class which should work in both themes.
Define your own colors for each theme
If the options above don’t suit your use case, you can define your own color variables.
These colors must be theme-aware, so you have to select a good color for both themes and add them to each
theme’s stylesheet.
Tip
Artemis uses white in light mode and $neutral-dark in dark mode
as background for the main content area, cards, etc.
For $neutral-dark, a few lightened default options exist: $neutral-dark-l-5; to $neutral-dark-l-20; in steps of 5.
Therefore, if you need to separate something from the background, choose one of gray-XXX for light mode and a lightened option of $neutral-dark in dark mode.
Keep this in mind while you select the colors to use.
Let’s go through it step by step. Let’s say, you want to give a box a special background color.
Define a class for it in your component’s SCSS file, and use a new, unique variable name as value:
It might happen that you need to modify a global style rule in one of the themes, for example if you’re using an external library which styles need to be overridden.
Each theme has its own file to which custom global styles can be added: theme-dark.scss and theme-default.scss.
For styles that should be applied in both themes, use global.scss.
There will be occasions where you need to know in your components which theme is currently applied.
The ThemeService will provide this information where needed.
For example, you could add a reactive flag to your component that indicates whether or not the current theme is dark:
From there, you can do whatever you need to do to change the behavior of your component based on the theme.
Alternatively, you can execute any actions directly in the subscribe() block.
The service will fire an event containing the current theme immediately as soon as you subscribe, and one more
event for each theme change from then on util you unsubscribe.
You can get the current theme using themeService.getCurrentTheme() at any time.
Additionally, it’s possible to change the theme programmatically. However, this should be rare: Usually, the user decides which theme
they want to use by using the theme switching component in the navbar. Any use of this must therefore be justified and
survive a detailed review.
The cost of retrieving and building an object’s relationships far exceeds the cost of selecting the object. This is especially true for relationships where it would trigger the loading of every child through the relationship hierarchy. The solution to this issue is lazy fetching (lazy loading). Lazy fetching allows the fetching of a relationship to be deferred until it is accessed. This is important not only to avoid the database access, but also to avoid the cost of building the objects if they are not needed.
In JPA lazy fetching can be set on any relationship using the fetch attribute. The fetch can be set to either LAZY or EAGER as defined in the FetchType enum. The default fetch type is LAZY for all relationships except for OneToOne and ManyToOne, but in general it is a good idea to make every relationship LAZY. The EAGER default for OneToOne and ManyToOne is for implementation reasons (more easier to implement), not because it is a good idea.
We always use FetchType.LAZY, unless there is a very strong case to be made for FetchType.EAGER.
Note
Additional effort to use FetchType.LAZY does not count as a strong argument.
A relationship is a reference from one object to another. In a relational database relationships are defined through foreign keys. The source row contains the primary key of the target row to define the relationship (and sometimes the inverse). A query must be performed to read the target objects of the relationship using the foreign key and primary key information. If there is a relationship to a collection of other objects, a Collection or array type is used to hold the contents of the relationship. In a relational database, collection relations are either defined by the target objects having a foreign key back to the source object’s primary key, or by having an intermediate join table to store the relationship (containing both objects’ primary keys).
In this section, we depict common entity relationships we use in Artemis and show some code snippets.
OneToOne A unique reference from one object to another. It is also inverse of itself. Example: one Complaint has a reference to one Result.
OneToMany A Collection or Map of objects. It is the inverse of a ManyToOne relationship. Example: one Result has a list of Feedback elements. For ordered OneToMany relations see ordered collections.
ManyToManyACollection or Map of objects. It is the inverse of itself. Example: one Exercise has a list of LearningGoal elements, one LearningGoal has list of Exercise elements. In other words: many exercises are connected to many learning goals and vice-versa.
For OneToMany, ManyToOne, and ManyToMany relationships you must not forget to mark the associated elements with @JsonIgnoreProperties(). Without this, the object serialization process will be stuck in an endless loop and throw an error. For more information check out the examples listed above and see: Jackson and JsonIgnoreType.
Lazy relationships
Lazy relationships in Artemis may require some additional special handling to work correctly:
Lazy OneToOne relationships require the additional presence of the @JoinColumn annotation and only work in one direction.
They can only lazily load the child of the relationship, not the parent. The parent is the entity whose database table owns the foreign key.
E.g., You can lazily load ProgrammingExercise::solutionParticipation but not SolutionProgrammingExerciseParticipation::programmingExercise, as the foreign key is part of the exercise table.
Lazy ManyToOne relationships require the additional presence of the @JoinColumn annotation.
Lazy OneToMany and ManyToMany relationships work without further changes.
Entity relationships often depend on the existence of another entity — for example, the Result-Feedback relationship. Without the Result, the Feedback entity doesn’t have any meaning of its own. When we delete the Result entity, our Feedback entity should also get deleted. For more information see: jpa cascade types.
CascadeType.ALL Propagates all operations mentioned below from the parent object to the to child object.
CascadeType.PERSIST When persisting a parent entity, it also persists the child entities held in its fields. This cascade rule is helpful for relationships where the parent acts as a container to the child entity. If you do not use this, you have to ensure that you persist the child entity first, otherwise an error will be thrown. Example: The code below propagates the persist operation from parent AnswerCounter to child AnswerOption. When an AnswerCounter is persisted, its AnswerOption is persisted as well.
CascadeType.MERGE If you merge the source entity (saved/updated/synchronized) to the database, the merge is cascaded to the target of the association. This rule applies to existing objects only. Use this type to always merge/synchronize the existing data in the table with the data in the object. Example below: whenever we merge a Result to the database, i.e. save the changes on the object, the Assessor object is also merged/saved.
CascadeType.REMOVE If the source entity is removed, the target of the association is also removed. Example below: propagates remove operation from parent Submission to child Result. When a Submission is deleted, the corresponding Result is also deleted.
CascadeType.REFRESH If the source entity is refreshed, it cascades the refresh to the target of the association. This is used to refresh the data in the object and its associations. This is useful for cases where there is a change which needs to be synchronized FROM the database.
If you want to create a @OneToMany relationship or @ManyToMany relationship, first think about if it is important for the association to be ordered. If you do not need the association to be ordered, then always go for a Set instead of List. If you are unsure, start with a Set.
Unordered Collection: A Set comes with certain advantages such as ensuring that there are no duplicates and null values in your collection. There are also performance arguments to use a Set, especially for @ManyToMany relationships. For more information see this stackoverflow thread. E.g.:
Ordered Collection: When you want to order the collection of objects of the relationship, then always use a List. It is important to note here that there is no inherent order in a database table. One could argue that you can use the id field for the ordering, but there are edge cases where this can lead to problems. Therefore, for a ordered collection, always annotate it with @OrderColumn. An order column indicates to hibernate that we want to order our collection based on a specific column of our data table. By default, the column name it expects is tablenameS_order. For ordered collections, we also recommend that you annotate them with CascadeType.ALL and orphanRemoval=true. E.g.:
Hibernate will take care of the ordering for you but you must create the order column in the database. This is not created automatically!
With ordered collections, you have to be very careful with the way you persist the objects in the database. You must first persist the child object without a relation to the parent object. Then, you recreate the association and persist the parent object. Example of how to correctly persist objects in an ordered collection:
// ProgrammingAssessmentServiceList<Feedback>savedFeedbacks=newArrayList<>();result.getFeedbacks().forEach(feedback->{// cut association to parent objectfeedback.setResult(null);// persist the child object without an association to the parent object. IMPORTANT: Use the object returned from the database!feedback=feedbackRepository.save(feedback);// restore the association to the parent objectfeedback.setResult(result);savedFeedbacks.add(feedback);});// set the association of the parent to its child objects which are now persisted in the databaseresult.setFeedbacks(savedFeedbacks);// persist the parent objectreturnresultRepository.save(result);
org.hibernate.LazyInitializationException:couldnotinitializeproxy–noSession caused by fetchType.LAZY. You must explicitly load the associated object from the database before trying to access those. Example of how to eagerly fetch the feedbacks with the result:
// ResultRepository.java@Query("select r from Result r left join fetch r.feedbacks where r.id = :resultId")Optional<Result>findByIdWithEagerFeedbacks(@Param("resultId")Longid);
JpaSystemException:nullindexcolumnforcollection caused by @OrderColumn annotation:
There is a problem with the way you save the associated objects. You must follow this procedure:
Save the child entity (e.g., Feedback) without connection to the parent entity (e.g., Result)
Add back the connection of the child entity to the parent entity.
Save the parent entity.
Always use the returned value after saving the entity, see: feedback=feedbackRepository.save(feedback);
In order to use Criteria Builder and benefit from Specifications, we need to adjust the Repository.
Metamodel: The metamodel is used to refer to the columns of a table, in an object-oriented way. For this, each entity needs to have a corresponding metamodel class. (Artemis already fulfills this requirement)
JpaSpecificationExecutor: To execute Specifications and generate SQL statements, we need to extend the JpaSpecificationExecutor interface in our Spring Data JPA Repository.
Defining the initial Specification: To generate a query with multiple Specifications, we can use the and() method for concatenation. However, the first Specification must always be called via the where() method as a rule.
Defining Specifications: A specification is a functional interface with a single method. This method has three parameters - a root, a query and a criteria builder. You don’t need to specify these arguments manually because they are provided during chaining.
Different joins are available (e.g. Join, ListJoin, SetJoin, CollectionJoin, …) - please choose the right one.
If we want to join from X to Y, we need to define the column and the join type. Please mind that when the join type is not specified an Inner Join is made by default.
Join<X,Y>join=root.join(X_.COLUMN,JoinType.LEFT);
We can define custom on clauses to specify the join condition.
Sub-queries are usually fine unless they are dependent sub-queries (also known as correlated sub queries).
Dependent Sub-Query:
In an SQL database query, a correlated sub-query is a sub-query (a query nested inside another query) that uses values from the outer query. But with a dependent sub-query you might run into performance problems because a dependent sub-query typically needs to be run once for each row in the outer query, e.g. if your outer query has 1000 rows, the sub-query will be run 1000 times.
Independent Sub-Query:
An independent sub-query is a sub-query that can be run on its own, without the main (sub-)query. Therefore, an independent sub-query typically only needs to be evaluated once.
You can find additional information on dependent sub-queries and how to identify them here.
publicstaticSpecification<User>getAllUsersMatchingCourses(Set<Long>courseIds){return(root,query,criteriaBuilder)->{Root<Course>courseRoot=query.from(Course.class);Join<User,String>group=root.join(User_.GROUPS,JoinType.LEFT);// Select all possible group typesString[]columns=newString[]{Course_.STUDENT_GROUP_NAME,Course_.TEACHING_ASSISTANT_GROUP_NAME,Course_.EDITOR_GROUP_NAME,Course_.INSTRUCTOR_GROUP_NAME};Predicate[]predicates=Arrays.stream(columns).map((column)->criteriaBuilder.in(courseRoot.get(column)).value(group)).toArray(Predicate[]::new);// The course needs to be one of the selectedPredicateinCourse=criteriaBuilder.in(courseRoot.get(Course_.ID)).value(courseIds);group.on(criteriaBuilder.or(predicates));query.groupBy(root.get(User_.ID)).having(criteriaBuilder.equal(criteriaBuilder.count(group),courseIds.size()));returncriteriaBuilder.in(courseRoot.get(Course_.ID)).value(courseIds);}}
We can simply return null, since specifications/predicates that are null are ignored when combining multiple specifications (e.g., specification.and(otherSpecification)) or when constructing a predicate from it.
The first term is a main feature of Artemis and is using code highlighting, e.g. “Programmingexercises:”.
Possible feature tags are: Programmingexercises, Modelingexercises, Textexercises, Quizexercises, Fileuploadexercises, Lectures, Exammode, Assessment, Communication, Notifications, Tutorialgroups. More tags are possible if they make sense.
If no feature makes sense, and it is a pure development or test improvement, we use the term “Development:”. More tags are also possible if they make sense.
Everything else belongs to the General category.
The colon is not highlighted.
After the colon, there should be a verbal form that is understandable by end users and non-technical persons, because this will automatically become part of the release notes.
The text should be short, non-capitalized (except the first word) and should include the most important keywords. Do not repeat the feature if it is possible.
We generally distinguish between bugfixes (the verb “Fix”) and improvements (all kinds of verbs) in the release notes. This should be immediately clear from the title.
Good examples:
“Allow instructors to delete submissions in the participation detail view”
“Fix an issue when clicking on the start exercise button”
“Add the possibility for instructors to define submission policies”
If the pull request doesn’t have any activity for at least 7 days, the stale bot will mark the PR as stale.
The stale status can simply be removed by adding a comment or a commit to the PR.
After the PR is marked as stale, the bot waits another 14 days until the PR will be closed (21 days in total).
Adding activity to the PR will remove the stale label again and reset the stale timer.
To prevent the bot from adding the stale label to the PR in the first place, the no-stale label can be used.
This label should only be utilized if the PR is blocked by another PR or the PR needs help from another developer.
Inclusive, diversity-sensitive language is important because language influences our thoughts and assessments of reality
and actively contributes to equality.
It is our responsibility to design software that is aware of diverse users, contributors, and developers.
We therefore provide guidelines for inclusive, diversity-sensitive, and appreciative language used for code, documentation, and user interaction.
Note
This page is not meant to be an exhaustive reference.
It rather describes some general guidelines and illustrates some best practices to follow.
The English language is used for code, documentation, user interaction.
Best practices
Avoid using terms that have social history: Terms that can have historical significance or impact in regards to race, ethnicity, national origin, gender, age, mental and physical ability, sexual orientation, socioeconomic status, religion, and educational background.
Avoid using idioms and jargons: These can exclude people who don’t have particular specialized knowledge, and many idioms don’t translate from country to country. Additionally, these sometimes have origins in negative stereotypes.
Write inclusive examples: Try to avoid using examples in the documentation that are culturally specific to a particular country, and be sure to use diverse names.
Examples - Common terms and recommendations for replacements:
Socially-charged language: Language that has historical or social roots, often assuming one classification as dominant over another.
Master, slave → primary/main, secondary/replica
Owner, master → lead, manager, expert
Blacklist → deny list, exclusion list, block list, banned list
Whitelist → allow list, inclusion list, safe list
Native feature → core feature, built-in feature
Culture fit → values fit
Gendered language: Language that either assumes the gender of the users and developers, or that makes assumptions of gender.
Man hours → labor hours, work hours
Manpower → labor, workforce
Guys (referring to a group) → folks, people, engineers/artists
The German language is mainly used for user interaction.
Best practices (written in German for demonstration purposes)
Nutze neutrale Formulierungen: Wenn möglich ist es empfehlenswert, neutrale Formulierungen zu finden. Neben passiven Formen, wie z.B. Studierende, können auch aktive mit „Mensch“ bzw. „Person“ etc. verwendet werden. Das spart „der/die“ und inkludiert alle Menschen unabhängig davon, welche Eigenschaften/Kategorien diese Personen sonst noch mitbringen.
Beschreibe Handlungen (aktiv oder passiv):
„Automatisch generierte Ergebnisse werden am Ende des Bearbeitungszeitraums erstellt“. Statt „Die Studierenden erhalten am Ende des Bearbeitungszeitraums ein automisch generiertes Ergebnis.“)
„Die Anmeldung für die Klausur ist bis zum 15.12.2020 möglich”. Statt „Jeder muss sich bis zum 15.12.2020 für die Klausur anmelden.“
Sprich Personen direkt an:
„Bitte melde dich bis zum 15.12.20 für die Lehrveranstaltung an“. Statt „Jeder muss sich bis zum 15.12.2020 für die Lehrveranstaltung anmelden.“
Rücke die Organisation in den Vordergrund:
„das Lehrstuhlpersonal“, „Beschäftigte des Lehrstuhls“, statt „die Lehrstuhlmitarbeiter/ die Lehrstuhlmitarbeiterinnen“
„das Organisationsteam“, statt „die Organisatoren, die Organisatorinnen“
Verwende die neutrale Pluralform:
„die Studierenden“, statt „die Studenten/ die Studentinnen“
„die Lehrenden“, statt „die Dozenten/ die Dozentinnen“
Verwende die neutralen Pronomina:
„niemand“, statt „keiner“
„alle die“, statt “jeder”
Examples - Terms replaced in the Artemis user interface
The following sources were used when creating these guidelines.
They provide detailed information on this topic and therefore can and should serve as further orientation.
The following diagram shows the top-level design of Artemis which is decomposed into an application client
(running as Angular web app in the browser) and an application server (based on Spring Boot).
For programming exercises, the application server connects to a version control system (VCS) and
a continuous integration system (CIS).
Authentication is handled by an external user management system (UMS).
While Artemis includes generic adapters to these three external systems with a defined protocol that can be instantiated
to connect to any VCS, CIS or UMS, it also provides 3 concrete implementations for these adapters to connect to:
VCS: Atlassian Bitbucket Server
CIS: Atlassian Bamboo Server
UMS: Atlassian JIRA Server (more specifically Atlassian Crowd on the JIRA Server)
The following UML deployment diagram shows a typical deployment of Artemis application server and application client.
Student, Instructor and Teaching Assistant (TA) computers are all equipped equally with the Artemis application client
being displayed in the browser.
The Continuous Integration Server typically delegates the build jobs to local build agents within
the university infrastructure or to remote build agents, e.g. hosted in the Amazon Cloud (AWS).
The Artemis application server uses the following (simplified) data model in the MySQL database.
It supports multiple courses with multiple exercises.
Each student in the participating student group can participate in the exercise by clicking
the Start Exercise button.
Then a repository and a build plan for the student (User) will be created and configured.
The initialization state helps to track the progress of this complex operation and allows recovering from errors.
A student can submit multiple solutions by committing and pushing the source code changes to a given example code
into the version control system or using the user interface.
The continuous integration server automatically tests each submission, and notifies the Artemis application server,
when a new result exists.
In addition, teaching assistants can assess student solutions and “manually” create results.
Please note, that the actual database model is more complex. The UML class diagram above omits some details for
readability (e.g. lectures, student questions, exercise details, static code analysis, quiz questions, exam sessions,
submission subclasses, etc.)
The following UML component diagram shows more details of the Artemis application server architecture and its
REST interfaces to the application client.
To prevent accidental irreversible database modifications, we use Liquibase to prepare changes when developing that can be confirmed in the review process.
liquibaseClearChecksums: Use this whenever Liquibase detects inconsistencies between the database changelog and the XML changelog
liquibaseDiffChangeLog: Generates a new changelog from the current database state. The command appears to not work for all operating systems, and you might have to add a changelog manually.
The changelog manifest lies in src/main/resources/config/liquibase/master.xml, which imports all single changelog files. To create a new change, you have to do the following:
Get the current time in the format YYYYMMDDHHmmss.
Create a new file in /changelog named <formatted-time>_changelog.xml and include this file at the bottom of the master.xml as every other file.
Add your changelog in your newly created file. Take other changes and the Liquibase documentation as inspiration
All executed entries are saved in the table databasechangelog. If you delete something, it gets executed again.
Test your changes locally first, and only commit changes you are confident that work.
Before deploying any database changes to a test server, ask for official permission from the project lead. If the changes don’t get approved, manual rollbacks can be necessary, which are avoidable.
Make sure to add your name to the changeSet in your file as well as your formatted time as the id. Refer to other changes for further help.
All executed entries are saved in the table migration_changelog. If you delete entries, they get executed again.
Test your changes locally first, and only commit changes you are confident that work.
Before deploying any database changes to a test server, ask for official permission from the project lead. If the changes don’t get approved, manual rollbacks can be necessary, which are avoidable.
If queries fail due to the authorization object being null, call SecurityUtils.setAuthorizationObject() beforehand to set a dummy object.
A guided tutorial can be created by instantiating a GuidedTour object.
This object has the mandatory attributes settingsKey, the identifier for the tutorial which will be stored in the
database, and steps, which is an array that stores all tutorial steps.
A tutorial can have different types of tutorial steps:
TextTourStep: tutorial step with only text content
ImageTourStep: tutorial step with text content and embedded image
VideoTourStep: tutorial step with text content and embedded video
UserInteractionTourStep: tutorial step which requires a certain interaction from the user to proceed
to the next step.
ModelingTaskTourStep: tutorial step with text content and modeling task for the Apollon editor
that is assessed for the step
AssessmentTaskTourStep: tutorial step with text content and a tutor assessment task for example submissions
(currently only implemented for text assessments).
In this example, the GuidedTour object is created and assigned to the constant exampleTutorial,
which one can use to embed the tutorial to a component of choice.
TextTourStep: The mandatory fields are headlineTranslateKey and contentTranslateKey.
ImageTourStep: The ImageTourStep extends the TextTourStep and has imageUrl as an additional mandatory
attribute.
VideoTourStep: The VideoTourStep extends the TextTourStep and has videoUrl as an additional mandatory
attribute.
UserInterActionTourStep: The UserInterActionTourStep extends the TextTourStep and is used
to include interactions tasks for the user during the tour step.
It has the additional mandatory attribute userInteractionEvent,
which defines the interaction type, and the optional attribute triggerNextStep.
ModelingTaskTourStep: The ModelingTaskTourStep extends the UserInterActionTourStep and
has modelingTask as an additional mandatory attribute.
AssessmentTaskTourStep: The AssessmentTaskTourStep extends the UserInterActionTourStep and
has assessmentTask as an additional mandatory attribute.
There are many optional attributes that can be defined for a tour step. These attributes and their definition
can be found in the abstractclassTourStep.
Below, you can find a list of attributes that are used more often:
highlightSelector: For the highlightSelector you have to enter a CSS selector for the HTML element that you
want to highlight for this step.
For better maintainability of the guided tutorials, it is strongly advised to create new selectors
with the prefix guided-tour within the DOM and use it as the highlight selector.
orientation: We can define an orientation for every tour step individually.
The tour step orientation is used to define the position of the tour step next to highlighted element.
highlightPadding: This attribute sets the additional padding around the highlight element.
userInteractionEvent: Some steps require user interactions, e.g. certain click events,
before the next tour step can be enabled.
The supported user interactions are defined in the enum UserInteractionEvent.
pageUrl: If you want to create a multi-page tutorial, i.e. a tutorial that guides the user
through multiple component pages, then you have to use this attribute.
The pageUrl should be added to the first tutorial step of every page and if the URL has identifiers in the URL
such as course or exercise ids then these numbers should be replaced with the regex (\d+)+.
An example of multi-page tutorials can be found in the tutor-assessment-tour.ts file.
In order to allow internationalization, the values for the attributes headlineTranslateKey,
subHeadlineTranslateKey, contentTranslateKey and hintTranslateKey reference the text snippets
which are stored in JSON translation document.
Further attributes that need translations are videoUrl for VideoTourStep and taskTranslateKey
for the modelingTask in the ModelingTaskTourStep.
One JSON document that is used for the translations of guided tutorials is the file guidedTour.json.
There are multiple service methods to embed a guided tutorial in an application component file.
We use the GuidedTutorialService in the component through dependency injection and invoke the fitting method
to enable the tutorial for the component:
The enableTourForCourseOverview method is used when the tutorial should be enabled for a certain course in
a component, which displays a list of courses (e.g. overview.component.ts).
It returns the course for which the tutorial is enabled, if available, otherwise null.
The enableTourForCourseExerciseComponent method is used when the tutorial should be enabled for a certain course
and exercise in a component, which displays a list of exercises for a course (e.g. course-exercises.component.ts).
It returns the exercise for which the tutorial is enabled, if available, otherwise null.
The enableTourForExercise method is used when the tutorial should be enabled for a certain exercise
(e.g. course-exercise-details.component.ts).
It returns the exercise for which the tutorial is enabled, if available, otherwise null.
The mapping of guided tutorials to certain courses and exercises is configured in the application-dev.yml and
application-prod.yml files.
The yaml configuration below shows that the guided tutorials are only enabled for the course with the short name
artemistutorial.
The configuration for tours shows a list of mappings tutorialSettingsKey → exerciseIdentifier.
The exerciseIdentifier for programming exercises is the exercise short name, otherwise it’s the exercise title.
The optional course-group-students property is used to automatically add the given tutorial’s course group to
all the new created users.
This functionality can be extended to users with instructor or teaching assistant roles, adding the optional
course-group-instructors and/or course-group-tutors properties.
In this case, newly created users with instructor or teaching assistant roles will be assigned to their
respectively tutorial’s course groups.
Through Jest client tests it is possible to start the guided tutorials and go through all the tutorial steps while
checking for the highlight selectors.
An example test suite for the courseOverviewTour can be found in the overview.component.spec.ts file.
Deployment via Bamboo. Only for branches of the ls1intum/Artemis repository, no forks.
Guide available in the Artemis Developer Confluence Space: Deploying changes to test server.
Pull requests on GitHub can be deployed to TS5, including forks.
To invoke a deployment, you need to be part of the @ls1intum/artemis-developers GitHub team.
Waiting for build to finish
Deployment waiting for approval
Review Deployment
Deployment done
Start the deployment by reviewing the Build & Deploy action.
(Refer to the GitHub documentation “Reviewing deployments”.)
TS5 is locked to a pull request using the lock:artemistest5 label.
The workflow applies the lock label automatically on deployment.
Remove the label from the PR once the test server is free to use by other developers.
General Structure of Programming Exercise Execution
Artemis uses docker containers to run programming exercises. This ensures that the students’ code does not have
direct access to the build agents’ hardware.
To reduce the time required for each test run, these docker containers already include commonly used dependencies
such as JUnit.
The Cypress test suite contains system tests verifying the most important features of Artemis.
System tests test the whole system and therefore require a complete deployment of Artemis first.
In order to prevent as many faults (bugs) as possible from being introduced into the develop branch,
we want to execute the Cypress test suite whenever new commits are pushed to a Git branch
(just like the unit and integration test suites).
To accomplish this we need to be able to dynamically deploy multiple different instances of Artemis at the same time.
An ideal setup would be to deploy the whole Artemis system using Kubernetes.
However, this setup is too complex at the moment.
The main reason for the complexity is that it is very hard to automatically setup Docker containers for
the external services (Jira, Bitbucket and Bamboo) and connect them directly with Artemis.
Therefore, the current setup only dynamically deploys the Artemis server and configures it to connect to
the prelive system, which is already properly setup in the university data center.
Every execution of the Cypress test suite requires its own deployment of Artemis.
The easiest way to accomplish this is to deploy Artemis locally on the build agent, which executes the Cypress tests.
Using dockercompose we can start a MySQL database and the Artemis server locally on the build agent and
connect it to the prelive system in the university data center.
Artemis Deployment on Bamboo Build Agent for Cypress
In total there are three Docker containers started in the Bamboo build agent:
MySQL
This container starts a MySQL database and exposes it on port 3306.
The container automatically creates a new database ‘Artemis’ and configures it
with the recommended settings for Artemis.
The Cypress setup reuses the already existing
MySQL docker image
from the standard Artemis Docker setup.
Artemis
The Docker image for the Artemis container is created from the already existing
Dockerfile.
When the Bamboo build of the Cypress test suite starts, it retrieves the Artemis executable (.war file)
from the Artemis build plan.
Upon creation of the Artemis Docker image the executable is copied into the image together with configuration files
for the Artemis server.
The main configuration of the Artemis server is contained in the
application.yml file.
However, this file does not contain any security relevant information.
Security relevant settings like the credentials to the Jira admin account in the prelive system are instead passed to
the Docker container via environment variables.
This information is accessible to the Bamboo build agent via
Bamboo plan variables.
The Artemis container is also configured to
depend on
the MySQL container and uses
health checks
to wait until the MySQL container is up and running.
Cypress
Cypress offers a variety of docker images
to execute Cypress tests.
We use an image which has the Cypress operating system dependencies and a Chrome browser installed.
However, Cypress itself is not installed in
these images.
This is convenient for us because the image is smaller and the Artemis Cypress project requires
additional dependencies to fully function.
Therefore, the Artemis Cypress Docker container is configured to install all dependencies
(using npmci) upon start. This will also install Cypress itself.
Afterwards the Artemis Cypress test suite is executed.
The necessary configuration for the Cypress test suite is also passed in via environment variables.
Furthermore, the Cypress container depends on the Artemis container and is only started
once Artemis has been fully booted.
Bamboo webhook
The Artemis instance deployed on the build agent is not publicly available to improve the security of this setup.
However, in order to get the build results for programming exercise submissions Artemis relies on a webhook from Bamboo
to send POST requests to Artemis.
To allow this, an extra rule has been added to the firewall allowing only the Bamboo instance in the prelive system
to connect to the Artemis instance in the build agent.
Timing
As mentioned above, we want the Cypress test suite to be executed whenever new commits are pushed to a Git branch.
This has been achieved by adding the
Cypress Github build plan
as a child dependency
to the Artemis Build build plan.
The Artemis Build build plan is triggered whenever a new commit has been pushed to a branch.
The Cypress build plan is only triggered after a successful build of the Artemis executable.
This does imply a delay (about 10 minutes on average) between the push of new commits and the execution
of the Cypress test suite, since the new Artemis executable first has to be built.
NOTE: The Cypress test suite is only automatically executed for internal branches and pull requests
(requires access to this GitHub repository) not for external ones.
In case you need access rights, please contact the maintainer Stephan Krusche.
Automatic flaky test detection based on changed code
In addition to our regular Cypress execution, we also run a special experimental build plan that attempts to detect
flaky tests based on the changed code. To do this, we have some special Docker configurations that are specific to this
build plan.
Docker Image Extensions
We extend the existing Dockerfile to create the Docker image for the Artemis
container. For the flaky test detection build plan, we need to change the Artemis startup and add the unzip
dependency. To do this, we have a special Dockerfile that extends the original one and adds these changes. The
Dockerfile can be found here. To do this, the regular image
has to be built and tagged with artemis:coverage-latest.
Additionally, we need Java in the Cypress container for the flaky test detection, so we have a special Dockerfile for
the Cypress container that extends the original one and adds the Java installation. This Dockerfile can be found
here.
Docker Compose Changes
The Docker Compose file for the flaky test detection is located
here. This file includes some overrides for the regular
Docker Compose file. The main differences are that we use the extended Dockerfiles for the Artemis and Cypress
containers, and we also change the Cypress startup command to include our coverage analysis. To use the overrides,
you can run the following command: docker-compose-fcypress-E2E-tests.yml-fcypress-E2E-tests-coverage-override.ymlup.
This setup allows us to run the flaky test detection build plan in parallel with the regular Cypress build plan. If
there is no overlap between the changed code and the files covered by failed tests, we label plan executions with the
suspected-flaky label.
There is another build plan on Bamboo which executes the Cypress test suite.
This build plan
deploys the latest Artemis executable of the develop branch on an already configured test environment (test server 3)
and executes the Cypress test suite against it.
This build plan is automatically executed every 8 hours and verifies that test server 3 is working properly.
Artemis Deployment on test environment for Cypress
The difference of this setup is that the Artemis server is deployed on a separate environment which already contains
the necessary configuration files for the Artemis server to connect to the prelive system.
The Docker image for the Cypress container should be exactly the same as the Cypress image used in
the docker compose file for the deployment on a Bamboo build agent.
The Artemis Dockerfile as well as the MySQL image are already maintained because they are used in
other Artemis Docker setups.
Therefore, only Cypress and the Cypress Docker image require active maintenance.
Since the Cypress test suite simulates a real user, it makes sense to execute the test suite with
the latest Chrome browser.
The Cypress Docker image we use always has a specific Chrome version installed.
Therefore, the
docker-compose file
as well as the
build plan configuration for the Cypress tests on test server 3
should be updated every month to make sure that the latest Cypress image for the Chrome browser is used.
We are dedicated to provide great software support for excellent teaching.
To achieve that, we develop Artemis as free and open-source software.
The core ideas of free software always have been “use”, “study”, “share”, and “improve”.
With Artemis, we embrace all of those aspects:
Artemis is used by over ten universities and thousands of students.
Artemis is part of active research and is constantly improved by students as part of university courses or
masters/bachelors theses.
Studying the Artemis source code helps our students to improve their software engineering skills
while working on a real, actively used product.
The foundation of learning and teaching is sharing knowledge.
Artemis is used to conduct courses at universities and, at the same time, used as an education research platform.
In an nginx proxy, you can define a fallback page that is shown when Artemis is not reachable.
Add the special location and error_page directive to the server section for Artemis as shown below.
Place the webpage that should be shown in case of Artemis being unreachable (in this case /srv/http/service-down.html) somewhere readable by the system user that runs nginx.
Users can register a new account on the start page based on the regex defined in allowed-email-pattern.
If no email pattern is defined, any email address can be used.
Upon registration, users receive an email to activate their account.
You can find more information on how to configure the email server on the official
Jhipster
documentation.
Artemis supports user login and registration using SAML2 / Shibboleth.
The SAML2 feature is intended for use with Artemis’ internal user management and primarily serves as a registration mechanism.
With the help of this feature it is possible to store not only the login, name and email, but also the student’s matriculation number directly in the database.
For each user who registers in the system for the first time, a “normal” artemis user is created and the data is taken from the attributes of the Shibboleth request.
The feature is activated by the saml2 profile.
If you use a reverse proxy, you have to redirect the following endpoints to the artemis server: /login and /saml2 .
If you activate the SAML2 Feature an additional login button will be activated (you can set the text of the button as you like):
The workflow of the SAML2 feature is shown in the following picture:
The SAML2 library of Spring Boot is used to create a second security filter chain.
The new (and old) security filter chain is presented in the following figure:
The feature is configured by the application-saml2.yml file.
You can configure multiple identity providers.
In addition, the SAML2 feature allows to decide whether a user can obtain a password (see “info.saml2.enable-password”).
This app password allows to use the connected services as VCS and CI as usual with the local user credentials.
You can see the structure of the saml2 configuration in the following:
saml2:# Define the patterns used when generating users. SAML2 Attributes can be substituted by surrounding them with# curly brackets. E.g. username: '{user_attribute}'. Missing attributes get replaced with an empty string.# This enables definition of alternative attribute keys when using multiple IdPs. E.g. username: '{uid}{user_id}'.# User template pattern:username-pattern:'{first_name}_{last_name}'first-name-pattern:'{first_name}'last-name-pattern:'{last_name}'email-pattern:'{email}'registration-number-pattern:'{uid}'lang-key-pattern:'en'# can be a pattern or fixed to en/de# It is also possible to only extract parts of the attribute values.# For each attribute key exactly one regular expression can optionally be defined that is used to extract only parts# of the received value. The regular expression must match the whole value. It also has to contain a named capture# group with the name 'value'.# E.g. when receiving 'pre1234post' from the SAML2 service in the 'uid'-example below, only '1234' will be used when# replacing '{uid}' in one of the user attributes defined above.value-extraction-patterns:#- key: 'registration_number'# value_pattern: 'somePrefix(?<value>.+)someSuffix'#- key: 'uid'# value_pattern: 'pre(?<value>\d+)post'# A list of identity provides (IdP). Metadata location can be a local path (or classpath) or url.# If your IdP does not publish its metadata you can generate it here: https://www.samltool.com/idp_metadata.phpidentity-providers:#- metadata: https://idp_host/.../metadata# registration-id: IdPName# entity-id: artemis# cert-file: # path-to-cert (optional) Set this path to the Certificate for encryption/signing or leave it blank# key-file: # path-to-key (optional) Set this path to the Key for encryption/signing or leave it blank (must be a PKCS#8 file!)# Multiple IdPs can be configured# - metadata: <URL># registrationid: <id># entityid: <id># String used for the SAML2 login button. E.g. 'Shibboleth Login'info.saml2.button-label:'SAML2Login'# Sends a e-mail to the new user with a link to set the Artemis password. This password allows login to Artemis and its# services such as GitLab and Jenkins. This allows the users to use password-based Git workflows.# Enabled the password reset function in Artemis.info.saml2.enable-password:true
Artemis distinguishes between six different roles: ADMIN, INSTRUCTOR, EDITOR, TA (Teaching Assistant), USER and ANONYMOUS.
The roles are sorted in descending order. An INSTRUCTOR has at least all the permissions that an EDITOR has.
An ADMIN has no restrictions, while an ANONYMOUS has hardly any rights.
ADMIN: can access all features that Artemis provides (includes features regarding the server administration, e.g. server health checks, user management and creating new courses)
INSTRUCTOR: can access all features related to the content of a course (includes creating and deleting exercises/exams/lectures and monitoring scores and submissions)
EDITOR: can create and edit the content of a course but cannot delete content and monitoring other course participants is limited
TEACHING ASSISTANT: can assess student submissions and view course content before the release date
USER: participates as a Student in courses, can view course content after the release date
ANONYMOUS: role before login
When changing the access rights of a user, the respective user must logoff and then logon again for the changes to take effect.
In the following, the respective permissions illustrated. If a subordinated role has all permissions or a role does not have any access rights, it is not explicitly displayed.
Do not use pgloader to convert the database from MySQL to PostgreSQL.
This results in a database schema that is not compatible with future migrations.
PgLoader converts constraint names into all-lowercase.
The Liquibase migrations assume that they have got their original name which contains the case-sensitive prefix FK.
Start Artemis at least once in version V ≧ 6.0.0 or greater to make sure the current database schema is PostgreSQL-compatible.
Stop Artemis.
Create a database backup using mysqldump--all-databasesArtemis>Artemis.sql.
This dump is called Artemis.sql in the following steps.
Copy the docker-compose.yml file into the same directory as the Artemis.sql database dump
and run the following commands to convert the Artemis.sql dump into Artemis.pg.sql that is usable by PostgreSQL.
docker-compose.yml with helper containers for MySQL and PostgreSQL.
Commands to transform the MySQL dump into a PostgreSQL one.
#! /usr/bin/env bash# start the temporary MySQL and Postgres containers
dockercomposeup-d
# import database dump into MySQL
dockercomposeexec-Tmysqlmysql<Artemis.sql
# use pgloader to transfer data from MySQL to Postgres
dockerrun--rm--network=artemis-db-migrationdocker.io/dimitri/pgloaderpgloadermysql://root@mysql/Artemispostgresql://root@postgres/Artemis
# dump the Postgres data in a format that can be imported in the actual database
dockercomposeexec-Tpostgrespg_dump-OxArtemis>Artemis.pg.sql
# clean up
dockercomposedown
Note
Alternatively, you could use some temporary database on your PostgreSQL instance that can be deleted afterwards to migrate the data directly from your production MySQL into there.
Use this temporary PostgreSQL database to create the Artemis.pg.sql dump that can be imported into the production database after merging with the proper schema.
In that case the pgloader command in the steps above should work similarly without the --network flag and adapted database connection URLs.
For pg_dump, add the necessary flags to connect to your database in addition to-Ox.
Update the Artemis config to connect to an empty new PostgreSQL database (see Connecting Artemis to PostgreSQL).
Start Artemis, wait until it has finished starting up and created the schema, and stop it again.
Warning
Use the same version V that was connected to MySQL before.
Dump the schema Artemis has created on the PostgreSQL server in the previous step using
pg_dump-OxArtemis>empty.pg.sql
Now the database schema as created by Artemis (empty.pg.sql) and the one containing the actual data migrated from MySQL (Artemis.pg.sql) need to be merged.
Use the following script like python3./merge.py>merged.pg.sql to create the merged database dump.
#! /usr/bin/env python3"""Merges two database dumps- empty.pg.sql- Artemis.pg.sqlcreated from an Artemis database where `empty.pg.sql` contains a fresh DBschema as created by the first start of Artemis from a new database, and`Artemis.pg.sql` is a dump from an Artemis database that was converted fromMySQL to PostgreSQL using pgloader.It is merged so that the schema definitions are taken from `empty.pg.sql` andthe actual data comes from `Artemis.pg.sql`. The script assumes the order ofoperations in the dumps: first the schema is created, then data is inserted,and finally foreign key constraints and indices are added.Both the empty database dump and the original MySQL data must come from an_identical_ version of Artemis. Otherwise, the data to be inserted might notmatch the schema definition."""frompathlibimportPathfromtypingimportIteratordef_fix_schema(line:str)->str:ifline.startswith("COPY artemis."):returnline.replace("COPY artemis.","COPY public.",1)ifline.startswith("SELECT"):old="SELECT pg_catalog.setval('artemis."new="SELECT pg_catalog.setval('public."returnline.replace(old,new,1)returnlinedef_extract_data(data_file_path:Path)->None:withopen(data_file_path,encoding="utf-8")asdata_file:copy_found=Falseforlineindata_file:ifnotcopy_foundandline.startswith("COPY "):copy_found=Trueifcopy_foundandline.startswith("ALTER TABLE "):breakifcopy_found:print(_fix_schema(line),end="")def_merge_files(*,schema_file_path:Path,data_file_path:Path)->None:withopen(schema_file_path,encoding="utf-8")asschema_file:schema_file_iter:Iterator[str]=iter(schema_file)forlineinschema_file_iter:ifline.startswith("COPY "):breakprint(line,end="")_extract_data(data_file_path)alter_table_found=Falseforlineinschema_file_iter:ifline.startswith("ALTER TABLE "):alter_table_found=Trueifalter_table_found:print(line,end="")defmain()->None:print("-- ensure fresh schema")print("drop schema if exists public cascade;")print("create schema public;")print()_merge_files(schema_file_path=Path("empty.pg.sql"),data_file_path=Path("Artemis.pg.sql"))if__name__=="__main__":main()
Import the merged database dump merged.pg.sql into the production PostgreSQL database using psql<merged.pg.sql.
Warning
The schema public of the target database will be deleted and completely overwritten when importing.
Currently, Jenkins doesn’t allow to update the user details. We would have to delete and create the user with the new information.
We can only do this if we have the password, which we don’t have unless the admin provides a new password as the passwords are hashed in our database.
If you update the user details in Artemis you have the following options if you use Jenkins:
Update the user manually in Jenkins. As admins rarely update the users themselves, this wouldn’t be too much work.
Leave it as it is. You have to decide for yourself whether the user details in Jenkins are important.
Provide a password. Not recommended! You would need to provide the user with the new password which is itself a security issue and then the user has to change the password which might not be done reliably.
Important to know:
Group/Permission updates get always delegated. They are separate from user details.
If users change their password after a not delegated user change, the user change gets automatically applied when recreating the Jenkins user.
Jan Philip Bernius, Stephan Krusche, and Bernd Bruegge. Machine learning based feedback on textual student answers in large courses. Computers and Education: Artificial Intelligence, June 2022. doi:10.1016/j.caeai.2022.100081.
Stephan Krusche. Semi-automatic assessment of modeling exercises using supervised machine learning. In 55th Hawaii International Conference on System Sciences, HICSS '22, 1–10. ScholarSpace, January 2022. URL: http://hdl.handle.net/10125/79439.
Gerhard Hagerer, Laura Lahesoo, Miriam Anschütz, Stephan Krusche, and Georg Groh. An analysis of programming course evaluations before and after the introduction of an autograder. In 19th International Conference on Information Technology Based Higher Education and Training, ITHET '21, 1–9. IEEE, November 2021. doi:10.1109/ITHET50392.2021.9759809.
Jan Philip Bernius, Stephan Krusche, and Bernd Bruegge. A machine learning approach for suggesting feedback in textual exercises in large courses. In 8th ACM Conference on Learning @ Scale, L@S '21, 173–182. Association for Computing Machinery (ACM), June 2021. doi:10.1145/3430895.3460135.
Anne Münzner, Nadja Bruckmoser, and Alexander Meschtscherjakov. Can i code? user experience of an assessment platform for programming assignments. In 2nd International Computer Programming Education Conference, volume 91 of ICPEC '21, 18:1–18:12. Schloss Dagstuhl – Leibniz-Zentrum für Informatik, May 2021. doi:10.4230/OASIcs.ICPEC.2021.18.
Jan Philip Bernius. Toward computer-aided assessment of textual exercises in very large courses. In 52nd ACM Technical Symposium on Computer Science Education, SIGCSE '21, 1386. Association for Computing Machinery (ACM), March 2021. doi:10.1145/3408877.3439703.
Jan Philip Bernius, Anna Kovaleva, Stephan Krusche, and Bernd Bruegge. Towards the automation of grading textual student submissions to open-ended questions. In 4th European Conference of Software Engineering Education, ECSEE '20, 61–70. Association for Computing Machinery (ACM), June 2020. doi:10.1145/3396802.3396805.
Stephan Krusche, Nadine von Frankenberg, Lara Marie Reimer, and Bernd Bruegge. An interactive learning method to engage students in modeling. In 42nd International Conference on Software Engineering, Software Engineering Education and Training, ICSE-SEET '20, 12–22. ACM, June 2020. doi:10.1145/3377814.3381701.
Jan Philip Bernius, Anna Kovaleva, and Bernd Bruegge. Segmenting student answers to textual exercises based on topic modeling. In 17th Workshop on Software Engineering im Unterricht der Hochschulen, SEUH '20, 72–73. CEUR-WS.org, February 2020. URL: http://ceur-ws.org/Vol-2531/poster03.pdf.
Christopher Laß, Stephan Krusche, Nadine von Frankenberg, and Bernd Bruegge. Stager: simplifying the manual assessment of programming exercises. In 16th Workshop on Software Engineering im Unterricht der Hochschulen, SEUH '19, 34–43. CEUR-WS.org, February 2019. URL: http://ceur-ws.org/Vol-2358/paper-03.pdf.
Jan Philip Bernius and Bernd Bruegge. Toward the automatic assessment of text exercises. In 2nd Workshop on Innovative Software Engineering Education, ISEE '19, 19–22. CEUR-WS.org, February 2019. URL: http://ceur-ws.org/Vol-2308/isee2019paper04.pdf.
Stephan Krusche and Andreas Seitz. Increasing the interactivity in software engineering moocs - A case study. In 52nd Hawaii International Conference on System Sciences, HICSS '19, 1–10. ScholarSpace, January 2019. URL: https://hdl.handle.net/10125/60197.
Stephan Krusche and Andreas Seitz. Artemis: an automatic assessment management system for interactive learning. In 49th ACM Technical Symposium on Computer Science Education, SIGCSE '18, 284–289. ACM, February 2018. doi:10.1145/3159450.3159602.
Stephan Krusche, Nadine von Frankenberg, and Sami Afifi. Experiences of a software engineering course based on interactive learning. In 15th Workshop on Software Engineering im Unterricht der Hochschulen, SEUH '17, 32–40. CEUR-WS.org, February 2017. URL: http://ceur-ws.org/Vol-1790/paper04.pdf.
Stephan Krusche, Andreas Seitz, Jürgen Börstler, and Bernd Brügge. Interactive learning: increasing student participation through shorter exercise cycles. In 19th Australasian Computing Education Conference, ACE '17, 17–26. ACM, January 2017. doi:10.1145/3013499.3013513.
8. Comments
Only write comments for complicated algorithms, to help other developers better understand them. We should only add a comment, if our code is not self-explanatory.