After installation is successful, you want to run your contest! Configuring DOMjudge to run a contest (or a number of them, in sequence) involves the following steps:
DOMjudge stores and retrieves most of its data from the MySQL database. Some information must be filled in beforehand, other tables will be populated by DOMjudge.
You can use the jury web interface to add, edit and delete most types of data described below. It's advised to keep a version of phpMyAdmin handy in case of emergencies, or for general database operations like import and export.
This section describes the meaning of each table and what you need to put into it. Tables marked with an `x' are the ones you have to configure with contest data before running a contest (via the jury web interface or e.g. with phpMyAdmin), the other tables are used automatically by the software:
clarification | Clarification requests/replies are stored here. | |
x | configuration | Runtime configuration settings. |
x | contest | Contest definitions with start/end time. |
event | Log of events during contests. | |
x | judgehost | Computers (hostnames) that function as judgehosts. |
judging | Judgings of submissions. | |
x | language | Definition of allowed submission languages. |
x | problem | Definition of problems (name, corresponding contest, etc.). |
submission | Submissions of solutions to problems. | |
x | team | Definition of teams. |
x | team_affiliation | Definition of institutions a team can be affiliated with. |
x | team_category | Different category groups teams can be put in. |
team_unread | Records which clarifications are read by which team. | |
x | testcase | Definition of testdata for each problem. |
scoreboard_jury | Cache of the scoreboards for public/teams and for the jury | |
scoreboard_public | separately, because of possibility of score freezing. |
Now follows a longer description (including fields) per table that has to be filled manually.
This table contains configuration settings and is work in progress.
These entries are simply stored as name, value
pairs.
The contests that the software will run. E.g. a test session and the live contest.
cid
is the reference ID and contestname
is a
descriptive name used in the interface.
activatetime
, starttime
and endtime
are required fields and specify when this contest is active and
open for submissions. Optional freezetime
and
unfreezetime
control scoreboard freezing. For a
detailed treating of these, see section
Contest milestones.
List here the hosts that will be judging the submissions.
hostname
is the (short) hostname of a judge computer.
active
indicates whether this host should judge incoming
submissions. polltime
is an internally used variable to
detect whether a judgedaemon is running on the host.
Programming languages in which to accept and judge submissions.
langid
is a string of maximum length 8, which references the
language. This reference is also used to call the correct compile
script (lib/judge/compile_c.sh
, etc.), so when adding
a new language, check that these match.
name
is the displayed name of the language;
extension
the internally used extension for that language,
which has to match the first extension as listed in the global
configuration file.
allow_submit
determines whether teams can submit
using this language; allow_judge
determines whether
judgehosts will judge submissions for this problem. This can for
example be set to no to temporarily hold judging when a problem occurs
with the judging of a specific language; after resolution of the
problem this can be set to yes again.
time_factor
is the relative factor by which the timelimit is
multiplied for solutions in this language. For example Java is/was
known to be structurally slower than C/C++.
This table contains the problem definitions. probid
is the
reference ID, cid
is the contest ID this problem is (only)
defined for: a problem cannot be used in multiple contests.
name
is the full name (description) of the problem.
allow_submit
determines whether teams can submit
solutions for this problem. Non-submittable problems are also not
displayed on the scoreboard. This can be used to define spare
problems, which can then be added to the contest quickly;
allow_judge
determines whether judgehosts will judge
submissions for this problem. See also the explanation for language.
timelimit
is the timelimit in seconds
within which solutions for this problem have to run (taking into
account time_factor
per language).
special_run
if not empty defines a custom run program
run_<special_run>
to run compiled submissions for
this problem and special_compare
if not empty defines a
custom compare program compare_<special_compare>
to
compare output for this problem.
The color
tag can be filled with a CSS colour specification
to associate with this problem; see also section
Scoreboard: colours.
Table of teams: login
is the account/login-name of the team
(which is referenced to in other tables as teamid
) and
name
the displayed name of the team. categoryid
is
the ID of the category the team is in; affilid
is the
affiliation ID of the team.
ipaddress
is the IP-address of the team. This is used to
automatically identify the team in the web interface and to check
submission origin. A value of NULL
results in the team being
unable to submit or view its team page, unless they first authenticate
via password or command line submission. passwd
is a
MD5-hash of a one-time password teams can use to authenticate and
register their IP address. It can be set under Administrator Functions:
Generate Passwords.
The hostname
field is automatically filled in when team data
is added or changed based on a reverse DNS lookup of the IP address.
If this cache becomes inaccurate for some reason, it can be refreshed
under the Administrator functions on the main page.
members
are the names of the team members, separated by
newlines and room
is the room the team is located, both for
display only; comments
can be filled with arbitrary useful
information and is only visible to the jury.
The timestamp teampage_first_visited
indicates when/whether a
team visited its team web interface.
affilid
is the reference ID and name
the name of the
institution. country
should be the 2 character
ISO 3166-1 alpha-2 abbreviation
of the country and comments
is a free form field
that is displayed in the jury interface.
Both for the country and the affiliation, a logo can be displayed on
the scoreboard. For this to work, the affilid
must match a
logo picture located in
www/images/affiliations/<affilid>.png
and
country
must match a (flag) picture in
www/images/countries/<country>.png
. All
country flags are present there, named with their 2-character ISO
codes. See also www/images/countries/README
. If
either file is not present the respective ID string will be printed
instead.
categoryid
is the reference ID and name
is a string:
the name of the category. sortorder
is the order at which
this group must be sorted in the scoreboard, where a higher number
sorts lower and equal sort depending on score.
The color
is again a CSS colour specification used to
discern different categories easily. See also section
Scoreboard: colours.
The visible
flag determines whether teams in this category
are displayed on the public/team scoreboard. This feature can be used
to remove teams from the public scoreboard by assigning them to a
separate, invisible category.
The testcase table contains testdata for each problem; id
is
a unique identifier, input
and output
contain the
testcase input/output and md5sum_input
,
md5sum_output
their respective md5 hashes to check for
up-to-date-ness of cached versions by the judgehosts. probid
is the corresponding problem and description
an optional
description for this testcase.
See also
providing testdata.
The contest
table specifies timestamps for each contest
that mark specific milestones in the course of the contest.
The triplet activatetime, starttime and endtime define when the contest runs and are required fields (activatetime and starttime may be equal).
activatetime is the moment when a contest first becomes visible to the public and teams (potentially replacing a previous contest that was displayed before). Nothing can be submitted yet and the problem set is not revealed. Clarifications can be viewed and sent.
At starttime, the scoreboard is displayed and submissions are accepted. At endtime the contest stops. New incoming submissions will be stored but not processed; unjudged submissions received before endtime will still be judged.
freezetime and unfreezetime control scoreboard freezing. freezetime is the time after which the public and team scoreboard are not updated anymore (frozen). This is meant to make the last stages of the contest more thrilling, because no-one knows who has won. Leaving them empty disables this feature. When using this feature, unfreezetime can be set to automatically `unfreeze' the scoreboard at that time. For a more elaborate description, see also section Scoreboard: freezing and defrosting.
The scoreboard, results and clarifications will remain to be displayed to team and public after a contest, until an activatetime of a later contest passes.
All events happen at the first moment of the defined time. That is: for a contest with starttime "12:00:00" and endtime "17:00:00", the first submission will be accepted at 12:00:00 and the last one at 16:59:59.
The following ordering must always hold: activatetime <= starttime < (freezetime <=) endtime (<= unfreezetime). No two contests may have overlap: there's always at most one active contest at any time.
The jury system needs to know which team it is dealing with.
The IP-address of a workstation is the primary means of authentication. The system assumes that someone coming from a specific IP is the team with that IP listed in the team table. When a team browses to the web interface, this is checked and the appropriate team page is presented. The submitclient via the same method also checks this IP for the origin of a submission.
There are three possible ways of configuring team IP-addresses.
Before the contest starts, when entering teams into the database, add the IP that each team will have to that team's entry. When the teams arrive, everything will work directly and without further configuration (except when teams switch workplaces). If possible, this is the recommended modus operandi, because it's the least hassle just before and during the contest.
Supply the teams with a password with which to authenticate. Beforehand, generate passwords for each team in the jury interface. When the test session (or contest) starts and a team connects to the web interface and have an unknown IP, they will be prompted for username and password. Once supplied, the IP is stored and the password is not needed anymore.
This is also a secure option, but requires a bit more hassle from the teams, and maybe from the organisers who have to distribute pieces of paper.
Note: the web interface will only allow a team to authenticate themselves once. If an IP is set, a next authentication will be refused (to avoid trouble with lingering passwords). In order to fully re-authenticate a team, the IP address needs to be unset. You might also want to generate a new password for this specific team. Furthermore, a team must explicitly connect to the team interface, because with an unknown IP, the root DOMjudge website will redirect to the public interface.
This is only possible with the Dolstra protocol. The advantage is that no prior mapping needs to be configured, but the disadvantage is that the team interface cannot be viewed until at least one submission was made; there are also more constraints on the system. See the section on the Dolstra protocol for details.
Testdata is used to judge the problems: when a submission run is given the input testdata, the resulting output is compared to the reference output data. If they match exactly, the problem is judged to be correct.
The database has a separate table named testcase, which can be manipulated from the web interface. Under a problem, click on the testcase link. There the files can be uploaded. The judger caches a copy based on MD5 sum, so if you need to make changes later, re-upload the data in the web interface and it will automatically be picked up.
For problems with a special compare script, things are a bit
different: testdata should still be provided as above, but the
correctness depends on the output of the custom compare script. Please
check the documentation in
judge/compare_program.sh
when using this feature.
Once everything is configured, you can start the daemons. They all run as a normal user on the system. The needed root privileges are gained by the setuid-root programs only when required.
If the daemons have started without any problems, you've come a long way! Now to check that you're ready for a contest.
First, go to the jury interface:
http://www.your-domjudge-location/jury
. Look under all the
menu items to see whether the displayed data looks sane. Use the
config-checker under `Admin Functions' for some sanity checks on your
configuration.
Go to a team workstation and see if you can access the team page and if you can submit solutions.
Next, it is time to submit some test solutions. If you have the default
Hello World problem enabled, you can submit some of the example sources
from under the doc/examples
directory. They should give `CORRECT'.
You can also try some (or all) of the sources under
tests
. Use make check
to submit a variety of
tests; this should work when the submit client is available and the
default example problems are in the active contest. There's also
make stress-test
, but be warned that these tests might crash
a judgedaemon. The results can be checked in the web interface; each
source file specifies the expected outcome with some explanations. For
convenience, there is also a script check-judgings
; this will
automatically check whether submitted sources from the
tests
directory were judged as expected. Note that a
few sources have multiple possible outcomes: these must be verified
manually.
When all this worked, you're quite ready for a contest. Or at least, the practice session of a contest.
Before running a real contest, you and/or the jury will want to test the jury's reference solutions on the system.
There is no special feature for testing their solutions under DOMjudge. The simplest approach is to submit these solutions as a special team. This method requires a few steps and some carefulness to prevent a possible information leak of the problemset. It is assumed that you have completely configured the system and contest and that all testdata is provided. To submit the jury solutions the following steps have to be taken:
Furthermore, you should make sure that the team you submit the solutions as, is in a category which is set to invisible, so that it doesn't show up on the public and team scoreboard. The sample team "DOMjudge" could be used, as it is in the "Organisation" category, which is not visible by default.