Skip navigation.
Home
Semantic Software Lab
Concordia University
Montréal, Canada

12th International Semantic Web Conference (ISWC) Evaluation Paper Deadline

Printer-friendly versionPrinter-friendly versionPDF versionPDF version
2013-05-10 23:59
Pacific/Honolulu

Call for Evaluation Papers
http://iswc2013.semanticweb.org/content/call-evaluation-papers

The Semantic Web and Linked Open Data has been an active area of research for several years. For the second time, the 12th edition of the International Semantic Web conference runs the Evaluation Track. Its goal is to consolidate research material and to gain new scientific insights and results by providing a place for in-depth experimental studies of significant scale. It aims at promoting experimental evaluations in Semantic Web/Linked Open Data research trying to create a research cycle between theory and experiments as in other sciences (e.g., physics).

A typical Evaluation paper has set its focus on the verification of an existing method by applying it to a specific task, and reporting the outcome of an experiment on an established or new dataset. Papers that propose new algorithms and architectures should continue to be submitted to the regular research track. Besides quantitative evaluation (i.e. Accuracy, Precision, Speed or similar measures), a qualitative error analysis should be included that discusses cases where the method succeeds and fails.
Papers in this track can fit in different categories:

* Comparative evaluation studies comparing a spectrum of approaches to a particular problem and, through extensive experiments, providing a comprehensive perspective on the underlying phenomena or approaches.
* Analyses of experimental results providing insights on the nature or characteristics of studied phenomena, including negative results.
* Result verification focusing on verifying or refuting published results and, through the renewed analysis, help to advance the state of the art.
* Benchmarking, focusing on datasets and algorithms for comprehensible and systematic evaluation of existing and future systems.
* Development of new evaluation methodologies, and their demonstration in an experimental study

Public availability of experimental datasets is highly encouraged.

In addition, these papers will be more specifically judged on the basis of their:

* Precise description of controlled experimental conditions
* Reproducibility
* Applicability range (broad coverage results being better than narrow applicability).
* Validity of the evaluation methodology (size of dataset, significance tests, ..)

Special attention will be paid to reproducibility. Hence, experimental settings will have to be extensively described so that the results could be independently reproduced, so that counter-experiments could be designed and subsequent work is enabled to improve over the presented results.

The experimental approach is based on systematic scientific methods used in many disciplines. The page http://explorable.com provides a good collection of methods, experiment design strategies, and resources.

Important Dates (same as Research Track)

* Abstracts: May 1, 2013, 11:59pm Hawaii time
* Full Paper Submission: May 10, 2013, 11:59pm Hawaii time
* Author Rebuttals: June 17-19, 2013
* Notifications: July 3, 2013
* Camera-Ready Versions: August 5, 2013
* Conference: October 21-25, 2013