LAST CALL: Evaluation of Semantic Technologies - IWEST 2010 (deadline: September 1, 2010)

[apologies for multiple postings]



Call for Papers

International Workshop on Evaluation of Semantic Technologies (IWEST 2010)

Full-day Workshop held at the 9th International Semantic Web Conference
(ISWC2010), Shanghai, China




*Deadline extended to September 1st*


The consistent evaluation of semantic technologies is critical not only for
future scientific progress but also for their widespread industrial
adoption. Such evaluation needs to address development quality (correctness
or robustness) as well as deployment qualities (interoperability or


While the community is evolving towards a more thorough evaluation of
semantic technologies, the field's inherent dynamism makes the evaluation of
semantic technologies a difficult task: as previous evaluation methods and
techniques become obsolete, new ones have to be developed as fast as
semantic technologies evolve.



IWEST workshop aims are twofold: (1) to discuss the current trends and
future challenges of evaluating semantic technologies, and (2) to support
communication and collaboration with the goal of aligning the various
evaluation efforts within the community and accelerating innovation in all
the associated fields as has been the case with both the TREC benchmarks in
information retrieval and the TPC benchmarks in database research.


In line with the goals of the workshop, we will also incorporate the results
of the 1st International Evaluation Campaign for Semantic Technologies - a
wide-ranging evaluation campaign which has been organised by the Semantic
Evaluation At Large Scale (SEALS) Initiative ( <>





High quality papers are invited from researchers interested in all aspects
of formal evaluation and benchmarking with reference to semantic
technologies ('Regular Papers'). In addition, we also invite papers from
participants of the 1st International Evaluation Campaign for Semantic
Technologies ('Evaluation Campaign Papers').


We invite contributions describing benchmarking approaches applied to
semantic technologies including, but not limited to: 

* Ontology Engineering Tools

* Semantic Search Tools

* Semantic Web Services

* Ontology Matching

* Storage and Reasoning


We encourage full papers (max 12 pages), short papers (max 6 pages) and
short demo papers (max 2 pages) describing significant work in progress,
late breaking results or ideas / challenges for the domain. Submissions must
be in PDF. Submissions must be formatted in the style of the Springer
Publications format for Lecture Notes in Computer Science (LNCS). For
details on the LNCS style, see Springer's Author Instructions at


Regular Papers and Evaluation Campaign Abstracts should be submitted to
<> no later than 23:59 GMT
on September 1st, 2010. 

Full Evaluation Campaign Papers should be submitted to
<> no later than 23:59 GMT
on September 24th, 2010. Please note, submission of an Evaluation Campaign
Paper is dependent upon submission of an Evaluation Campaign Abstract by
September 1st, 2010. 


Accepted papers will be published in the workshop proceedings. 


Important Dates



Regular Papers   Sep 1, 2010 (extended)

Evaluation Campaign Papers (Abstract) Sep 1, 2010 (extended)

Evaluation Campaign Papers (Paper) Sep 24, 2010 (extended)

Notifications Oct 1, 2010

Camera-ready Versions Oct 15, 2010

Workshop Nov 8, 2010


Organizing Committee 



Prof. Dr. Asunción Gómez-Pérez (Main contact)

Universidad Politecnica de Madrid, Spain


Prof. Fabio Ciravegna

University of Sheffield, UK


Prof. Dr. Frank van Harmelen

VU University, Amsterdam, Netherlands


Dr. Jeff Heflin

Lehigh University, USA


Evaluation Campaign Committee 



Dr. Raúl García Castro

Universidad Politécnica de Madrid, Spain


Dr. Stuart Wrigley

University of Sheffield


Dr. Zhisheng Huang

Vrije University Amsterdam, Netherlands 


Received on Tuesday, 31 August 2010 15:37:42 UTC