W3C home > Mailing lists > Public > semantic-web@w3.org > March 2009

Call for Journal Papers: Special Issue Evaluation Aspects of Semantic Search Applications (IJMSO, ISSN:1744-2621)

From: Darijus Strasunskas <darijuss@gmail.com>
Date: Mon, 23 Mar 2009 09:46:04 +0100
Message-ID: <3fc48d900903230146s5ec4a1a5j7b5466539372f00f@mail.gmail.com>
To: public-semweb-ui@w3.org, semanticweb@yahoogroups.com, semantic-web@yahoogroups.com, semantic-web@w3c.org
apologies for cross-posting...

                     Call for Journal Paper

                       Special Issue on
>>> Evaluation Aspects of Semantic Search Applications <<<
International Journal of Metadata, Semantics and Ontologies
                by Inderscience (ISSN: 1744-2621)



Nowadays, the Web is one of the dominant information sources for
learning and acquiring new knowledge. However, finding the relevant
information is still a huge challenge. To solve this problem, a
significant research effort has been devoted to enhance linguistics
and statistics based search by added semantics. In the recent years,
many approached to semantic search have emerged. Ontologies are
typically used by most of the approaches. Some approaches are relying
on semantic annotations by adding additional metadata; some are
enhancing clustering of retrieved documents according to topic or
semantically enriching queries; some are developing powerful querying
languages for ontology.

The progress and existing sparse evaluations of the semantic search
tools offer a promising prospect to improve performance of traditional
information retrieval (IR) systems. However, the results lack
indications whether this improvement is optimal, causing difficulties
to benchmark different semantic search systems. Yet, majority of IR
evaluation methods is mainly based on relevance of retrieved
information. While additional sophistication of the semantic search
tools adds complexity on user interaction to reach improved results.
Therefore, standard IR metrics as recall and precision do not suffice
alone to measure user satisfaction because of complexity and efforts
needed to use the semantic search systems. There is a need to
investigate what ontology properties can even further enhance search
performance, to assess whether this improvement comes at a cost of
interaction simplicity and user satisfaction, etc.

Furthermore, evaluation methods based on recall and precision do not
indicate the causes for variation in different retrieval results.
There are many other factors that influence the performance of
ontology-based information retrieval, such as query quality, ontology
quality, complexity of user interaction, difficulty of a searching
topic with respect to retrieval, indexing, searching, and ranking
methods. The detail analysis on how these factors and their
interactions affect a retrieval process can help to dramatically
improve retrieval methods or processes.

>From other hand, semantic search systems depend on correct information
specified in ontology at the appropriate level of granularity and
precision. An important body of work already exists in ontology
quality assessment field. However, most of ontology evaluation methods
are generic quality evaluation frameworks, which do not take into
account application of ontology. Therefore there is a need for task-
and scenario-based quality assessment methods that, in this particular
case, would target and optimize ontology quality for use in
information retrieval systems.

In order to promote more efficient and effective ontology usage in IR,
there is a need to contemplate on analysis of ontology quality- and
value-added aspects for this domain, summarize use cases and identify
best practices. Several issues have been put forward by the current
research, like the workload for annotation, the scalability, and the
balance between the express power and reasoning capability. An
approach to holistic evaluation should assess both technological and
economical performance viewpoints. An aspect of value creation by
semantics-based systems is important to demonstrate that the benefits
of the new technology will overwhelm the payout.

The aim of this special issue of the International Journal of
Metadata, Semantics and Ontologies is to present new and challenging
issues in semantic search and how the solutions can be evaluated,
compared and systemised. Therefore, submissions dealing with ontology
quality aspects and their impact on IR results, evaluation of
usability of the semantic search systems, analysis of user behaviour,
new evaluation methods enabling thorough and fine-grained analysis of
semantic search technological and/or financial performance, etc. are
strongly encouraged.


Original and high quality submissions that focus on different
evaluation aspects of semantic search are invited. The topics of
interest are as follows:

- Evaluation of Semantic Search systems:
 * Evaluation of information retrieval efficiency and effectiveness
 * Scalability assessment
 * Assessment of annotation quality/labour-load
 * Evaluation and benchmarking techniques and datasets
- Ontology quality aspects in Semantic Search:
 * Ontology quality evaluation
 * Ontology utility in semantic search
 * Ontology maintenance
- Evaluation of human-computer interaction:
 * Query interpretation and refinement
 * User acceptance of semantic technology
 * Usability evaluation
 * Interaction modes in semantic search
- Business value:
 * Ratio of semantics processing cost/ retrieval utility
 * Incentives for annotation and interaction
 * Costs of maintenance of semantic search solutions
 * Value of Information


All submissions will be double-blind refereed. Submitted papers should
not have been previously published nor be currently under
consideration for publication elsewhere. (N.B. Conference papers may
only be submitted if the paper was not originally copyrighted and if
it has been completely re-written).

All papers are refereed through a peer review process. A guide for
authors, sample copies and other relevant information for submitting
papers are available on the Author Guidelines
(http://www.inderscience.com/mapper.php?id=31) page.

You may send one copy in the form of an MS Word file attached to an
e-mail (details in Author Guidelines) to:

>>> eassa09@gmail.com <<<

with a copy to Editorial Office, email:

>>> ijmso@inderscience.com <<<


- Darijus Strasunskas (Dept. of Industrial Economics & Technology
 Management, NTNU, Norway)
- Stein L. Tomassen (Dept. of Computer & Information Science, NTNU,
- Jinghai Rao (AOL, China).

Contact at: >>> eassa09@gmail.com <<<


- Xi Bai (University of Edinburgh, UK)
- Robert Engels (ESIS, Norway)
- Sari E. Hakkarainen (Finland)
- Monika Lanzenberger (Vienna University of Technology, Austria)
- Kin Fun Li (University of Victoria, Canada)
- Federica Mandreoli (University of Modena e Reggio Emilia, Italy)
- Gabor Nagypal (disy Informationssysteme GmbH, Germany)
- Iadh Ounis (University of Glasgow, UK)
- Marta Sabou (The Open University, UK)
- Sergej Sizov (University of Koblenz-Landau, Germany)
- Amanda Spink (Queensland Univ. of Technology, Australia)
- Peter Spyns (Vrije Universiteit Brussel, Belgium)
- Heiko Stoermer (University of Trento, Italy)
- Nenad Stojanovic (FZI Karlsruhe, Germany)
- Victoria Uren (The Open University, UK)
- Csaba Veres (University of Bergen, Norway)


May 17, 2009            Submission of abstracts
May 31, 2009            Full paper submission
August 15, 2009         Notification about acceptance/rejection
September 27, 2009      Submission of revised version
November 15, 2009       Final camera-ready submission
First half of 2010      Publication


Received on Monday, 23 March 2009 08:46:42 GMT

This archive was generated by hypermail 2.3.1 : Tuesday, 26 March 2013 21:45:28 GMT