- From: Yves Raimond <yves.raimond@gmail.com>
- Date: Wed, 13 Sep 2006 13:17:19 +0100
- To: "Raphaël Troncy" <Raphael.Troncy@cwi.nl>
- Cc: g.tummarello@gmail.com, "MMSem-XG Public List" <public-xg-mmsem@w3.org>, "mark sandler" <mark.sandler@elec.qmul.ac.uk>
Hello, I am currently working in one of the lab which was involved in the SIMAC project (Centre for Digital Music, Queen Mary, University of London). Outside the SIMAC project, we have been doing some work to integrate music analysis technologies and semantic web technologies. Some of the outputs of this work may be useful for the music use-case. * We developed a music production ontology. Thus, we are able to express most of the musical features we can extract in this ontology (segmentation, beat tracking, source separation, instrument recognition, and so on...). After a while thinking about it, we completely stopped trying to express things in an MPEG7 way in our ontologies. Considering the multimedia material as a `top-level' entity was causing lots of ontological troubles, was considerably limiting the overall expressiveness and the added value was almost nil... I know this is subject to debate, so I won't get into details here:-) The ontology is available here: http://purl.org/NET/c4dm/music.owl You might also want to check the following paper: Abdallah, Raimond, Sandler - An Ontology-based Approach to Information Management for Music Analysis Systems, AES120, 2006 * We developed a framework able to easily wrap multimedia analysis algorithms to directly feed an ontology, or even to use semantic web knowledge *inside* the algorithm. This is still in active development, I will present a paper on it at SAMT. Moreover, there is another european project, co-ordinated by us which just started: EASAIER (enabling access to sound archives through integration, enrichment and retrieval). It will try to apply semantic web technologies to create a large-scale audio archive. http://www.easaier.org/ If you think we can be helpful to you, please either contact me by email, or at SAMT. Best regards, Yves Raimond PhD student Centre for Digital Music Queen Mary, University of London 2006/9/1, Raphaël Troncy <Raphael.Troncy@cwi.nl>: > > Hi Giovanni, > > A useful resource I think for the use case you coordinate. The SIMAC > (acronym for Semantic Interaction with Music Audio Contents) project was > funded by the EU (Jan 2004 - March 2006), http://www.semanticaudio.org/ > > It seems that Xavier SERRA (Universitat Pompeu Fabra) is the right > person to talk about the results of the SIMAC project. > He has proposed a contribution "Semantic Interaction with Music Audio > Content" in the next IST event. You may contact him to better know what > they have done in this project and the potential overlap with the use > case. > > Best. > > Raphaël > > -- > Raphaël Troncy > CWI (Centre for Mathematics and Computer Science), > Kruislaan 413, 1098 SJ Amsterdam, The Netherlands > e-mail: raphael.troncy@cwi.nl & raphael.troncy@gmail.com > Tel: +31 (0)20 - 592 4093 > Fax: +31 (0)20 - 592 4312 > Web: http://www.cwi.nl/~troncy/ > > > >
Received on Thursday, 14 September 2006 09:49:15 UTC