Re: Safe manipulation of RDF data (from semantic-web)

Ok, I'm not getting any output then, so it could se some kind of 
connection issue. The output I sent is all there is,
last line being

18-Sep-2019 11:45:44.880 INFO [main] 
org.apache.catalina.startup.Catalina.start Server startup in 343896 ms



On 18/09/2019 14:54, Martynas Jusevičius wrote:
> The logger is configured to write to stdout, so you should be able to
> see the logs in container output when log4j.properties are mounted. Is
> that not the case?
>
> If you docker run with -d, then I think you can use docker logs
> container.name to see the stdout log.
> https://docs.docker.com/engine/reference/commandline/logs/
>
> This is some of the log that I see when I run the Wikidata example:
>
> 1 * Client out-bound request
> 1 > GET https://query.wikidata.org/bigdata/namespace/wdq/sparql?query=BASE%20%20%20%20%3Chttp%3A%2F%2Flocalhost%3A8080%2F%3E%0APREFIX%20%20bd%3A%20%20%20%3Chttp%3A%2F%2Fwww.bigdata.com%2Frdf%23%3E%0APREFIX%20%20wdt%3A%20%20%3Chttp%3A%2F%2Fwww.wikidata.org%2Fprop%2Fdirect%2F%3E%0APREFIX%20%20wikibase%3A%20%3Chttp%3A%2F%2Fwikiba.se%2Fontology%23%3E%0A%0ACONSTRUCT%20%0A%20%20%7B%20%0A%20%20%20%20%3Fentity%20%3Chttps%3A%2F%2Fgithub.com%2FAtomGraph%2FProcessor%2Fblob%2Fdevelop%2Fexamples%2Fwikidata%23year%3E%20%3Fyear%20.%0A%20%20%7D%0AWHERE%0A%20%20%7B%20SELECT%20%20%3Fentity%20%3Fyear%0A%20%20%20%20WHERE%0A%20%20%20%20%20%20%7B%20BIND%28month%28now%28%29%29%20AS%20%3FnowMonth%29%0A%20%20%20%20%20%20%20%20BIND%28day%28now%28%29%29%20AS%20%3FnowDay%29%0A%20%20%20%20%20%20%20%20%3Fentity%20%20wdt%3AP569%20%20%3Fdate%0A%20%20%20%20%20%20%20%20FILTER%20%28%20%28%20month%28%3Fdate%29%20%3D%20%3FnowMonth%20%29%20%26%26%20%28%20day%28%3Fdate%29%20%3D%20%3FnowDay%20%29%20%29%0A%20%20%20%20%20%20%20%20SERVICE%20wikibase%3Alabel%0A%20%20%20%20%20%20%20%20%20%20%7B%20bd%3AserviceParam%0A%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20%20wikibase%3Alanguage%20%20%22en%22%0A%20%20%20%20%20%20%20%20%20%20%7D%0A%20%20%20%20%20%20%20%20BIND%28year%28%3Fdate%29%20AS%20%3Fyear%29%0A%20%20%20%20%20%20%7D%0A%20%20%20%20LIMIT%20%20%20100%0A%20%20%7D%0A
> 1 > Accept: application/n-quads,application/rdf+thrift,text/trig
> 1 * Client in-bound response
> 1 < 403
> 1 < Transfer-Encoding: chunked
> 1 < X-Cache: cp3033 int, cp3030 pass
> 1 < Server: Varnish
> 1 < Server-Timing: cache;desc="int-local"
> 1 < Connection: keep-alive
> 1 < X-Client-IP: 87.72.251.196
> 1 < Date: Wed, 18 Sep 2019 11:48:33 GMT
> 1 < X-Varnish: 308872225, 719836281
> 1 < Strict-Transport-Security: max-age=106384710; includeSubDomains; preload
> 1 < X-Cache-Status: int-local
> 1 < Set-Cookie:
> WMF-Last-Access-Global=18-Sep-2019;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sun,
> 20 Oct 2019 00:00:00 GMT
> 1 < Set-Cookie:
> WMF-Last-Access=18-Sep-2019;Path=/;HttpOnly;secure;Expires=Sun, 20 Oct
> 2019 00:00:00 GMT
> 1 < Vary: Accept-Encoding
> 1 < X-Analytics: https=1;nocookies=1
> 1 < Age: 0
> 1 < Content-Type: text/html; charset=utf-8
> 1 <
> <!DOCTYPE html>
> <html lang="en">
> <meta charset="utf-8">
> <title>Wikimedia Error</title>
> <style>
> * { margin: 0; padding: 0; }
> body { background: #fff; font: 15px/1.6 sans-serif; color: #333; }
> .content { margin: 7% auto 0; padding: 2em 1em 1em; max-width: 640px; }
> .footer { clear: both; margin-top: 14%; border-top: 1px solid #e5e5e5;
> background: #f9f9f9; padding: 2em 0; font-size: 0.8em; text-align:
> center; }
> img { float: left; margin: 0 2em 2em 0; }
> a img { border: 0; }
> h1 { margin-top: 1em; font-size: 1.2em; }
> .content-text { overflow: hidden; overflow-wrap: break-word; word-wrap: bre
> 11:48:33,899 ERROR SPARQLClient:177 - Query request to endpoint:
> https://query.wikidata.org/bigdata/namespace/wdq/sparql unsuccessful.
> Reason: Forbidden
> 11:48:33,948 DEBUG OntologyProvider:183 - Loading sitemap ontology
> from URI: https://github.com/AtomGraph/Processor/blob/develop/examples/wikidata#
>
> Which suggests that Wikidata's SPARQL endpoint responds with an error
> -- this has been on and off today. Although the LDT is processed
> correctly.
>
> On Wed, Sep 18, 2019 at 1:40 PM Mikael Pesonen
> <mikael.pesonen@lingsoft.fi> wrote:
>>
>> Sorry how do I access the log file? I'm not able to find it with docker cp.
>>
>> Mikael
>>
>>
>> On 18/09/2019 13:27, Martynas Jusevičius wrote:
>>> This is the file:
>>> https://raw.githubusercontent.com/AtomGraph/Processor/master/src/main/resources/log4j.properties
>>>
>>> On Wed, Sep 18, 2019 at 12:26 PM Mikael Pesonen
>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>> Could I please have a log4j.properties file too. I have no experience in
>>>> java logging...
>>>>
>>>> Mikael
>>>>
>>>> On 18/09/2019 13:20, Martynas Jusevičius wrote:
>>>>> Oops, I'll CC the list again :)
>>>>>
>>>>> Could you try mounting log4j.properties as well, like in the example?
>>>>> That should give you debug output in the container.
>>>>>
>>>>> Also a good idea to validate the ontology's Turtle syntax (or any RDF
>>>>> config really) after any changes, just to be sure. There's a handy
>>>>> online tool: http://ttl.summerofcode.be
>>>>>
>>>>> BTW you can also do docker pull atomgraph/processor, I just released
>>>>> an updated version that allows killing the container gracefully with
>>>>> Ctrl+C.
>>>>>
>>>>> On Wed, Sep 18, 2019 at 12:12 PM Mikael Pesonen
>>>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>>>> Thanks! Now I think I have the correct command line
>>>>>>
>>>>>> sudo docker run --rm -p 8090:8090 -e
>>>>>> ENDPOINT="https://query.wikidata.org/bigdata/namespace/wdq/sparql" -e
>>>>>> GRAPH_STORE="https://query.wikidata.org/bigdata/namespace/wdq/service"
>>>>>> -e
>>>>>> ONTOLOGY="https://github.com/AtomGraph/Processor/blob/develop/examples/wikidata#"
>>>>>> -v
>>>>>> "/home/text/cases/nimisampo/proxy/wikidata.ttl":"/usr/local/tomcat/webapps/ROOT/WEB-INF/classes/org/wikidata/ldt.ttl"
>>>>>> -v
>>>>>> "/home/text/cases/nimisampo/proxy/location-mapping.n3":"/usr/local/tomcat/webapps/ROOT/WEB-INF/classes/custom-mapping.n3"
>>>>>> atomgraph/processor
>>>>>>
>>>>>> When testing I get
>>>>>>
>>>>>>     >> curl -v http://localhost:8090/birthdays
>>>>>>
>>>>>> *   Trying ::1...
>>>>>> * Connected to localhost (::1) port 8090 (#0)
>>>>>>     > GET /birthdays HTTP/1.1
>>>>>>     > Host: localhost:8090
>>>>>>     > User-Agent: curl/7.47.0
>>>>>>     > Accept: */*
>>>>>>     >
>>>>>> * Recv failure: Connection reset by peer
>>>>>> * Closing connection 0
>>>>>> curl: (56) Recv failure: Connection reset by peer
>>>>>>
>>>>>>
>>>>>> Do you have and idea what could be the issue? In docker output there
>>>>>> seems to be no errors:
>>>>>>
>>>>>> @prefix lm: <http://jena.hpl.hp.com/2004/08/location-mapping#> .
>>>>>>
>>>>>> [] lm:mapping
>>>>>>
>>>>>>        [ lm:name "https://www.w3.org/ns/ldt#"
>>>>>> ;                                 lm:altName
>>>>>> "com/atomgraph/processor/ldt.ttl" ] ,
>>>>>>        [ lm:name "https://www.w3.org/ns/ldt/core/domain#"
>>>>>> ;                     lm:altName "com/atomgraph/processor/c.ttl" ] ,
>>>>>>        [ lm:name "https://www.w3.org/ns/ldt/core/templates#"
>>>>>> ;                  lm:altName "com/atomgraph/processor/ct.ttl" ] ,
>>>>>>        [ lm:name "https://www.w3.org/ns/ldt/named-graphs/templates#"
>>>>>> ;          lm:altName "com/atomgraph/processor/ngt.ttl" ] ,
>>>>>>        [ lm:name "https://www.w3.org/ns/ldt/document-hierarchy/domain#"
>>>>>> ;       lm:altName "com/atomgraph/processor/dh.ttl" ] ,
>>>>>>        [ lm:name "https://www.w3.org/ns/ldt/topic-hierarchy/templates#"
>>>>>> ;       lm:altName "com/atomgraph/processor/tht.ttl" ] ,
>>>>>>        [ lm:name "http://rdfs.org/sioc/ns#"
>>>>>> ;                                   lm:altName
>>>>>> "com/atomgraph/processor/sioc.owl" ] ,
>>>>>>        [ lm:name "http://rdfs.org/ns/void#"
>>>>>> ;                                   lm:altName
>>>>>> "com/atomgraph/processor/void.owl" ] ,
>>>>>>        [ lm:name "http://www.w3.org/2011/http#"
>>>>>> ;                               lm:altName
>>>>>> "com/atomgraph/processor/http.owl" ] ,
>>>>>>        [ lm:name "http://www.w3.org/2011/http"
>>>>>> ;                                lm:altName
>>>>>> "com/atomgraph/processor/http.owl" ] ,
>>>>>>        [ lm:name "http://www.w3.org/2011/http-statusCodes#"
>>>>>> ;                   lm:altName
>>>>>> "com/atomgraph/processor/http-statusCodes.rdf" ] ,
>>>>>>        [ lm:name "http://www.w3.org/2011/http-statusCodes"
>>>>>> ;                    lm:altName
>>>>>> "com/atomgraph/processor/http-statusCodes.rdf" ] ,
>>>>>>        [ lm:name "http://www.w3.org/ns/sparql-service-description#"
>>>>>> ;           lm:altName "com/atomgraph/processor/sparql-service.owl" ] ,
>>>>>>        [ lm:name "http://xmlns.com/foaf/0.1/"
>>>>>> ;                                 lm:altName
>>>>>> "com/atomgraph/processor/foaf.owl" ] ,
>>>>>>        [ lm:name "http://spinrdf.org/sp#"
>>>>>> ;                                     lm:altName "etc/sp.ttl" ] ,
>>>>>>        [ lm:name "http://spinrdf.org/sp"
>>>>>> ;                                      lm:altName "etc/sp.ttl" ] ,
>>>>>>        [ lm:name "http://spinrdf.org/spin#"
>>>>>> ;                                   lm:altName "etc/spin.ttl" ] ,
>>>>>>        [ lm:name "http://spinrdf.org/spin"
>>>>>> ;                                    lm:altName "etc/spin.ttl" ] ,
>>>>>>        [ lm:name "http://spinrdf.org/spl#"
>>>>>> ;                                    lm:altName "etc/spl.spin.ttl" ] ,
>>>>>>        [ lm:name "http://spinrdf.org/spl"
>>>>>> ;                                     lm:altName "etc/spl.spin.ttl" ]
>>>>>> .
>>>>>>
>>>>>> ... html for a github page ...
>>>>>>
>>>>>> 18-Sep-2019 09:57:42.303 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Server
>>>>>> version:        Apache Tomcat/8.0.52
>>>>>> 18-Sep-2019 09:57:42.305 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Server
>>>>>> built:          Apr 28 2018 16:24:29 UTC
>>>>>> 18-Sep-2019 09:57:42.306 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Server
>>>>>> number:         8.0.52.0
>>>>>> 18-Sep-2019 09:57:42.306 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log OS
>>>>>> Name:               Linux
>>>>>> 18-Sep-2019 09:57:42.306 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log OS
>>>>>> Version:            4.4.0-148-generic
>>>>>> 18-Sep-2019 09:57:42.306 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log
>>>>>> Architecture:          amd64
>>>>>> 18-Sep-2019 09:57:42.307 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Java
>>>>>> Home:             /usr/lib/jvm/java-8-openjdk-amd64/jre
>>>>>> 18-Sep-2019 09:57:42.307 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log JVM
>>>>>> Version:           1.8.0_171-8u171-b11-1~deb9u1-b11
>>>>>> 18-Sep-2019 09:57:42.307 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log JVM
>>>>>> Vendor:            Oracle Corporation
>>>>>> 18-Sep-2019 09:57:42.307 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log
>>>>>> CATALINA_BASE:         /usr/local/tomcat
>>>>>> 18-Sep-2019 09:57:42.308 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log
>>>>>> CATALINA_HOME:         /usr/local/tomcat
>>>>>> 18-Sep-2019 09:57:42.308 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Command line
>>>>>> argument:
>>>>>> -Djava.util.logging.config.file=/usr/local/tomcat/conf/logging.properties
>>>>>> 18-Sep-2019 09:57:42.308 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Command line
>>>>>> argument: -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager
>>>>>> 18-Sep-2019 09:57:42.308 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Command line
>>>>>> argument: -Djdk.tls.ephemeralDHKeySize=2048
>>>>>> 18-Sep-2019 09:57:42.309 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Command line
>>>>>> argument: -Djava.protocol.handler.pkgs=org.apache.catalina.webresources
>>>>>> 18-Sep-2019 09:57:42.309 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Command line
>>>>>> argument: -Dignore.endorsed.dirs=
>>>>>> 18-Sep-2019 09:57:42.309 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Command line
>>>>>> argument: -Dcatalina.base=/usr/local/tomcat
>>>>>> 18-Sep-2019 09:57:42.309 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Command line
>>>>>> argument: -Dcatalina.home=/usr/local/tomcat
>>>>>> 18-Sep-2019 09:57:42.310 INFO [main]
>>>>>> org.apache.catalina.startup.VersionLoggerListener.log Command line
>>>>>> argument: -Djava.io.tmpdir=/usr/local/tomcat/temp
>>>>>> 18-Sep-2019 09:57:42.310 INFO [main]
>>>>>> org.apache.catalina.core.AprLifecycleListener.lifecycleEvent Loaded APR
>>>>>> based Apache Tomcat Native library 1.2.16 using APR version 1.5.2.
>>>>>> 18-Sep-2019 09:57:42.310 INFO [main]
>>>>>> org.apache.catalina.core.AprLifecycleListener.lifecycleEvent APR
>>>>>> capabilities: IPv6 [true], sendfile [true], accept filters [false],
>>>>>> random [true].
>>>>>> 18-Sep-2019 09:57:42.314 INFO [main]
>>>>>> org.apache.catalina.core.AprLifecycleListener.initializeSSL OpenSSL
>>>>>> successfully initialized (OpenSSL 1.1.0f  25 May 2017)
>>>>>> 18-Sep-2019 09:57:42.387 INFO [main]
>>>>>> org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler
>>>>>> ["http-apr-8080"]
>>>>>> 18-Sep-2019 09:57:42.394 INFO [main]
>>>>>> org.apache.coyote.AbstractProtocol.init Initializing ProtocolHandler
>>>>>> ["ajp-apr-8009"]
>>>>>> 18-Sep-2019 09:57:42.398 INFO [main]
>>>>>> org.apache.catalina.startup.Catalina.load Initialization processed in 451 ms
>>>>>> 18-Sep-2019 09:57:42.432 INFO [main]
>>>>>> org.apache.catalina.core.StandardService.startInternal Starting service
>>>>>> Catalina
>>>>>> 18-Sep-2019 09:57:42.432 INFO [main]
>>>>>> org.apache.catalina.core.StandardEngine.startInternal Starting Servlet
>>>>>> Engine: Apache Tomcat/8.0.52
>>>>>> 18-Sep-2019 09:57:42.446 INFO [localhost-startStop-1]
>>>>>> org.apache.catalina.startup.HostConfig.deployDescriptor Deploying
>>>>>> configuration descriptor /usr/local/tomcat/conf/Catalina/localhost/ROOT.xml
>>>>>> 18-Sep-2019 09:57:43.893 INFO [localhost-startStop-1]
>>>>>> org.apache.jasper.servlet.TldScanner.scanJars At least one JAR was
>>>>>> scanned for TLDs yet contained no TLDs. Enable debug logging for this
>>>>>> logger for a complete list of JARs that were scanned but no TLDs were
>>>>>> found in them. Skipping unneeded JARs during scanning can improve
>>>>>> startup time and JSP compilation time.
>>>>>> 18-Sep-2019 09:57:43.915 INFO [localhost-startStop-1]
>>>>>> org.apache.catalina.startup.HostConfig.deployDescriptor Deployment of
>>>>>> configuration descriptor
>>>>>> /usr/local/tomcat/conf/Catalina/localhost/ROOT.xml has finished in 1,469 ms
>>>>>> 18-Sep-2019 09:57:43.917 INFO [main]
>>>>>> org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler
>>>>>> ["http-apr-8080"]
>>>>>> 18-Sep-2019 09:57:43.925 INFO [main]
>>>>>> org.apache.coyote.AbstractProtocol.start Starting ProtocolHandler
>>>>>> ["ajp-apr-8009"]
>>>>>> 18-Sep-2019 09:57:43.927 INFO [main]
>>>>>> org.apache.catalina.startup.Catalina.start Server startup in 1528 ms
>>>>>>
>>>>>>
>>>>>> Mikael
>>>>>>
>>>>>>
>>>>>> On 18/09/2019 10:58, Martynas Jusevičius wrote:
>>>>>>> Hi :)
>>>>>>>
>>>>>>> thanks for reporting, I've now fixed the links. Docker Hub reuses
>>>>>>> README from GitHub, so GitHub-relative URLs don't work.
>>>>>>>
>>>>>>> I also added a link to the Wikidata example:
>>>>>>> https://github.com/AtomGraph/Processor/tree/master/examples
>>>>>>> It contains these two files:
>>>>>>>
>>>>>>> - wikidata.ttl is the LDT ontology of the example application. It
>>>>>>> contains LDT templates, in your case :PersonItem for example.
>>>>>>> The base URI (i.e. the namespace) is totally up to you, but keep in
>>>>>>> mind the URI of the ldt:Ontology resource, because that is what
>>>>>>> specified as -e ONTOLOGY.
>>>>>>> In the Wikidata example, the ontology URI is
>>>>>>> https://github.com/AtomGraph/Processor/blob/develop/examples/wikidata#
>>>>>>> (with the trailing hash).
>>>>>>>
>>>>>>> - location-mapping.n3 - this is a config file for Jena:
>>>>>>> https://jena.apache.org/documentation/notes/file-manager.html#the-locationmapper-configuration-file
>>>>>>> In RDF we want to use URIs namespaces and not file paths. This file
>>>>>>> provides a mapping between the two, and when an application attempts
>>>>>>> to read a mapped URI, Jena actually reads it from the file it is
>>>>>>> mapped to.
>>>>>>> In the example,
>>>>>>> https://github.com/AtomGraph/Processor/blob/develop/examples/wikidata#
>>>>>>> is mapped to org/wikidata/ldt.ttl - and that is the file we mount
>>>>>>> under /usr/local/tomcat/webapps/ROOT/WEB-INF/classes/ in the
>>>>>>> container.
>>>>>>>
>>>>>>> So you do need location-mapping.n3 as well, unless your LDT ontology
>>>>>>> is resolvable from its URI. In the example the ontology URI is made up
>>>>>>> and there is no document behind it, so Jena would fail reading it if
>>>>>>> there would be no mapping.
>>>>>>>
>>>>>>> I think it would be easiest for you to reuse the example as it is,
>>>>>>> having copies of wikidata.ttl and location-mapping.n3 on your machine
>>>>>>> (you can probably skip log4j.properties) that you mount using -v.
>>>>>>> Then you can make changes in wikidata.ttl. Try to replace the
>>>>>>> :BirthdaysTemplate and query with your own and see if it works. Worry
>>>>>>> about namespaces and filenames later :)
>>>>>>>
>>>>>>> Well and -e ENDPOINT and -e GRAPH_STORE values have to be replaced
>>>>>>> with your URLs of course.
>>>>>>>
>>>>>>> Martynas
>>>>>>>
>>>>>>> On Tue, Sep 17, 2019 at 3:24 PM Mikael Pesonen
>>>>>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>>>>>> Hi again!
>>>>>>>>
>>>>>>>> Im reading instructions at https://hub.docker.com/r/atomgraph/processor. There are some broken links at top.
>>>>>>>>
>>>>>>>> docker run --rm \
>>>>>>>>         -p 8080:8080 \
>>>>>>>>         -e ENDPOINT="https://query.wikidata.org/bigdata/namespace/wdq/sparql" \
>>>>>>>>         -e GRAPH_STORE="https://query.wikidata.org/bigdata/namespace/wdq/service" \
>>>>>>>>         -e ONTOLOGY="https://github.com/AtomGraph/Processor/blob/develop/examples/wikidata#" \
>>>>>>>>         -v "/c/Users/namedgraph/WebRoot/Processor/src/main/resources/log4j.properties":"/usr/local/tomcat/webapps/ROOT/WEB-INF/classes/log4j.properties" \
>>>>>>>>         -v "/c/Users/namedgraph/WebRoot/Processor/examples/wikidata.ttl":"/usr/local/tomcat/webapps/ROOT/WEB-INF/classes/org/wikidata/ldt.ttl" \
>>>>>>>>         -v "/c/Users/namedgraph/WebRoot/Processor/examples/location-mapping.n3":"/usr/local/tomcat/webapps/ROOT/WEB-INF/classes/custom-mapping.n3" \
>>>>>>>>         atomgraph/processor
>>>>>>>>
>>>>>>>> What is the puspose of wikidata.ttl and where can I find it? location_mapping.n3 can be left out if its non custom?
>>>>>>>>
>>>>>>>> So this would work?
>>>>>>>>
>>>>>>>> docker run --rm \
>>>>>>>>         -p 8080:8080 \
>>>>>>>>         -e ENDPOINT="https://query.wikidata.org/bigdata/namespace/wdq/sparql" \
>>>>>>>>         -e GRAPH_STORE="https://query.wikidata.org/bigdata/namespace/wdq/service" \
>>>>>>>>         -e ONTOLOGY="https://github.com/AtomGraph/Processor/blob/develop/examples/wikidata#" \
>>>>>>>>         -v "/c/Users/namedgraph/WebRoot/Processor/examples/wikidata.ttl":"/usr/local/tomcat/webapps/ROOT/WEB-INF/classes/org/wikidata/ldt.ttl" \
>>>>>>>>         atomgraph/processor
>>>>>>>>
>>>>>>>>
>>>>>>>> Mikael
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On 17/09/2019 14:57, Martynas Jusevičius wrote:
>>>>>>>>
>>>>>>>> Hi Mikael,
>>>>>>>>
>>>>>>>> the template URI on its own is irrelevant here, it could be a blank
>>>>>>>> node resource. It becomes important when one intends to reuse
>>>>>>>> templates, e.g. extend them or reference them, possibly from another
>>>>>>>> LDT ontology.
>>>>>>>>
>>>>>>>> Yes it is the ldt:match that holds the URI template that request URI
>>>>>>>> is matched against. I have expanded the explanation here:
>>>>>>>> https://github.com/AtomGraph/Processor/wiki/Linked-Data-Templates#templates
>>>>>>>>
>>>>>>>> As for the agent ID, one option to pass a value to the LDT template is
>>>>>>>> using template parameters:
>>>>>>>> https://github.com/AtomGraph/Processor/wiki/Linked-Data-Templates#parameters
>>>>>>>>
>>>>>>>> Then if a request URI is
>>>>>>>> https://resource.lingsoft.fi/286c384d-cd5c-4887-9b85-94c0c147f709?agent=123456,
>>>>>>>> a variable binding (?agent, "123456") is applied to the query string
>>>>>>>> from ldt:query, before it is executed.
>>>>>>>> This might or might not work for your use case.
>>>>>>>>
>>>>>>>> Martynas
>>>>>>>>
>>>>>>>> On Tue, Sep 17, 2019 at 1:43 PM Mikael Pesonen
>>>>>>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>>>>>>
>>>>>>>> Hmm, still mixing up things. So ldt:match has to match the resource URI.
>>>>>>>>
>>>>>>>> On 17/09/2019 12:13, Mikael Pesonen wrote:
>>>>>>>>
>>>>>>>> Ok now I got it, the template address has to be the same as resource URI.
>>>>>>>> Just one question, in our case, https://base/{uuid}, how should we
>>>>>>>> forward the agent id (access level) to the template for utilizing ACL?
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On 17/09/2019 11:40, Martynas Jusevičius wrote:
>>>>>>>>
>>>>>>>> Thanks Mikael,
>>>>>>>>
>>>>>>>> the example makes it clearer.
>>>>>>>>
>>>>>>>> So the URI template for all persons (and I guess all resources in
>>>>>>>> general?) is "/{uuid}", if we take https://resource.lingsoft.fi as the
>>>>>>>> base URI. Which means that you could not match two different LDT
>>>>>>>> templates for different types of persons.
>>>>>>>>
>>>>>>>> Then my suggestion with using a single template with a query that
>>>>>>>> references the ACL graph still stands. Let me know if you need help
>>>>>>>> setting it up in Processor.
>>>>>>>>
>>>>>>>> Martynas
>>>>>>>>
>>>>>>>> On Mon, Sep 16, 2019 at 11:27 AM Mikael Pesonen
>>>>>>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>>>>>>
>>>>>>>> Here is a sample data:
>>>>>>>>
>>>>>>>> <https://resource.lingsoft.fi/286c384d-cd5c-4887-9b85-94c0c147f709>
>>>>>>>>               a                        foaf:Person ;
>>>>>>>>               vcard:family-name        "Pesonen" ;
>>>>>>>>               vcard:fn                 "Mikael Pesonen" ;
>>>>>>>>               vcard:given-name         "Mikael" ;
>>>>>>>>               vcard:hasEmail
>>>>>>>> <https://resource.lingsoft.fi/cf9b02b7-bd0d-486e-b0d9-da1464e27d2e> ,
>>>>>>>> <https://resource.lingsoft.fi/5c04aa23-6c42-44a1-9ac9-69ee255ac170> ;
>>>>>>>>               vcard:hasGender          vcard:Male ;
>>>>>>>>               vcard:hasInstantMessage
>>>>>>>> <https://resource.lingsoft.fi/4aa01d37-744c-4964-a794-d997aa376584> ;
>>>>>>>>               vcard:hasPhoto
>>>>>>>> <https://resource.lingsoft.fi/8f4a4ddd-43c2-4e27-8ed7-996dd00e939c> ;
>>>>>>>>               vcard:hasTelephone
>>>>>>>> <https://resource.lingsoft.fi/3755ed0c-81b7-430e-92a0-16fc80ba41b4> ;
>>>>>>>>               org:basedAt
>>>>>>>> <https://resource.lingsoft.fi/b48a0820-6921-43fc-a346-e72397265bbe> ;
>>>>>>>>               org:memberOf
>>>>>>>> <https://resource.lingsoft.fi/810dfbff-e6fb-458a-b27d-3726a27e5109> ;
>>>>>>>>               foaf:account
>>>>>>>> <https://resource.lingsoft.fi/2f0aa772-f845-4f43-b607-dc65ff66b9aa> ;
>>>>>>>> <https://resource.lingsoft.fi/cf9b02b7-bd0d-486e-b0d9-da1464e27d2e>
>>>>>>>>               a                         vcard:Email , vcard:Work ;
>>>>>>>>               rdfs:label                "***@lingsoft.fi" ;
>>>>>>>>               vcard:hasValue <mailto:***@lingsoft.fi> .
>>>>>>>>
>>>>>>>>
>>>>>>>> So most of the person's values are resources and every resource has id
>>>>>>>> of type https://resource.lingsoft.fi/<UUID>.
>>>>>>>>
>>>>>>>>
>>>>>>>> Mikael
>>>>>>>>
>>>>>>>>
>>>>>>>> On 15/09/2019 01:02, Martynas Jusevičius wrote:
>>>>>>>>
>>>>>>>> I meant the first and the ACL examples as alternatives, but yes you
>>>>>>>> can combine the approaches as well. Again, depends mostly on your URIs
>>>>>>>> - and are able to change their pattern?
>>>>>>>>
>>>>>>>> I think it would help if you could show some RDF data that represents
>>>>>>>> your case (does not have to be the actual person data :)) Either paste
>>>>>>>> inline or as a Gist if it's larger.
>>>>>>>>
>>>>>>>> Re. ACL, we use a filter in our LinkedDataHub platform that checks ACL
>>>>>>>> access before the actual LDT request is invoked. And if query results
>>>>>>>> need to depend on the access level, we reference the ACL dataset as I
>>>>>>>> showed in the example.
>>>>>>>>
>>>>>>>> Martynas
>>>>>>>>
>>>>>>>> On Fri, Sep 13, 2019 at 3:55 PM Mikael Pesonen
>>>>>>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>>>>>>
>>>>>>>> Looking at your first example, looks like that and this acl
>>>>>>>> version work
>>>>>>>> both?
>>>>>>>>
>>>>>>>> So as with your first example:
>>>>>>>>
>>>>>>>> /person/basic_access/{id}
>>>>>>>> --
>>>>>>>>
>>>>>>>> :BasicPersonAccessItem a ldt:Template ;
>>>>>>>>            ldt:match "/person/basic_access/{id}" ;
>>>>>>>>            ldt:query :ConstructBasicPerson ;
>>>>>>>>
>>>>>>>> ----
>>>>>>>> /person/admin_access/{id}
>>>>>>>> --
>>>>>>>> :AdminPersonAccessItem a ldt:Template ;
>>>>>>>>            ldt:match "/person/admin_access/{id}" ;
>>>>>>>>            ldt:query :ConstructFullPerson ;
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> And this acl example
>>>>>>>>
>>>>>>>> /person/{agent}/{id}
>>>>>>>> --
>>>>>>>> :PersonAccessItem a ldt:Template ;
>>>>>>>>            ldt:match "/person/{agent}/{id}" ;
>>>>>>>>            ldt:query :ConstructPerson ;
>>>>>>>> ...
>>>>>>>>
>>>>>>>>
>>>>>>>> ACL example sure is more refined since you can define the access
>>>>>>>> levels
>>>>>>>> in the ACL data.
>>>>>>>>
>>>>>>>>
>>>>>>>> On 13/09/2019 16:25, Martynas Jusevičius wrote:
>>>>>>>>
>>>>>>>> Well if you only have one kind of person resources with a single URI
>>>>>>>> pattern, then you cannot select (match) different LDT templates.
>>>>>>>> That is because an LDT template maps one URI pattern to one SPARQL
>>>>>>>> command. The matching process is not looking into the SPARQL query
>>>>>>>> results at all, only at the request URI and the application's LDT
>>>>>>>> ontology.
>>>>>>>>
>>>>>>>> I think you can solve this with a single query though. What we do is
>>>>>>>> provide the URI of the requesting agent as a query binding, e.g.
>>>>>>>> ?agent variable. Something like
>>>>>>>>
>>>>>>>> :ConstructPerson a sp:Construct ;
>>>>>>>>            sp:text """
>>>>>>>> PREFIX  foaf: <http://xmlns.com/foaf/0.1/>
>>>>>>>> PREFIX  acl:  <http://www.w3.org/ns/auth/acl#>
>>>>>>>>
>>>>>>>> CONSTRUCT
>>>>>>>>          {
>>>>>>>>            ?this a foaf:Person .
>>>>>>>>            ?this foaf:name ?name .
>>>>>>>>            ?this ?p ?o .
>>>>>>>>          }
>>>>>>>> WHERE
>>>>>>>>          {   { ?this  a                     foaf:Person ;
>>>>>>>>                       foaf:name             ?name
>>>>>>>>              }
>>>>>>>>            UNION
>>>>>>>>              { GRAPH <acl>
>>>>>>>>                  { ?auth  acl:accessTo  ?this ;
>>>>>>>>                         acl:agent ?agent .
>>>>>>>>                  }
>>>>>>>>                ?this  ?p  ?o
>>>>>>>>              }
>>>>>>>>          }
>>>>>>>>            """ ;
>>>>>>>>            rdfs:isDefinedBy : .
>>>>>>>>
>>>>>>>> The idea is that the person query always returns "basic" properties,
>>>>>>>> and adds all properties *only* if the agent ?agent has an
>>>>>>>> authorization to access the requested resource ?this.
>>>>>>>> This approach requires that the query has access to the ACL data,
>>>>>>>> which I have indicated here as GRAPH <acl>. The actual pattern for
>>>>>>>> authorization check will probably be more complex of course.
>>>>>>>> It also requires that the authentication mechanism can provide
>>>>>>>> the URI
>>>>>>>> of the agent.
>>>>>>>>
>>>>>>>> I hope I got what you meant :)
>>>>>>>>
>>>>>>>> On Fri, Sep 13, 2019 at 2:58 PM Mikael Pesonen
>>>>>>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>>>>>>
>>>>>>>> Ah, I might have explained our case bit vaguely. So I just meant
>>>>>>>> that we
>>>>>>>> have in RDF data one kind of person resources, and
>>>>>>>> depending on the access rights in the application, you are
>>>>>>>> allowed to
>>>>>>>> see different portions of that person's data.
>>>>>>>> Basic user sees only the name, for example, and admin user is
>>>>>>>> allowed to
>>>>>>>> see all data. This is handled by selecting different template
>>>>>>>> for basic
>>>>>>>> user and admin, right?
>>>>>>>>
>>>>>>>>
>>>>>>>>
>>>>>>>> On 13/09/2019 15:52, Martynas Jusevičius wrote:
>>>>>>>>
>>>>>>>> Mikael,
>>>>>>>>
>>>>>>>> this is related to hierarchical URIs:
>>>>>>>> http://patterns.dataincubator.org/book/hierarchical-uris.html
>>>>>>>>
>>>>>>>> In your case, the question is how you have organized the
>>>>>>>> collections/items of basic and admin persons in your dataset.
>>>>>>>>
>>>>>>>> One option is that both "basic persons" and "admin persons"
>>>>>>>> belong to
>>>>>>>> the same collection and have a single URI pattern: /persons/{id}
>>>>>>>> In this case you cannot tell if resource /persons/12345 is a
>>>>>>>> "basic
>>>>>>>> person" or "admin person" just from its URI. You need to
>>>>>>>> dereference
>>>>>>>> it and the look into RDF types and properties.
>>>>>>>>
>>>>>>>> Another option is that you treat them as belonging to separate
>>>>>>>> collections, for example: /persons/{id} and /admins/{id}
>>>>>>>> In this case you can easily tell if a resource is a "basic
>>>>>>>> person" or
>>>>>>>> an "admin person" already from its URIs.
>>>>>>>>
>>>>>>>> Linked Data Templates are best suited for this second case,
>>>>>>>> where URI
>>>>>>>> space is subdivided into hierarchies based on entity types.
>>>>>>>> That makes
>>>>>>>> it easy to define URI templates that match precisely the set of
>>>>>>>> resources that you want.
>>>>>>>>
>>>>>>>> Does it make it clearer?
>>>>>>>>
>>>>>>>> On Fri, Sep 13, 2019 at 2:08 PM Mikael Pesonen
>>>>>>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>>>>>>
>>>>>>>> Hi Martynas,
>>>>>>>>
>>>>>>>> thank you for the examples, GET seems clear now.
>>>>>>>>
>>>>>>>> Good point about the person / document. We probably end up
>>>>>>>> with three
>>>>>>>> kind of resources: actual object, admin record (who last
>>>>>>>> modified etc),
>>>>>>>> and web page or another document about the object.
>>>>>>>>
>>>>>>>> Just one question: what did you mean by
>>>>>>>>
>>>>>>>> "If you cannot distinguish "basic person" from "admin person"
>>>>>>>> by their
>>>>>>>> URIs"?
>>>>>>>>
>>>>>>>>
>>>>>>>> We are not quite there yet with updates, so we might have
>>>>>>>> questions
>>>>>>>> later about those.
>>>>>>>>
>>>>>>>> Br,
>>>>>>>> Mikael
>>>>>>>>
>>>>>>>> On 11/09/2019 18:45, Martynas Jusevičius wrote:
>>>>>>>>
>>>>>>>> Hi Mikael,
>>>>>>>>
>>>>>>>> thanks for reaching out.
>>>>>>>>
>>>>>>>> There is more information on LDT in the AtomGraph Processor
>>>>>>>> wiki, more
>>>>>>>> specifically:
>>>>>>>> https://github.com/AtomGraph/Processor/wiki/Linked-Data-Templates
>>>>>>>>
>>>>>>>>
>>>>>>>> The matching is based on URIs: relative request URI is being
>>>>>>>> matched
>>>>>>>> against the ldt:match values of templates in the ontology.
>>>>>>>>
>>>>>>>> Then, from the matching template (if there is any), the
>>>>>>>> SPARQL command
>>>>>>>> is retrieved using either ldt:query or ldt:update (depending
>>>>>>>> on the
>>>>>>>> HTTP request method).
>>>>>>>>
>>>>>>>> To address your example, templates and queries could look
>>>>>>>> like this:
>>>>>>>>
>>>>>>>> :BasicPersonItem a ldt:Template ;
>>>>>>>>              ldt:match "/person/basic/{id}" ;
>>>>>>>>              ldt:query :ConstructBasicPerson ;
>>>>>>>>              rdfs:isDefinedBy : .
>>>>>>>>
>>>>>>>> :ConstructBasicPerson a sp:Construct ;
>>>>>>>>              sp:text """
>>>>>>>>              PREFIX  foaf: <http://xmlns.com/foaf/0.1/>
>>>>>>>>
>>>>>>>>              CONSTRUCT
>>>>>>>>              {
>>>>>>>>                  ?this a foaf:Person ;
>>>>>>>>                      foaf:name ?name .
>>>>>>>>              }
>>>>>>>>              {
>>>>>>>>                  ?this a foaf:Person ;
>>>>>>>>                      foaf:name ?name .
>>>>>>>>              }
>>>>>>>>              """ ;
>>>>>>>>              rdfs:isDefinedBy : .
>>>>>>>>
>>>>>>>> :AdminPersonItem a ldt:Template ;
>>>>>>>>              ldt:match "/person/admin/{id}" ;
>>>>>>>>              ldt:query :ConstructAdminPerson ;
>>>>>>>>              rdfs:isDefinedBy : .
>>>>>>>>
>>>>>>>> :ConstructAdminPerson a sp:Construct ;
>>>>>>>>              sp:text """
>>>>>>>>              CONSTRUCT WHERE
>>>>>>>>              {
>>>>>>>>                  ?this ?p ?o
>>>>>>>>              }
>>>>>>>>              """ ;
>>>>>>>>              rdfs:isDefinedBy : .
>>>>>>>>
>>>>>>>> "Basic person" query retrieves only name and type, "admin
>>>>>>>> person"
>>>>>>>> query retrieves all properties.
>>>>>>>> This example requires that basic and admin person resources
>>>>>>>> can be
>>>>>>>> differentiated by their URIs, i.e. "/person/basic/{id}" vs
>>>>>>>> "/person/admin/{id}".
>>>>>>>>
>>>>>>>> It also assumes that persons are documents (since they can be
>>>>>>>> dereferenced over HTTP), which is not kosher re. httpRange-14
>>>>>>>> [1]. A
>>>>>>>> better solution would have separate resources for persons
>>>>>>>> e.g. using
>>>>>>>> hash URIs such as #this) and explicitly connect them to
>>>>>>>> documents
>>>>>>>> using an RDF property. We use
>>>>>>>> foaf:primaryTopic/foaf:isPrimaryTopicOf.
>>>>>>>> But this is a whole topic on its own.
>>>>>>>>
>>>>>>>> If you cannot distinguish "basic person" from "admin person"
>>>>>>>> by their
>>>>>>>> URIs, you could also have a template that matches both and
>>>>>>>> maps to a
>>>>>>>> single query. The question is then whether you can
>>>>>>>> differentiate which
>>>>>>>> properties to return using a single query.
>>>>>>>>
>>>>>>>> Does this help?
>>>>>>>>
>>>>>>>>
>>>>>>>> [1] https://www.w3.org/2001/tag/group/track/issues/14
>>>>>>>>
>>>>>>>> Martynas
>>>>>>>> atomgraph.com
>>>>>>>>
>>>>>>>> On Wed, Sep 11, 2019 at 11:21 AM Mikael Pesonen
>>>>>>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>>>>>>
>>>>>>>> Hi Martynas,
>>>>>>>>
>>>>>>>> we have a proprietary implementation now:
>>>>>>>>
>>>>>>>> js/React app generates a custom json out of form data. That
>>>>>>>> is sent
>>>>>>>> (with a template id) to also custom proxy, which converts
>>>>>>>> the json into
>>>>>>>> SPARQL using pre made templates. SPARQL is then queried on
>>>>>>>> Apache Jena.
>>>>>>>>
>>>>>>>> Now we would like to replace all custom bits with one ore
>>>>>>>> more standards.
>>>>>>>>
>>>>>>>> Is it possible to have any kind of templates with LDT? For
>>>>>>>> example
>>>>>>>> "person_basic" and "person_admin",
>>>>>>>> where admin contains more properties of a person?
>>>>>>>>
>>>>>>>> I'm still having trouble to understand how the SPARQL
>>>>>>>> template is
>>>>>>>> selected with LDT.
>>>>>>>>
>>>>>>>> Br,
>>>>>>>> Mikael
>>>>>>>>
>>>>>>>> On 10/09/2019 15:50, Martynas Jusevičius wrote:
>>>>>>>>
>>>>>>>> Hey Mikael,
>>>>>>>>
>>>>>>>> we have a simple example here:
>>>>>>>> https://github.com/AtomGraph/Processor#example
>>>>>>>>
>>>>>>>> Do you have some specific use case in mind? If you can
>>>>>>>> share it, I can
>>>>>>>> probably look into it.
>>>>>>>>
>>>>>>>> There is a Community Group for Linked Data Templates which
>>>>>>>> includes a
>>>>>>>> mailing list: https://www.w3.org/community/declarative-apps/
>>>>>>>>
>>>>>>>> Martynas
>>>>>>>> atomgraph.com
>>>>>>>>
>>>>>>>> On Tue, Sep 10, 2019 at 1:27 PM Mikael Pesonen
>>>>>>>> <mikael.pesonen@lingsoft.fi> wrote:
>>>>>>>>
>>>>>>>> In the example there is the GET request
>>>>>>>>
>>>>>>>> GET
>>>>>>>> /people/Berners-Lee?g=http%3A%2F%2Flinkeddatahub.com%2Fgraphs%2Fc5f34fe9-0456-48e8-a371-04be71529762
>>>>>>>> HTTP/1.1
>>>>>>>>
>>>>>>>>
>>>>>>>> Often you want to query different amounts of data
>>>>>>>> depending of the case. Sometimes for example, person name
>>>>>>>> is enough, other time you want all the triples (DESCRIBE).
>>>>>>>> How do you specify here the context?
>>>>>>>>
>>>>>>>> BTW is there a dedicated forum for discussing Linked Data
>>>>>>>> Templates?
>>>>>>>>
>>>>>>>> --
>>>>>>>> Lingsoft - 30 years of Leading Language Management
>>>>>>>>
>>>>>>>> www.lingsoft.fi
>>>>>>>>
>>>>>>>> Speech Applications - Language Management - Translation -
>>>>>>>> Reader's and Writer's Tools - Text Tools - E-books and M-books
>>>>>>>>
>>>>>>>> Mikael Pesonen
>>>>>>>> System Engineer
>>>>>>>>
>>>>>>>> e-mail: mikael.pesonen@lingsoft.fi
>>>>>>>> Tel. +358 2 279 3300
>>>>>>>>
>>>>>>>> Time zone: GMT+2
>>>>>>>>
>>>>>>>> Helsinki Office
>>>>>>>> Eteläranta 10
>>>>>>>> FI-00130 Helsinki
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> Turku Office
>>>>>>>> Kauppiaskatu 5 A
>>>>>>>> FI-20100 Turku
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> --
>>>>>>>> Lingsoft - 30 years of Leading Language Management
>>>>>>>>
>>>>>>>> www.lingsoft.fi
>>>>>>>>
>>>>>>>> Speech Applications - Language Management - Translation -
>>>>>>>> Reader's and Writer's Tools - Text Tools - E-books and M-books
>>>>>>>>
>>>>>>>> Mikael Pesonen
>>>>>>>> System Engineer
>>>>>>>>
>>>>>>>> e-mail: mikael.pesonen@lingsoft.fi
>>>>>>>> Tel. +358 2 279 3300
>>>>>>>>
>>>>>>>> Time zone: GMT+2
>>>>>>>>
>>>>>>>> Helsinki Office
>>>>>>>> Eteläranta 10
>>>>>>>> FI-00130 Helsinki
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> Turku Office
>>>>>>>> Kauppiaskatu 5 A
>>>>>>>> FI-20100 Turku
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> --
>>>>>>>> Lingsoft - 30 years of Leading Language Management
>>>>>>>>
>>>>>>>> www.lingsoft.fi
>>>>>>>>
>>>>>>>> Speech Applications - Language Management - Translation -
>>>>>>>> Reader's and Writer's Tools - Text Tools - E-books and M-books
>>>>>>>>
>>>>>>>> Mikael Pesonen
>>>>>>>> System Engineer
>>>>>>>>
>>>>>>>> e-mail: mikael.pesonen@lingsoft.fi
>>>>>>>> Tel. +358 2 279 3300
>>>>>>>>
>>>>>>>> Time zone: GMT+2
>>>>>>>>
>>>>>>>> Helsinki Office
>>>>>>>> Eteläranta 10
>>>>>>>> FI-00130 Helsinki
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> Turku Office
>>>>>>>> Kauppiaskatu 5 A
>>>>>>>> FI-20100 Turku
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> --
>>>>>>>> Lingsoft - 30 years of Leading Language Management
>>>>>>>>
>>>>>>>> www.lingsoft.fi
>>>>>>>>
>>>>>>>> Speech Applications - Language Management - Translation - Reader's
>>>>>>>> and Writer's Tools - Text Tools - E-books and M-books
>>>>>>>>
>>>>>>>> Mikael Pesonen
>>>>>>>> System Engineer
>>>>>>>>
>>>>>>>> e-mail: mikael.pesonen@lingsoft.fi
>>>>>>>> Tel. +358 2 279 3300
>>>>>>>>
>>>>>>>> Time zone: GMT+2
>>>>>>>>
>>>>>>>> Helsinki Office
>>>>>>>> Eteläranta 10
>>>>>>>> FI-00130 Helsinki
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> Turku Office
>>>>>>>> Kauppiaskatu 5 A
>>>>>>>> FI-20100 Turku
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> --
>>>>>>>> Lingsoft - 30 years of Leading Language Management
>>>>>>>>
>>>>>>>> www.lingsoft.fi
>>>>>>>>
>>>>>>>> Speech Applications - Language Management - Translation - Reader's
>>>>>>>> and Writer's Tools - Text Tools - E-books and M-books
>>>>>>>>
>>>>>>>> Mikael Pesonen
>>>>>>>> System Engineer
>>>>>>>>
>>>>>>>> e-mail: mikael.pesonen@lingsoft.fi
>>>>>>>> Tel. +358 2 279 3300
>>>>>>>>
>>>>>>>> Time zone: GMT+2
>>>>>>>>
>>>>>>>> Helsinki Office
>>>>>>>> Eteläranta 10
>>>>>>>> FI-00130 Helsinki
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> Turku Office
>>>>>>>> Kauppiaskatu 5 A
>>>>>>>> FI-20100 Turku
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> --
>>>>>>>> Lingsoft - 30 years of Leading Language Management
>>>>>>>>
>>>>>>>> www.lingsoft.fi
>>>>>>>>
>>>>>>>> Speech Applications - Language Management - Translation - Reader's and Writer's Tools - Text Tools - E-books and M-books
>>>>>>>>
>>>>>>>> Mikael Pesonen
>>>>>>>> System Engineer
>>>>>>>>
>>>>>>>> e-mail: mikael.pesonen@lingsoft.fi
>>>>>>>> Tel. +358 2 279 3300
>>>>>>>>
>>>>>>>> Time zone: GMT+2
>>>>>>>>
>>>>>>>> Helsinki Office
>>>>>>>> Eteläranta 10
>>>>>>>> FI-00130 Helsinki
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> Turku Office
>>>>>>>> Kauppiaskatu 5 A
>>>>>>>> FI-20100 Turku
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>>
>>>>>>>> --
>>>>>>>> Lingsoft - 30 years of Leading Language Management
>>>>>>>>
>>>>>>>> www.lingsoft.fi
>>>>>>>>
>>>>>>>> Speech Applications - Language Management - Translation - Reader's and Writer's Tools - Text Tools - E-books and M-books
>>>>>>>>
>>>>>>>> Mikael Pesonen
>>>>>>>> System Engineer
>>>>>>>>
>>>>>>>> e-mail: mikael.pesonen@lingsoft.fi
>>>>>>>> Tel. +358 2 279 3300
>>>>>>>>
>>>>>>>> Time zone: GMT+2
>>>>>>>>
>>>>>>>> Helsinki Office
>>>>>>>> Eteläranta 10
>>>>>>>> FI-00130 Helsinki
>>>>>>>> FINLAND
>>>>>>>>
>>>>>>>> Turku Office
>>>>>>>> Kauppiaskatu 5 A
>>>>>>>> FI-20100 Turku
>>>>>>>> FINLAND
>>>>>> --
>>>>>> Lingsoft - 30 years of Leading Language Management
>>>>>>
>>>>>> www.lingsoft.fi
>>>>>>
>>>>>> Speech Applications - Language Management - Translation - Reader's and Writer's Tools - Text Tools - E-books and M-books
>>>>>>
>>>>>> Mikael Pesonen
>>>>>> System Engineer
>>>>>>
>>>>>> e-mail: mikael.pesonen@lingsoft.fi
>>>>>> Tel. +358 2 279 3300
>>>>>>
>>>>>> Time zone: GMT+2
>>>>>>
>>>>>> Helsinki Office
>>>>>> Eteläranta 10
>>>>>> FI-00130 Helsinki
>>>>>> FINLAND
>>>>>>
>>>>>> Turku Office
>>>>>> Kauppiaskatu 5 A
>>>>>> FI-20100 Turku
>>>>>> FINLAND
>>>>>>
>>>> --
>>>> Lingsoft - 30 years of Leading Language Management
>>>>
>>>> www.lingsoft.fi
>>>>
>>>> Speech Applications - Language Management - Translation - Reader's and Writer's Tools - Text Tools - E-books and M-books
>>>>
>>>> Mikael Pesonen
>>>> System Engineer
>>>>
>>>> e-mail: mikael.pesonen@lingsoft.fi
>>>> Tel. +358 2 279 3300
>>>>
>>>> Time zone: GMT+2
>>>>
>>>> Helsinki Office
>>>> Eteläranta 10
>>>> FI-00130 Helsinki
>>>> FINLAND
>>>>
>>>> Turku Office
>>>> Kauppiaskatu 5 A
>>>> FI-20100 Turku
>>>> FINLAND
>>>>
>> --
>> Lingsoft - 30 years of Leading Language Management
>>
>> www.lingsoft.fi
>>
>> Speech Applications - Language Management - Translation - Reader's and Writer's Tools - Text Tools - E-books and M-books
>>
>> Mikael Pesonen
>> System Engineer
>>
>> e-mail: mikael.pesonen@lingsoft.fi
>> Tel. +358 2 279 3300
>>
>> Time zone: GMT+2
>>
>> Helsinki Office
>> Eteläranta 10
>> FI-00130 Helsinki
>> FINLAND
>>
>> Turku Office
>> Kauppiaskatu 5 A
>> FI-20100 Turku
>> FINLAND
>>

-- 
Lingsoft - 30 years of Leading Language Management

www.lingsoft.fi

Speech Applications - Language Management - Translation - Reader's and Writer's Tools - Text Tools - E-books and M-books

Mikael Pesonen
System Engineer

e-mail: mikael.pesonen@lingsoft.fi
Tel. +358 2 279 3300

Time zone: GMT+2

Helsinki Office
Eteläranta 10
FI-00130 Helsinki
FINLAND

Turku Office
Kauppiaskatu 5 A
FI-20100 Turku
FINLAND

Received on Wednesday, 18 September 2019 12:28:43 UTC