1 00:00:00,000 --> 00:00:02,790 This is Paola from the AIKR 2 00:00:02,790 --> 00:00:05,860 Community Group, Artificial 3 00:00:05,860 --> 00:00:08,450 Intelligence Knowledge 4 00:00:08,450 --> 00:00:14,860 presentation at WC. And it's 5 00:00:14,860 --> 00:00:18,200 June. Finally, I get around to 6 00:00:18,200 --> 00:00:20,980 show something. I've got 7 00:00:20,980 --> 00:00:24,000 something to show finally. 8 00:00:24,000 --> 00:00:26,570 So, about the slides very 9 00:00:26,570 --> 00:00:28,950 quickly. To provide an 10 00:00:28,950 --> 00:00:31,600 introduction because a few 11 00:00:31,600 --> 00:00:34,460 people are new to this work and 12 00:00:34,460 --> 00:00:37,230 they may still haven't quite 13 00:00:37,230 --> 00:00:40,340 grasped what is being done and 14 00:00:40,340 --> 00:00:41,000 why. 15 00:00:41,000 --> 00:00:43,400 And myself, I also, as this 16 00:00:43,400 --> 00:00:46,300 goalposts were shifting and the 17 00:00:46,300 --> 00:00:48,550 ideas behind this work were 18 00:00:48,550 --> 00:00:50,920 kind of shifting. I haven't 19 00:00:50,920 --> 00:00:53,350 been sure what I was doing for 20 00:00:53,350 --> 00:00:56,170 quite a while. So, it's just to 21 00:00:56,170 --> 00:00:59,000 say this is what we're doing. 22 00:00:59,000 --> 00:01:01,230 And also, there are a few 23 00:01:01,230 --> 00:01:03,450 people who would like to 24 00:01:03,450 --> 00:01:06,440 contribute. I thought we should 25 00:01:06,440 --> 00:01:09,440 make it easier for them. Also, 26 00:01:09,440 --> 00:01:12,140 there have been a couple of 27 00:01:12,140 --> 00:01:14,480 questions that I hope this 28 00:01:14,480 --> 00:01:17,000 slides will have to answer. 29 00:01:17,000 --> 00:01:20,440 Requesting expression of 30 00:01:20,440 --> 00:01:23,680 interest. This is very 31 00:01:23,680 --> 00:01:27,560 important. So, people are 32 00:01:27,560 --> 00:01:32,060 saying, do we have funding? Why 33 00:01:32,060 --> 00:01:35,000 do we have funding? And because 34 00:01:35,000 --> 00:01:35,000 I wasn't able to be very clear. 35 00:01:35,000 --> 00:01:35,000 Now, what can start? It's 36 00:01:35,000 --> 00:01:35,000 opening the proposal. 37 00:01:35,000 --> 00:01:38,180 And I'm hoping there will be 38 00:01:38,180 --> 00:01:41,770 people will be people will come 39 00:01:41,770 --> 00:01:44,830 on board who are capable of 40 00:01:44,830 --> 00:01:48,580 articulating budgeting and all 41 00:01:48,580 --> 00:01:52,310 of the things and handle the 42 00:01:52,310 --> 00:01:55,590 bureaucracy involved. Maybe 43 00:01:55,590 --> 00:01:58,880 institutional partners or, you 44 00:01:58,880 --> 00:02:02,000 know, backers of all sorts. 45 00:02:02,000 --> 00:02:04,550 And also, it's going to be a 46 00:02:04,550 --> 00:02:07,210 way of opening up feedback so 47 00:02:07,210 --> 00:02:09,770 people can see what has been 48 00:02:09,770 --> 00:02:12,250 done with a few slides and 49 00:02:12,250 --> 00:02:14,240 finally be able to give 50 00:02:14,240 --> 00:02:17,110 feedback on what has been done 51 00:02:17,110 --> 00:02:19,000 and contribute. 52 00:02:19,000 --> 00:02:21,480 So, people who have been in the 53 00:02:21,480 --> 00:02:23,950 CG, the community group know 54 00:02:23,950 --> 00:02:26,440 most of what has been going on. 55 00:02:26,440 --> 00:02:29,050 You just search W3C AI KRSG, 56 00:02:29,050 --> 00:02:31,190 you get to the landing page, 57 00:02:31,190 --> 00:02:34,040 you get to the wiki. Mostly, it's 58 00:02:34,040 --> 00:02:36,530 been discussions. But there has 59 00:02:36,530 --> 00:02:37,990 been quite a lot of discussions. 60 00:02:37,990 --> 00:02:41,520 And there's been quite a lot of 61 00:02:41,520 --> 00:02:44,660 updates. So, the main task was 62 00:02:44,660 --> 00:02:47,080 try to understand what is going 63 00:02:47,080 --> 00:02:48,000 on in AI. 64 00:02:48,000 --> 00:02:50,690 Because this has been more easy. 65 00:02:50,690 --> 00:02:52,810 So, around the time when we 66 00:02:52,810 --> 00:02:54,660 started is the time when 67 00:02:54,660 --> 00:02:56,710 symbolic AI and knowledge 68 00:02:56,710 --> 00:02:59,280 presentation were being nailed 69 00:02:59,280 --> 00:03:01,280 in the coffin by the Turing 70 00:03:01,280 --> 00:03:04,190 Prize scholars. And at that 71 00:03:04,190 --> 00:03:06,820 time, myself, a few people 72 00:03:06,820 --> 00:03:09,230 started kind of, you know, 73 00:03:09,230 --> 00:03:11,000 getting nervous. 74 00:03:11,000 --> 00:03:13,300 And they were saying, "where is 75 00:03:13,300 --> 00:03:15,400 this going?" And soon after 76 00:03:15,400 --> 00:03:17,000 that, AI, how do you say, 77 00:03:17,000 --> 00:03:19,470 explainability risks and it all 78 00:03:19,470 --> 00:03:21,080 became mayhem of where, you 79 00:03:21,080 --> 00:03:22,760 know, what is it going to 80 00:03:22,760 --> 00:03:24,800 happen and how we're going to 81 00:03:24,800 --> 00:03:26,520 fix it. And a lot of people 82 00:03:26,520 --> 00:03:29,000 have jumped on the AI bandwagon. 83 00:03:29,000 --> 00:03:31,210 And WAGOR fits. Still not 84 00:03:31,210 --> 00:03:32,920 understanding AI, which is 85 00:03:32,920 --> 00:03:35,260 important. And I don't make any 86 00:03:35,260 --> 00:03:37,340 claims that I understand AI, 87 00:03:37,340 --> 00:03:39,170 but at least from this time 88 00:03:39,170 --> 00:03:41,350 onwards, it is documented on a 89 00:03:41,350 --> 00:03:43,260 mailing list that I've been 90 00:03:43,260 --> 00:03:45,440 paying attention, at least to 91 00:03:45,440 --> 00:03:48,000 whatever I could understand. 92 00:03:48,000 --> 00:03:49,550 Out of the need to understand 93 00:03:49,550 --> 00:03:50,970 machine learning, KR was 94 00:03:50,970 --> 00:03:52,910 involved. So, when we couldn't 95 00:03:52,910 --> 00:03:54,460 figure out what was happening 96 00:03:54,460 --> 00:03:56,390 in machine learning, the black 97 00:03:56,390 --> 00:03:57,940 box, we needed to call upon 98 00:03:57,940 --> 00:03:59,310 knowledge representation to 99 00:03:59,310 --> 00:04:03,190 figure. And at the time, people 100 00:04:03,190 --> 00:04:06,770 still weren't sure outside the 101 00:04:06,770 --> 00:04:10,000 niche of the KR people. 102 00:04:10,000 --> 00:04:13,900 We were able to communicate. So, 103 00:04:13,900 --> 00:04:17,340 we were able to communicate 104 00:04:17,340 --> 00:04:20,140 what was. And out of the need 105 00:04:20,140 --> 00:04:23,910 to understand machine learning, 106 00:04:23,910 --> 00:04:26,730 KR was needed. But I don't 107 00:04:26,730 --> 00:04:30,270 think anybody knew exactly how 108 00:04:30,270 --> 00:04:32,880 it was going to be used to 109 00:04:32,880 --> 00:04:37,000 achieve, to control the AI risk. 110 00:04:37,000 --> 00:04:38,260 We still don't know, but we're 111 00:04:38,260 --> 00:04:39,000 working on it. 112 00:04:39,000 --> 00:04:40,440 This is that progress. 113 00:04:41,180 --> 00:04:42,020 Took us a few years. 114 00:04:42,100 --> 00:04:42,820 So what is KR? 115 00:04:43,280 --> 00:04:44,120 We know that. 116 00:04:44,840 --> 00:04:46,280 You can read the rehab risk. 117 00:04:46,660 --> 00:04:48,420 Excellent education materials. 118 00:04:48,580 --> 00:04:49,800 I don't have to tell you. 119 00:04:50,580 --> 00:04:51,390 You say, okay, I'm going to 120 00:04:51,390 --> 00:04:52,080 school. 121 00:04:52,200 --> 00:04:54,120 I've been studying KR for years. 122 00:04:54,240 --> 00:04:55,220 I've been a researcher. 123 00:04:55,420 --> 00:04:56,920 I'm a professor in KR. 124 00:04:57,200 --> 00:04:58,690 There are a few, not many, a 125 00:04:58,690 --> 00:04:59,640 few people. 126 00:05:00,560 --> 00:05:03,690 But as you can see from the 127 00:05:03,690 --> 00:05:07,580 educational resources, 128 00:05:08,100 --> 00:05:09,820 which we are creating on the wiki, 129 00:05:09,820 --> 00:05:12,370 there are a lot of courses in 130 00:05:12,370 --> 00:05:13,980 knowledge of recitation. 131 00:05:14,180 --> 00:05:15,160 Some of them are excellent and 132 00:05:15,160 --> 00:05:16,160 they're accessible. 133 00:05:16,820 --> 00:05:17,880 And there are books. 134 00:05:19,180 --> 00:05:20,100 So one can say, okay, I'm going 135 00:05:20,100 --> 00:05:21,180 to study KR. 136 00:05:21,180 --> 00:05:24,170 But that doesn't mean that 137 00:05:24,170 --> 00:05:26,980 people who know KR, the 138 00:05:26,980 --> 00:05:30,820 professor or even the experts, 139 00:05:32,140 --> 00:05:33,820 know how to use it to address 140 00:05:33,820 --> 00:05:36,880 the current AI concerns. 141 00:05:37,040 --> 00:05:37,940 Or can't, you know. 142 00:05:38,040 --> 00:05:40,740 So nobody can do anything in 143 00:05:40,740 --> 00:05:43,740 absolute terms because AI is 144 00:05:43,740 --> 00:05:47,980 evolving, how do you say, organically. 145 00:05:48,200 --> 00:05:49,480 Nobody has control over, you 146 00:05:49,480 --> 00:05:51,110 know, these algorithms are out 147 00:05:51,110 --> 00:05:52,830 there, the computational power 148 00:05:52,830 --> 00:05:53,500 is there. 149 00:05:53,920 --> 00:05:55,270 And a lot of wonderful things 150 00:05:55,270 --> 00:05:56,720 are happening, a lot of useful 151 00:05:56,720 --> 00:05:57,640 applications. 152 00:05:59,800 --> 00:06:01,660 So we know that we need KR for 153 00:06:01,660 --> 00:06:06,130 accessibility and safety in 154 00:06:06,130 --> 00:06:07,640 general. 155 00:06:07,640 --> 00:06:09,640 So we're bridging the machine 156 00:06:09,640 --> 00:06:10,520 learning. 157 00:06:10,520 --> 00:06:12,440 So now that you see, KR 158 00:06:12,440 --> 00:06:15,020 typically has been used to make 159 00:06:15,020 --> 00:06:18,160 the machine to program machines. 160 00:06:18,160 --> 00:06:19,660 So we wanted an intelligent 161 00:06:19,660 --> 00:06:20,400 function. 162 00:06:20,400 --> 00:06:23,000 We need KR to tell the computer 163 00:06:23,000 --> 00:06:24,860 to execute it. 164 00:06:26,320 --> 00:06:27,940 Now the opposite is happening. 165 00:06:27,940 --> 00:06:30,200 We tell the computer is 166 00:06:30,200 --> 00:06:32,280 executing an intelligent 167 00:06:32,280 --> 00:06:33,480 function. 168 00:06:33,480 --> 00:06:35,650 And we need KR to figure out 169 00:06:35,650 --> 00:06:37,260 how it's doing it. 170 00:06:37,260 --> 00:06:39,330 So I think this is very 171 00:06:39,330 --> 00:06:41,820 exciting to see for me. 172 00:06:41,820 --> 00:06:44,000 And this is what makes it worth 173 00:06:44,000 --> 00:06:45,200 it, really. 174 00:06:45,200 --> 00:06:48,440 So in order to communicate what 175 00:06:48,440 --> 00:06:51,580 KR is, is such a huge domain, 176 00:06:51,580 --> 00:06:54,640 we need this vertical because 177 00:06:54,640 --> 00:06:57,200 there is so much of it. 178 00:06:57,200 --> 00:06:58,980 And we need to also explain the 179 00:06:58,980 --> 00:07:00,830 role of KR in explainability, 180 00:07:00,830 --> 00:07:02,880 which becomes a bit circular. 181 00:07:02,880 --> 00:07:06,630 We have to open the black box. 182 00:07:06,630 --> 00:07:10,100 And we have to use KR to create 183 00:07:10,100 --> 00:07:15,170 reliable, safe, ethical, unbiased, 184 00:07:15,170 --> 00:07:17,400 and all of that. 185 00:07:17,400 --> 00:07:19,730 So now this is the other very 186 00:07:19,730 --> 00:07:22,650 exciting bit, using KR in the 187 00:07:22,650 --> 00:07:25,200 creation of agents. Non-trivial 188 00:07:25,200 --> 00:07:27,190 because the KR is a huge 189 00:07:27,190 --> 00:07:29,650 subject. So you can point 190 00:07:29,650 --> 00:07:31,700 people to KR, educational 191 00:07:31,700 --> 00:07:34,370 resources, and there are books 192 00:07:34,370 --> 00:07:37,140 and exercises and lectures they 193 00:07:37,140 --> 00:07:38,500 can follow. 194 00:07:38,500 --> 00:07:41,750 But will they, is anyone that, 195 00:07:41,750 --> 00:07:45,310 I don't know anybody who can 196 00:07:45,310 --> 00:07:47,540 guarantee a safe, ethical AI 197 00:07:47,540 --> 00:07:50,320 system, whether they have 198 00:07:50,320 --> 00:07:52,500 studied KR or not. 199 00:07:52,500 --> 00:07:55,530 So, you know, but still we need 200 00:07:55,530 --> 00:07:58,150 to be able to, to summarize 201 00:07:58,150 --> 00:08:00,990 what KR is or which concepts 202 00:08:00,990 --> 00:08:03,480 and tools and methods in KR can 203 00:08:03,480 --> 00:08:06,220 be leveraged to guarantee AI 204 00:08:06,220 --> 00:08:07,500 safety. 205 00:08:07,500 --> 00:08:09,340 Because even experts in KR may 206 00:08:09,340 --> 00:08:11,160 not be familiar with the entire 207 00:08:11,160 --> 00:08:12,940 domain or with the state of the 208 00:08:12,940 --> 00:08:13,500 art. 209 00:08:13,500 --> 00:08:15,390 So typical expert people I've 210 00:08:15,390 --> 00:08:17,530 spoken with, they, they were 211 00:08:17,530 --> 00:08:18,980 experts 50 years ago when 212 00:08:18,980 --> 00:08:20,500 things have changed. 213 00:08:20,500 --> 00:08:23,150 AI has changed, computational, 214 00:08:23,150 --> 00:08:25,500 computer science has changed. 215 00:08:25,500 --> 00:08:27,470 Because using KR to make the 216 00:08:27,470 --> 00:08:29,370 black box transparent may 217 00:08:29,370 --> 00:08:31,610 require advancing the state of 218 00:08:31,610 --> 00:08:33,500 the art in many fields. 219 00:08:33,500 --> 00:08:35,630 KR but also logic. And this is 220 00:08:35,630 --> 00:08:37,890 another exciting area. So, at 221 00:08:37,890 --> 00:08:39,690 least we, so we have to start 222 00:08:39,690 --> 00:08:41,820 by mapping the knowledge domain 223 00:08:41,820 --> 00:08:42,500 for KR. 224 00:08:42,500 --> 00:08:45,240 So ultimately we're doing an 225 00:08:45,240 --> 00:08:47,520 ontology for knowledge 226 00:08:47,520 --> 00:08:49,500 representation. 227 00:08:49,500 --> 00:08:51,970 The scenario, important thing 228 00:08:51,970 --> 00:08:53,910 to keep in mind is that AI, 229 00:08:53,910 --> 00:08:56,300 whatever is going to be, which 230 00:08:56,300 --> 00:08:58,230 there is a lot here, which 231 00:08:58,230 --> 00:09:00,390 there is so much here I cannot 232 00:09:00,390 --> 00:09:01,500 describe. 233 00:09:01,500 --> 00:09:04,990 So, AI is actually driving and 234 00:09:04,990 --> 00:09:07,150 influencing human evolution, 235 00:09:07,150 --> 00:09:10,620 cognition, information, belief, 236 00:09:10,620 --> 00:09:13,210 decision, the politics, the 237 00:09:13,210 --> 00:09:14,500 economics. 238 00:09:14,500 --> 00:09:17,910 So AI is not just by itself. It's 239 00:09:17,910 --> 00:09:20,750 something which is very 240 00:09:20,750 --> 00:09:22,570 fundamental in human 241 00:09:22,570 --> 00:09:26,370 contemporary evolution and in 242 00:09:26,370 --> 00:09:28,500 social technologies. 243 00:09:28,500 --> 00:09:31,750 So, KR can help to mitigate 244 00:09:31,750 --> 00:09:35,510 existential risks. To be honest 245 00:09:35,510 --> 00:09:37,680 here, I have a whole talk on 246 00:09:37,680 --> 00:09:40,500 this. Systemic aberration. 247 00:09:40,500 --> 00:09:43,400 So, what is happening now is 248 00:09:43,400 --> 00:09:46,480 that things are being done 249 00:09:46,480 --> 00:09:49,110 which are not being described 250 00:09:49,110 --> 00:09:50,500 properly. 251 00:09:50,500 --> 00:09:52,490 And things that are said that 252 00:09:52,490 --> 00:09:54,820 are not being thought. There is 253 00:09:54,820 --> 00:09:57,500 a lot of confusion by mislabeling. 254 00:09:57,500 --> 00:10:01,040 And we can only use knowledge 255 00:10:01,040 --> 00:10:03,930 representation to help us disambiguate 256 00:10:03,930 --> 00:10:06,110 and get a grip with what is 257 00:10:06,110 --> 00:10:07,500 going on. 258 00:10:07,500 --> 00:10:09,410 So, if you want, I can share 259 00:10:09,410 --> 00:10:11,210 the link if you cannot click it. 260 00:10:11,210 --> 00:10:13,070 If you're interested, that's 261 00:10:13,070 --> 00:10:14,500 about an hour tall. 262 00:10:14,500 --> 00:10:16,980 If the standards being 263 00:10:16,980 --> 00:10:20,470 developed to ensure AI safety, 264 00:10:20,470 --> 00:10:23,830 and I have sifted through a few 265 00:10:23,830 --> 00:10:26,830 AI standards on the hub, are 266 00:10:26,830 --> 00:10:29,580 missing out on this essential 267 00:10:29,580 --> 00:10:31,900 AI safety concept, then the 268 00:10:31,900 --> 00:10:34,780 risk of critical failures in AI 269 00:10:34,780 --> 00:10:36,500 safety increase. 270 00:10:36,500 --> 00:10:39,370 So, I hope this is fairly 271 00:10:39,370 --> 00:10:41,800 logical and clear, despite a 272 00:10:41,800 --> 00:10:44,880 lot of effort. So, there is a 273 00:10:44,880 --> 00:10:47,410 lot of money and noise in AI 274 00:10:47,410 --> 00:10:48,500 safety. 275 00:10:48,500 --> 00:10:51,940 Nonetheless, the safety 276 00:10:51,940 --> 00:10:54,960 standards are missing out on 277 00:10:54,960 --> 00:10:58,930 essential safety concepts. And 278 00:10:58,930 --> 00:11:02,710 therefore, I think we need to 279 00:11:02,710 --> 00:11:05,840 use leverage KR to get out of 280 00:11:05,840 --> 00:11:07,500 this loop. 281 00:11:07,500 --> 00:11:10,620 So, I'm mapping the KR domain, 282 00:11:10,620 --> 00:11:13,640 simply because when people ask 283 00:11:13,640 --> 00:11:16,400 me what is KR, I want to point 284 00:11:16,400 --> 00:11:19,650 to something that is cognitively 285 00:11:19,650 --> 00:11:22,510 can be processed by a human and 286 00:11:22,510 --> 00:11:25,500 by the machine obviously. 287 00:11:25,500 --> 00:11:28,580 But, at the moment, KR is all 288 00:11:28,580 --> 00:11:31,630 over the place. So, I have 289 00:11:31,630 --> 00:11:33,500 divided into sub domain. 290 00:11:33,500 --> 00:11:36,060 So, I'm saying the knowledge 291 00:11:36,060 --> 00:11:38,690 presentation domain is made up 292 00:11:38,690 --> 00:11:41,830 of upper foundational ontologies, 293 00:11:41,830 --> 00:11:43,980 knowledge presentation 294 00:11:43,980 --> 00:11:46,420 languages and formalisms, 295 00:11:46,420 --> 00:11:49,130 domain ontologies. Oops, 296 00:11:49,130 --> 00:11:51,500 spelling mistake here. 297 00:11:51,500 --> 00:11:54,890 So, I can never remember what 298 00:11:54,890 --> 00:11:59,510 ODDS stands for. Domain? Ontology 299 00:11:59,510 --> 00:12:03,010 domain? Basically, these are 300 00:12:03,010 --> 00:12:07,020 domain ontologies. And there is 301 00:12:07,020 --> 00:12:10,410 a whole field and I'm looking 302 00:12:10,410 --> 00:12:12,960 at it. When I can remember what 303 00:12:12,960 --> 00:12:15,650 it is. AI safety standards. So, 304 00:12:15,650 --> 00:12:17,210 here is where I found the 305 00:12:17,210 --> 00:12:19,670 critical issues. Some critical 306 00:12:19,670 --> 00:12:20,500 issues. 307 00:12:20,500 --> 00:12:23,960 which are my worry, and I think 308 00:12:23,960 --> 00:12:27,500 they should be your worry too. 309 00:12:27,500 --> 00:12:32,560 So we have a lot happening in 310 00:12:32,560 --> 00:12:33,800 the knowledge presentation 311 00:12:33,800 --> 00:12:35,550 learning and ontology-driven 312 00:12:35,550 --> 00:12:37,500 agents, very exciting stuff. 313 00:12:37,500 --> 00:12:40,930 We cannot conceptualize all 314 00:12:40,930 --> 00:12:44,590 this massive field without a 315 00:12:44,590 --> 00:12:47,960 bit of sweat. So this is what 316 00:12:47,960 --> 00:12:50,600 we're doing now, and I 317 00:12:50,600 --> 00:12:53,770 appreciate very much your 318 00:12:53,770 --> 00:12:57,790 attention and the concentration 319 00:12:57,790 --> 00:13:01,500 you're applying to follow. 320 00:13:01,500 --> 00:13:03,470 Now, this is the point where 321 00:13:03,470 --> 00:13:05,740 there was a bit of a question. 322 00:13:05,740 --> 00:13:08,200 Peter Rivett said, "Oh, you're 323 00:13:08,200 --> 00:13:09,930 saying that you're going to do 324 00:13:09,930 --> 00:13:11,820 something that can make the 325 00:13:11,820 --> 00:13:13,500 system reliable." 326 00:13:13,500 --> 00:13:16,560 He wasn't quite sure this is 327 00:13:16,560 --> 00:13:20,210 something to be taken seriously 328 00:13:20,210 --> 00:13:24,390 or feasible. So I thought, okay, 329 00:13:24,390 --> 00:13:27,530 we cannot have any AI ethical 330 00:13:27,530 --> 00:13:30,500 safety standards unless... 331 00:13:30,500 --> 00:13:33,120 Unless we understand the notion 332 00:13:33,120 --> 00:13:35,930 of reliability engineering. So 333 00:13:35,930 --> 00:13:38,820 I'm intersecting these two here 334 00:13:38,820 --> 00:13:40,570 and I'm happy to look at it 335 00:13:40,570 --> 00:13:41,500 later. 336 00:13:41,500 --> 00:13:44,560 So today we are starting from 337 00:13:44,560 --> 00:13:50,160 this. We are modeling. So these 338 00:13:50,160 --> 00:13:52,720 are subdomains. Each of these 339 00:13:52,720 --> 00:13:54,570 is the subdomain for the 340 00:13:54,570 --> 00:13:56,940 knowledge representation in 341 00:13:56,940 --> 00:13:59,500 that domain, which is huge. 342 00:13:59,500 --> 00:14:02,250 And today we are just focusing 343 00:14:02,250 --> 00:14:05,180 on this state. Because I think 344 00:14:05,180 --> 00:14:07,640 we should take this step by 345 00:14:07,640 --> 00:14:10,290 step. So first question, are 346 00:14:10,290 --> 00:14:13,250 the subdomains adequate? And 347 00:14:13,250 --> 00:14:14,980 this is a nice little 348 00:14:14,980 --> 00:14:18,260 evaluation task for anyone who 349 00:14:18,260 --> 00:14:20,500 has a spare neuron. 350 00:14:20,500 --> 00:14:24,240 So, okay. So, upper ontology, 351 00:14:24,240 --> 00:14:27,620 knowledge representation 352 00:14:27,620 --> 00:14:32,010 languages, domain ontology, 353 00:14:32,010 --> 00:14:34,360 knowledge representation 354 00:14:34,360 --> 00:14:37,650 learning, and all of these. So 355 00:14:37,650 --> 00:14:39,960 is there anything in this subdomains 356 00:14:39,960 --> 00:14:41,940 that shouldn't be there, that 357 00:14:41,940 --> 00:14:43,760 doesn't belong? Or is there 358 00:14:43,760 --> 00:14:46,140 anything fundamentally missing? 359 00:14:46,140 --> 00:14:47,500 Please let me know. 360 00:14:47,500 --> 00:14:50,100 And so what are we doing? What 361 00:14:50,100 --> 00:14:52,550 I have done so far is that I've 362 00:14:52,550 --> 00:14:54,900 taken all the upper top level 363 00:14:54,900 --> 00:14:57,500 ontologies that I could put my 364 00:14:57,500 --> 00:14:58,500 hands on. 365 00:14:58,500 --> 00:15:01,810 I have identified the sources 366 00:15:01,810 --> 00:15:04,430 of knowledge, which were 367 00:15:04,430 --> 00:15:07,070 textbooks and papers and 368 00:15:07,070 --> 00:15:10,200 repositories and abstracted 369 00:15:10,200 --> 00:15:13,740 terms. In particular, I have 370 00:15:13,740 --> 00:15:17,580 extracted the classes, I think. 371 00:15:17,580 --> 00:15:19,690 And so now it's a matter of 372 00:15:19,690 --> 00:15:22,720 evaluating. So then we're going 373 00:15:22,720 --> 00:15:25,120 to develop a definition for 374 00:15:25,120 --> 00:15:27,500 each class or category. 375 00:15:27,500 --> 00:15:30,630 or streamline the duplicate 376 00:15:30,630 --> 00:15:33,550 clean up boil it down a little 377 00:15:33,550 --> 00:15:36,820 bit make a list and effect well 378 00:15:36,820 --> 00:15:37,120 so when I 379 00:15:37,120 --> 00:15:38,680 say list it looks like I'm 380 00:15:38,680 --> 00:15:40,440 doing nothing you know but I'm 381 00:15:40,440 --> 00:15:41,930 looking I'm making a bag of 382 00:15:41,930 --> 00:15:43,060 words which is 383 00:15:43,060 --> 00:15:45,270 a fundamental thing and I'll 384 00:15:45,270 --> 00:15:47,410 pay mm-hmm so I have you can 385 00:15:47,410 --> 00:15:49,410 click here if you cannot I'm 386 00:15:49,410 --> 00:15:50,380 gonna send a 387 00:15:50,380 --> 00:15:53,000 link separately a parentology 388 00:15:53,000 --> 00:15:55,580 book up typically I would put 389 00:15:55,580 --> 00:15:57,680 this in an open spreadsheet 390 00:15:57,680 --> 00:15:58,420 open for 391 00:15:58,420 --> 00:16:02,390 anyone to open but because 392 00:16:02,390 --> 00:16:07,160 there have been people who have 393 00:16:07,160 --> 00:16:11,560 kind of taken advantage of the 394 00:16:11,560 --> 00:16:13,440 openness we 395 00:16:13,440 --> 00:16:16,300 don't know who they are we have 396 00:16:16,300 --> 00:16:19,080 there is no way to track whose 397 00:16:19,080 --> 00:16:21,950 access to what resource maybe 398 00:16:21,950 --> 00:16:22,800 using it 399 00:16:22,800 --> 00:16:24,730 without credit to see doesn't 400 00:16:24,730 --> 00:16:26,800 matter because ultimately these 401 00:16:26,800 --> 00:16:28,660 are going to be published and 402 00:16:28,660 --> 00:16:31,550 licensed for everyone to use 403 00:16:31,550 --> 00:16:33,530 they're gonna be on an open 404 00:16:33,530 --> 00:16:36,840 access but in the meantime some 405 00:16:36,840 --> 00:16:37,560 people could 406 00:16:37,560 --> 00:16:40,150 just use this work to do their 407 00:16:40,150 --> 00:16:42,850 own thing and not acknowledge 408 00:16:42,850 --> 00:16:44,480 no so this this is 409 00:16:44,480 --> 00:16:45,900 unfortunately 410 00:16:45,900 --> 00:16:47,910 happened so here is the upper 411 00:16:47,910 --> 00:16:49,710 ontology vocab if you would 412 00:16:49,710 --> 00:16:51,600 like to look at it or provide 413 00:16:51,600 --> 00:16:52,800 feedback in 414 00:16:52,800 --> 00:16:55,580 any way please request and if 415 00:16:55,580 --> 00:16:57,810 you cannot see it please 416 00:16:57,810 --> 00:17:00,990 request access I will however 417 00:17:00,990 --> 00:17:02,100 already managed 418 00:17:02,100 --> 00:17:05,250 to I will send the vocab to the 419 00:17:05,250 --> 00:17:08,350 people who are already working 420 00:17:08,350 --> 00:17:11,010 on this one so two or three 421 00:17:11,010 --> 00:17:12,340 people have 422 00:17:12,340 --> 00:17:14,830 already provided feedback on a 423 00:17:14,830 --> 00:17:17,320 drums and these will receive it 424 00:17:17,320 --> 00:17:19,420 probably in an email with a 425 00:17:19,420 --> 00:17:20,460 request not 426 00:17:20,460 --> 00:17:22,400 to share it outside at the 427 00:17:22,400 --> 00:17:24,890 community group at this stage. 428 00:17:24,890 --> 00:17:27,160 Please check the current list 429 00:17:27,160 --> 00:17:27,900 of terms. 430 00:17:30,380 --> 00:17:30,560 Annotate it. So in the Excel 431 00:17:30,560 --> 00:17:33,240 sheet you may be able to create 432 00:17:33,240 --> 00:17:35,420 a column with your name 433 00:17:37,340 --> 00:17:38,210 and then in the column you 434 00:17:38,210 --> 00:17:41,680 could write something for each 435 00:17:41,680 --> 00:17:45,100 term, something like add, 436 00:17:46,860 --> 00:17:49,330 you know, this doesn't, 437 00:17:49,330 --> 00:17:52,980 whatever. So feel free to annotate 438 00:17:52,980 --> 00:17:55,500 each term or to create an 439 00:17:55,500 --> 00:17:58,140 annotation that 440 00:17:58,140 --> 00:17:59,940 applies to the entire 441 00:17:59,940 --> 00:18:02,740 vocabulary or even just email. 442 00:18:02,740 --> 00:18:05,440 So are there any terms in cost? 443 00:18:05,440 --> 00:18:07,500 About 200 terms now. 444 00:18:08,460 --> 00:18:09,890 Anything to be added? Anything 445 00:18:09,890 --> 00:18:12,340 that should be deleted? How can 446 00:18:12,340 --> 00:18:16,220 these upper ontology 447 00:18:17,020 --> 00:18:19,850 vocabulary, upper ontology be 448 00:18:19,850 --> 00:18:24,040 used? Is it useful? Complete? 449 00:18:24,040 --> 00:18:27,590 How can it be used? Make some 450 00:18:27,590 --> 00:18:28,940 use cases? 451 00:18:30,140 --> 00:18:31,680 Then eventually this is going 452 00:18:31,680 --> 00:18:34,440 to be published and it's going 453 00:18:34,440 --> 00:18:37,090 to be constituting the first 454 00:18:37,090 --> 00:18:37,980 step into 455 00:18:37,980 --> 00:18:40,840 the publication of a series of 456 00:18:40,840 --> 00:18:44,500 vocabularies which overall will 457 00:18:44,500 --> 00:18:46,440 represent the knowledge 458 00:18:46,440 --> 00:18:48,140 representation 459 00:18:49,260 --> 00:18:51,730 domain. And probably the basis 460 00:18:51,730 --> 00:18:54,040 for an ontology of knowledge 461 00:18:54,040 --> 00:18:56,650 representation. So if anyone 462 00:18:56,650 --> 00:18:57,580 interested 463 00:18:57,580 --> 00:18:59,830 in working on the list, please 464 00:18:59,830 --> 00:19:02,160 get in touch. Thank you so much 465 00:19:02,160 --> 00:19:04,590 for listening. And I'll see you 466 00:19:04,590 --> 00:19:05,820 in the group. 467 00:19:08,220 --> 00:19:08,500 Wow, this is being disclosed 468 00:19:08,500 --> 00:19:11,880 already. Look at the machine. 469 00:19:11,880 --> 00:19:14,320 This is the machine. It's doing 470 00:19:14,320 --> 00:19:14,620 it. 471 00:19:17,420 --> 00:19:19,900 The machine is opening. Bye.