0:00:16 | i |
---|
0:00:16 | so i have to be too large and |
---|
0:00:20 | discourses |
---|
0:00:22 | i know system |
---|
0:00:23 | it means that we are to go and talk about |
---|
0:00:28 | how we |
---|
0:00:29 | in one meeting |
---|
0:00:31 | from it |
---|
0:00:32 | so i which is we come from an industry which is working |
---|
0:00:38 | it provides it support providing industry |
---|
0:00:42 | as a lot of the and have a lot of a two d |
---|
0:00:47 | a chest in the process of trying to exploit those text today or deixis that |
---|
0:00:53 | is what we get trying to me in two |
---|
0:00:56 | to exploit the state that to extract relevant information |
---|
0:01:00 | so maybe presenting and here's my would be what is also order |
---|
0:01:07 | so i'll do that easy or difficult questions can be addressed by hand |
---|
0:01:13 | so is an introduction about so like that |
---|
0:01:18 | the problem or what kind of for information extraction are talking about |
---|
0:01:23 | so this is easy their relationships as we let us know more about motion all |
---|
0:01:31 | between the two portions of sentences or multiple portions of sentences |
---|
0:01:36 | we are interested in this is off relationship which is close to we extract |
---|
0:01:44 | well as |
---|
0:01:45 | effect from data |
---|
0:01:46 | relationships |
---|
0:01:48 | the sound so it will be child trying to |
---|
0:01:52 | actually support what i see that while it is important to industry |
---|
0:01:57 | so the are a large and relations which are extracted from there so different domains |
---|
0:02:03 | as a response tires set due to faulty here |
---|
0:02:07 | of course it's forty here |
---|
0:02:09 | issue where |
---|
0:02:11 | recording of a hours and then a company going |
---|
0:02:14 | has been all over the |
---|
0:02:16 | so why this is important is that the and |
---|
0:02:20 | they are all these kind of ripples |
---|
0:02:23 | happening in one industry or one particular organization |
---|
0:02:27 | a organizations from past experience |
---|
0:02:30 | will know |
---|
0:02:32 | here comes a novel which can be of what entiated |
---|
0:02:37 | which can be potentially difficult for me |
---|
0:02:41 | that is what really kind of pretty systems we are talking about |
---|
0:02:47 | building for industry which is not only coming from |
---|
0:02:51 | d down which the rules for demand forecast et cetera et cetera |
---|
0:02:55 | but also using a lot of information that can be there in that |
---|
0:03:02 | the second one is |
---|
0:03:04 | actually an example which is coming from i don't know utilities and ask who are |
---|
0:03:10 | always bothered about safety regulations |
---|
0:03:13 | and the success that something safety agency which gives a which |
---|
0:03:18 | at this point about |
---|
0:03:20 | and kind of safety incident that has happened to see |
---|
0:03:25 | manufacturing plant |
---|
0:03:26 | or a construction |
---|
0:03:28 | i or that can make a |
---|
0:03:30 | three and so one |
---|
0:03:31 | so i one of these reports |
---|
0:03:34 | actually gives a broad outline of the regulated agencies |
---|
0:03:39 | about what kind of issues that have what is it easy to what kind of |
---|
0:03:43 | what kind of problem is kind of human activities |
---|
0:03:48 | also we have for the are both in these collected automatically extracted like that |
---|
0:03:56 | kind of knowledge base the all-pole |
---|
0:03:59 | these kinds of reports for |
---|
0:04:01 | future |
---|
0:04:03 | very important |
---|
0:04:05 | each |
---|
0:04:06 | and i |
---|
0:04:07 | it is very prominently we have a lot of reports that are coming on |
---|
0:04:12 | was tracked effect |
---|
0:04:13 | so serious adverse effect was observed in patients with heart disease |
---|
0:04:18 | due to hide the sage all class i don't know an actual okay |
---|
0:04:23 | so this and the discounting you are reported because of the language you know |
---|
0:04:29 | no i was tracked effects are also reported on social media which are noisy text |
---|
0:04:35 | and |
---|
0:04:35 | so on |
---|
0:04:37 | so all the have serious implications because that there are |
---|
0:04:42 | regulatory agencies what keeping track of |
---|
0:04:44 | all these issues that are reported or the what the police and then there |
---|
0:04:48 | and how to get into investigating the and checking whether the |
---|
0:04:54 | they are really a badly to p or not order one of these and so |
---|
0:04:59 | on |
---|
0:05:00 | so i just giving some examples to motivate asking why it is actually a this |
---|
0:05:05 | became a problem for us to be |
---|
0:05:09 | so it's actually we are interested in detecting such fortune relations |
---|
0:05:14 | a for |
---|
0:05:17 | analytical and predictive applications as i say |
---|
0:05:20 | a separate application which i don't think about what is saying you know what is |
---|
0:05:26 | to build only warning systems |
---|
0:05:28 | so is automatically style of statements are detected one i two |
---|
0:05:33 | keep track of a |
---|
0:05:35 | that against abortion |
---|
0:05:37 | a knowledge based act as they are in that domain i don't increments the partial |
---|
0:05:43 | knowledge base part actually generally |
---|
0:05:46 | that the warning signals to |
---|
0:05:51 | okay so that let us get into the complexity |
---|
0:05:56 | what to write it is a problem |
---|
0:05:59 | so the different kinds of course and relation so we saw some time here a |
---|
0:06:04 | few more |
---|
0:06:05 | so it is still files for bankruptcy for mounting financial troubles |
---|
0:06:10 | this is |
---|
0:06:11 | so the ordering relation |
---|
0:06:13 | where |
---|
0:06:14 | if it is on the left hand side |
---|
0:06:16 | a company files for bankruptcy |
---|
0:06:19 | the of course is on the right multi dimensional problems |
---|
0:06:24 | right there are tools that is not |
---|
0:06:30 | personality over here |
---|
0:06:31 | but we know that |
---|
0:06:32 | if you want to drive cautiously over all calls it can lead to a particular |
---|
0:06:37 | effect |
---|
0:06:38 | so |
---|
0:06:40 | standard microphones |
---|
0:06:41 | and there is an accident |
---|
0:06:43 | we will be able to |
---|
0:06:46 | is right that's it into the power tools that |
---|
0:06:49 | kind of in french and reasoning which may also have to be done |
---|
0:06:54 | an explicit and in research project is once again the bars has been caused by |
---|
0:06:59 | what and how much attention |
---|
0:07:01 | but are and where is the egg |
---|
0:07:03 | and in there is an issue which is important |
---|
0:07:06 | it is not mentioned in the sentence over here was to the operation has found |
---|
0:07:10 | that the practical or something |
---|
0:07:13 | one the pauses mentioned over here and it has to be read |
---|
0:07:18 | the more complicated ones are also act there can be multiple colours she in a |
---|
0:07:25 | sentence |
---|
0:07:26 | so my data point |
---|
0:07:28 | but in thousands model behaves one thirty thousand |
---|
0:07:33 | that's motivated to fix it also lets |
---|
0:07:37 | and then no models but actually causing something and it's which is that would be |
---|
0:07:42 | to and shouldn't stall so here there are |
---|
0:07:46 | g so that is unusable in which is the false floor |
---|
0:07:50 | engine spelling and starting to and the engine starting |
---|
0:07:55 | issues reported is actually the costs for model the ultimate lead to record and of |
---|
0:08:01 | course in these equal has financial implications |
---|
0:08:04 | okay |
---|
0:08:05 | so we get into the kind of work that has been down soul most often |
---|
0:08:12 | these where rule based kind of approaches that has been a light |
---|
0:08:17 | so working but |
---|
0:08:18 | i was drug effect dataset it has been there for quite some time |
---|
0:08:22 | a lot of it is a rule based which of course has its own problems |
---|
0:08:27 | the learning approaches are |
---|
0:08:30 | a lot of people have stopped and using that in many situations |
---|
0:08:34 | however the problem of course as lack of training data |
---|
0:08:39 | but that's |
---|
0:08:40 | and that's means all sentences can be mighty complex so therefore rule based approaches do |
---|
0:08:45 | not always give us |
---|
0:08:46 | hundred percent correct rates |
---|
0:08:49 | okay so far |
---|
0:08:51 | task |
---|
0:08:52 | because of these problems coming from multiple domains opened note that the dataset |
---|
0:08:58 | not being able to work with the rules |
---|
0:09:00 | wants to have an unknown to the assets from whatever we would get from multiple |
---|
0:09:06 | domains |
---|
0:09:08 | this task force we have proposed |
---|
0:09:11 | linguistic and informal bidirectional lstm baseline |
---|
0:09:15 | you do and not at all the sentences |
---|
0:09:21 | where each word of a sentence is finally labeled as i there are calls or |
---|
0:09:27 | an effect |
---|
0:09:28 | or not |
---|
0:09:29 | a larger connective sort of for so called effect portion connective or not |
---|
0:09:35 | and then |
---|
0:09:37 | be you at this bush self goals this |
---|
0:09:41 | of course portions that are modeled as |
---|
0:09:44 | and the a consecutive proportions which are marked as if it does effect |
---|
0:09:49 | one of them together for our domain time then built or sub graphs |
---|
0:09:55 | so for a portion graphs we have applied clustering |
---|
0:09:58 | so this time the four steps only need not something |
---|
0:10:03 | we did that notation ourselves and then we went on to the second box of |
---|
0:10:07 | classification and then building |
---|
0:10:09 | a vision graph |
---|
0:10:12 | okay so these are resources so i would be to be created a total of |
---|
0:10:18 | some of them it right available and just talk about that but we change the |
---|
0:10:22 | notations a bit |
---|
0:10:23 | so the first time and i missed reports from each other kind of as talking |
---|
0:10:26 | about with c is recorded reports financially and information about companies et cetera |
---|
0:10:33 | so we picked up about four thousand five hundred sentences from many reports average sentence |
---|
0:10:40 | length somebody a big a high in this case |
---|
0:10:44 | a it is necessary to house and a particular to a dataset which was also |
---|
0:10:52 | fit |
---|
0:10:53 | so that |
---|
0:10:54 | thirteen a hundred those sentences |
---|
0:10:58 | we actually the unaudited |
---|
0:11:00 | so why not intended to more in i |
---|
0:11:03 | one thing works when model |
---|
0:11:06 | "'cause" this |
---|
0:11:07 | and single words from theirs |
---|
0:11:10 | effect |
---|
0:11:10 | whereas when we show that the we saw that i think what causes and not |
---|
0:11:15 | right |
---|
0:11:15 | so |
---|
0:11:16 | we don't agree and notation of it we validated by taking bad |
---|
0:11:22 | i think that what was a part of our "'cause" i don't know |
---|
0:11:28 | there is a collection it just be seen you also which is which are a |
---|
0:11:33 | few sentences so that would be in the average length of sentences is quite high |
---|
0:11:39 | if it is okay dataset which is a noisy images from twitter and social media |
---|
0:11:46 | it has about three thousand which are really matters which are shared by drug companies |
---|
0:11:50 | of and then read one and you |
---|
0:11:54 | which is |
---|
0:11:54 | i think that i read all related events but which are coming in use and |
---|
0:11:59 | not in and now list |
---|
0:12:00 | reports |
---|
0:12:02 | okay so this notation mechanism will be followed in each so first one because the |
---|
0:12:09 | sentences could be complex and there could be constant change |
---|
0:12:13 | so we used a open which is by university of washington to |
---|
0:12:19 | actually breaking down into the multiply clauses and then |
---|
0:12:23 | we set to three annotators |
---|
0:12:26 | each and note that there |
---|
0:12:28 | wants to mark the portions of the sentences as you can see over here |
---|
0:12:33 | as either time effect |
---|
0:12:35 | a larger |
---|
0:12:36 | all cordial connecting from which is |
---|
0:12:40 | so here c is of course you can be and of course |
---|
0:12:46 | a big much cost is coming from the same sentence from multiple faces which are |
---|
0:12:52 | open it has broken need so therefore these and numbered also so forty sentence for |
---|
0:12:59 | one portion |
---|
0:13:00 | would be one |
---|
0:13:02 | the subscript one |
---|
0:13:03 | for an abortion for the subscript to |
---|
0:13:07 | and these are some examples once again that when you have a very complex sentence |
---|
0:13:12 | like this |
---|
0:13:14 | these are two into me show the open it breaks it into two components |
---|
0:13:19 | and then |
---|
0:13:21 | each of them is model so here it is easy one |
---|
0:13:24 | here at a here at this is easy to see two e and so one |
---|
0:13:32 | and this one |
---|
0:13:35 | similarly central sentence can see that she originally |
---|
0:13:39 | so in this case also you will see c one c two c one c |
---|
0:13:43 | two |
---|
0:13:43 | and so on |
---|
0:13:46 | so this is a more for a so based on the cell and audition speech |
---|
0:13:53 | are given by |
---|
0:13:55 | users |
---|
0:13:56 | we now we i think learning model |
---|
0:14:00 | so we only linguistic only because we also use a lot of linguistic information |
---|
0:14:08 | by training and so it does not just the word vectors |
---|
0:14:12 | so of course support vectors is your from the original board |
---|
0:14:17 | then we have a rich just space which we do not by using |
---|
0:14:22 | a lot of information bits comes from the |
---|
0:14:25 | a standard linguistic tools |
---|
0:14:28 | so the part-of-speech tags |
---|
0:14:29 | the university dependency relations between the words |
---|
0:14:33 | also very poor |
---|
0:14:34 | a particular what is the headboard we see that particular dependency and that it is |
---|
0:14:40 | the beginning inside or end of a phrase we have taken verb noun position of |
---|
0:14:48 | three structures |
---|
0:14:50 | we have also utilize wordnet hierarchy and |
---|
0:14:53 | especially because in many situations |
---|
0:14:56 | as evident just for you only chart |
---|
0:14:59 | even that's and non-names down a relationship |
---|
0:15:04 | we head words |
---|
0:15:08 | we have taken into account over here |
---|
0:15:11 | whether it's an entity whether it's a group but it's a phenomenon and so on |
---|
0:15:17 | muscle the desire for the original remark one of these also for |
---|
0:15:22 | it's synonymous |
---|
0:15:24 | so that is how we make this |
---|
0:15:27 | very informed linguistic |
---|
0:15:31 | and so each one of them are one-hot encodings that we have used |
---|
0:15:37 | finally |
---|
0:15:39 | all these information is fed into bidirectional lstm so that |
---|
0:15:44 | as we saw that was effective relationships do not follow a pretty standard structure that |
---|
0:15:49 | effect |
---|
0:15:49 | we give the pause |
---|
0:15:51 | can be well |
---|
0:15:55 | wow pointed out sentences so therefore we use a bidirectional lstm |
---|
0:16:00 | to implement |
---|
0:16:03 | the |
---|
0:16:05 | to finally get the fine you building off a particular what their scores if a |
---|
0:16:11 | non-causal to connect |
---|
0:16:13 | by passing it through a set of hidden layers as and finally taking a softmax |
---|
0:16:19 | layer to take the one with the highest probability |
---|
0:16:23 | okay |
---|
0:16:25 | this is from is |
---|
0:16:27 | sentence |
---|
0:16:28 | a portion of the sentence model |
---|
0:16:31 | effect |
---|
0:16:32 | a portion of the sentence map task force |
---|
0:16:35 | sometimes we get on the calls are only if a we don't get the course |
---|
0:16:40 | and connectives correctly sometimes that's and that's may not need as we saw |
---|
0:16:44 | because it is it can be implicit the causality and so one |
---|
0:16:51 | but no it's our second problem passing mention that extract or something relation was just |
---|
0:16:57 | the first part of the task |
---|
0:16:59 | we want to use this relations to be coded graph or an industrial applications |
---|
0:17:05 | no this case now here comes on the problems that we had only are |
---|
0:17:12 | we this information |
---|
0:17:16 | expressed in different areas in different reports |
---|
0:17:20 | in a different companies and so on |
---|
0:17:24 | so for |
---|
0:17:25 | we know class we could all well as this all groups of fa |
---|
0:17:32 | in order to be our portion graph we could not possibly have a very complex |
---|
0:17:37 | colour red border effect |
---|
0:17:40 | is just there is a relationship in background |
---|
0:17:45 | so here are some examples that you see that all of these could be potentially |
---|
0:17:50 | grouped into what is called a few will design problem |
---|
0:17:54 | okay |
---|
0:17:55 | so if you intended effect filter |
---|
0:17:58 | so these are all expressed differently in different reports by different |
---|
0:18:03 | usable as |
---|
0:18:04 | in the other one language |
---|
0:18:05 | what would be the same event actually different even also one could be for one |
---|
0:18:11 | model of the art one would be for |
---|
0:18:13 | and how the model of another car and so on |
---|
0:18:16 | but i want to use whenever you feel problem |
---|
0:18:20 | then it is what we show that |
---|
0:18:24 | this manifested in the car |
---|
0:18:26 | so if you want your problem but also because the card installed at random and |
---|
0:18:31 | would be something that is done with the car |
---|
0:18:35 | so lonely here if you see these are all initial estimates problems which are people |
---|
0:18:40 | in multiple different trees |
---|
0:18:43 | these are flyer risks so as i was mentioning that if there is some kind |
---|
0:18:50 | all four ignition problem it would be to installing or it could be to an |
---|
0:18:56 | engine they don't it would also need to a fire in fact these are all |
---|
0:19:01 | from real data |
---|
0:19:03 | which has been reported |
---|
0:19:05 | four or even causes and effects we wanted to do |
---|
0:19:12 | just click for the similar you |
---|
0:19:15 | and once again we |
---|
0:19:18 | exploiting the same word vectors that we have |
---|
0:19:21 | ten years because there were some more issues to be taken care of so we |
---|
0:19:26 | proposed |
---|
0:19:28 | utilizing unigrams and bigrams what vectors for bigrams |
---|
0:19:33 | so we used that can separate at all |
---|
0:19:36 | the and z q e |
---|
0:19:38 | works |
---|
0:19:39 | from |
---|
0:19:40 | two different |
---|
0:19:42 | two different poses a cartoon faces are two different effect phrases |
---|
0:19:47 | and then |
---|
0:19:49 | we do that standard clustering k-means clustering very key was determined by looking at rick's |
---|
0:19:56 | a new method to see that |
---|
0:19:58 | whether a particular |
---|
0:20:00 | well as our fa |
---|
0:20:01 | we don't more to yes |
---|
0:20:04 | our proposed user to another class |
---|
0:20:06 | so that does help for this particular domain that i was discussing the recon new |
---|
0:20:11 | support that particular domain became a twenty one |
---|
0:20:15 | just words |
---|
0:20:16 | all utterances anything |
---|
0:20:18 | and finally |
---|
0:20:20 | we do not fit the bill flight this mentions that part of the graph |
---|
0:20:24 | shown bit though |
---|
0:20:25 | the samples |
---|
0:20:27 | phrases that it was showing |
---|
0:20:29 | so there is something like a line fairly or icsi's to starting of view it |
---|
0:20:34 | differently to start |
---|
0:20:36 | but we have also seen everyone turn defect mainly due for you and we also |
---|
0:20:41 | need a sort of really |
---|
0:20:44 | of a problem also leads to for any |
---|
0:20:47 | of you intended effect maybe to five is and so on |
---|
0:20:51 | so this is how we what we got from the information |
---|
0:20:57 | after clustering |
---|
0:20:59 | in fact that are |
---|
0:21:01 | was one |
---|
0:21:03 | so large and relation given you dataset |
---|
0:21:07 | for a whole as belonging to a classical to an effective belonging to another cluster |
---|
0:21:13 | we do you this particular |
---|
0:21:18 | link |
---|
0:21:19 | we used |
---|
0:21:19 | for the time being a very simple reliability mechanism is what we have assigned to |
---|
0:21:24 | it |
---|
0:21:25 | there needs to be more work to actually compute the programmability |
---|
0:21:29 | this is simply observing how many times the scores and effective come together |
---|
0:21:34 | over |
---|
0:21:35 | the number of times that was observed in |
---|
0:21:39 | represent three |
---|
0:21:41 | so |
---|
0:21:43 | based on this now also mention we had five different data sets which we had |
---|
0:21:48 | and not do it |
---|
0:21:50 | and so what we did was treated there are two types of experiments one and |
---|
0:21:55 | h we combined all five dataset |
---|
0:21:59 | together |
---|
0:22:00 | and so whatever number of sentences we got we used and we divided them into |
---|
0:22:07 | training validation testing the data five fold cross validation |
---|
0:22:11 | and in an experiment we train won't be using you want to say |
---|
0:22:17 | and try to see how we performed on each dataset |
---|
0:22:21 | separately |
---|
0:22:22 | so here the results for one side we have we you mixed up all the |
---|
0:22:27 | data |
---|
0:22:28 | and so as you can see something that the report to b c et cetera |
---|
0:22:32 | et cetera |
---|
0:22:33 | these are |
---|
0:22:33 | the performances |
---|
0:22:36 | the performance these are the baselines we have used simple rule crfs |
---|
0:22:42 | only by the lstm and linguistic and informed by lstms |
---|
0:22:47 | so in this case crfs give better advice |
---|
0:22:51 | in most of the case of linguistically informed it better |
---|
0:22:55 | and the reason is also very obvious because crfs a good care of named entities |
---|
0:23:00 | the we drown assigning et cetera |
---|
0:23:03 | the positioning the features are very good and crf which is actually giving you got |
---|
0:23:08 | better performance |
---|
0:23:10 | down our boards and semantics et cetera |
---|
0:23:13 | are seen think you'll observed for that if a bar |
---|
0:23:18 | also |
---|
0:23:19 | this is what the project and it's |
---|
0:23:21 | here we call discourse connectives are all standard english words |
---|
0:23:26 | we don't have anything to do with drug names specific features et cetera so in |
---|
0:23:31 | this case this is giving better performance in retrieving the cost and connectives irrespective of |
---|
0:23:37 | the demi |
---|
0:23:39 | it is so also very similar things are up so here as we mentioned |
---|
0:23:45 | one dataset is used for |
---|
0:23:47 | clean off on how to do this we perform yes |
---|
0:23:52 | and does that is |
---|
0:23:54 | best performing that what happened between semi that on this gives fess likely but the |
---|
0:24:00 | decision |
---|
0:24:01 | sentences |
---|
0:24:03 | a b c is |
---|
0:24:06 | good |
---|
0:24:08 | because b c has good english so therefore most of the clean as follows that |
---|
0:24:14 | okay |
---|
0:24:15 | but this is as usual for on was probably because again the a lot of |
---|
0:24:22 | a domain specificity that is involved in |
---|
0:24:26 | which it cannot learn when it comes from the dataset |
---|
0:24:32 | so this is what we have here for confusion so first three what we have |
---|
0:24:39 | done over here |
---|
0:24:40 | and what the last what the future but still characterization of even more data me |
---|
0:24:46 | just we are |
---|
0:24:48 | i in doing so when hasn't even though good but has a talk or to |
---|
0:24:52 | et cetera because |
---|
0:24:54 | just not enough to see a particular even |
---|
0:24:59 | i |
---|
0:25:00 | for i was trying to think you need more of context to actually applied to |
---|
0:25:05 | real scenario |
---|
0:25:07 | who has got more about the prior this condition sensible one |
---|
0:25:11 | so we are working towards more complex categorization of events |
---|
0:25:15 | also one composite events |
---|
0:25:18 | so here most of the time when there are composed a even change |
---|
0:25:23 | how do we characterize into the budget |
---|
0:25:26 | we are |
---|
0:25:27 | to buy |
---|
0:25:29 | you |
---|
0:26:19 | okay |
---|
0:26:35 | okay so the labeling we don't consider these issues because it's in a sentence but |
---|
0:26:41 | that it that you're or sort and it |
---|
0:26:44 | all these issues company trying to the colours and graph |
---|
0:26:48 | because what is an effect in one sentence is that well as in and then |
---|
0:26:57 | g |
---|
0:27:05 | yes |
---|
0:27:06 | okay so that is we used to the if it's a complex sentence that is |
---|
0:27:11 | why we can use open |
---|
0:27:13 | so there would be read it is this all this stuff |
---|
0:27:17 | you one |
---|
0:27:19 | in another one v c |
---|
0:27:23 | yes |
---|
0:28:05 | i mean |
---|
0:28:06 | definitely the we would like to goal |
---|
0:28:09 | we because we then |
---|
0:28:11 | specifically cause and effect |
---|
0:28:14 | argument in was built a partial effect |
---|
0:28:18 | we were trying to its simpler |
---|
0:28:20 | so that we see that definitely |
---|
0:28:23 | first |
---|
0:28:25 | more |
---|
0:28:26 | twenty rich set of an and then to it |
---|
0:28:56 | i think that that's points |
---|
0:28:58 | where |
---|
0:29:23 | yes |
---|
0:29:31 | definitely have we need to do that |
---|
0:29:34 | so the only issue was there you know we wanted to restrict ourselves look very |
---|
0:29:39 | a set of relations did not in focus |
---|
0:29:43 | but definitely not so much more |
---|
0:29:46 | more |
---|