0:00:18 | well |
---|
0:00:20 | uh i will try to be fast because there is already select for you right |
---|
0:00:25 | so |
---|
0:00:27 | it's my pleasure to go down a introduce a this project through this is european |
---|
0:00:32 | project uh unique properties again |
---|
0:00:36 | uh the same the project is robust and save mobile cooperation of the number sixteens |
---|
0:00:42 | and the abbreviation is a actually |
---|
0:00:51 | uh so at the beginning and introduce the project itself in schools structure and then |
---|
0:00:58 | demonstrators |
---|
0:01:00 | oh then i will present our scientific development and achievements in this work or university |
---|
0:01:09 | and finally i will describe how we integrate the our results into the project |
---|
0:01:15 | the ministry |
---|
0:01:17 | so the main project all its production of advanced robust and C cognitive uh reasoning |
---|
0:01:23 | older than a simple practical robotic systems at reduced cost is most important thing |
---|
0:01:31 | then designed and it'll reusable building blocks collected in the knowledge base |
---|
0:01:38 | is the main idea of the project that |
---|
0:01:42 | knowledge base with |
---|
0:01:45 | two rolls independent building blocks that are reusable so then any other repeatable will be |
---|
0:01:55 | cheaper in |
---|
0:01:58 | present time |
---|
0:02:01 | so based on the state-of-the-art that is |
---|
0:02:06 | uh the project goals is to specified objects and develop in about solutions covering or |
---|
0:02:12 | crucial robotics topics |
---|
0:02:15 | uh such as however performance in addressing and i'm and perception |
---|
0:02:22 | and modeling reasoning decision making or validation and testing |
---|
0:02:28 | that and design of solutions are used to develop a knowledge base |
---|
0:02:34 | uh i and tools for body development and standardised testing |
---|
0:02:44 | a button on to manage those two approach |
---|
0:02:50 | one of the key or what |
---|
0:02:53 | as i said is development knowledge base but for knowledge based framework |
---|
0:03:02 | so first two points |
---|
0:03:05 | state-of-the-art records |
---|
0:03:07 | and new features are used to feel this knowledge base |
---|
0:03:14 | and used to design a methodology i'll work with this knowledge |
---|
0:03:22 | the model |
---|
0:03:23 | solutions will be then |
---|
0:03:26 | integrated in the demonstrators from a different industrial partners |
---|
0:03:44 | so there are examples of application domains |
---|
0:03:50 | they are covered by industrial partners in this project |
---|
0:03:54 | um |
---|
0:03:57 | maybe i was giving you to show you the partners that is twenty seven partners |
---|
0:04:01 | in position quite large amount |
---|
0:04:05 | the project is focused on corporation this industrial partners so besides universities and national research |
---|
0:04:11 | centres there is a high part oh |
---|
0:04:15 | members of the project |
---|
0:04:17 | uh our industrial representatives |
---|
0:04:23 | so come back to this light |
---|
0:04:29 | both and that area and probably both |
---|
0:04:34 | they are not round by |
---|
0:04:38 | um independently or incorporation are imperative solutions for accomplishment of surveillance or security inspection you |
---|
0:04:48 | to use a more extensive address |
---|
0:04:55 | examples might be more ignoring the borders the seashore all inspecting infrastructures some |
---|
0:05:06 | about exactly was i'll talk in detail later because this is what we integrate our |
---|
0:05:12 | solutions |
---|
0:05:19 | industrial in manufacturing wouldn't parts and products the process maybe |
---|
0:05:27 | like cutting the who surf a single assembly of the parts that are usually carried |
---|
0:05:34 | by nancy on C and C O machines |
---|
0:05:38 | in the current practise |
---|
0:05:40 | oh |
---|
0:05:42 | manual operation with machines is necessary because the machines are not able to of |
---|
0:05:49 | oh to the mostly work with small |
---|
0:05:53 | parts so the objective of this project is also solve these problems and then there |
---|
0:06:00 | is a set of this remote applications that are can ski because we california talked |
---|
0:06:07 | about the previous |
---|
0:06:09 | presentation in detail |
---|
0:06:16 | so in this structure where is a more research and it is it is university |
---|
0:06:29 | there are driven by requirements of objects applications |
---|
0:06:34 | but online as real-time sensor data processing is key requirement |
---|
0:06:41 | our proposal cues and development novel methods and optimization of existing ones and acceleration |
---|
0:06:49 | in our some |
---|
0:06:59 | the methods are well separated might be so i |
---|
0:07:06 | that is what we have developed some methods and the results were research O R |
---|
0:07:14 | based on says a fusion usually for those tasks for robot localisation an object detection |
---|
0:07:21 | of perception |
---|
0:07:25 | in the robot localisation uh we develop a method that uses two |
---|
0:07:32 | use an existing methods is you mapping this method that process a laser scans and |
---|
0:07:41 | have very low precision but the model as you can see the environment is quite |
---|
0:07:45 | or so |
---|
0:07:46 | it's harder to get any knowledge from it |
---|
0:07:50 | other methods |
---|
0:07:53 | based on data from kinect so we show and that's data |
---|
0:07:58 | can create a more informative model but the precision is so matter if use those |
---|
0:08:06 | two methods actually we are able to get much we can see the precision of |
---|
0:08:13 | a methods domain |
---|
0:08:18 | and so |
---|
0:08:20 | one remote so |
---|
0:08:22 | final solution is um precise and the model was then higher quality |
---|
0:08:32 | including detection |
---|
0:08:35 | we experimented and realised several methods i will briefly introduced to them in that um |
---|
0:08:42 | when we achieved a nice results that the uh published |
---|
0:08:48 | well all those detectors are then used in our system and fused together well for |
---|
0:08:57 | improving the robustness the final solution and the |
---|
0:09:03 | more precise modeling of the environment |
---|
0:09:08 | situation so one that is |
---|
0:09:12 | a segmentation of the video signal um by tracking of local features |
---|
0:09:21 | in comparison with existing methods that are used more on |
---|
0:09:26 | precision and instability of the method as i said i focus on |
---|
0:09:31 | speech and also |
---|
0:09:33 | to work in an online manner |
---|
0:09:36 | but usually off-line methods that exists use whole data to produce minus tracks and then |
---|
0:09:46 | cost in a nice way we cannot use all data from |
---|
0:09:51 | the future in on-line processing so this work this was uh problems that we need |
---|
0:09:57 | to solve so actually come up with the |
---|
0:10:00 | the method is able to run online and the real time with comparable precision but |
---|
0:10:09 | with |
---|
0:10:10 | many more times higher uh speech so |
---|
0:10:14 | computational cost is very low |
---|
0:10:20 | this method is able to segment one the robot moves |
---|
0:10:24 | this method is able to segment objects that moves in the video stream relatively to |
---|
0:10:31 | each other |
---|
0:10:32 | so it doesn't necessarily mean that the T V to move but if there is |
---|
0:10:36 | object far from some background we are able to see actually i shouldn't say we |
---|
0:10:42 | this is computed at the picture this is a segmentation you don't know if it's |
---|
0:10:47 | object one C |
---|
0:10:50 | the second method i want to introduce is um |
---|
0:10:54 | processing methods for this data um |
---|
0:10:59 | the idea is based |
---|
0:11:02 | that |
---|
0:11:03 | in indoor scenes many objects or wonder so |
---|
0:11:08 | this method segments longer well |
---|
0:11:14 | segments object |
---|
0:11:17 | again before used on a computational efficiency so here we have |
---|
0:11:25 | also slightly better precision than existing methods but there are many times faster than the |
---|
0:11:32 | others |
---|
0:11:37 | this is the last uh and rough research what we do here for the project |
---|
0:11:43 | this is for validation and verification part of the project reading something we should of |
---|
0:11:49 | common uh features |
---|
0:11:51 | um |
---|
0:11:53 | usually one |
---|
0:11:55 | robin systems are the lot and need to be very verify that there is need |
---|
0:12:02 | of some simulation process |
---|
0:12:05 | and usually we generate uh image actually |
---|
0:12:11 | right image so what we are trying to do is to somehow similar in the |
---|
0:12:15 | real situation so |
---|
0:12:18 | we in |
---|
0:12:20 | introducing to be nice um |
---|
0:12:24 | say |
---|
0:12:26 | distortions |
---|
0:12:27 | right lower noise and many other |
---|
0:12:33 | distortion |
---|
0:12:35 | chromatic aberration |
---|
0:12:37 | lens flare to basically a example |
---|
0:12:41 | not perfect |
---|
0:12:46 | so that was our research from scientific point of you know how we apply an |
---|
0:12:52 | integrated or assumptions into a vectors we cooperate with i hope the at like a |
---|
0:13:00 | different at that is um |
---|
0:13:03 | uh |
---|
0:13:05 | they have a covert in for gifts those allergies laser guided very close |
---|
0:13:12 | just physically moving groups |
---|
0:13:15 | and X is a link between different machines in big barrels moving the pilots and |
---|
0:13:23 | they do it autonomously controlled by one |
---|
0:13:27 | uh |
---|
0:13:29 | central unit |
---|
0:13:31 | i that controls oh |
---|
0:13:37 | situation in the better so we have sold to task with that |
---|
0:13:44 | one is also abundance this is crucial for that a lot |
---|
0:13:50 | oh |
---|
0:13:51 | solution because when two i T Vs meets somewhere because of one hundred you can |
---|
0:13:59 | feel because of something like models and yeah anyway however than rgb space there and |
---|
0:14:06 | in the corridor but i don't actually use |
---|
0:14:10 | so we need to solve all the other actually might |
---|
0:14:15 | and white this obstacle |
---|
0:14:22 | the second |
---|
0:14:24 | task is um |
---|
0:14:27 | this application track probably means that those ics needs to get into that right lower |
---|
0:14:34 | their the product and the track the problem is that |
---|
0:14:39 | the solution |
---|
0:14:42 | proposition solution for this uh framework |
---|
0:14:47 | is based on there is an allegation positioning system that works only inside the white |
---|
0:14:52 | house indulgently the rubber believes the barrels it was is position is as see what |
---|
0:15:00 | it is so the goal is somehow |
---|
0:15:06 | measure and perceive |
---|
0:15:09 | the localisation actually out of the barrels inside the track the track which later there's |
---|
0:15:16 | also some constraints |
---|
0:15:19 | so this to make that we try to uh |
---|
0:15:24 | so |
---|
0:15:27 | this example this is serious frames rubber problems sensor is a camera |
---|
0:15:35 | this is manually driven |
---|
0:15:38 | example how |
---|
0:15:42 | and we should be hey if i think it's J |
---|
0:15:47 | and then we see what type do this |
---|
0:15:51 | it's a person if it so that the robot it's politics |
---|
0:15:55 | this an object because according to those information we can say if you can what |
---|
0:16:01 | do not because of the safety reasons for example the person is not |
---|
0:16:06 | ll to be avoided |
---|
0:16:10 | the attendant actually or pilot is about the word in the harness and so |
---|
0:16:16 | uh so |
---|
0:16:19 | for this we design some structure i would represent what here and its input data |
---|
0:16:26 | centre we use of actions in some sort |
---|
0:16:32 | in all image domain or sensory that my |
---|
0:16:39 | our experiments we use |
---|
0:16:43 | R G and G other constraints bottom an environment we |
---|
0:16:51 | right project and what you don't objects in the environment |
---|
0:17:00 | alright |
---|
0:17:03 | having |
---|
0:17:05 | information about objects don't actually are report classify what type it is if it's a |
---|
0:17:12 | danger situation just warning situation |
---|
0:17:19 | a point of the information that rgb in making decide if he wants to avoid |
---|
0:17:26 | object so that it is that we have a planning module that computes the past |
---|
0:17:32 | and uh provide information twenty actually here we do the perception and more than the |
---|
0:17:41 | a |
---|
0:17:44 | i don't know any controlling on T V |
---|
0:17:47 | so we only provide measurements and |
---|
0:17:51 | that's a |
---|
0:17:53 | oh analysis and proposals what actually my too much oh but you know any controlling |
---|
0:18:04 | is it an example |
---|
0:18:05 | just of the constraints in front of the T that there is something classified object |
---|
0:18:13 | read position study danger or if it's just morning andrea |
---|
0:18:19 | according to those information we use something like decision making use a if it's a |
---|
0:18:25 | person it is too close to be muscle or the bill if it's other robot |
---|
0:18:32 | and it's far all what we need to do is slow down so this is |
---|
0:18:37 | people describing the decision making and remind you also computes |
---|
0:18:43 | either that a if the avoidance |
---|
0:18:47 | procedure is exceeded |
---|
0:18:51 | so this is our asepsis team for obstacle avoidance cost |
---|
0:18:56 | and the second that it is it is to get into the track |
---|
0:19:01 | localise itself in the strong using different localisation method then used in our house so |
---|
0:19:08 | you can see |
---|
0:19:10 | i wanted visual information track is |
---|
0:19:14 | to do we base or solution or laser measurements |
---|
0:19:22 | so we design simply the model of that right and we measured roles right points |
---|
0:19:28 | uh in that right |
---|
0:19:31 | and we provide measurements |
---|
0:19:33 | to the T V and T V then using a similar way as uh it's |
---|
0:19:40 | postage it's uh it's using the localisation information |
---|
0:19:44 | one house system |
---|
0:19:46 | so |
---|
0:19:51 | we have developed a methods |
---|
0:19:53 | for sensory processing |
---|
0:19:56 | i think the next imputation |
---|
0:19:59 | we use an optimization method one |
---|
0:20:03 | acceleration hardware |
---|
0:20:06 | um but it somewhere else in which can directors oh |
---|
0:20:12 | the results of those methods are was rendered looks like the use of two hours |
---|
0:20:20 | we created experiment experimental probably form that see here our experiments |
---|
0:20:29 | oh really |
---|
0:20:31 | and most of our results are great to demonstrate |
---|
0:20:36 | these cases |
---|
0:20:37 | um thank you very much attention |
---|