0:00:14 | they don't uh |
---|
0:00:16 | uh |
---|
0:00:16 | and |
---|
0:00:20 | i |
---|
0:00:22 | then you are there |
---|
0:00:25 | oh |
---|
0:00:27 | i |
---|
0:00:30 | right and see |
---|
0:00:31 | uh you know |
---|
0:00:33 | i |
---|
0:00:34 | be overwhelming majority work |
---|
0:00:36 | has has a a they'll with construct a matrix |
---|
0:00:39 | that are sub gaussian |
---|
0:00:41 | it's are made are gaussian |
---|
0:00:44 | or or |
---|
0:00:45 | a a binary and in some some fashion or or or or partial |
---|
0:00:48 | uh |
---|
0:00:49 | for matrix |
---|
0:00:50 | and and the like |
---|
0:00:52 | so that special export here |
---|
0:00:54 | is uh |
---|
0:00:56 | thank you |
---|
0:00:57 | so the but the the question away |
---|
0:01:00 | is what happens if that the the |
---|
0:01:03 | matrices that we using compressive sensing |
---|
0:01:05 | uh are not so cows |
---|
0:01:07 | and there some uh uh interesting a statistical and and uh a geometrical characteristics that but |
---|
0:01:13 | it will in march out of of a of of those uh |
---|
0:01:16 | prop |
---|
0:01:17 | so we will |
---|
0:01:19 | the first point we will motivate the we will uh |
---|
0:01:21 | some right of characteristics when these matrices are are not so gaussian |
---|
0:01:25 | uh we will talk about stable run process that are to one class of four |
---|
0:01:30 | or for all variables that are are not sub gaussian in the uh we'll will go for some |
---|
0:01:35 | uh rap |
---|
0:01:36 | equivalent characteristics of a piece of compressive measurements |
---|
0:01:40 | and show some computer simulation |
---|
0:01:44 | um |
---|
0:01:46 | so in compressive sensing the uh |
---|
0:01:48 | fundamental property the alright you the right |
---|
0:01:51 | uh |
---|
0:01:52 | uh |
---|
0:01:53 | shows how these dimensional reduction that is achieved when you look |
---|
0:01:56 | pressure |
---|
0:01:57 | upper side information in space |
---|
0:01:59 | right in there and several people in this uh |
---|
0:02:02 | so i should have talked about uh information per server |
---|
0:02:06 | uh_huh |
---|
0:02:06 | the different space |
---|
0:02:08 | and uh technical sub gaussian the uh measurements you got you have these uh |
---|
0:02:13 | i some a three |
---|
0:02:14 | you have the a some entry |
---|
0:02:17 | let you go from |
---|
0:02:18 | uh the L two space |
---|
0:02:19 | and when you of the projections you're are you have it's as a matter of preservation on L to specs |
---|
0:02:24 | as well |
---|
0:02:25 | um |
---|
0:02:26 | but i has to be uh |
---|
0:02:28 | gaussian or |
---|
0:02:29 | or something like that |
---|
0:02:32 | uh |
---|
0:02:34 | a chart tried of and the calling of is uh a generalized this idea to say okay well |
---|
0:02:40 | uh how modified look at this as a sum to still on the L two based but i could the |
---|
0:02:44 | regularization right |
---|
0:02:46 | in an L P |
---|
0:02:47 | and uh you could uh |
---|
0:02:51 | get that |
---|
0:02:52 | more sparse characteristics and in the in in this just |
---|
0:02:56 | so using this this came right yeah what we will do is we will uh generalise it |
---|
0:03:00 | to apply to a um |
---|
0:03:02 | compressive scenario |
---|
0:03:04 | where R is no longer |
---|
0:03:06 | so gaussian in in in fact be a very |
---|
0:03:09 | uh i'm whole save or are rich wrote a novel impulsive also characteristics |
---|
0:03:13 | it will will show that the use somewhat tree that is preserved here is uh and that a L to |
---|
0:03:17 | space it's and |
---|
0:03:18 | um |
---|
0:03:20 | and uh and L P space L L P uh metric |
---|
0:03:23 | going to uh and L L from metric and these are very well |
---|
0:03:27 | very distinctly |
---|
0:03:29 | related |
---|
0:03:31 | uh key will be less less in L two |
---|
0:03:35 | that's a correct |
---|
0:03:36 | so of |
---|
0:03:36 | and to uh can go from one to two |
---|
0:03:40 | um and and a stable distributions uh |
---|
0:03:43 | how have determines the uh uh |
---|
0:03:45 | the level of the degree of um that |
---|
0:03:48 | gaussian at |
---|
0:03:49 | if you will |
---|
0:03:49 | uh to meaning gaussian uh um |
---|
0:03:52 | and the |
---|
0:03:53 | the lower the out of the more impulsive characteristics you will have a |
---|
0:03:56 | and of course the that the ripple also give you |
---|
0:03:59 | at uh uh |
---|
0:04:00 | probability of of reconstruction and also a number of measurements |
---|
0:04:03 | you need |
---|
0:04:05 | i |
---|
0:04:11 | yes |
---|
0:04:11 | well it's of this |
---|
0:04:14 | a i i a more |
---|
0:04:17 | now this this is the geometric interpretation |
---|
0:04:19 | um |
---|
0:04:21 | um |
---|
0:04:23 | in a |
---|
0:04:23 | in compressive sensing typically |
---|
0:04:26 | you have a uh that the L two |
---|
0:04:28 | is uh |
---|
0:04:29 | if you have the L two distance in uh in the original space you are going to preserve these |
---|
0:04:34 | L two distance in the in the project space |
---|
0:04:37 | uh in in in a |
---|
0:04:38 | i in the image on a reduction actual literature uh |
---|
0:04:42 | there's a lot of interest in doing that this the mitchell i reduction |
---|
0:04:45 | in uh uh uh north that are not L two |
---|
0:04:48 | uh maybe the L one |
---|
0:04:50 | that that out maybe more uh a truck before the application |
---|
0:04:54 | and B for the image processing literature |
---|
0:04:56 | is well known that the L two um norms based consider are are very and you may wanna go to |
---|
0:05:01 | other other more |
---|
0:05:04 | so that was a motivation here you know can we get |
---|
0:05:06 | a other uh L two distance preservation in this in in this compress sense |
---|
0:05:11 | so in this case for instance of were talking of the L one norm here how i how maybe preserving |
---|
0:05:16 | in that the projections on the and this L P more |
---|
0:05:21 | and the associate with this the main char with action then you will have the uh |
---|
0:05:26 | a or the restricted isometry property |
---|
0:05:29 | on the reconstruction you would like to get the |
---|
0:05:31 | construction back |
---|
0:05:33 | so uh the that two to the result that we will look at two is basically this result here which |
---|
0:05:38 | is |
---|
0:05:39 | in since a uh |
---|
0:05:40 | there similar to the traditional additional right P |
---|
0:05:43 | when set the L two norm few we're gonna have the L L P |
---|
0:05:46 | a P more |
---|
0:05:49 | so what are the what are the things that |
---|
0:05:51 | this will |
---|
0:05:52 | will uh will get gain if you use this uh |
---|
0:05:55 | projection rather than that are not to a L C |
---|
0:05:59 | uh |
---|
0:06:00 | one is a you will preserve distances in in space is are not L L two |
---|
0:06:04 | uh but there's also the properties that are interest |
---|
0:06:07 | um |
---|
0:06:08 | uh we know when you pray when you process data with that |
---|
0:06:13 | uh a more that are lower than L two but the processing a lot is a lot more robust so |
---|
0:06:18 | that's form dress as one of the parks of what's |
---|
0:06:20 | given here by or real and bar |
---|
0:06:22 | that if you have a is in your in your observations that are not |
---|
0:06:26 | uh gaussian |
---|
0:06:27 | then you were uh a reconstruction gonna be much more robust than what you would get with |
---|
0:06:33 | can in projection |
---|
0:06:35 | uh you use L P therefore you you per are more the sparsity of the reconstruction |
---|
0:06:40 | and but um as i said that that the |
---|
0:06:43 | but distant reservation is is a different |
---|
0:06:45 | or space |
---|
0:06:48 | okay so in in a stable when we talk about a a stay uh a sub gaussian projections what is |
---|
0:06:53 | key is that |
---|
0:06:54 | if you print if you're generating a round on uh make trick |
---|
0:06:58 | rather than having a a a a uh the entries in B of a gaussian distribution they're gonna be an |
---|
0:07:03 | alpha stable distribution |
---|
0:07:04 | which uh have pales we're than than the gaussian so for instance here |
---|
0:07:09 | the uh as you change the parameter uh i'll cycle two with this this black |
---|
0:07:14 | of which is the gaussian |
---|
0:07:15 | if you have a a a a a i'll think with one point five is gonna be used blue curve |
---|
0:07:18 | and so on so as as you lower the L are you gonna have a have your heavy tails |
---|
0:07:22 | uh to generate that that the projection matrix |
---|
0:07:26 | um |
---|
0:07:27 | be a characteristic function of all stable the distributions have a this have the shape |
---|
0:07:32 | uh is |
---|
0:07:33 | a class of uh uh density functions of distributions |
---|
0:07:37 | uh a characteristic function is very well to if is very compact very nice but that dish regions are not |
---|
0:07:43 | that's very interesting class |
---|
0:07:46 | um here the out that if you put of equal to two that's a characteristic function of a gaussian |
---|
0:07:50 | so in general this is the easiest way to |
---|
0:07:52 | cracked tries that the of or stable distribution |
---|
0:07:56 | and i think that's very interesting that with the stable distributions it it has a a a a stability property |
---|
0:08:01 | much like the gaussian has a a is gaussian central in your |
---|
0:08:06 | uh stable distribution have a a generalized central |
---|
0:08:09 | limit your |
---|
0:08:10 | or or stable wall you will |
---|
0:08:12 | and it's that that it it's interesting because if you have a |
---|
0:08:15 | um |
---|
0:08:16 | uh |
---|
0:08:17 | let this are |
---|
0:08:19 | be a stable distributions you stable distribution with some |
---|
0:08:23 | that |
---|
0:08:24 | this this this this five i'll of uh in this fight |
---|
0:08:27 | this partial and so you don't have the the if you have person |
---|
0:08:31 | uh if you have a a a bunch of them uh |
---|
0:08:33 | the output what also be stable of the same |
---|
0:08:37 | parameter alpha |
---|
0:08:38 | and but of this portion will be and and so like an L P |
---|
0:08:42 | uh |
---|
0:08:44 | a metric of of the dispersion |
---|
0:08:47 | right so |
---|
0:08:48 | uh for instance if the it if all of these have a a a a uh this person one |
---|
0:08:52 | and you are and you out all of this the N |
---|
0:08:54 | the the output the why |
---|
0:08:56 | which is the sum of all of these will be um |
---|
0:08:59 | a of the E the L A of a more of four |
---|
0:09:02 | of of of the data |
---|
0:09:04 | so that was partial power captures the that the of |
---|
0:09:06 | the other or |
---|
0:09:10 | um |
---|
0:09:12 | it is an example what you have a a a cat can data if you do the cal you random |
---|
0:09:16 | the projections in |
---|
0:09:17 | the output will be that cal she with that of the L one this |
---|
0:09:22 | this is a special case of |
---|
0:09:24 | i is stable distribution |
---|
0:09:28 | okay um |
---|
0:09:30 | so when you tack of a stable distribution since are heavy tails |
---|
0:09:33 | uh you can use second or or or second order moments are second or statistic because they're very heavy tails |
---|
0:09:39 | and the means and variances are that where will find so what you have to use |
---|
0:09:44 | is you have to have |
---|
0:09:45 | what you have to use you have to uh use fractional uh mode |
---|
0:09:50 | so |
---|
0:09:51 | uh if if uh X is a uh |
---|
0:09:53 | stable distribution |
---|
0:09:55 | then the expected value of X to the P so that X of the P |
---|
0:09:58 | has uh |
---|
0:10:01 | you compute the the |
---|
0:10:02 | the moments |
---|
0:10:03 | then you will this will be related to the dispersion the P |
---|
0:10:07 | and uh therefore if you have |
---|
0:10:09 | the L P more |
---|
0:10:11 | oh of these this is a of the sum of a a lot of these terms |
---|
0:10:14 | then the expected value of that the they'll P then it'll be another constant that term |
---|
0:10:19 | a variable P and alpha |
---|
0:10:20 | in L so the dispersion of the of the very |
---|
0:10:24 | so this can be used in order to do the analysis that we need for the |
---|
0:10:28 | for the uh are i P that would will go to |
---|
0:10:32 | so this is a a that they are we were gonna we gonna look for is uh we're gonna have |
---|
0:10:37 | a |
---|
0:10:37 | run the matrix that again but like you generate |
---|
0:10:40 | uh |
---|
0:10:41 | compressive random the to see that are gaussian that we gonna be generating |
---|
0:10:45 | i got |
---|
0:10:46 | matrices was is that are of the stable |
---|
0:10:48 | i at uh with out between one and two |
---|
0:10:53 | and you will |
---|
0:10:54 | uh a show that if you have a |
---|
0:10:56 | K A the dimension of is matrix B |
---|
0:10:58 | well sir it than the mentioned |
---|
0:11:00 | uh which is as |
---|
0:11:02 | uh a and of over S a very similar to that |
---|
0:11:05 | for a traditional are P where is that the sparsity |
---|
0:11:08 | then this this uh |
---|
0:11:10 | um distance card to station will be preserved |
---|
0:11:13 | and with a given problem |
---|
0:11:18 | a came to do that to prove that we will use this are similar steps as as as we do |
---|
0:11:21 | in the in the uh traditional are P construction |
---|
0:11:25 | except that we will be using these other |
---|
0:11:28 | uh different than for and the uh |
---|
0:11:30 | uh |
---|
0:11:31 | fractional moment |
---|
0:11:32 | and that the procedure is of course to look at the probability probabilistic approach that for a fixed X |
---|
0:11:40 | in uh that is sparse uh |
---|
0:11:42 | that the rip hold |
---|
0:11:44 | then we will look at the uh |
---|
0:11:46 | that the rip |
---|
0:11:47 | uh |
---|
0:11:48 | uh |
---|
0:11:49 | i |
---|
0:11:50 | is achieved |
---|
0:11:51 | for any X in a in our and the S and then for any submatrix of B R are so |
---|
0:11:56 | it's very similar to be other |
---|
0:11:58 | or just on the traditional |
---|
0:11:59 | on the traditional um |
---|
0:12:01 | alright are P the relations so we will just get |
---|
0:12:03 | how we do this |
---|
0:12:06 | um so the first lemma tell so that if you have the index of uh |
---|
0:12:11 | uh |
---|
0:12:12 | S being the sparsity |
---|
0:12:14 | and uh oh we're we're looking at |
---|
0:12:17 | this uh be a submatrix of K by ace |
---|
0:12:20 | uh of these uh projection matrix |
---|
0:12:23 | then uh this will hold this this will whole whole these are are some constants that we will derive |
---|
0:12:28 | in in a wheel |
---|
0:12:29 | with this will hold with a given probability |
---|
0:12:33 | and uh |
---|
0:12:34 | the problem that is is uh are related to the to them up the fractional moments we went uh |
---|
0:12:39 | we just discussed uh |
---|
0:12:41 | a minute ago |
---|
0:12:42 | we you have a i is equal to the projection major |
---|
0:12:46 | each of the interest of these why will be the linear combination of the X with the stable |
---|
0:12:52 | components |
---|
0:12:53 | therefore for this uh a random variable Y to inter a projection will be alpha stable with a zero mean |
---|
0:12:59 | but the |
---|
0:13:00 | dispersion of uh |
---|
0:13:02 | yeah uh the the L L |
---|
0:13:05 | right the |
---|
0:13:06 | uh the L P L of the of Y |
---|
0:13:09 | then is |
---|
0:13:11 | um |
---|
0:13:12 | we know that by this somewhere where you have each of these is the i is uh |
---|
0:13:16 | is given there |
---|
0:13:18 | so one um |
---|
0:13:19 | this is just a that the sum of this |
---|
0:13:22 | elements of of the why way of the of the people are |
---|
0:13:26 | and that you can then take the expected value of of the projection |
---|
0:13:31 | which um is just uh |
---|
0:13:34 | again it's a this are |
---|
0:13:36 | uh |
---|
0:13:36 | the P uh |
---|
0:13:38 | uh |
---|
0:13:39 | the the the out of an a normal of the of of of the back |
---|
0:13:46 | oh so we have a then is is we have the expected value of and then we have the variance |
---|
0:13:52 | in in the variance you can |
---|
0:13:53 | similarly divide there so you have a you have the mean and you have a variance and you trying to |
---|
0:13:57 | get this bound |
---|
0:13:58 | in order to detect the probabilities that |
---|
0:14:00 | that that the distance |
---|
0:14:01 | as will be preserved in that this |
---|
0:14:05 | um um |
---|
0:14:07 | then if you approximate of the distribution of of of these uh P more by an inverse gaussian |
---|
0:14:14 | then you will have a a of by bound that you can then relate how likely use |
---|
0:14:19 | how like there these distances to be in that in that in a given "'em" ball if you will |
---|
0:14:25 | in that will be a function |
---|
0:14:26 | well this parameter a to we'd be parameter at the the distribution of these me are |
---|
0:14:30 | and signal that we just |
---|
0:14:32 | review in the previous slide |
---|
0:14:34 | and the |
---|
0:14:35 | but |
---|
0:14:35 | the turn of band provides as that probability um |
---|
0:14:39 | which uh will be a function of the K right |
---|
0:14:42 | that will tell you for a given probably that i i'm within the ball and need so many projections |
---|
0:14:46 | the |
---|
0:14:47 | on the uh set |
---|
0:14:50 | the |
---|
0:14:50 | but on the compressive sensing measure |
---|
0:14:55 | so a uh |
---|
0:14:56 | we then general that not just for a given X but for any arbitrary X that could but that preserves |
---|
0:15:02 | that uh it'll P norm |
---|
0:15:04 | in that does just changes this probability |
---|
0:15:06 | you just have to be a little more |
---|
0:15:08 | i don't the on the |
---|
0:15:10 | uh |
---|
0:15:11 | on the distances in then |
---|
0:15:13 | for any submatrix |
---|
0:15:14 | i you just change also the |
---|
0:15:15 | but |
---|
0:15:16 | the constants that |
---|
0:15:18 | an N to prove the that the or and then you just have to |
---|
0:15:21 | uh |
---|
0:15:22 | many P like the constants to put in to this form |
---|
0:15:25 | which is what we you what will do |
---|
0:15:28 | but in in and at the same time that in says the minimum number of measurements |
---|
0:15:32 | to it's to attain the rip which is again uh |
---|
0:15:35 | a function of these S slot and |
---|
0:15:40 | so this is an example what happens if you can if you if you um |
---|
0:15:44 | if you do these projections so you have the the shall reduction you you have this matrices that satisfy this |
---|
0:15:50 | is the this reason |
---|
0:15:52 | uh and then you can do the reconstruction that only just mitchell to but you wanna to the reconstruction |
---|
0:15:58 | but when you do the be a a reconstruction |
---|
0:16:01 | you're project thing with this uh |
---|
0:16:03 | not some gaussian sub gaussian entries |
---|
0:16:06 | in therefore for you can not use traditional compressive sensing algorithms |
---|
0:16:10 | uh like the L two a one or or uh |
---|
0:16:14 | methods that are rely on and to uh |
---|
0:16:16 | distance distances |
---|
0:16:19 | um point of the greedy out within |
---|
0:16:21 | to is you really with ends would fail |
---|
0:16:23 | uh because of the impulsive nature of that the measure |
---|
0:16:27 | so in this uh in this uh example uh oh we're gonna we getting um compressive measurements Y |
---|
0:16:33 | where are are are the alpha stable uh projections |
---|
0:16:37 | without equal to one point two |
---|
0:16:39 | uh well the noise is one |
---|
0:16:41 | what one point two and um |
---|
0:16:45 | and |
---|
0:16:46 | then you have this is a density function |
---|
0:16:49 | a number of measurements is four hundred K is the number measurements |
---|
0:16:53 | S is the sparsity |
---|
0:16:54 | and |
---|
0:16:55 | this is a uh uh um reconstruction algorithms that |
---|
0:16:59 | are not develop in this paper but in different paper that uses |
---|
0:17:02 | uh |
---|
0:17:02 | uh not L two |
---|
0:17:04 | data fitting uh |
---|
0:17:07 | uh terms uh but uses a a um |
---|
0:17:11 | uh i L L lorentzian regular say |
---|
0:17:14 | a lorentzian based metrics to do the the |
---|
0:17:17 | the data fit |
---|
0:17:19 | then you can see that the the data that is the original or the circles blue circles |
---|
0:17:24 | bin them cover data is is |
---|
0:17:26 | is well preserved |
---|
0:17:28 | if you the L one a is you will week |
---|
0:17:30 | we curve or we cover the sparsity terms |
---|
0:17:33 | you will have a lot of a ears uh |
---|
0:17:35 | it's star |
---|
0:17:38 | and then you can do that and you can |
---|
0:17:40 | test the |
---|
0:17:42 | um |
---|
0:17:43 | the number of uh measurements that the you require and |
---|
0:17:46 | again for these method we have |
---|
0:17:48 | a the medical uh K that you need to to fit the in this construction you you were able to |
---|
0:17:54 | uh do the inversion |
---|
0:17:56 | uh |
---|
0:17:57 | in a within the bounds of |
---|
0:17:59 | the tip that they L C |
---|
0:18:02 | so in summary O |
---|
0:18:04 | what uh |
---|
0:18:05 | what if you use uh |
---|
0:18:06 | if you don't use of gaussian |
---|
0:18:08 | a measurement |
---|
0:18:10 | uh we we explore that you can get a a a uh this is trees trees or these distances in |
---|
0:18:15 | a in a not an end to but another another other uh |
---|
0:18:20 | norms or |
---|
0:18:22 | and uh uh at the same time you will you will find that the they're more last |
---|
0:18:27 | you have a lot more robust |
---|
0:18:28 | against noise that very close it |
---|
0:18:30 | and uh uh also we use you yeah |
---|
0:18:33 | the |
---|
0:18:33 | sparsity inducing a a a more that but you want for |
---|
0:18:40 | question |
---|
0:18:53 | i |
---|
0:19:06 | well you're |
---|
0:19:07 | what you're but your uh |
---|
0:19:09 | you're using your use in the journal bounds on the L |
---|
0:19:13 | on that uh |
---|
0:19:13 | uh |
---|
0:19:15 | fractional |
---|
0:19:16 | uh tire |
---|
0:19:17 | so you're doing |
---|
0:19:18 | the close to the |
---|
0:19:19 | P |
---|
0:19:20 | so or you can use a |
---|
0:19:21 | approximations matt not couch in that |
---|
0:19:24 | right him where you can put inverse gaussian or other channel out some of that is true |
---|
0:19:28 | a well the a |
---|
0:19:34 | sure because you do your use using those on the on the L P type |
---|
0:19:38 | uh very |
---|
0:19:41 | he is uh |
---|
0:19:42 | yes |
---|
0:19:43 | yes |
---|
0:19:50 | yeah yes |
---|
0:20:13 | yeah what happens is a um |
---|
0:20:17 | you generate you generate your your matrices using a stable distribution right so you have |
---|
0:20:22 | some that is not gaussian you generate the projections for you gonna get a vector of measurements |
---|
0:20:27 | but these measurements are are are |
---|
0:20:30 | linear combination alpha stable distributions so they're not |
---|
0:20:34 | uh they're not so gaussian than i gauss |
---|
0:20:36 | right |
---|
0:20:36 | so the inverse a algorithms can use traditional uh L two |
---|
0:20:42 | uh data fitting terms |
---|
0:20:44 | with the regular right so you have to use norms that are robust you have to use norms |
---|
0:20:49 | perhaps like the ones they what they were |
---|
0:20:50 | mentioned the |
---|
0:20:52 | later or earlier |
---|
0:20:53 | but are are one or or a more robust and you're of |
---|
0:20:56 | so in this case |
---|
0:20:58 | uh this illustrates that if you if you norms of data fitting that are |
---|
0:21:02 | are there are more robust like the L L the or in that was uh |
---|
0:21:18 | yes yes uh |
---|
0:21:19 | yes that that will be that will double will you will get a good result if you do yes |
---|
0:21:24 | within that the in the algorithm the that we use as a lorentzian type |
---|
0:21:28 | but we could that derive an algorithm that uses and an L |
---|
0:21:31 | a people less than |
---|
0:21:43 | i |
---|
0:21:44 | i |
---|
0:21:45 | yeah well the L one is the approximation for the for the right |
---|
0:21:55 | um |
---|
0:21:57 | i we have really run run that experiment because are to turn right there there's the data fitting "'em" which |
---|
0:22:02 | is the that one that you were used |
---|
0:22:04 | put a zero point six |
---|
0:22:06 | in and there's the sparsity term which |
---|
0:22:07 | could be the L zero |
---|
0:22:09 | even the L one |
---|
0:22:10 | right so if if you if you if you can but without of them that was the and the data |
---|
0:22:14 | fitting normal zero point six and use the L zero we a very good |
---|
0:22:18 | or or L |
---|
0:22:19 | so point six of than then one maybe |
---|
0:22:22 | fine |
---|
0:22:23 | but he these odd with them the you we try |
---|
0:22:25 | was one that was somewhat to double that that we could go down |
---|
0:22:29 | to that as as was presented with a more in more |
---|
0:22:33 | uh in this here shows you that if you the the L two |
---|
0:22:36 | they are fading |
---|
0:22:37 | you can of the port you gonna get a lot of spurs result because of that |
---|
0:22:40 | because of the structure of the projection |
---|
0:22:42 | which is not |
---|
0:22:43 | some gauss |
---|
0:22:53 | yes that's a good question um |
---|
0:22:55 | uh well we explore where the fun that the goal um |
---|
0:22:59 | characteristics of of such mate |
---|
0:23:02 | uh a to generate them and how do you uh a user in practise that's that's something to be explored |
---|
0:23:07 | uh a sort you you could try |
---|
0:23:10 | uh |
---|
0:23:11 | for a gaussian mixtures are easy |
---|
0:23:13 | a very good approximation of of not sub gaussian uh |
---|
0:23:17 | majors but in general to be in practice you have to to come up that's an open open question |
---|
0:23:25 | okay thank |
---|