0:00:14 | thanks first to me and and joe for for organising |
---|
0:00:18 | very interesting |
---|
0:00:19 | session which also gives us a |
---|
0:00:21 | the in |
---|
0:00:22 | to be prime |
---|
0:00:23 | uh so it's a will be talking to you about the reconstruction algorithms under sparsity constraints and so that kind |
---|
0:00:29 | of relates also to |
---|
0:00:31 | that's of activity going on in signal processing it |
---|
0:00:34 | a compressed sensing in give you some tuition |
---|
0:00:37 | uh in "'em" are i and by means |
---|
0:00:39 | nee since uh tomography |
---|
0:00:42 | um |
---|
0:00:43 | and the |
---|
0:00:44 | also i i mean in doing so i'll give you also kind of the art of for |
---|
0:00:50 | uh i turn to a reconstruction algorithms that use the |
---|
0:00:54 | uh |
---|
0:00:55 | uh |
---|
0:00:56 | regularization constraints uh by by the way of sparsity |
---|
0:01:01 | so uh first to a we i i i i i will just |
---|
0:01:04 | discuss a little the variational approach to image reconstruction will need to |
---|
0:01:08 | minimize so of some |
---|
0:01:10 | some uh a functional and and where we introduce some sparsity constraints then |
---|
0:01:14 | i will read uh read you some of the standard algorithm in particular uh it |
---|
0:01:20 | soft thresholding the uh uh the algorithm that's is stuff |
---|
0:01:23 | the fast version of that which is a a kind of multi step i'll grid which is |
---|
0:01:28 | gives you sort of the state of the arts |
---|
0:01:30 | coming from convex a analysis |
---|
0:01:32 | and then uh uh in this talk i i i i want to |
---|
0:01:35 | i discuss our variation of used star uh which is sort of |
---|
0:01:39 | you note a like to the problem which will use step adaptation to to gain uh some speed because the |
---|
0:01:45 | cool here really to get to the speed of the conventional in your taking |
---|
0:01:49 | is that have been |
---|
0:01:50 | use for for many years |
---|
0:01:52 | and then uh application to power L or i and and by mean this nest since the tomography |
---|
0:01:58 | so here is uh just the set |
---|
0:02:00 | so uh okay so we we have some object X that we'd like to to image for for example |
---|
0:02:07 | a proton density or some absorption map |
---|
0:02:10 | and then we have these scanner |
---|
0:02:12 | here that uh we will just assume we have a linear measurement system which is the the usual assumption in |
---|
0:02:18 | them are i |
---|
0:02:19 | X ray and so it's an integral operator but to since here we have a discrete eyes the problem so |
---|
0:02:24 | we can all represent body big fat matrix H |
---|
0:02:28 | that uh represents all our physics knowledge of of the problem |
---|
0:02:31 | and then uh at the outputs uh oops |
---|
0:02:34 | the yeah i here a see we have some |
---|
0:02:37 | modeling perfection and noise and and all that i i represent by oops |
---|
0:02:41 | by be uh maybe i should be problem like lawn |
---|
0:02:45 | knowledge knowledge that's professes where |
---|
0:02:48 | oh |
---|
0:02:49 | in the room |
---|
0:02:50 | uh |
---|
0:02:52 | okay so so uh uh a so here was saying we we have noise |
---|
0:02:56 | and so our imaging matching model here looks very easy just the this uh imaging matrix up like to the |
---|
0:03:01 | object that we like to recover |
---|
0:03:03 | plus some noise |
---|
0:03:04 | and then in the variational approach so you just set up this |
---|
0:03:08 | kind of uh uh that this criterion here where you have a |
---|
0:03:11 | data consistency requirements so we you you are kind of |
---|
0:03:14 | uh reconstructing X |
---|
0:03:16 | and then you are you're simulating your measurements and you would like this difference to to be very small |
---|
0:03:22 | spec to the true measurements |
---|
0:03:23 | and then here you it should use some regularization constraint and this is the arts |
---|
0:03:28 | of reconstruction here to |
---|
0:03:30 | to to just choose |
---|
0:03:31 | the the the right trigger is regularization |
---|
0:03:34 | and and and a classically here uh for example if we put a a quadratic term your was that's say |
---|
0:03:40 | sum |
---|
0:03:40 | but people operator here that |
---|
0:03:42 | it's that's |
---|
0:03:42 | tikhonov regularization but |
---|
0:03:45 | now uh |
---|
0:03:46 | i with compressed sensing actually all this was started by |
---|
0:03:49 | the observation that so many images natural images would represent was uh well was very few large group fusion is |
---|
0:03:57 | some basis |
---|
0:03:58 | in particular wavelet basis and and in fact this happens as well |
---|
0:04:02 | uh very significantly with medical images since so here's some "'em" are i image |
---|
0:04:07 | and and just the showing different |
---|
0:04:09 | transformations and and and and and so that you can get a pretty good uh a signal restitution but just |
---|
0:04:16 | keeping being a very small percentage of of large coefficients and |
---|
0:04:20 | and and so this this is really like used uh |
---|
0:04:23 | to promote uh i mean to to sort of justify the you the use of wavelets and so here |
---|
0:04:29 | then the idea ideas to use a regularization where you will impose a and one and that T |
---|
0:04:34 | uh a on the wavelet coefficients so here are would uh represent let's say might wavelet transform |
---|
0:04:39 | and one norm |
---|
0:04:40 | uh uh uh promotes sparsity and why it's good its convex so so that we have like some |
---|
0:04:47 | i guarantee you existence uh |
---|
0:04:49 | this |
---|
0:04:50 | oh |
---|
0:04:51 | a solution and also it will favour sparse solution |
---|
0:04:55 | okay so here is sort of uh the grandfather of of those uh a sparse a reconstruction written |
---|
0:05:01 | and and and so uh what do what you want to do is to minimize this okay you have this |
---|
0:05:06 | L two term |
---|
0:05:07 | and this |
---|
0:05:08 | sparsity constraint here i'm just as to me |
---|
0:05:11 | X is the representation in the wavelet basis and i'm using |
---|
0:05:15 | and augmented just the matrix that's sort of contents this |
---|
0:05:18 | a a wavelet transform followed by by by by by by uh |
---|
0:05:22 | i imaging device |
---|
0:05:24 | so |
---|
0:05:25 | then you have to very special case very easy completely classical also first if you put a day will zero |
---|
0:05:31 | just the square problem so it's a a text solution is one like calculation you get that |
---|
0:05:37 | uh and four she those i which may |
---|
0:05:39 | way |
---|
0:05:40 | a you at and |
---|
0:05:41 | rob you |
---|
0:05:42 | would not even want to compute that but then |
---|
0:05:44 | this sort of the grand father of our britains here which is |
---|
0:05:48 | but |
---|
0:05:49 | and a grid |
---|
0:05:50 | uh on on on to this data term |
---|
0:05:52 | and and and i mean this is sort of a you know the big the backbone of many iterative out |
---|
0:06:00 | and the i mean this works but the you have to two |
---|
0:06:03 | to stop it after a certain number of iterations |
---|
0:06:06 | but now the other interesting cases |
---|
0:06:08 | H is i you or you are |
---|
0:06:11 | like a wavelet transform in that case |
---|
0:06:13 | fact this |
---|
0:06:14 | problem you can by just going the wavelet domain |
---|
0:06:17 | uh doing a a a a a a soft thresholding and |
---|
0:06:20 | going |
---|
0:06:21 | now it soft thresholding algorithm which was the uh discovered |
---|
0:06:25 | well |
---|
0:06:26 | many |
---|
0:06:27 | people probably but uh |
---|
0:06:29 | feed yeah they do in no of act yeah uh where where |
---|
0:06:32 | among the very first here using your yeah formulation |
---|
0:06:35 | and so the a kind of a start if you were just to thing those two very |
---|
0:06:39 | three of |
---|
0:06:41 | remarkably uh this would solve this problem |
---|
0:06:45 | should look like to very difficult problem and then do a she the use them and two thousand four were |
---|
0:06:50 | able to |
---|
0:06:51 | sure that action |
---|
0:06:52 | prove convergence |
---|
0:06:54 | and so that that was uh a little revolution in the field |
---|
0:06:57 | although i'm say the output myth |
---|
0:07:02 | so maybe not so usable in that form |
---|
0:07:04 | and now if you just look at the structure actually the fixed points uh algorithm |
---|
0:07:08 | so uh what you have is a rule which is this a gradient |
---|
0:07:12 | this sense followed by |
---|
0:07:14 | based thresholding that will give you the new update and then there's this nice |
---|
0:07:18 | mathematics that guarantees convergence |
---|
0:07:21 | and fortunately it's very slow so |
---|
0:07:23 | that's been shown now that uh the cost |
---|
0:07:26 | will a decrease like one over and where and the number of iteration |
---|
0:07:30 | but then by |
---|
0:07:32 | uh building on many many years of full convex optimization then there was this proposition position in two thousand nine |
---|
0:07:39 | of fist yeah |
---|
0:07:40 | which is a like you control |
---|
0:07:42 | over the relaxation |
---|
0:07:43 | evaluation on this thing but |
---|
0:07:45 | essentially what it is it's it's a an updating dating a G that not only text |
---|
0:07:49 | uh |
---|
0:07:50 | uh uh uh built upon the previous estimate but takes two |
---|
0:07:54 | pretty |
---|
0:07:55 | a so and minus one |
---|
0:07:56 | and and and |
---|
0:07:58 | and then will |
---|
0:08:00 | a a construct a kind of |
---|
0:08:01 | vacation |
---|
0:08:02 | well |
---|
0:08:03 | car uh the correct uh update should be |
---|
0:08:05 | and then apply the is to out |
---|
0:08:08 | so to to just make it |
---|
0:08:10 | problem |
---|
0:08:11 | trying to solve |
---|
0:08:12 | and are remarkably this fist uh algorithm |
---|
0:08:14 | use you uh uh and square |
---|
0:08:17 | so it it's one or there |
---|
0:08:18 | faster or and and and and so gives you |
---|
0:08:21 | now a a a fast enough our |
---|
0:08:23 | to to to apply this kind of techniques so |
---|
0:08:25 | now what we have proposing here |
---|
0:08:28 | it's some variation and and so we call it fast the okay |
---|
0:08:32 | weighted iterative soft thresholding algorithm |
---|
0:08:35 | and and the idea is pretty simple so we had the step size |
---|
0:08:39 | or |
---|
0:08:40 | but not your |
---|
0:08:41 | yeah |
---|
0:08:42 | getting factor for for the great so we replace that by a diagonal matrix |
---|
0:08:47 | and actually we can also uh some |
---|
0:08:49 | flexibility on the |
---|
0:08:51 | a side of the regularization so |
---|
0:08:53 | with the slightly more channel |
---|
0:08:55 | regularisation matrix |
---|
0:08:57 | and you she uh the this is uh or or style with my are derived using |
---|
0:09:01 | uh max a uh uh uh and M |
---|
0:09:04 | a a a a a a a as a sort of constructing a a a quadratic bound on on the |
---|
0:09:09 | cost functional |
---|
0:09:10 | and and then all using a you know always trying to go and the bottom of this quadratic bound no |
---|
0:09:15 | if you have more flexibility here with |
---|
0:09:17 | introducing this matrix |
---|
0:09:19 | fact but you can do you can construct |
---|
0:09:21 | better a quadratic bound that |
---|
0:09:23 | re tailored to it to your problem |
---|
0:09:26 | and so hopefully this should allow us to go faster |
---|
0:09:29 | and then actually we we can do some have here |
---|
0:09:32 | and and so we obtain a now this uh this kind of uh in a quite the on the cost |
---|
0:09:38 | so no if you look at it superficially for mathematician doesn't look so much better because we still like what |
---|
0:09:44 | are and where one or and square |
---|
0:09:46 | but the important thing here we instead of having |
---|
0:09:48 | a in a norm we have |
---|
0:09:51 | weighted |
---|
0:09:52 | and actually this i for the call |
---|
0:09:54 | oh |
---|
0:09:55 | and now it means that clever engineering can really like from down those calls |
---|
0:10:00 | quite significant and and then |
---|
0:10:02 | can uh and that a be much fast |
---|
0:10:04 | so the application here i |
---|
0:10:06 | just want to |
---|
0:10:07 | a show you two examples by by the way i forgot to knowledge my |
---|
0:10:11 | cool all is okay and who the want to |
---|
0:10:14 | did all the work and and this is the first one matt i'm much should |
---|
0:10:17 | just can K on is working on product and more i actually in collaboration |
---|
0:10:21 | with the guy who invented the pollen and the rice so it's cuss was month from it T H three |
---|
0:10:28 | so it here the idea is just to to have the a bunch of colours here |
---|
0:10:32 | and for the |
---|
0:10:33 | soon or a little basics about "'em" or rice so M are i it's she we compute samples of of |
---|
0:10:38 | a a a a measure samples of the for yeah transform |
---|
0:10:41 | but if you have colours in multiple colour they also this sensitivity function so it's |
---|
0:10:46 | it's you are object X multiplied by a C |
---|
0:10:50 | a function that's a sense it to you but you of your cord and doing the for transform all all |
---|
0:10:54 | that |
---|
0:10:54 | and in power and the right you just put several core |
---|
0:10:58 | so you you can do the the measurement in parallel and hopefully this |
---|
0:11:01 | allows you |
---|
0:11:02 | to reconstruct images using |
---|
0:11:04 | fewer samples |
---|
0:11:06 | and so now here how can we exploit the structure of the problem so we need really to |
---|
0:11:10 | they those degrees of freedom that that left which is this diagonal weighting matrix |
---|
0:11:15 | so if you look at the pollen and more i what we have |
---|
0:11:18 | so for for each call here or you would have this sense |
---|
0:11:21 | T matrix which is just a a i uh a diagonal matrix followed by full you |
---|
0:11:26 | and now the problem here is this sensitivity can vary a not |
---|
0:11:30 | and |
---|
0:11:30 | on on you on your on your course |
---|
0:11:33 | and it's a pretty in a gene use and that makes the the problem not so well conditioned because for |
---|
0:11:38 | many years |
---|
0:11:39 | actually i are i would mean you |
---|
0:11:40 | and inverse for your transforms so with the unitary for |
---|
0:11:43 | vol |
---|
0:11:44 | oh rate in the |
---|
0:11:46 | problem for inverse |
---|
0:11:47 | a the imaging uh if inverse problem |
---|
0:11:50 | people but uh i mean with was that it becomes much uh uh less less well uh condition |
---|
0:11:56 | so now the idea of sensitivity based a precondition conditioning |
---|
0:12:01 | here uh |
---|
0:12:02 | okay so it's uh try to uh inverse he respect respect to to to the sensitivity |
---|
0:12:08 | just tried to get the better condition |
---|
0:12:11 | uh a system matrix |
---|
0:12:12 | uh in in this uh precondition |
---|
0:12:16 | them |
---|
0:12:16 | variables |
---|
0:12:17 | and so now if you think of it in the wavelet domain you could almost to the same but okay |
---|
0:12:23 | uh you just have to transpose your send |
---|
0:12:26 | i |
---|
0:12:26 | in the wavelet domain and an intuitive is just the way that is more less local so you would just |
---|
0:12:31 | i agree |
---|
0:12:32 | a sense C P T V T oh over the support or of the where let's |
---|
0:12:37 | and and and then get this kind of precondition E |
---|
0:12:40 | matrix |
---|
0:12:41 | here here and and and so that makes a a much better version |
---|
0:12:45 | after precondition version of the great |
---|
0:12:47 | so here are some examples i could also have shown you was real data are but here were just using |
---|
0:12:53 | the ship and look and don't because |
---|
0:12:55 | just the to you know compare the performance of the R so |
---|
0:12:58 | we had taking a a pretty realistic you a set up here with the |
---|
0:13:03 | uh for receiving corps some |
---|
0:13:05 | some noise here and and uh a certain number of of a radial a trajectory |
---|
0:13:11 | and where using uh a a reconstruction in the in the hard to domain |
---|
0:13:15 | using a two level haar transform and so this would be the traditional is to out |
---|
0:13:20 | so this chart soft thresholding |
---|
0:13:24 | and so this is the cost it but it would should |
---|
0:13:26 | oh of the cost function and this is the uh this is the the error with respect |
---|
0:13:32 | this solution |
---|
0:13:33 | actually |
---|
0:13:34 | exact |
---|
0:13:34 | sure |
---|
0:13:35 | optimization problem |
---|
0:13:36 | and so you see this kind of |
---|
0:13:38 | uh a typical uh it pollution of the classical uh i i mean the typical out with is not very |
---|
0:13:44 | fast |
---|
0:13:45 | and then |
---|
0:13:45 | if you do fist uh it's much five |
---|
0:13:48 | this is |
---|
0:13:49 | and square |
---|
0:13:50 | type of performance |
---|
0:13:52 | but now if you do this trick conditioning as as as we propose |
---|
0:13:56 | in this work |
---|
0:13:57 | we get more less the same slope but okay |
---|
0:14:00 | we get a |
---|
0:14:00 | or earlier |
---|
0:14:01 | and actually in practice this makes a huge difference because we may be happy here |
---|
0:14:06 | K with this kind of uh performance |
---|
0:14:10 | and so it can |
---|
0:14:10 | means we can |
---|
0:14:11 | get there uh at that much fun |
---|
0:14:14 | "'cause" |
---|
0:14:14 | as the number |
---|
0:14:15 | iteration |
---|
0:14:17 | and and same a here so maybe just to show you some uh uh uh examples of reconstruction so here |
---|
0:14:23 | have just showing you the evolution of of the solution |
---|
0:14:26 | was a number of iterations so that's the T star |
---|
0:14:29 | so it or too soft thresholding goes there very slowly |
---|
0:14:32 | fist fist close them very a much faster and if we have those appropriate weight we will there even faster |
---|
0:14:38 | so |
---|
0:14:39 | she here we chose a such that a |
---|
0:14:41 | uh we get yeah this quality here |
---|
0:14:44 | with this number of iterations so so we can |
---|
0:14:47 | a essentially the uh or or or here here's it what's okay to actually this is better quality of that |
---|
0:14:53 | so |
---|
0:14:53 | we can go go there for five |
---|
0:14:56 | this particular |
---|
0:14:58 | so why not |
---|
0:15:00 | yeah |
---|
0:15:01 | times |
---|
0:15:01 | that's sir and |
---|
0:15:02 | a a faster and and and |
---|
0:15:05 | anyway all those methods that uh |
---|
0:15:08 | optimising the same cost function |
---|
0:15:10 | okay okay so then these second application was |
---|
0:15:13 | but you wouldn't since tomography |
---|
0:15:16 | and so this is a a a a much harder inverse problem of this |
---|
0:15:20 | very very badly conditioned |
---|
0:15:22 | and hear what you have your you your you have some uh uh by |
---|
0:15:26 | hmmm hmmm this sense uh uh |
---|
0:15:28 | uh |
---|
0:15:30 | a Q markers within the specimen |
---|
0:15:33 | and and then a you are you i have like detector |
---|
0:15:36 | just the around the object of interest |
---|
0:15:39 | and here i mean you you have okay optical uh optical setup but |
---|
0:15:44 | i i i mean here the like this |
---|
0:15:46 | just |
---|
0:15:46 | if you |
---|
0:15:48 | and and in fact you have the diffusion equation so you can shall solve this diffusion equation using the green |
---|
0:15:53 | function |
---|
0:15:53 | and this will give you a a a a a system matrix and |
---|
0:15:57 | voice once to a a |
---|
0:15:59 | and near |
---|
0:16:00 | uh a model but |
---|
0:16:01 | three need badly condition because was green function |
---|
0:16:05 | so here the green functions actually look like that so it is it K like one over |
---|
0:16:09 | uh a a a normal hard to the power to |
---|
0:16:12 | and it happens |
---|
0:16:13 | be very well condition also |
---|
0:16:15 | so the idea of now is a by second call third row shot |
---|
0:16:19 | was try to |
---|
0:16:21 | you know use this flexibility of the precondition conditioning tried to try to |
---|
0:16:25 | you mode get a much better conditioning by just multiplying this by this |
---|
0:16:29 | diagonal matrix and so |
---|
0:16:30 | yeah the inside that |
---|
0:16:32 | if you few where a more this normal |
---|
0:16:34 | uh uh this a system matrix |
---|
0:16:36 | to make it like constant and energy |
---|
0:16:38 | it would produce a much better condition |
---|
0:16:41 | and and and so this |
---|
0:16:42 | this results in |
---|
0:16:44 | type of of |
---|
0:16:45 | set up for |
---|
0:16:47 | uh reconstruction of group |
---|
0:16:49 | and in this case of so here we have also some experiments use syntactic force just to demonstrate that it |
---|
0:16:55 | really works so we just have to for uh |
---|
0:16:57 | a by lee |
---|
0:16:58 | mean S and sources point sources |
---|
0:17:00 | so here we are imposing sparse |
---|
0:17:03 | state of because it's uh |
---|
0:17:05 | a very small sources here |
---|
0:17:07 | and here's a comparison of this three algorithms by the way also |
---|
0:17:10 | uh physically realistic modeling of different optical palm which is uh |
---|
0:17:15 | number of sources detectors et cetera also some noise |
---|
0:17:19 | and so here it's to a bit of course |
---|
0:17:21 | this slow |
---|
0:17:22 | the fist act was much faster and our i'll get |
---|
0:17:25 | was at the same the rate |
---|
0:17:27 | i the fist are but gets there much faster and |
---|
0:17:30 | in this case we have a ten fold |
---|
0:17:33 | you |
---|
0:17:34 | but essentially using the same out with just with this |
---|
0:17:37 | clever |
---|
0:17:38 | uh precondition E |
---|
0:17:39 | here are uh some is also here |
---|
0:17:41 | is the uh the source |
---|
0:17:44 | and and so this is the evolution of the |
---|
0:17:46 | standard i'll go so it has a |
---|
0:17:48 | no it takes a C is very badly conditioned so it's very very low |
---|
0:17:52 | it will take a long time to get there this is the fist a |
---|
0:17:56 | goes faster and this is the weighted fist or |
---|
0:17:59 | and and essentially here we also set it up so that you have have a correspondence your so it's a |
---|
0:18:05 | bit and fine |
---|
0:18:06 | or ten times is a lot |
---|
0:18:08 | oh |
---|
0:18:08 | for |
---|
0:18:09 | image |
---|
0:18:10 | so so then this makes it much more practical so that really gets me to the and here |
---|
0:18:14 | so the conclusion here |
---|
0:18:16 | uh in a principal we hope we can hope we get better quality |
---|
0:18:20 | taking advantage of of all those methods based on on sparsity so it has been demonstrated in the field that |
---|
0:18:26 | you get better |
---|
0:18:27 | instruction now we can also |
---|
0:18:29 | it might think speed and this is this |
---|
0:18:31 | a she's |
---|
0:18:32 | T Q was of |
---|
0:18:33 | different next generation strategy so the multi step update the adaptive weights |
---|
0:18:38 | and and so now present we we |
---|
0:18:41 | with a the factor |
---|
0:18:42 | so |
---|
0:18:43 | or even a little less of the best the |
---|
0:18:46 | you here taken |
---|
0:18:47 | we which are conjugate way |
---|
0:18:49 | and and so now it becomes very applicable |
---|
0:18:52 | now that when used for signal processing this you like it's not like you can our |
---|
0:18:56 | to requires you expert is to just find the right |
---|
0:18:59 | i i that this step and conditioning so that you algorithm |
---|
0:19:03 | we will work for us and finally generality |
---|
0:19:06 | because it's transpose able to many |
---|
0:19:08 | modalities as uh meal and was saying that X rate |
---|
0:19:11 | tomography more breathy action the lab we also use a uh working face |
---|
0:19:15 | thus the more graffiti |
---|
0:19:16 | also or S |
---|
0:19:17 | my process |
---|
0:19:18 | other modality |
---|
0:19:20 | so that's just you and |
---|
0:19:22 | thank you for your |
---|
0:19:23 | uh attention |
---|
0:19:28 | i have time for maybe one quick question |
---|
0:19:36 | just one if you can time i H i seven wavelet sparse in seven like total variation a their air |
---|
0:19:42 | you technique can be used X a total variation a construct |
---|
0:19:45 | yeah actually is uh a a a very good question which just the actually yeah i have a paper on |
---|
0:19:49 | that so that hopefully should come out in journal |
---|
0:19:52 | i |
---|
0:19:53 | the that so also |
---|
0:19:55 | uh uh there's a trick with wavelet uh a because where that's and not completely shift invariance so so |
---|
0:20:01 | uh two |
---|
0:20:02 | in fact if you just a use a a conventional wavelet and poke the valuation of the valuation works a |
---|
0:20:07 | little but |
---|
0:20:09 | but now |
---|
0:20:09 | if you allow for sure |
---|
0:20:11 | wavelet wavelet they something |
---|
0:20:12 | that's called |
---|
0:20:13 | cycle spinning |
---|
0:20:15 | we show that we can get |
---|
0:20:17 | as as good or even a little better than total the variation |
---|
0:20:21 | and it a in its much faster uh uh to a rate |
---|
0:20:25 | the way that you bed |
---|
0:20:26 | the total valuation where is all this |
---|
0:20:29 | a you do you are but of course people are also working on total variation and there also fast version |
---|
0:20:34 | of total variation |
---|
0:20:36 | but uh i i think it |
---|
0:20:38 | easier to get five our written |
---|
0:20:40 | uh with with |
---|
0:20:41 | the the sort of uh since this this formulation rather than and that is |
---|
0:20:45 | formulation but |
---|
0:20:47 | uh and and |
---|
0:20:48 | so the good news is this with way that we can it |
---|
0:20:50 | yeah that's with that |
---|
0:20:52 | oh |
---|
0:20:52 | so it's |
---|
0:20:53 | we have documented it with the M or i |
---|
0:20:57 | it's like my so once again things |
---|