0:00:13 | a very much good afternoon my name is down you well are to be talking to you a to day |
---|
0:00:17 | about combining press and see what parallel M are i |
---|
0:00:20 | and for gonna compare |
---|
0:00:22 | with uniform and random cartesian undersampling pattern |
---|
0:00:25 | "'kay" space |
---|
0:00:26 | so first me introduce magnetic resonance image |
---|
0:00:29 | a attack resonance imaging |
---|
0:00:31 | is a very versatile |
---|
0:00:32 | in growing imaging modality both the madison |
---|
0:00:36 | in spectroscopy and others |
---|
0:00:38 | and in particular we can get images of all kinds of organs of four bodies et cetera |
---|
0:00:44 | uh for example we have |
---|
0:00:46 | a a a a a |
---|
0:00:47 | uh a three D volume of a |
---|
0:00:49 | brain i T one weighted image here |
---|
0:00:51 | yeah and however despite all the advancements in the last thirty is some odd years and am are i acquisition |
---|
0:00:57 | time so remains an issue for many times acquisitions for instance this image took between eight in ten minutes to |
---|
0:01:02 | acquire in a three test the mac |
---|
0:01:05 | now if we can get a faster acquisition we can lower the cost of |
---|
0:01:08 | uh the uh uh are imaging first |
---|
0:01:10 | subjects slash patients |
---|
0:01:12 | uh we can increase the comfort for those subjects because they're not in the scanners long and we can possibly |
---|
0:01:17 | also improve the tradeoffs an increase the quality of our image reconstruction |
---|
0:01:22 | and to that and do uh we propose a method called spring which combines the parallel imaging method known as |
---|
0:01:28 | crap but |
---|
0:01:29 | with |
---|
0:01:29 | compressed sensing slash sparsity |
---|
0:01:32 | and we use that cover images from accelerated that is under they |
---|
0:01:37 | now |
---|
0:01:38 | we |
---|
0:01:38 | it can investigate several different undersampling sampling strategies in the kind of |
---|
0:01:42 | uh compressed sensing all the conventional wisdom is that random undersampling is required |
---|
0:01:47 | however we show that with the addition of parallel imaging in certain situations uniform undersampling |
---|
0:01:53 | is sufficient |
---|
0:01:56 | so let's talk about case space quickly so case space basically were first to the fourier a transform domain which |
---|
0:02:01 | the signals are actually acquired |
---|
0:02:02 | then we can normally take an inverse fourier transform a shown here to take the case space data |
---|
0:02:08 | in recovery images or in the three D case of value |
---|
0:02:11 | now we're gonna sample case space using a a after scanning a a approach |
---|
0:02:16 | uh we're gonna get a cartesian volume |
---|
0:02:19 | here where we essentially actually have a readout direction as the after scanning direction not sometimes called frequency oh |
---|
0:02:25 | and we're going to assume that this direction is |
---|
0:02:27 | transverse to the axial slice plane |
---|
0:02:30 | slice but something called the transverse |
---|
0:02:33 | now the axial slice plane we're going to acquire |
---|
0:02:36 | uh in two dimensions here a set of lines |
---|
0:02:40 | so uh |
---|
0:02:41 | and we can but a bound points in the plane |
---|
0:02:43 | yeah we're going to under sample the points because it takes too much time to acquire a whole whole you |
---|
0:02:50 | so for instance a we under sample by two in two directions we can reduce the total amount of ski |
---|
0:02:55 | time by a factor |
---|
0:02:56 | for |
---|
0:02:57 | and we would get and aliased image if we just did inverse fourier transform because of the undersampling that's actually |
---|
0:03:03 | looming under sample reduces or field of view that we we get overlap |
---|
0:03:07 | and the reconstruction |
---|
0:03:10 | now the deal with this there too general methods that the proposed over the years |
---|
0:03:14 | other probably others to but i to focus on be used |
---|
0:03:17 | so for small there's parallel imaging |
---|
0:03:19 | in parallel imaging essentially we can have a multiple uh a set up such as the three two channel quayle |
---|
0:03:24 | shown you |
---|
0:03:25 | and the thirty two channel called all basically a gets images |
---|
0:03:28 | from |
---|
0:03:29 | a a a a a a whole ray |
---|
0:03:31 | of coils and each coil gets a slightly different we image basically due to its spatial position |
---|
0:03:37 | very |
---|
0:03:38 | so prop but is a method that basically takes |
---|
0:03:41 | all these |
---|
0:03:42 | a all data in case space under undersampled data |
---|
0:03:45 | uses a small block of additional |
---|
0:03:47 | calibration line |
---|
0:03:49 | called a cs S lines |
---|
0:03:50 | to calibrate a kernel which is then used to fit in all the missing K space and that's on do |
---|
0:03:54 | the aliasing |
---|
0:03:56 | now compressed sensing works a little different way and that and step |
---|
0:03:59 | a designing a specialised observation model we're designing |
---|
0:04:02 | a prior for are image for or image is sparse in some domain |
---|
0:04:07 | for instance |
---|
0:04:08 | the brain images i showed for |
---|
0:04:10 | we could consider them approximately sparse or compressible |
---|
0:04:13 | in a domain like the for level nine seven do W T |
---|
0:04:17 | now the fourier transform is nice because it provides incoherent sampling |
---|
0:04:21 | a i to be under sample |
---|
0:04:23 | and have have the random fashion |
---|
0:04:25 | and and therefore for using the sparsity the incoherent slash random sampling |
---|
0:04:29 | and an amateur algorithm like of one magic or others adapted to the complex that the data we have here |
---|
0:04:35 | uh we can do compressed sensing reconstruction |
---|
0:04:38 | now a more abstract lee what we can think about this problem |
---|
0:04:42 | i have in |
---|
0:04:43 | in not simple presentation as a multichannel sampling problem |
---|
0:04:46 | where essentially we can incomplete data |
---|
0:04:49 | so we have that the image you that we desire the we a i |
---|
0:04:52 | yeah this image is multiplied in the image domain by a much these coil all sensitive D so those weightings |
---|
0:04:57 | i mentioned in the parallel imaging setup |
---|
0:04:59 | and then there's sample that for a transfer domain knows that their sample the same for a transform court and |
---|
0:05:04 | it's |
---|
0:05:05 | for all the oil |
---|
0:05:07 | and then we're assume there the data is perturbed by some amount of additive white noise |
---|
0:05:12 | now we assume the noise well white the cross frequency is actually some correlation across coils so we can measure |
---|
0:05:17 | this noise covariance |
---|
0:05:19 | using he simple fast priest king |
---|
0:05:24 | now |
---|
0:05:25 | what's look back a parallel imaging for second the motivate the spring work |
---|
0:05:29 | so parallel imaging methods such just grab a we're great at low undersampling factors in fact today clinically we can |
---|
0:05:34 | do |
---|
0:05:35 | uh acceleration factor a between two and four and still get high quality images is many application |
---|
0:05:40 | however as we further accelerate our scanning process to say nine sixteen twenty five thirty six times over undersampling |
---|
0:05:48 | our proper results clearly a D grade and receive the noise essentially blows up |
---|
0:05:53 | yeah this noise amplification is well is a one of phenomenon in parallel image |
---|
0:05:58 | now there's additional form of error that i wanna just mentioned because it's important |
---|
0:06:01 | to this talk |
---|
0:06:02 | and that as residual aliasing so you can't really see that in the use images but you can imagine that |
---|
0:06:07 | i due to limitations are coil is that the on aliasing from the graph of kernels not going to be |
---|
0:06:12 | exact if are coils |
---|
0:06:14 | are don't have enough a spatial variability to duplicate a high frequency complex exponentials we need the synthesized the frequency |
---|
0:06:20 | shifts in case space |
---|
0:06:23 | now |
---|
0:06:24 | we know the noise here is not sparse |
---|
0:06:26 | yeah and therefore for promoting sparsity can D noise or images |
---|
0:06:30 | now compressed sensing can also undo the incoherent aliasing here |
---|
0:06:33 | so if we have a saying |
---|
0:06:35 | unlike like what we saw before but a leasing from random undersampling compress i think a on that as well |
---|
0:06:41 | so that kinda motivates |
---|
0:06:42 | using spring to improve this crap a result using compressed sensing or sparsity |
---|
0:06:49 | so we have |
---|
0:06:50 | two possible a general types of undersampling sampling frameworks works many others |
---|
0:06:54 | now a uniform undersampling we motivate by the time prototypical grapple a |
---|
0:07:00 | method expect a uniform undersampling |
---|
0:07:02 | and drop can take uniformly undersampled data and a on it we use it up to uh P alias |
---|
0:07:08 | so basically if you have a capital P oils |
---|
0:07:11 | it can |
---|
0:07:13 | in fury and you up to an acceleration factor of |
---|
0:07:16 | are equals P |
---|
0:07:18 | however beyond on that you get a cold here artifacts |
---|
0:07:21 | now if we shift to a random undersampling set the application grout but isn't and so straightforward and will address |
---|
0:07:26 | that shortly |
---|
0:07:27 | but we can still in theory are really as |
---|
0:07:30 | a frequency shifts up to capitol P |
---|
0:07:33 | of course if we have random undersampling we may have areas that are very sparse sampled and for this may |
---|
0:07:38 | not be sufficient |
---|
0:07:39 | however can here is that are not able be reconstructed exactly be typically get not sparse were incoherent part of |
---|
0:07:46 | now compressed sensing |
---|
0:07:48 | but we apply with uniform undersampling were violating kind of one that basic of requirements compressed sensing at least in |
---|
0:07:54 | theory |
---|
0:07:54 | uh we get a very limited on a listing for that reason however a press thing or rather a regular |
---|
0:07:59 | rising the sparse C still very capable D voicing images |
---|
0:08:04 | however if we can combine to press with random undersampling therefore we get the resolution a listing as predicted by |
---|
0:08:09 | a T you here as bounds and so one |
---|
0:08:11 | so then the question becomes with spring apply with these different sampling patterns |
---|
0:08:15 | what gives us the best combination on aliasing and denoising for application |
---|
0:08:19 | oh just the be complete i just one mention that between there various kinds of sampling parents we |
---|
0:08:24 | kind of export those as well there's jitter undersampling sampling spots on this sampling which is a favourite in the |
---|
0:08:30 | elements P are well known a state-of-the-art art method |
---|
0:08:33 | other other under sampling all pass of this can other combinations of |
---|
0:08:37 | but list goes on and not |
---|
0:08:39 | oh the spring method just a a general overview |
---|
0:08:42 | essentially panels as fidelity to the crap a solution |
---|
0:08:45 | the first part of the equation below |
---|
0:08:48 | joint sparse the solution that the second part the equation below |
---|
0:08:51 | while preserving the acquired data |
---|
0:08:54 | and the reason why preserve our acquired data |
---|
0:08:56 | is is better than taking uh a a bayesian view of things we actually don't wanna a kind of remove |
---|
0:09:01 | any information or somehow clean up our data in some way that's not really realistic |
---|
0:09:06 | so therefore we really only going to mess with the |
---|
0:09:09 | and a case space |
---|
0:09:10 | so this |
---|
0:09:11 | optimization problem here |
---|
0:09:13 | uh picks the full |
---|
0:09:15 | F will be so the full set of K space but we're gonna preserve the is S were only actually |
---|
0:09:19 | filling in the missing case space in we can use that |
---|
0:09:22 | to simplify or optimization problem and to a constraint one |
---|
0:09:25 | but a in the null space of the matrix K |
---|
0:09:28 | now just a little bit of other notation here we have our D R data |
---|
0:09:32 | arg a solution |
---|
0:09:34 | you and |
---|
0:09:35 | that basically can be pre-computed a of time if that only depends on the data |
---|
0:09:39 | we have some low resolution quilt combination we that we just use the fights um |
---|
0:09:44 | heuristic way |
---|
0:09:45 | to the uh grab a |
---|
0:09:47 | a fidelity |
---|
0:09:48 | and we have a tuning parameter that we can set based on confidence in the graph a result |
---|
0:09:52 | we also have a a joint sparsity penalty function |
---|
0:09:55 | that basically |
---|
0:09:57 | uh could be like the L one norm or could be something else to approximate the L zero norm in |
---|
0:10:02 | our analysis |
---|
0:10:02 | "'cause" we know the elves your penalty T is very hard to uh |
---|
0:10:06 | compute with this |
---|
0:10:07 | where a have my |
---|
0:10:09 | so we can use the convex all one norm which is |
---|
0:10:12 | nice to a C properties or we want to maybe get a bit closer to the L zero penalty we |
---|
0:10:17 | use |
---|
0:10:17 | a home we topic you'd way should with the non-convex convex penalty |
---|
0:10:20 | function |
---|
0:10:21 | such as the cushy penalty function |
---|
0:10:23 | which has this kind of a logarithmic increase as we increase the you which we just say school fish |
---|
0:10:29 | of are sparse transform representation |
---|
0:10:32 | now |
---|
0:10:33 | we but and we enforce joint sparsity over all the coils pipeline |
---|
0:10:38 | you penalty function whatever it is |
---|
0:10:39 | not to this this not to the individual coefficients |
---|
0:10:42 | but to the L two norm of the coefficient across all the coils |
---|
0:10:47 | so we actually have this hybrid penalty function |
---|
0:10:50 | yeah and |
---|
0:10:50 | by applying this so we can essentially forced joint sparsity across all just the sparsity basically is considered to be |
---|
0:10:57 | from the object for scanning and not some |
---|
0:10:59 | uh |
---|
0:11:00 | artifact from a a little sensitivity use |
---|
0:11:03 | say not having enough signal and in area |
---|
0:11:07 | now various strategies exist for extending the spring method which is originally proposed a uniform undersampling case |
---|
0:11:14 | and essentially were going to look at extending the grab the method and then using spring this actually has is |
---|
0:11:20 | so it of method such as here to drop but |
---|
0:11:22 | or or a real |
---|
0:11:24 | uh methods |
---|
0:11:25 | a spirit to grab |
---|
0:11:27 | can handle arbitrary sample |
---|
0:11:28 | as but they are catered of an nature and therefore computationally intensive |
---|
0:11:32 | now other direct map is that exist already for radial and spiral trajectories |
---|
0:11:36 | and we draw a a motivation from those two start grab but are arbitrary cartesian undersampling using a direct method |
---|
0:11:42 | where we have locally by kernel |
---|
0:11:44 | yeah basically for each block a case space we have separate graph or kernels |
---|
0:11:48 | they're drive using in cs data |
---|
0:11:50 | now because the direct method we can still up compute G of D that a a result had of time |
---|
0:11:55 | and there's use that through out our spring |
---|
0:11:58 | computations were were not really increase the computational complexity of the spring method it's |
---|
0:12:03 | however because it is to racks and we're not really folding in the compressed sensing key into the crap reconstruction |
---|
0:12:10 | as it's happening and may not be quite as robust as a method |
---|
0:12:14 | but leads for this demonstration or |
---|
0:12:15 | we willing to take that |
---|
0:12:18 | we to by case space into a bunch of blocks of some |
---|
0:12:22 | average acceleration size we couldn't have choose R Y an R Z here appropriately |
---|
0:12:27 | and i we reconstruct construct each block with the kernel to find for little sampling plan for instance if we |
---|
0:12:31 | look at the central block here reconstructing it |
---|
0:12:33 | using only the central block ends neighbour we can imagine have the six sense many neighbours |
---|
0:12:37 | i we can look at that point that where we have a red |
---|
0:12:40 | as our target that we want reconstruct a we can essentially look at the same pattern |
---|
0:12:44 | in that C S |
---|
0:12:45 | a a region yeah essentially use that pair as a template and trace the row along a cs region the |
---|
0:12:50 | guess and if fits to calibrate occur |
---|
0:12:53 | so just kind of demonstrating that you can see |
---|
0:12:56 | now we can also take another target point |
---|
0:12:58 | with a different right axe there and the left |
---|
0:13:01 | and it still i get another set |
---|
0:13:05 | now the shows some reconstructions |
---|
0:13:07 | uh had to evaluate between a uniform random undersampling the first turn to the extremely sparse |
---|
0:13:12 | yep popular |
---|
0:13:13 | shop logan fan and and we multiply channel i sit |
---|
0:13:16 | uh |
---|
0:13:17 | part of that's not a word |
---|
0:13:18 | i an essentially we construct an each channel shuffled in phantom using a a B Os of our lot based |
---|
0:13:23 | B once really |
---|
0:13:25 | and we yeah some noise we're we actually simulate noise covariance using |
---|
0:13:29 | uh the proposed a |
---|
0:13:32 | a measure |
---|
0:13:33 | uh for noise covariance from rumours ninety ninety paper |
---|
0:13:36 | and uh we sin white noise that minus thirty D B |
---|
0:13:40 | yeah and essentially an we apply the spring method using both L one norm a she penalty using both to |
---|
0:13:45 | a uniform undersampling the C way |
---|
0:13:49 | so here i'm showing how but the shop logan phantom looks like four we more i channel it |
---|
0:13:53 | and but the more i channel you just look like a for under sample |
---|
0:13:58 | now if under sample |
---|
0:14:00 | uniformly and take the grab a result |
---|
0:14:02 | uh this is what an eight channel but where accelerating at of |
---|
0:14:05 | factor of sixteen approximately to the E S data |
---|
0:14:09 | uh |
---|
0:14:10 | it essentially actually we know a primary that crap of by itself is not going the work |
---|
0:14:15 | and we can see a significant at lease are aliasing artifact |
---|
0:14:18 | the lab |
---|
0:14:19 | how |
---|
0:14:20 | uh uh applying spring this is with uniform undersampling mind you |
---|
0:14:23 | are are able to bear once the grab result |
---|
0:14:26 | with a some additional sparse regularization actually get rid of |
---|
0:14:29 | a number of those coherent artifacts just by virtue of the fact that the cup chuckled of and that is |
---|
0:14:34 | so spy |
---|
0:14:35 | using the cushy she |
---|
0:14:37 | a a penalty function which as a sparse see that more than L one by itself |
---|
0:14:41 | uh i i can actually further reduce |
---|
0:14:43 | the amount of |
---|
0:14:45 | a come here artifacts we actually get a easily good all the image |
---|
0:14:49 | using random undersampling |
---|
0:14:50 | actually improves |
---|
0:14:51 | things low bit further |
---|
0:14:53 | in the spring side however we know as the grab result because |
---|
0:14:56 | of the income here's a to fast a it a visually very and pleasing however still at even with all |
---|
0:15:01 | that and use that we see that all that here |
---|
0:15:04 | uh but |
---|
0:15:05 | adding the regularization slash the yes |
---|
0:15:07 | spraying |
---|
0:15:08 | i does help matters |
---|
0:15:09 | yeah B see again with the so she L C we get i |
---|
0:15:12 | i quality |
---|
0:15:12 | image |
---|
0:15:14 | now if we turn ourselves to real data |
---|
0:15:16 | quite if very two channel T one weighted and pure rage rain |
---|
0:15:20 | i'd i data at three test |
---|
0:15:22 | and essentially a we only extracted a small set |
---|
0:15:27 | of the coils from that really kind to pull out that you list |
---|
0:15:31 | and when we did the reconstructions you can see that aliasing still we're meeting in the spring construction when we |
---|
0:15:35 | have uniform undersampling |
---|
0:15:37 | however uh when we turn the random undersampling those scope here of are non where left |
---|
0:15:42 | and blurring because we have to turn up the acceleration so much it |
---|
0:15:46 | a twenty five |
---|
0:15:47 | which is very |
---|
0:15:48 | in inter |
---|
0:15:49 | and just in conclusion i just one a point out that for the various partial logan phantom |
---|
0:15:54 | the "'cause" penalty function as a factor that boat denoising and on do we the saying even with uniform undersampling |
---|
0:15:59 | but for real data sparsity sparse my with uniform undersampling |
---|
0:16:02 | i doesn't really mitigate in leasing that much of the D is the image |
---|
0:16:06 | a random sample a random undersampling does increase ability of cs resolve aliasing most sparse real images was gonna correlates |
---|
0:16:12 | with the intuition and |
---|
0:16:14 | however |
---|
0:16:15 | uh we have to consider whether a leasing is really snap can real images and just wanna show one more |
---|
0:16:19 | slide of that's okay |
---|
0:16:21 | yeah so basically we just crying go back use all three two channels so the data now and do have |
---|
0:16:25 | a more uh |
---|
0:16:26 | a a reasonable but still aggressive undersampling and we see that in result is it's mainly noise amplification that were |
---|
0:16:32 | saying not really signal this level |
---|
0:16:34 | so spring as |
---|
0:16:35 | you with uniform undersampling is fairly helpful |
---|
0:16:37 | and we actually don't really see |
---|
0:16:39 | uh a that much additional improvement from random and |
---|
0:16:42 | so i just like to acknowledge make a there's and funding and also fast one mine for providing "'cause" be |
---|
0:16:47 | one simulator which can download i |
---|
0:16:49 | it very much |
---|
0:16:55 | thank you |
---|
0:16:56 | that question yes |
---|
0:17:03 | oh |
---|
0:17:03 | a a and the low take into account the are if Y lower bias field artifacts the you expose the |
---|
0:17:09 | model of them somewhere and try to get to do that or or not |
---|
0:17:13 | okay so we're are bring a three T and there we are gonna is that they can buy as not |
---|
0:17:18 | sniff can as we C at seventy for example so for the most part we just ignored |
---|
0:17:22 | both |
---|
0:17:24 | types of uh artifacts in model |
---|
0:17:26 | and we essentially a use |
---|
0:17:28 | a a a a a set of coils that are we have very good |
---|
0:17:31 | a a both P one plus a be one my as perform |
---|
0:17:34 | so |
---|
0:17:35 | a a are results were very clean start out with a you curves are original image and after to just |
---|
0:17:40 | some basic a normalisation |
---|
0:17:42 | to take care of the uh attenuation the middle uh we were able to kind of deal with |
---|
0:17:47 | all those of |
---|
0:17:49 | just by using a a a a high quality |
---|
0:17:50 | which |
---|
0:17:51 | but we would expect that |
---|
0:17:52 | i if maybe are coil but were quite as good as we |
---|
0:17:55 | see those kinds artifacts we have to adapt our method |
---|
0:17:58 | to |
---|
0:17:58 | for those |
---|
0:18:01 | so a Q you know that to keep the time |
---|
0:18:05 | and we moved to the |
---|
0:18:06 | so |
---|