0:00:13 | i was speak |
---|
0:00:15 | very |
---|
0:00:15 | so |
---|
0:00:18 | i just thinking um i mean my first conference presentation |
---|
0:00:21 | every gave a member was a our ten a member of like |
---|
0:00:25 | i spend the whole session |
---|
0:00:27 | hammering like all the speakers of questions of my presentation was that the and |
---|
0:00:30 | course after after my presentation they'll hammered me together so |
---|
0:00:34 | so my plan is i'm an speak the whole time go five minutes extra low we no time for questions |
---|
0:00:38 | and the will i be very quickly |
---|
0:00:41 | um |
---|
0:00:42 | it was so i i guess we're gonna get started here i mean you know a B being a professor |
---|
0:00:46 | i can basically fill and whatever time is required i have like fifty back up so i i can do |
---|
0:00:51 | this and an hour a can do than twenty minutes a do five |
---|
0:00:54 | um so this is this is joint work with my P H D student to count you know we he's |
---|
0:00:57 | a national instruments now |
---|
0:00:59 | and uh |
---|
0:01:01 | i |
---|
0:01:01 | work that he did for P H T actually about a year ago |
---|
0:01:05 | something about one students graduated becomes a lot harder to get them to actually submit the work on time for |
---|
0:01:10 | some reason |
---|
0:01:11 | uh so anyway as time but to be here presenting a here so the the title is many and predictive |
---|
0:01:16 | coding for limited feedback multiuser mimo system |
---|
0:01:19 | and i'm glad about that the especially the previous presentation |
---|
0:01:23 | because uh it gives me a nice introduction that i was expecting to get in this uh sessions i don't |
---|
0:01:28 | have to tell you what is multiuser mimo set |
---|
0:01:30 | so the the system that i'm in a consider in this talk is |
---|
0:01:33 | so multiuser user mimo communication system with limited feedback |
---|
0:01:36 | um i'm going to make |
---|
0:01:38 | the uh |
---|
0:01:39 | the main assumption that |
---|
0:01:41 | i i guess we just found on the previous talk wasn't good which is i'm assume the that the quality |
---|
0:01:45 | information is perfect |
---|
0:01:46 | and i'm to focus on quantizing |
---|
0:01:49 | the uh direction information and so you the multi jim i'm most set of is |
---|
0:01:53 | uh we're gonna do in a in a case and do zero forcing precoding at the transmitter we have a |
---|
0:01:57 | single receive even of the objective at |
---|
0:01:59 | each user is we're gonna quantized that direction vector we're gonna send a back of the limit feedback laying |
---|
0:02:04 | we to put all those together or design trans the beam formers and |
---|
0:02:08 | everything works just like in that the previous talk |
---|
0:02:11 | and uh that so this is a set of that we're considering |
---|
0:02:14 | and uh the basic |
---|
0:02:16 | uh challenge here |
---|
0:02:17 | is as follows here so if if you been following the the field for a long time you know that |
---|
0:02:21 | you know we started off doing this was single user beamforming |
---|
0:02:24 | um we were looking at |
---|
0:02:26 | codebooks books on the order of three four five six bits for up to four antennas |
---|
0:02:31 | in three G P B ended up taking four bats |
---|
0:02:33 | so so you don't need that much feedback it's |
---|
0:02:36 | that's great |
---|
0:02:36 | now the problem is you know |
---|
0:02:38 | leveraging this work my general you find out that |
---|
0:02:41 | generally speaking |
---|
0:02:43 | the to achieve a constant gap from performance with the optimal um |
---|
0:02:47 | there were some rate that you chi with the ah with um perfect channel state information |
---|
0:02:51 | you need to have a codebook that scales with the number of users |
---|
0:02:55 | and all i also scales with the the snr so |
---|
0:02:58 | the the application here is that you you tend to need um high resolution otherwise are gonna be interference limited |
---|
0:03:04 | and and exactly like that was the nice |
---|
0:03:06 | a plot before |
---|
0:03:07 | you know you at at some point at high snr |
---|
0:03:09 | you wanna do multiuser mimo all |
---|
0:03:12 | and the reason is that we don't have enough resolution to become quantisation |
---|
0:03:15 | interference |
---|
0:03:16 | limited and um |
---|
0:03:18 | performances |
---|
0:03:19 | is the problem and then we're we're also seeing this right now a base station coordination we're people are applying |
---|
0:03:24 | multiuser mimo to distributed in a settings |
---|
0:03:27 | and we need just as much |
---|
0:03:28 | feedback back there if if not |
---|
0:03:31 | so the main thing here is that um yeah as we look at more sophisticated transmission techniques we we really |
---|
0:03:35 | need higher code-books we bigger code-books |
---|
0:03:38 | and |
---|
0:03:39 | i becomes or problem "'cause" you have more feedback |
---|
0:03:41 | so |
---|
0:03:42 | um |
---|
0:03:43 | of the other issues is that |
---|
0:03:44 | in addition the quantisation i mean we also have delay and mobility me if you you have a large codebook |
---|
0:03:49 | size |
---|
0:03:50 | the channel mobile you have to send a lot of bits back very quickly is that means the net feedback |
---|
0:03:54 | to have that link is |
---|
0:03:56 | is very high |
---|
0:03:58 | so |
---|
0:03:58 | you know for channel the the transmitters not won't point about the channel and |
---|
0:04:01 | um this is a problem so |
---|
0:04:03 | well we're gonna proposes a predictive coding framework |
---|
0:04:06 | that uh will allow us to |
---|
0:04:08 | um take advantage of correlation in the channels so that when the channels not moving very quickly |
---|
0:04:13 | we can get effectively larger codebook sizes then we otherwise what have |
---|
0:04:18 | so i'll i'll explain a little bit about the key idea and a mention some some prior work so |
---|
0:04:23 | the predictive coding concept the mean it |
---|
0:04:25 | anyone working and source coding |
---|
0:04:27 | uh ah this would be probably the first thing that you would do and |
---|
0:04:30 | um |
---|
0:04:31 | the idea as as follows here's base if you have a you have a source score over time |
---|
0:04:36 | we do is um you you find or solve a good predictor this might be uh |
---|
0:04:40 | and i'm a C predictor for example or or you have a favour you could plug it in |
---|
0:04:44 | and use use that predictor |
---|
0:04:46 | to predict the next value of the source and then instead of quantizing |
---|
0:04:49 | the noose you know the the new value of the source you quantise the error between the predicted value of |
---|
0:04:54 | the source and the source that comes at |
---|
0:04:56 | this is used to it |
---|
0:04:57 | with great success in speech and image i mean it's is |
---|
0:05:01 | use all over the place and the main things you need here i mean you need |
---|
0:05:04 | a a a predictor |
---|
0:05:06 | you need to be a quantized the error you need to be able to compute the difference between the predicted |
---|
0:05:11 | and the true |
---|
0:05:12 | and then you need to be a to track your um |
---|
0:05:15 | pretty good sequence |
---|
0:05:16 | and then um |
---|
0:05:17 | the decoder essentially what it does is a |
---|
0:05:20 | it also implements this pretty cut with this um predictor so takes the quantized there updates and that keeps implement |
---|
0:05:26 | the prediction and that gives a |
---|
0:05:28 | it's you keep updating the |
---|
0:05:29 | the decoder over time |
---|
0:05:31 | and so for example um |
---|
0:05:33 | what the traditional like use like a first-order one step predictor |
---|
0:05:36 | you might have to weights here you take a linear you're combination of the previous two symbols that becomes your |
---|
0:05:41 | predicted value and |
---|
0:05:43 | you know you can optimize as weights using |
---|
0:05:45 | mmse and |
---|
0:05:46 | um |
---|
0:05:46 | and then if you wanna design the code but probably the typical way to do this is with the lloyd |
---|
0:05:50 | algorithm there is also a of structure techniques tree structure and so on |
---|
0:05:54 | and uh this this is |
---|
0:05:55 | this is known as widely deployed a mean people the doing this for |
---|
0:05:59 | at least twenty thirty years |
---|
0:06:01 | so |
---|
0:06:01 | um the the problem here is that |
---|
0:06:04 | as follows in the limited feedback beamforming case are the information that we wanna quantise that normalized vector a normalized |
---|
0:06:10 | vector space invariant it's actually |
---|
0:06:13 | um represented by a point on the grass and of |
---|
0:06:16 | in this case |
---|
0:06:17 | uh |
---|
0:06:18 | the space of beamforming vectors are considered be points on |
---|
0:06:21 | G and come one so this is the set of sub-spaces of |
---|
0:06:25 | in N dimensional space with one to mention |
---|
0:06:27 | and you can write it like |
---|
0:06:29 | points on a sphere that's |
---|
0:06:31 | more of an illustration not exactly correct but gives you |
---|
0:06:34 | kind of the idea here |
---|
0:06:35 | and so it effectively the problem that we have with them if you back beamforming is that |
---|
0:06:40 | we want to do um |
---|
0:06:41 | predictive quantisation of a source that lives on this matter |
---|
0:06:45 | and um the and so that's that's the basic idea |
---|
0:06:48 | now let's let's figure out what the problem is we white why are we talking about this now after we've |
---|
0:06:52 | be doing them if you back for |
---|
0:06:54 | eight or nine |
---|
0:06:56 | well the problem as follows your so first all we have a subspace source |
---|
0:06:59 | that's going click |
---|
0:07:00 | coder |
---|
0:07:01 | i we need to generate the error |
---|
0:07:03 | and the problem with working on |
---|
0:07:05 | it manifold special the grassmann manifold here is that |
---|
0:07:08 | simple operations like um |
---|
0:07:10 | adding up to point |
---|
0:07:13 | not necessarily well the finite |
---|
0:07:15 | so like for example at if if i'm looking at those two lines there |
---|
0:07:18 | what is it need to add those two lines up |
---|
0:07:20 | you got another line or a me what we or getting points on a sphere |
---|
0:07:24 | you get a point that's not on the sphere ones you're done if you "'em" an euclidean space |
---|
0:07:28 | so |
---|
0:07:28 | i |
---|
0:07:29 | you need to do something special just a add these things that so the net of that is that generating |
---|
0:07:33 | a a a a a of the concept of a error it's actually |
---|
0:07:35 | not obvious |
---|
0:07:36 | um |
---|
0:07:38 | well you also need to predict things on the subspace "'em" if you really wanna take this manifold full structure |
---|
0:07:42 | into account we shouldn't be predict euclidean space we should be predicting on on this manifold a cell so we |
---|
0:07:47 | need a we need a manifold predictor but |
---|
0:07:49 | then |
---|
0:07:50 | wow |
---|
0:07:51 | what is an mmse manifold predictor or mean what what is multiplying what is weights to me all these things |
---|
0:07:56 | the they're not um very well the fine |
---|
0:07:59 | and so that's why it's be very hard to come up with even |
---|
0:08:03 | you know extremely simple um examples of |
---|
0:08:06 | of doing |
---|
0:08:07 | predictive coding here's because |
---|
0:08:09 | these operations are hard what that means is that |
---|
0:08:11 | in the predictive coder |
---|
0:08:12 | the generating the error is not straightforward want in the errors not straightforward |
---|
0:08:17 | predicting the the sequence not straightforward an updating the predictors not straight for |
---|
0:08:22 | and so |
---|
0:08:23 | um |
---|
0:08:24 | each of these proposed blocks you need to have something you here |
---|
0:08:27 | and so there there has been a a lot of work on this i mean |
---|
0:08:30 | it certainly my group another |
---|
0:08:32 | i mean everybody wants to exploit the temporal correlation the channel mean it it's it's the right thing to do |
---|
0:08:36 | a mean we know the multi to my what doesn't work well as the channels very quickly and |
---|
0:08:40 | you know this temporal correlations just sitting there begging to be exploited |
---|
0:08:43 | so in trying to do that |
---|
0:08:45 | uh i mean prior the earliest work this is um |
---|
0:08:47 | and a certain as either and i need two thousand two they have some nice papers with |
---|
0:08:51 | with gradient based update |
---|
0:08:52 | there's some dynamic codebook approaches where you |
---|
0:08:55 | you kind of adaptively |
---|
0:08:57 | select a subset of a codebook depending on how fast the channels moving if it's very slow you end up |
---|
0:09:02 | with a |
---|
0:09:03 | very directional codebook for correlation if is going fast you and of a kind of a grassmannian codebook |
---|
0:09:08 | there's a these progressive or successive refinement techniques where you zoom and on the channel estimate depending on how a |
---|
0:09:16 | fast or slow it's going that's more like a tree kind of quantisation |
---|
0:09:19 | uh there there is |
---|
0:09:20 | there been some work using would euclidean prediction you could use the euclidean predictor in font size it and depending |
---|
0:09:25 | on how you do it sometimes like an actually work very well but sometimes |
---|
0:09:28 | you lose the um |
---|
0:09:30 | the phase and variance all the structure that we were trying to exploit the grassmann manifold first what |
---|
0:09:34 | so i can be a problem um |
---|
0:09:36 | probably the best |
---|
0:09:38 | or to seen is a different role approach |
---|
0:09:40 | like a at all and this is related to something that they've actually proposed |
---|
0:09:45 | i the three G P P standard where it's essentially you look at a difference between the last vector in |
---|
0:09:49 | the next vector here |
---|
0:09:50 | and that that approach um |
---|
0:09:52 | works reasonably well but it's |
---|
0:09:54 | um |
---|
0:09:55 | does a really use anything |
---|
0:09:56 | stochastic and then there's also some rating expression ray compression techniques |
---|
0:10:00 | so i mean the main message here as i mean there there's definitely like a lot of work on this |
---|
0:10:04 | topic but |
---|
0:10:05 | there's not really a comprehensive framework for solving the problem that i just told you about doing predictive a quantisation |
---|
0:10:10 | the grass aggressive mouth |
---|
0:10:12 | so what i'm gonna do is i'm it's tell you about |
---|
0:10:15 | our approach to solving this problem and fortunately |
---|
0:10:18 | uh we do have a general solution i mean the general solution with you know general manifolds at many points |
---|
0:10:23 | is still an open problem on it so you about the solution that we have |
---|
0:10:26 | um um that we're proposing for |
---|
0:10:28 | the case where we have a a two points |
---|
0:10:30 | in so one a do is i'm a to build i'm a explain some of the mathematical concepts that we |
---|
0:10:34 | need to use of i've got the |
---|
0:10:36 | the equations here |
---|
0:10:38 | but i'm i'm a really focus on the the picture "'cause" the point is to get the intuition of what's |
---|
0:10:41 | happening the picture and then all to you how we |
---|
0:10:44 | yeah |
---|
0:10:45 | so operating on |
---|
0:10:46 | um |
---|
0:10:47 | in these kinds of manifolds here you can the fine |
---|
0:10:50 | uh several cards as one of them is |
---|
0:10:51 | is this notion of a tangent vector so |
---|
0:10:54 | this tangent vector is of i have two point X one and X two that live on the manifold here |
---|
0:10:58 | i can the fine a tangent vector which are you want to |
---|
0:11:03 | which is a vector that pointing |
---|
0:11:05 | in a direction |
---|
0:11:07 | of X two from X one |
---|
0:11:08 | and so far gone all to X one |
---|
0:11:11 | and member X one is a |
---|
0:11:12 | is a point on the grass of manifold we can represent as a unit vector so E the E is |
---|
0:11:17 | if vector that's orthogonal to X one but it's not a unit vector it has a link that's link is |
---|
0:11:23 | dependent on the court of distance between X one and X two |
---|
0:11:26 | and you can go through there's um so mice paper is it'll minutes miss that have |
---|
0:11:30 | some very general descriptions of the notions of tangents and geodesic spot but those are |
---|
0:11:35 | um if you look through which you're fine that |
---|
0:11:38 | because everything is so general actually very hard to pick it out and so one of |
---|
0:11:41 | the contributions here is that we simplified everything down for the beamforming case said it turns out this that the |
---|
0:11:46 | equation simplifies dramatically and there is actually a lot of intuition and equations old |
---|
0:11:51 | but through all |
---|
0:11:52 | but basically the tended vector here you can decompose an the two pieces one which has to do with the |
---|
0:11:56 | link |
---|
0:11:57 | which is the arc link between X one and X two |
---|
0:12:00 | a second which is the |
---|
0:12:01 | unit tangent direction and then this is a function of |
---|
0:12:04 | the inner product between X one and X two and it's a function of a coral this |
---|
0:12:08 | and |
---|
0:12:09 | so we're gonna use the tangent vector |
---|
0:12:11 | uh to give us a a quick one notion of air |
---|
0:12:15 | okay so some concept we use the stewardess so a geodesic this this is |
---|
0:12:19 | the curve that's between X one and X two that's the shortest path between these two |
---|
0:12:24 | you can come up with an equation for um for that's curve that can sit that's a function of you |
---|
0:12:30 | can write as a function of X one and X two but becomes more convenient be write is a function |
---|
0:12:33 | of X one in the tangent vector |
---|
0:12:35 | you can write it like this here |
---|
0:12:37 | you've actually got a um |
---|
0:12:38 | X one times the cosine of this thing |
---|
0:12:40 | plus |
---|
0:12:41 | um |
---|
0:12:42 | sign of this thing in the T equal zero gives you X one T one gives you X two it |
---|
0:12:47 | turns out that the this |
---|
0:12:49 | and this orthogonal |
---|
0:12:50 | you can see that because if you remember that the tangent vector is orthogonal to X one |
---|
0:12:54 | do you this nice if orthogonal decomposition |
---|
0:12:56 | so we're gonna use this geodesic together something that looks like an addition |
---|
0:13:01 | now the sort thing that we need is |
---|
0:13:03 | we we want to predict so we wanna try to |
---|
0:13:05 | figure out where the sequence of points on the grassmann manifold |
---|
0:13:09 | and um |
---|
0:13:10 | we so we wanna get some function that's got some or we can optimize over it and |
---|
0:13:14 | so for this work what we did this |
---|
0:13:16 | oh |
---|
0:13:16 | where we're gonna take a really something very simple |
---|
0:13:18 | really use this concept of parallel transport of the parallel transport is essentially a way of |
---|
0:13:23 | uh taking this |
---|
0:13:25 | tangent vector X one |
---|
0:13:27 | and mapping it over to X two |
---|
0:13:30 | a remember that |
---|
0:13:31 | that's change a vector has to be orthogonal to the point you have to do this mapping in such a |
---|
0:13:35 | way that the orthogonality is maintained to the new vector |
---|
0:13:39 | and so you can do that and it turns out that |
---|
0:13:41 | it it actually looks something like the negative of the previous tangent back |
---|
0:13:45 | so we're gonna use the parallel transport concept to fine |
---|
0:13:48 | a predicted value |
---|
0:13:52 | okay so now i'm point how we use these mathematical operations to do press ready and |
---|
0:13:56 | pretty quantisation so the first thing here is let's generated error was suppose we have a predict sequence the predicted |
---|
0:14:01 | sequences |
---|
0:14:02 | is gonna be denoted by |
---|
0:14:04 | X |
---|
0:14:04 | X still to here |
---|
0:14:05 | so this X had is the state |
---|
0:14:07 | this is um |
---|
0:14:09 | this is gonna be known at both the uh transmitter receiver this is the predicted value here |
---|
0:14:14 | which just from the state and then read take the difference between the predicted now sir |
---|
0:14:18 | so what happens is we're |
---|
0:14:19 | this is the predicted value here that we know the receiver and transmitter this is a new observation |
---|
0:14:24 | we want compute this vector |
---|
0:14:26 | tangent vector that'll take us from here to here |
---|
0:14:29 | and so this is |
---|
0:14:30 | or really do with that the error tangent |
---|
0:14:32 | there we're gonna use the |
---|
0:14:34 | a value the new value to generate a error |
---|
0:14:38 | the second thing that really do here run a quantized a tangent vector |
---|
0:14:41 | now the the D first um reaction at least at me looking at this problem is yes |
---|
0:14:46 | nother grass many quantisation problem that's great the problem is is not actually grass many quantisation problem because a change |
---|
0:14:51 | of vectors not you or |
---|
0:14:53 | it has a a structure its orthogonal to X one |
---|
0:14:57 | but otherwise it lives in the space orthogonal to that space so |
---|
0:15:00 | it's a slightly different quantisation problem so we're gonna quantise that tangent vector in this paper here what we did |
---|
0:15:06 | is we actually decompose the changing two pieces one that's a addressed many quantisation one that's a |
---|
0:15:11 | not to again it's a complex king |
---|
0:15:13 | quantisation is kind of like a quality and then a direction |
---|
0:15:17 | so how we propose a codebook for that |
---|
0:15:20 | and |
---|
0:15:20 | that's where we used to quantized |
---|
0:15:22 | ten |
---|
0:15:24 | and the third thing is the state update here |
---|
0:15:26 | so this is how we actually |
---|
0:15:28 | update we add that a to the um predicted value three use the geodesic for that's so what we're gonna |
---|
0:15:34 | do is we're gonna take the um |
---|
0:15:36 | take the X still like here |
---|
0:15:38 | we're gonna add the um |
---|
0:15:40 | this |
---|
0:15:41 | a parallel trance |
---|
0:15:42 | as a problem transfer error vector here |
---|
0:15:45 | take a full step and that's what we're gonna get for the uh updated |
---|
0:15:49 | eek what's here yeah okay right so this is the |
---|
0:15:51 | state update were add the error on and in this is the predicted value |
---|
0:15:55 | so the project |
---|
0:15:56 | what we're gonna do is we're gonna take a um |
---|
0:15:59 | the difference between these two previous state factors |
---|
0:16:01 | and then we're gonna kind of like keep going in the same direction |
---|
0:16:05 | add that |
---|
0:16:06 | on to the X |
---|
0:16:07 | had here |
---|
0:16:08 | goal |
---|
0:16:09 | a right here so that's gonna be are are predicted value |
---|
0:16:11 | and i should point out here that um |
---|
0:16:13 | it in this paper we've simplified everything down counts were taking a full step for were |
---|
0:16:18 | can probably realise that going of false step it may not be the right thing to do and so in |
---|
0:16:23 | some of our other work |
---|
0:16:24 | we have actually optimized the step size and and you can optimize that as of uh over time in there's |
---|
0:16:29 | some cool things related to that and that that of here to uh i T A |
---|
0:16:33 | a that result |
---|
0:16:34 | does is our proposed predictor here |
---|
0:16:36 | and so that's a basically the idea here i mean i taking |
---|
0:16:39 | the geometric tools |
---|
0:16:40 | for for doing things on the grass manifold and i'm gonna write these interesting equations to you |
---|
0:16:46 | to get notions of prediction and |
---|
0:16:48 | to get notions of error update quantized error as i'm i use all of that now to quantise a correlate |
---|
0:16:53 | sequence over time |
---|
0:16:55 | uh for this spring for the simulations in this paper really use a |
---|
0:16:58 | uh autoregressive model |
---|
0:17:00 | which is which is rather standard though |
---|
0:17:02 | you could do better if you use to i a a uh |
---|
0:17:05 | clark gone slight model which is band limited has better prediction problem so this is actually more of a worst |
---|
0:17:09 | case |
---|
0:17:10 | we're gone you consider the sum rate without of so we're gonna use or for a multi to my mow |
---|
0:17:15 | and we compare with using ran a vector quantisation which is essentially a good way of finding a a fixed |
---|
0:17:19 | codebook |
---|
0:17:21 | or different sizes |
---|
0:17:22 | so that the first um |
---|
0:17:24 | result as i just wanted to point out that |
---|
0:17:26 | uh indeed using this the using this our of um it has result in effectively high resolution so this is |
---|
0:17:33 | i i i forgot the the parameters the simulation here but this is basically |
---|
0:17:36 | a nine to codebook |
---|
0:17:38 | the sequences varying over time in this is you know we keep quantizing it over time here and and it |
---|
0:17:42 | the coral dozens fluctuates because sometimes you're quantized values close to the true channel sometimes as far as fluctuating |
---|
0:17:48 | and then this is our approach down here also with nine bits |
---|
0:17:52 | so we're getting about |
---|
0:17:54 | vector five down here |
---|
0:17:55 | and our approach is um |
---|
0:17:57 | because we're tracking as over of things are changing over time we're not converging to zero because it's very |
---|
0:18:02 | so |
---|
0:18:03 | um but this is just to show you that and it does actually |
---|
0:18:06 | decrease the error |
---|
0:18:07 | terms of average word of this |
---|
0:18:10 | now uh let's look at the |
---|
0:18:12 | uh sum rate comparison |
---|
0:18:13 | and |
---|
0:18:15 | so the blue curve here |
---|
0:18:16 | this is a perfect csi as for users |
---|
0:18:19 | for in as we don't have any more the reverse is we didn't select the that's for users we just |
---|
0:18:23 | picked randomly for users and |
---|
0:18:25 | they have the same average snr so there |
---|
0:18:27 | there are like sitting on the same sort if you like |
---|
0:18:30 | so here's this the blue is the |
---|
0:18:32 | you know what we're trying to achieve here |
---|
0:18:34 | that's perfect csi |
---|
0:18:36 | and |
---|
0:18:37 | this uh uh what colour that is |
---|
0:18:39 | kind of a mustard looking over that |
---|
0:18:41 | is a grassmannian i D |
---|
0:18:43 | go work here |
---|
0:18:44 | it's it it kind of a weird thing is that |
---|
0:18:46 | it turns out that there's not really good grassmannian code-books for large codebook size is so with a vector quantisation |
---|
0:18:52 | you can get |
---|
0:18:52 | more or less a grassmannian cold so it's |
---|
0:18:55 | you can get essentially a very good code |
---|
0:18:57 | have a geometry works so this actually is a good codebook book |
---|
0:19:00 | a good fixed to mention book you probably won't do much better than that |
---|
0:19:03 | and in here is the standard errors for result you get a high snr your france |
---|
0:19:07 | so |
---|
0:19:08 | right here you can see these different uh doppler |
---|
0:19:11 | a a symbol period products here so this |
---|
0:19:13 | zero point zero one this is a very slow channel |
---|
0:19:16 | and this is a |
---|
0:19:17 | reasonably fast channel here |
---|
0:19:19 | well you can see is that |
---|
0:19:21 | in a of all with nine bits of feedback in a in a effectively a slow channel were able to |
---|
0:19:26 | get a |
---|
0:19:26 | very good |
---|
0:19:27 | performance tracking that ideal sum rate |
---|
0:19:30 | with a a a you know somewhat |
---|
0:19:32 | fast channel |
---|
0:19:34 | we still get a little bit of improvement but um the performance improvements not |
---|
0:19:38 | is not |
---|
0:19:39 | and so then nice thing is that you know this this means that |
---|
0:19:42 | you know every user essentially K can be updating their um |
---|
0:19:46 | this is for all the users have the same channel profile but the out of as flexible enough that mean |
---|
0:19:50 | every user since you runs are an adaptive algorithm that predicted |
---|
0:19:53 | they get the csi and if their channel happens be slow they have a better information of the channel during |
---|
0:19:57 | fast to get worse |
---|
0:19:59 | so it's it's um i-th that it is actually tracked quite well on the also includes |
---|
0:20:03 | a five millisecond delay which is a standard assumption three D |
---|
0:20:07 | so even with delay that we're not accounting for we still can track the the sum rate with uh |
---|
0:20:14 | under some |
---|
0:20:15 | reasonable some |
---|
0:20:18 | so uh that's essentially a tears what i've done and the stalks the proposed a |
---|
0:20:22 | a a many predictive coding framework |
---|
0:20:24 | and this is i think um um at least from a source coding perspective |
---|
0:20:27 | you know the the pictures nice the stories nice um |
---|
0:20:31 | now |
---|
0:20:32 | there's a lot of limitations of first formally using the previous two point |
---|
0:20:36 | i like to use more than that |
---|
0:20:38 | clearly if i had um i figure out how to do this with three or five previous points of would |
---|
0:20:42 | just shown that to |
---|
0:20:44 | it's it's hard because |
---|
0:20:46 | these tangents all those concept are based on two point |
---|
0:20:48 | so you have three i U you gotta do something else there |
---|
0:20:52 | and so we have some ideas but that's that's proving to be very difficult |
---|
0:20:55 | um my student in his dissertation has some extension of this steve manifolds to so you can do this um |
---|
0:21:01 | we did some work with limited feedback mimo ofdm or you had to do a a people quantisation set of |
---|
0:21:06 | grassmann quantisation |
---|
0:21:07 | and this whole concept that works there as well but the feedback requirements really become high so i |
---|
0:21:13 | i i i'm not sure that this is exactly the right solution for higher dimensions yet but i think it's |
---|
0:21:17 | a good idea |
---|
0:21:19 | and then i should you the multiuser mimo results |
---|
0:21:22 | um |
---|
0:21:23 | since we develop this i've another student who graduated |
---|
0:21:26 | who has |
---|
0:21:27 | um enhance this for multi cell my mel so where we actually predict the channels from interfering base stations using |
---|
0:21:33 | a similar concept what she was actually able to come up with frankly a much better predictor |
---|
0:21:37 | and that results in a at i C C |
---|
0:21:40 | and then i have another result |
---|
0:21:42 | where we've also come up with a better predictor for interference alignment |
---|
0:21:46 | and that's gonna appear period |
---|
0:21:48 | that |
---|
0:21:48 | so the concept |
---|
0:21:50 | the concept i think it's good and that i still think of a lot of a lot of work to |
---|
0:21:53 | be done here |
---|
0:21:54 | okay so that's a i pa gesture not |
---|
0:21:56 | taking enough time |
---|
0:21:58 | um |
---|
0:21:58 | uh this is not my virus |
---|
0:22:02 | oh i took exactly the right about time one |
---|
0:22:09 | well |
---|
0:22:13 | oh |
---|
0:22:18 | uh would you know |
---|
0:22:20 | i need to get back to D |
---|
0:22:23 | it is not depend this this method on a a a a a a a you model list time |
---|
0:22:28 | correlation |
---|
0:22:29 | um it's it's not really um |
---|
0:22:32 | very dependent so |
---|
0:22:34 | in the in the journal version which uh i i think it |
---|
0:22:38 | we actually just submitted the journal version recently and it should be an archive |
---|
0:22:41 | a or tomorrow we have simulations with other temporal like a a multi tap they are models and it |
---|
0:22:47 | still works well |
---|
0:22:48 | so we're not really exploiting any knowledge about the channel correlation |
---|
0:22:52 | you could probably come up with a correlated channel model that makes the art more for like |
---|
0:22:56 | i process but |
---|
0:22:57 | but we're not using |
---|
0:22:58 | you know we we don't have the um |
---|
0:23:02 | the the correlation function anyway |
---|
0:23:04 | the there was no i'm and right i mean mean we'd like to have that or not exploiting a yeah |
---|
0:23:07 | so you not exploding help |
---|
0:23:10 | right right |
---|
0:23:11 | i i mean it it's only coming see the this be it's really only coming when we quantise |
---|
0:23:17 | the magnitude of the tangent vector |
---|
0:23:19 | so the channels bearings slowly |
---|
0:23:21 | what's gonna happen is um |
---|
0:23:24 | we're gonna take the where the quantized nick here |
---|
0:23:26 | well when we quantized this so we quantized the direction |
---|
0:23:30 | and |
---|
0:23:31 | um the magnitude to of tender vectors one very slowly |
---|
0:23:35 | we're gonna take a small step |
---|
0:23:36 | and what very quickly we're gonna take a larger step |
---|
0:23:39 | so that's the only place where |
---|
0:23:41 | the correlation comes in now the other paper i mentioned where we adapt the size |
---|
0:23:46 | they are the statistics come in but not in uh |
---|
0:23:49 | a way |
---|
0:23:57 | as one the back |
---|
0:23:59 | a this was not a speaker good heart |
---|
0:24:03 | so yeah |
---|
0:24:04 | cross |
---|
0:24:06 | this |
---|
0:24:07 | these correspond to "'cause" i'm an easy to |
---|
0:24:09 | the best |
---|
0:24:11 | presentation of the information that one can combine E |
---|
0:24:14 | hmmm hmmm you do something on some in by do have to seek to unit |
---|
0:24:19 | inspectors |
---|
0:24:20 | um okay i can use be the last part at in |
---|
0:24:24 | do you have to two |
---|
0:24:25 | stick to to to to not terms |
---|
0:24:29 | you could do just one uh the percent direction by by |
---|
0:24:33 | some big to that's not normal estimates |
---|
0:24:37 | yeah so i mean the |
---|
0:24:38 | a the question is |
---|
0:24:40 | yeah whether you need the |
---|
0:24:42 | do you know link factors not something is that remember that this vector of lives on the grass amount for |
---|
0:24:46 | model is it unit length that's also phase and vary |
---|
0:24:48 | so those two properties |
---|
0:24:50 | it |
---|
0:24:51 | holds out degrees of freedom of the vectors we have a complex vector with and in tree you have to |
---|
0:24:56 | and pram |
---|
0:24:57 | you know nor we have to and mine one this phase in bearing you have to and minus two |
---|
0:25:00 | so the whole reason that we use this on the grassmann manifold is so that we quantise |
---|
0:25:05 | less information and you can show from a point of view of capacity you're probably of error that all you |
---|
0:25:09 | need is this |
---|
0:25:11 | normalized direction information |
---|
0:25:13 | now if you wanna quantise the direction as well |
---|
0:25:18 | with the C Q why then actually your back your back to to and minus one and so if you |
---|
0:25:22 | want to combine those together you could |
---|
0:25:25 | you could do that and you will end up with something different |
---|
0:25:27 | but still you have the phase and variance so what the |
---|
0:25:30 | it when a be like it wouldn't be like a a random vector |
---|
0:25:34 | a gaussian still wanna be a gaussian problem it would be on a different fold |
---|
0:25:38 | right but but the |
---|
0:25:40 | doesn't really matter in this context menu you |
---|
0:25:43 | one to two |
---|
0:25:46 | good some some |
---|
0:25:48 | and some |
---|
0:25:49 | after at each step and and a doesn't matter |
---|
0:25:53 | oh to the one of is |
---|
0:25:54 | no number unit |
---|
0:25:56 | thanks |
---|
0:25:58 | and you you don't care about |
---|
0:25:59 | the thanks and then |
---|
0:26:01 | so anyway |
---|
0:26:02 | oh okay okay so um i-th |
---|
0:26:04 | i can give you another answer of this |
---|
0:26:06 | this question here so one okay one thing that you could do |
---|
0:26:09 | is um |
---|
0:26:11 | grassmann manifold as the rim money in manifold it's local euclidean so in fact you could uh you could use |
---|
0:26:16 | kind of a standard predictor |
---|
0:26:18 | and then |
---|
0:26:19 | you could at you can add them up and you get a point not on the manifold and then you |
---|
0:26:22 | could if you need a you know norm you can project back to the manifold |
---|
0:26:26 | and uh indeed indeed that works if you're operating in a local enough region |
---|
0:26:30 | and so we found that um |
---|
0:26:32 | you an actually the analysis in the journal version this paper |
---|
0:26:35 | makes that it's like a small angle assumption but if you have |
---|
0:26:39 | more variation |
---|
0:26:40 | the problem is that |
---|
0:26:42 | you're |
---|
0:26:42 | that effectively |
---|
0:26:44 | that addition in projection |
---|
0:26:46 | gets you five away from this kind of addition operation so i i |
---|
0:26:50 | i think there there's is actually a case for doing that to |
---|
0:26:53 | it just depends and and the the one |
---|
0:26:56 | the the approach that we've taken this i C C paper actually has a better predictor that that doesn't predict |
---|
0:27:02 | on the grassmann manifold but we do updates on the grass amount so |
---|
0:27:06 | i think it's |
---|
0:27:07 | the your point is well as well taken |
---|
0:27:11 | okay |
---|
0:27:12 | so |
---|
0:27:13 | copy break |
---|
0:27:14 | right |
---|
0:27:16 | oh |
---|
0:27:18 | i |
---|