0:00:14 | i'm does to mix an i'm from the uh |
---|
0:00:17 | program in and computational mathematics to prince university |
---|
0:00:21 | uh |
---|
0:00:22 | i'm rubber colour makes do that |
---|
0:00:24 | and so i'm gonna be talking about some work i did |
---|
0:00:26 | this past summer |
---|
0:00:28 | with chris when |
---|
0:00:30 | uh grad student at urbana champaign |
---|
0:00:32 | and is |
---|
0:00:34 | adviser negative about |
---|
0:00:36 | and also |
---|
0:00:37 | map fig is that the air force since to tech not |
---|
0:00:42 | so first |
---|
0:00:44 | we'll talk about the problem at hand |
---|
0:00:46 | a see i have |
---|
0:00:47 | a file |
---|
0:00:49 | but don't wanna share with |
---|
0:00:51 | a bunch of my friends |
---|
0:00:53 | um |
---|
0:00:54 | but i wanna be able to |
---|
0:00:56 | determine if any of my friends |
---|
0:00:58 | and decide to |
---|
0:00:59 | a week this file to other people |
---|
0:01:02 | so let's say i |
---|
0:01:03 | notice |
---|
0:01:05 | uh |
---|
0:01:05 | one the copies of the files in the wrong person hands |
---|
0:01:08 | i can look at |
---|
0:01:10 | back copy and notice |
---|
0:01:12 | if i had put a finger a distinct fingerprint on that |
---|
0:01:15 | notice where the fingerprint came from |
---|
0:01:18 | and uh identify might trader |
---|
0:01:21 | now let's suppose |
---|
0:01:23 | that |
---|
0:01:25 | my friends are smarter |
---|
0:01:27 | and uh |
---|
0:01:29 | a group of them a group of corporate just side |
---|
0:01:31 | to |
---|
0:01:33 | collaborate |
---|
0:01:34 | and make a forgery |
---|
0:01:36 | with their copies |
---|
0:01:37 | um |
---|
0:01:38 | that's gonna make it harder for me to identify with the traders are |
---|
0:01:42 | um |
---|
0:01:42 | but let's say |
---|
0:01:44 | the way they for it is they do it |
---|
0:01:46 | convex combination |
---|
0:01:47 | of their copies |
---|
0:01:49 | then the ad some random noise on top of that |
---|
0:01:54 | well |
---|
0:01:55 | my goal also identify the corporate |
---|
0:01:57 | and the talk is |
---|
0:01:59 | uh geared dance sits question |
---|
0:02:01 | of how you do that |
---|
0:02:02 | so the first thing you might do |
---|
0:02:05 | is isolate the noisy combination |
---|
0:02:08 | a fingerprints |
---|
0:02:10 | you take the forgery in use subtract off the host signal |
---|
0:02:13 | and uh what's left is |
---|
0:02:15 | a convex combination of fingerprints plus some noise |
---|
0:02:18 | um |
---|
0:02:19 | if you |
---|
0:02:20 | view your fingerprints |
---|
0:02:22 | as columns of a short fat matrix |
---|
0:02:25 | uh |
---|
0:02:26 | big F |
---|
0:02:28 | multiply that by out of uh |
---|
0:02:30 | which has mostly zero entries |
---|
0:02:32 | um |
---|
0:02:34 | a nonzero entries are the alpha case |
---|
0:02:37 | K being in a coalition script K |
---|
0:02:39 | you plus this uh |
---|
0:02:41 | this apps on vector |
---|
0:02:45 | when you pose it in this matrix vector format you basically want to recover the support |
---|
0:02:49 | of L for that |
---|
0:02:50 | the locations of the nonzero entries of alpha |
---|
0:02:53 | given |
---|
0:02:54 | the measurements so you have uh why mine assess equal this |
---|
0:02:58 | F L for plus epsilon |
---|
0:03:00 | so what's first look at the noiseless case where epsilon is zero |
---|
0:03:05 | then you have this |
---|
0:03:06 | this |
---|
0:03:07 | picked two all uh |
---|
0:03:09 | figure |
---|
0:03:10 | of the matrix vector |
---|
0:03:12 | uh thing the thing on the left is what we have available to us |
---|
0:03:15 | it's the forgery my as the host signal |
---|
0:03:18 | the short fat matrix |
---|
0:03:20 | the columns are uh fingerprints |
---|
0:03:23 | and then |
---|
0:03:24 | the sparse vector on the right there is alpha |
---|
0:03:27 | the white entries trees are the zero entries in the colour trees |
---|
0:03:30 | are the uh the alpha case |
---|
0:03:33 | in our convex combination |
---|
0:03:35 | so |
---|
0:03:36 | when we pose a problem this way |
---|
0:03:38 | uh it's appropriate to consider |
---|
0:03:41 | uh compressed sensing |
---|
0:03:43 | as a uh solution alternative |
---|
0:03:45 | um what compressed sensing offers is machinery |
---|
0:03:49 | that allow us to |
---|
0:03:50 | not only identify the support of alpha but actually recover |
---|
0:03:54 | uh |
---|
0:03:55 | with its uh |
---|
0:03:57 | the values of the nonzero entries |
---|
0:03:59 | uh but this assumes |
---|
0:04:01 | uh |
---|
0:04:02 | sparsity in the support |
---|
0:04:04 | that the there's not |
---|
0:04:05 | too many colluders in the coalition |
---|
0:04:08 | um so if we assume that are coalition is small enough |
---|
0:04:12 | uh |
---|
0:04:12 | we will be able to use uh compressed sensing ideas |
---|
0:04:16 | to find the |
---|
0:04:17 | the perpetrators |
---|
0:04:19 | um |
---|
0:04:21 | so to go a little more in depth on |
---|
0:04:23 | but machinery offered by compressed sensing |
---|
0:04:26 | let's talk about the restricted isometry property |
---|
0:04:29 | this is a property on short fat matrices |
---|
0:04:31 | that's says |
---|
0:04:32 | that |
---|
0:04:33 | the short that matrix F |
---|
0:04:35 | axes and new ice on a tree |
---|
0:04:37 | on |
---|
0:04:38 | a sufficiently sparse vectors |
---|
0:04:40 | so |
---|
0:04:42 | if you're vector X has |
---|
0:04:44 | uh no more than K nonzero entries |
---|
0:04:47 | them |
---|
0:04:47 | the length |
---|
0:04:49 | of uh F fax |
---|
0:04:51 | is |
---|
0:04:51 | some between one minus delta to one plus delta times that like the backs |
---|
0:04:56 | square |
---|
0:04:57 | um and and |
---|
0:04:59 | the importance of the research price on tree property as reflecting this that's and by as and tao |
---|
0:05:04 | which states that if you're matrix is alright P |
---|
0:05:09 | then you can find any case sparse vector X |
---|
0:05:12 | by minimizing the one norm |
---|
0:05:14 | overall |
---|
0:05:16 | vectors |
---|
0:05:17 | that map to F that |
---|
0:05:20 | normally you have to find the sparse as vector |
---|
0:05:22 | uh in the pre-image |
---|
0:05:24 | but because are short fat matrix satisfies all right P |
---|
0:05:28 | were able to |
---|
0:05:29 | you this problem |
---|
0:05:31 | as being equivalent to this a one minimisation |
---|
0:05:34 | uh which we can solve by |
---|
0:05:35 | when you're programming |
---|
0:05:37 | so the moral this slide is |
---|
0:05:40 | uh if |
---|
0:05:41 | your |
---|
0:05:41 | fingerprints |
---|
0:05:43 | R columns in or i P matrix |
---|
0:05:45 | then you can find |
---|
0:05:47 | alpha |
---|
0:05:48 | and identify perpetrators in the noiseless case |
---|
0:05:51 | i just using linear programming |
---|
0:05:53 | uh |
---|
0:05:54 | but |
---|
0:05:55 | we didn't assume and noiseless case we assume noise |
---|
0:05:58 | so let's talk about the noise |
---|
0:06:00 | um |
---|
0:06:02 | so the results |
---|
0:06:03 | on |
---|
0:06:04 | uh |
---|
0:06:05 | linear programming |
---|
0:06:06 | to a |
---|
0:06:07 | perform |
---|
0:06:09 | uh |
---|
0:06:09 | this recovery |
---|
0:06:11 | problem |
---|
0:06:12 | uh |
---|
0:06:13 | and there's |
---|
0:06:14 | count limited |
---|
0:06:15 | if epsilon is small |
---|
0:06:17 | then you can perform linear programming |
---|
0:06:21 | and |
---|
0:06:22 | but only work if epsilon a small |
---|
0:06:24 | uh |
---|
0:06:25 | so let's |
---|
0:06:26 | start walking away from when you're programming because we care about noise |
---|
0:06:30 | um what's look at focused detection |
---|
0:06:33 | where we |
---|
0:06:34 | care whether a particular user |
---|
0:06:36 | is in the cooler |
---|
0:06:39 | so |
---|
0:06:40 | look at the set |
---|
0:06:42 | of all fingerprint combinations |
---|
0:06:45 | where |
---|
0:06:46 | the weights are equal |
---|
0:06:48 | and the coalition size is no more think K |
---|
0:06:52 | and break that's set up into a set in which i is a member of the coalition |
---|
0:06:56 | and M is not measure uh member the coalition |
---|
0:06:59 | so gives you the guilty that |
---|
0:07:01 | for um and the not guilty set for and |
---|
0:07:05 | and we can talk about the distance between these two sets is being the minimum distance between |
---|
0:07:09 | points and one set another set |
---|
0:07:12 | we can actually bound |
---|
0:07:13 | this distance |
---|
0:07:15 | ah |
---|
0:07:15 | from below |
---|
0:07:17 | uh provided the fingerprints |
---|
0:07:20 | come from columns of an P matrix |
---|
0:07:22 | um |
---|
0:07:24 | and so this really |
---|
0:07:26 | illustrates |
---|
0:07:27 | uh but there is some resilience to noise |
---|
0:07:30 | with uh |
---|
0:07:31 | with a P fingerprint |
---|
0:07:33 | but there is a if |
---|
0:07:35 | provided the noise is |
---|
0:07:37 | have |
---|
0:07:38 | this lower bound |
---|
0:07:40 | you'll be able to determine whether |
---|
0:07:42 | uh a a given user is guilty |
---|
0:07:46 | so |
---|
0:07:49 | or P appears to be a great for |
---|
0:07:51 | fingerprint |
---|
0:07:53 | design |
---|
0:07:54 | so how do we build an nor P matrix |
---|
0:07:56 | um |
---|
0:07:57 | the typical |
---|
0:07:58 | construction uses |
---|
0:08:00 | random entries |
---|
0:08:01 | if all the entries are independently drawn gaussian random variables |
---|
0:08:06 | then the resulting matrix is alright P with high probability |
---|
0:08:10 | at it's only with high probability |
---|
0:08:12 | there is some probability that you'll fail |
---|
0:08:15 | but you want know it |
---|
0:08:16 | because tracking that a matrix satisfies all P |
---|
0:08:20 | is hard |
---|
0:08:22 | so this |
---|
0:08:23 | we just uh |
---|
0:08:25 | look for deterministic constructions |
---|
0:08:28 | uh are right P matrices |
---|
0:08:30 | and the key tool and doing this |
---|
0:08:32 | is |
---|
0:08:33 | using what's called the group in circle theorem |
---|
0:08:36 | which states that |
---|
0:08:38 | well a diagonal matrix the eigenvalues are the diagonal entries well nearly diagonal matrix |
---|
0:08:44 | the diagonal entries will be close to the eigenvalues |
---|
0:08:46 | uh how close |
---|
0:08:48 | well well B |
---|
0:08:49 | turn and by the size of the off diagonal entries |
---|
0:08:53 | so |
---|
0:08:54 | when |
---|
0:08:56 | trying to demonstrate that a matrix as R right P |
---|
0:08:59 | you're really |
---|
0:09:00 | trying to say something that the eigenvalues |
---|
0:09:03 | of all the submatrices of the grammy and |
---|
0:09:06 | so if you take |
---|
0:09:07 | uh |
---|
0:09:09 | you take your matrix of fingerprints |
---|
0:09:11 | the seem your fingerprints have unit norm |
---|
0:09:14 | to find the worst-case case here is to be the size of the largest inner product between the fingerprint |
---|
0:09:19 | and then for each day |
---|
0:09:21 | we can analyse with the small the delta is for which |
---|
0:09:25 | that is K don't alright P |
---|
0:09:27 | the way you do that is you want to find out how far away from one |
---|
0:09:33 | the eigenvalues of each of the sub matrices are |
---|
0:09:35 | so you subtract |
---|
0:09:37 | the sub grant yeah |
---|
0:09:39 | uh you subtract a uh the identity from the sub gram yeah |
---|
0:09:42 | which centres it |
---|
0:09:44 | about zero as opposed to centring about one |
---|
0:09:47 | and then the maximum distance |
---|
0:09:48 | will be the spectral norm of this differ |
---|
0:09:51 | and using the group square |
---|
0:09:52 | the group score in circle there |
---|
0:09:55 | uh this we bounded above by K minus one times me |
---|
0:09:58 | why because there's K minus one |
---|
0:10:00 | off diagonal entries in each row |
---|
0:10:02 | and new is the size of a large off the act one tree |
---|
0:10:06 | so |
---|
0:10:07 | i just demonstrated to you the |
---|
0:10:09 | a |
---|
0:10:10 | that F |
---|
0:10:12 | is |
---|
0:10:12 | "'kay" don't to alright P whenever dealt is bigger then |
---|
0:10:15 | "'kay" minus one |
---|
0:10:17 | times the worst-case case yeah |
---|
0:10:20 | it would be nice if the worst-case coherence |
---|
0:10:23 | or small |
---|
0:10:24 | because that would give us |
---|
0:10:26 | the best are ap P parameter |
---|
0:10:28 | uh |
---|
0:10:29 | for utility |
---|
0:10:30 | um |
---|
0:10:31 | how small can the worst-case coherence parents be |
---|
0:10:35 | the welch bound tells us |
---|
0:10:37 | that the worst case coherence |
---|
0:10:39 | is bounded below by this |
---|
0:10:40 | square root |
---|
0:10:42 | and there exist constructions which actually achieve this lower bow |
---|
0:10:46 | those are called a query angular tight frame |
---|
0:10:50 | so for echo wearing are tight frames |
---|
0:10:52 | new |
---|
0:10:53 | is equal to this |
---|
0:10:54 | well spent on the square root |
---|
0:10:56 | and so using that quality |
---|
0:10:59 | and |
---|
0:10:59 | the inequality from a previous slide |
---|
0:11:01 | we know the X wing are tight frames |
---|
0:11:04 | are actually K dealt or P |
---|
0:11:06 | when you satisfy this inequality one delta square is bigger than |
---|
0:11:10 | uh this quotient |
---|
0:11:13 | because echoing coming tight frames |
---|
0:11:15 | happened to achieve the welsh bound |
---|
0:11:18 | and because the worst-case coherence |
---|
0:11:20 | is what naturally comes out of the gross an argument to demonstrate a right P |
---|
0:11:25 | echoing are tight frames happen to be a the state-of-the-art |
---|
0:11:28 | for deterministic alright Q construction |
---|
0:11:31 | and |
---|
0:11:32 | it's relatively easy to build |
---|
0:11:34 | and that crank type frame |
---|
0:11:36 | i read a paper recently with matt fig guess |
---|
0:11:39 | called steiner at winning tight frames |
---|
0:11:41 | if you google stack steiner T S |
---|
0:11:43 | they'll so you had a |
---|
0:11:45 | how to construct these guys but i'm not gonna |
---|
0:11:48 | talk about that today |
---|
0:11:50 | i just one leave us uh |
---|
0:11:52 | fact that |
---|
0:11:53 | i wing are tight frames appear to be particularly well suited |
---|
0:11:57 | as uh a fingerprint codes both because there |
---|
0:12:00 | or P |
---|
0:12:02 | and because they're deterministic |
---|
0:12:04 | easy the |
---|
0:12:05 | easy to build so what's zoom in on this particular subclass |
---|
0:12:09 | of a right P matrices |
---|
0:12:11 | in the context of focused detection |
---|
0:12:15 | so with focused detection |
---|
0:12:17 | we take are noisy |
---|
0:12:19 | a |
---|
0:12:20 | convex combination |
---|
0:12:22 | a fingerprints |
---|
0:12:24 | and we compare |
---|
0:12:25 | to all the fingerprints and our dictionary |
---|
0:12:30 | the fingerprints |
---|
0:12:32 | that looks the most like or noisy combination |
---|
0:12:36 | we uh |
---|
0:12:39 | we say belong to |
---|
0:12:41 | guilty |
---|
0:12:42 | suspects |
---|
0:12:46 | so this process in certain false positive and false negative probability |
---|
0:12:51 | and uh the probability |
---|
0:12:53 | uh |
---|
0:12:55 | the fact that there's a probability a talk about |
---|
0:12:57 | is because epsilon |
---|
0:12:59 | is uh modelled as a and with |
---|
0:13:02 | variance |
---|
0:13:03 | signals squared |
---|
0:13:05 | so |
---|
0:13:06 | the false positive probability |
---|
0:13:09 | is the probability that we declare that he's a guilty when is actually not |
---|
0:13:13 | and vice versa for a some negative |
---|
0:13:19 | well i'm gonna talk about |
---|
0:13:21 | is the the worst case |
---|
0:13:23 | of these types of probable |
---|
0:13:25 | so the worst-case case false positive |
---|
0:13:27 | is for each |
---|
0:13:28 | uh coalition you look at all guys not your coalition |
---|
0:13:32 | take the maximum |
---|
0:13:33 | uh false positive probability |
---|
0:13:36 | and maximise all over all of the coalition's |
---|
0:13:39 | but for the E false negative |
---|
0:13:41 | uh |
---|
0:13:42 | it's a different story because |
---|
0:13:44 | we |
---|
0:13:45 | we want to catch at least one colluder |
---|
0:13:47 | namely the most vulnerable |
---|
0:13:49 | so instead of maximizing over |
---|
0:13:51 | all |
---|
0:13:52 | a members of the coalition we actually minimize |
---|
0:13:55 | uh the false negative probabilities over all members of coalition |
---|
0:13:59 | the maximise that |
---|
0:14:01 | over all |
---|
0:14:02 | oh cool ish |
---|
0:14:04 | so it's so with these we uh we actually have a result that says |
---|
0:14:08 | fingerprints that form an knack wearing tight frame |
---|
0:14:11 | actually have these false |
---|
0:14:13 | positive and false negative probabilities satisfy these |
---|
0:14:17 | these uh inequalities using the Q function |
---|
0:14:20 | and uh i like to just go over the use any qualities |
---|
0:14:23 | and more detail |
---|
0:14:24 | the first one |
---|
0:14:27 | an upper bound on a false positive |
---|
0:14:29 | it |
---|
0:14:30 | independent |
---|
0:14:32 | of the colluders choice of alpha |
---|
0:14:34 | so remember alpha is the weights that they |
---|
0:14:37 | uh |
---|
0:14:38 | that they chose to |
---|
0:14:40 | um |
---|
0:14:42 | use for their convex combination of fingerprints for the forgery |
---|
0:14:46 | regardless of |
---|
0:14:47 | what |
---|
0:14:48 | coefficients they use |
---|
0:14:50 | they won't be able to |
---|
0:14:52 | frame |
---|
0:14:53 | and in is that user sense this upper bound is and the kind of health a |
---|
0:14:56 | so |
---|
0:14:57 | that that's kind of a fun interpretation |
---|
0:15:00 | and then the second equality |
---|
0:15:02 | uh we notice that the upper bound is maximised |
---|
0:15:05 | when the coefficients |
---|
0:15:07 | in the collusion or equal |
---|
0:15:11 | so the way we interpret this is that |
---|
0:15:13 | uh the colluders |
---|
0:15:15 | well they don't wanna get caught they have the best chance of not getting caught under focus detection if there |
---|
0:15:20 | if their weights are equal |
---|
0:15:22 | and this comes out of the fact that |
---|
0:15:24 | focused detection |
---|
0:15:26 | is |
---|
0:15:26 | seeking to |
---|
0:15:28 | find the uh |
---|
0:15:29 | the most vulnerable |
---|
0:15:31 | so one the weights are equal |
---|
0:15:33 | the most vulnerable as |
---|
0:15:34 | least normal |
---|
0:15:36 | so |
---|
0:15:39 | this talk |
---|
0:15:40 | was basically a |
---|
0:15:42 | um |
---|
0:15:43 | a presentation selling compressed sensing for |
---|
0:15:46 | uh |
---|
0:15:47 | think print design |
---|
0:15:48 | a a compressed sensing ideas like with the restricted isometry property appear be useful |
---|
0:15:53 | in figure |
---|
0:15:54 | fingerprint design and and |
---|
0:15:56 | uh i think of ways of identifying corporate |
---|
0:15:58 | and equating wearing tight frames being deterministic and |
---|
0:16:03 | uh a right P |
---|
0:16:04 | easy to construct |
---|
0:16:06 | there uh |
---|
0:16:07 | there well suited for fingerprint |
---|
0:16:09 | in designing them |
---|
0:16:10 | focus detection |
---|
0:16:12 | uh appears to work well |
---|
0:16:13 | and i'd like to think more about different ways of |
---|
0:16:17 | detecting technical using |
---|
0:16:18 | equating or type frame fingerprint |
---|
0:16:21 | this concludes my talk all a chain the questions of this |
---|
0:16:29 | we have time for questions |
---|
0:16:41 | oh |
---|
0:16:43 | in compressed sensing |
---|
0:16:44 | you want to work can sort the ones you brought |
---|
0:16:46 | the young vector |
---|
0:16:47 | where are here |
---|
0:16:49 | did you want to sit in |
---|
0:16:50 | just catching one colluder |
---|
0:16:52 | so the can and mismatch between |
---|
0:16:55 | right so um |
---|
0:16:57 | another |
---|
0:16:59 | um another thing that i could do is instead of just catching the most vulnerable |
---|
0:17:04 | i could catch |
---|
0:17:05 | the the top cable able |
---|
0:17:08 | um |
---|
0:17:09 | or uh |
---|
0:17:12 | when i set up |
---|
0:17:13 | the focus detection |
---|
0:17:15 | i set a threshold tao |
---|
0:17:17 | and decide that |
---|
0:17:18 | a user is guilty if the test cystic |
---|
0:17:21 | uh exceeds town |
---|
0:17:23 | um |
---|
0:17:24 | i don't necessarily |
---|
0:17:26 | i will necessarily have only one |
---|
0:17:28 | guilty suspects |
---|
0:17:30 | with that uh with that process |
---|
0:17:32 | but with the uh |
---|
0:17:34 | the theorem state that i gave |
---|
0:17:37 | it |
---|
0:17:37 | it was |
---|
0:17:38 | based on uh |
---|
0:17:40 | uh |
---|
0:17:41 | most of all able the worst case scenario now |
---|
0:17:44 | uh if you can only get one |
---|
0:17:45 | but |
---|
0:17:46 | agreed compressed sensing |
---|
0:17:48 | is more uh |
---|
0:17:51 | more versatile it's |
---|
0:17:52 | it's intended to capture the entire support |
---|
0:17:55 | recover the entire |
---|
0:17:57 | vector it |
---|
0:17:57 | tired |
---|
0:17:58 | so |
---|
0:17:59 | um there's definitely |
---|
0:18:01 | more work to be done to to incorporate a compressed sensing ideas |
---|
0:18:05 | and model selection ideas for uh |
---|
0:18:08 | for the fingerprints problem |
---|
0:18:15 | great |
---|
0:18:16 | i |
---|
0:18:17 | um i don't see the um |
---|
0:18:19 | i did you |
---|
0:18:20 | did you could this seeing the difference with |
---|
0:18:22 | preview |
---|
0:18:23 | paper |
---|
0:18:24 | from the same or for like yeah |
---|
0:18:26 | um the one based on simplex exclude or or its because it seemed like school because the set ups in |
---|
0:18:31 | to be the same |
---|
0:18:33 | the detectors a linear your correlation so |
---|
0:18:36 | yeah uh uh make are |
---|
0:18:38 | published |
---|
0:18:40 | a |
---|
0:18:40 | saying that simplex |
---|
0:18:42 | codes |
---|
0:18:43 | are them all |
---|
0:18:45 | provided |
---|
0:18:46 | the number of users |
---|
0:18:48 | is that most |
---|
0:18:50 | uh |
---|
0:18:50 | the signal the mentioned plus one |
---|
0:18:53 | and the U D of |
---|
0:18:57 | this solution |
---|
0:18:58 | is that |
---|
0:18:59 | were allowed to have |
---|
0:19:00 | uh the number of users |
---|
0:19:02 | not be so |
---|
0:19:03 | construct |
---|
0:19:04 | it can be a larger than and plus one |
---|
0:19:06 | and that's what uh |
---|
0:19:08 | equating english tight frames |
---|
0:19:10 | uh |
---|
0:19:11 | allows for |
---|
0:19:12 | in fact |
---|
0:19:13 | uh constructions exist |
---|
0:19:15 | ah |
---|
0:19:17 | for |
---|
0:19:18 | a |
---|
0:19:19 | number of users |
---|
0:19:20 | uh |
---|
0:19:21 | scaling on |
---|
0:19:22 | the order of |
---|
0:19:23 | the square of the signal dimension |
---|
0:19:25 | so that's much more |
---|
0:19:28 | uh |
---|
0:19:28 | versatility band |
---|
0:19:30 | what the simplex good you have |
---|
0:19:31 | and |
---|
0:19:32 | no be more specific |
---|
0:19:34 | simplex codes happen to be a special case |
---|
0:19:37 | uh a query were tight frames |
---|
0:19:39 | so |
---|
0:19:39 | um |
---|
0:19:41 | so i'm glad you mention that it really uh |
---|
0:19:43 | kind of |
---|
0:19:44 | puts |
---|
0:19:45 | this work in like proper context |
---|
0:19:51 | oh the questions |
---|
0:19:55 | do |
---|
0:19:56 | so thank you know match for |
---|