0:00:15 | that's is the improvements on the bottleneck features itself work by some the end from |
---|
0:00:21 | us to use a couple of the search for a million myself and dinally one |
---|
0:00:26 | and if you follow us onions work over the past few years to say these |
---|
0:00:32 | make great gains in using deep bottleneck features for lid |
---|
0:00:38 | so this particular paper |
---|
0:00:41 | it extends from some work that is published i think last year just pretty much |
---|
0:00:47 | this work and he's using the bottleneck features vector and bottleneck layer and fifty features |
---|
0:00:53 | to create to extract i-vectors what he's doing this is basically |
---|
0:01:01 | taking out that p gmm of putting in a phonetic mixture french analyses in its |
---|
0:01:06 | place and what this does is it allows the |
---|
0:01:10 | single step to do the analysis feature reduction |
---|
0:01:14 | and a combination they're also on the locks some efficiency gains that allows them to |
---|
0:01:20 | explore and doing something like sdc with take bottleneck features that is concatenating or extending |
---|
0:01:28 | the context |
---|
0:01:30 | time which appears to not quite well |
---|
0:01:34 | the test is done on lre zero nine with the six most highly confused languages |
---|
0:01:39 | and he's got some improvement gains and as you'll see if you come to the |
---|
0:01:44 | poster the improvement is less |
---|
0:01:47 | alpha three seconds and it is for the longer utterances that's not really surprising but |
---|
0:01:52 | if you're interested where poster number eleven |
---|