WiDance论文学习

毕设自己挖坑选择了以前科研的方向,所以继续开始看CSI。这次也是去复现清华的钱堃大佬论文,不同于之前的PADS,WiDance这篇论文要难上好几个数量级。

简介

这里我也就不细细介绍分析论文每一段都写了什么,我主要是想以我的理解,以及部分源码的思路来重新看这一篇论文。论文里的 Figure 4 算是前期操作的一个主要过程了,这部分都是为了选取出合适的数据并提取出多普勒频移。Figure 6 则是利用多普勒频移进行建模,从而做到动作识别。整个论文的实现可以分为上述说到的这两大部分。本文的数据格式以及环境设置和论文里相同。

数据提取

按照论文里的设置,采样率是1024Hz,使用 csi_get_all 函数从原始的CSI中提取出振幅和时间戳进行后期工作。以前认为CSI就是用 read_bf_file 读取到的CSI矩阵,但实际上需要用 get_scaled_csi 处理一下。PADS中相位变换就需要这一步操作,没有这一步操作,会有一定的区别。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
function [cfr_array, timestamp, rssi_array] = csi_get_all(filename)
csi_trace = read_bf_file(filename);
timestamp = zeros(length(csi_trace), 1);
cfr_array = zeros(length(csi_trace), 90);
rssi_array = zeros(length(csi_trace), 3);
for k = 1:length(csi_trace)
csi_entry = csi_trace{k}; % for the k_{th} packet

csi_all = squeeze(get_scaled_csi(csi_entry)).'; % estimate channel matrix Hexp-figu
csi = [csi_all(:,1); csi_all(:,2); csi_all(:, 3)].'; % select CSI for one antenna pair

timestamp(k) = csi_entry.timestamp_low;
cfr_array(k,:) = csi;
rssi_array(k,:) = [csi_entry.rssi_a, csi_entry.rssi_b, csi_entry.rssi_c];
end

设置滤波器

论文里虽然首先进行天线选择,但实际上先完成的是Butterworth滤波器的构造。滤波是为了将高频和低频的信号过滤掉。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
half_rate = samp_rate / 2;
uppe_orde = 6;
uppe_stop = 40;
lowe_orde = 3;
lowe_stop = 4;
[lu,ld] = butter(uppe_orde,uppe_stop/half_rate,'low');
[hu,hd] = butter(lowe_orde,lowe_stop/half_rate,'high');
data_wind = samp_rate * 2;
iter_star = 1:samp_rate:n_data-data_wind+1;
freq_bins = [0:samp_rate/2-1 -samp_rate/2:-1]'/samp_rate;
time_bins = (1:length(iter_star) * (0.5*data_wind)) / samp_rate;
freq_sele = freq_bins <= uppe_stop / samp_rate & freq_bins >= -uppe_stop / samp_rate;
freq_bins = freq_bins(freq_sele,:);
freq_time_prof = zeros(length(freq_bins), length(time_bins),length(file_path_list));

关于滤波器的设置,这里面的tricks就非常多了。首先是定义了一个数据窗口(采样率的两倍),接着就是定义了频域选择器,之后定义的 freq_time_prof 用于后面时频分析。iter_star 是迭代生成器,按照此预设的迭代生成器从原始数据中选取部分数据来进行操作。

天线选择

在天线选择之前,先计算出信道能量,这里信道能量就是CSI的共轭乘积。

1
2
3
chan_data = chan_data_list{1};
chan_powe = chan_data .* conj(chan_data);
ante_pairs = zeros(2, length(iter_star));

之后按照论文的思路,选取出振幅最大方差最小和振幅最小方差最大的两个天线。这两根天线的数据进行计算共轭乘积。仔细想一下会发现这个地方非常的巧妙,一个动作的完整过程就是执行动作以及复原,会发现动作方向是一个相反方向,也就是说,多普勒频移应该是相反数的存在关系,通过共轭乘积,可以将这个动作看做成向同一个方向进行了一个更大的动作,这样将微弱的多普勒频移进行放大,这样在频谱图上,多普勒效应就会更加明显。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
%% Select antenna pair.
part_data = chan_data(iter_star(ii):iter_star(ii)+data_wind-1,:);
part_powe = chan_powe(iter_star(ii):iter_star(ii)+data_wind-1,:);
part_vari = sqrt(var(part_powe,1));
part_mean = mean(part_powe,1);
part_vari_ante = mean(reshape(part_vari.', n_subc, n_ante));
part_mean_ante = mean(reshape(part_mean.', n_subc, n_ante));
part_indi_ante = part_mean_ante ./ part_vari_ante;
[~, maxi] = max(part_indi_ante);
[~, mini] = min(part_indi_ante);
ante_pair = [mini, maxi];
ante_pairs(:,ii) = ante_pair.';
part_diff = part_data(:,(ante_pair(1)-1)*n_subc+1:ante_pair(1)*n_subc) ...
.* conj(part_data(:,(ante_pair(2)-1)*n_subc+1:ante_pair(2)*n_subc));

滤波以及时频分析

在选择完天线以后才会用之前定义好的滤波器进行滤波,接着用PCA进行数据压缩,之后用 tftb 时频分析工具箱进行时频分析。

1
2
3
4
5
6
7
8
9
10
11
%% Filter out noises and interferences.
for jj = 1:n_subc,
part_diff(:,jj) = filter(lu, ld, part_diff(:,jj));
part_diff(:,jj) = filter(hu, hd, part_diff(:,jj));
end

pca_coef = pca(part_diff);
part_diff = part_diff * pca_coef(:,1);
part_wind = floor(0.5*samp_rate+1:1.5*samp_rate);
part_spec = tfrsp(part_diff, part_wind, samp_rate, tftb_window(samp_rate/8+1, 'gauss'));
freq_time_prof(:,(ii-1)*samp_rate+1:ii*samp_rate,1) = part_spec(freq_sele, :);

使用 mesh(time_bins, freq_bins, freq_time_prof) 可以画出频谱图。

动作检测

在动作检测事情,先通过 smooth 进行数据光滑操作。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
prof_cent = zeros(length(time_bins),length(file_path_list));
prof_vari = zeros(length(time_bins),length(file_path_list));
prof_ener = zeros(length(time_bins),length(file_path_list));
% powe_cent = zeros(length(time_bins),length(file_path_list));
norm_freq_time_prof = zeros(size(freq_time_prof));
% norm_powe_freq_time_prof = zeros(size(powe_freq_time_prof));

for kk = 1:length(file_path_list),
norm_freq_time_prof(:,:,kk) = abs(freq_time_prof(:,:,kk)) ./ repmat(sum(abs(freq_time_prof(:,:,kk)),1), size(freq_time_prof(:,:,kk),1), 1);
% norm_powe_freq_time_prof(:,:,kk) = abs(powe_freq_time_prof(:,:,kk)) ./ repmat(sum(abs(powe_freq_time_prof(:,:,kk)),1), size(powe_freq_time_prof(:,:,kk),1), 1);
prof_cent(:,kk) = freq_bins.' * norm_freq_time_prof(:,:,kk);
% powe_cent(:,kk) = powe_freq_bins.' * norm_powe_freq_time_prof(:,:,kk);

prof_vari(:,kk) = sqrt(sum((repmat(freq_bins, 1, length(time_bins)) - repmat(prof_cent(:,kk).', length(freq_bins), 1)).^2 ...
.* norm_freq_time_prof(:,:,kk), 1));
prof_ener(:,kk) = smooth(sum(freq_time_prof(:,:,kk),1), samp_rate/8+1, 'moving');
end

动作检测里设定了阈值,正如论文所说的,一个动作在一对波峰或波谷之中。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
%% Motion Detection...
moti_indi = smooth(mean(prof_vari,2),samp_rate/2+1,'moving');
dete_thre = 0.015;
time_thre = [0.25, 0.25];
moti_valu = [1, 0];
moti_inde = moti_indi < dete_thre;

for jj = 1:length(time_thre),
curr_thre = time_thre(jj);
curr_valu = moti_valu(jj);
ii = 1;
while ii <= length(moti_inde),
while ii <= length(moti_inde) && moti_inde(ii) ~= curr_valu,
ii = ii + 1;
end
if ii > length(moti_inde),
break;
end
kk = ii + 1;
while kk <= length(moti_inde) && moti_inde(kk) == curr_valu,
kk = kk + 1;
end
if kk - ii < curr_thre * samp_rate,
moti_inde(ii:kk-1) = ~curr_valu;
end
ii = kk;
end
end
moti_indx = find(moti_inde);
note_inte = floor(note_inte * samp_rate);
note_indx = moti_indx(ll);
%note_indx = indx_list(ll);
note_end = moti_indx(end);
note_segm = zeros(n_note, 2);

动作分片

动作分片其实可以和动作检测看成一个整体,就是将所有包含动作的数据包进行切片出来,然后拼接在一起。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
%% Segmentation...
segm_lowb = 0.8;
segm_higb = 1.2;
for ii = 1:n_note,
note_segm(ii,1) = note_indx;
if ii < n_note,
curr_ener = mean(prof_ener(note_indx:min(note_indx+2*note_inte-1,size(prof_ener,1)),:), 2);
[~, segm_poin] = min(curr_ener(floor(note_inte*segm_lowb)+1:floor(note_inte*segm_higb)));
note_segm(ii,2) = note_indx+floor(note_inte*segm_lowb)+segm_poin-1;
% plot(time_bins(note_indx:min(note_indx+2*note_inte-1,size(prof_ener,1))), curr_ener);
% hold on;
% plot(time_bins(note_indx-1+floor(note_inte*segm_lowb)+segm_poin), curr_ener(floor(note_inte*segm_lowb)+segm_poin), '*');
% hold off;
note_indx = note_indx+floor(note_inte*segm_lowb)+segm_poin;
else
curr_ener = mean(prof_ener(note_indx:note_end,:), 2);
note_segm(ii,2) = note_end;
% plot(time_bins(note_indx:note_end), curr_ener);
end
end

动作识别

动作识别是整个论文最关键的地方,准确的说,应该是方向识别。引入两个接收端,来同时计算出多普勒频移,通过比对动作对两个接收端的多普勒频移,来进行动作识别。这种方法无需训练,可以即时的计算出方向。正如论文提到的,动作识别分为两类,以第一象限为例子,理论上是0°,45°,90°为三类,但实际上这个是需要修正的,因为真实情况中,并不会如此精确,所以论文中以0-30°,30-60°,60-90°来划分三类,按照这样来划分,实际上360°可以划分为8个区块(0-30°和-30-0°是一个区块)。接着进行第二次识别,第二次识别就用到了两个设备的数据。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
%% Recognition..
for kk = 1:size(prof_cent,2),
prof_cent(:,kk) = smooth(prof_cent(:,kk), samp_rate/4+1, 'moving');
% powe_cent(:,kk) = smooth(powe_cent(:,kk), samp_rate/4+1, 'moving');
end
prof_cent(~moti_inde,:) = 0;
prof_cent_tota = sum(abs(prof_cent),2)/2;
prof_cent_tota = smooth(prof_cent_tota, samp_rate/8+1, 'moving');
% powe_cent(~moti_inde,:) = 0;
% powe_cent_tota = sum(abs(powe_cent),2)/2;
% powe_cent_tota = smooth(powe_cent_tota, samp_rate/8+1, 'moving');

note_list = zeros(1,n_note);
tx = [1.6,1.6].';
rx = [0,1.6;1.6,0].';
init_loc = [0.8,0.8].';
tant_valu = zeros(1,n_note);
dire_valu = zeros(2,n_note);
peak_dist = 0.1;

for ii = 1:n_note,
curr_cent = prof_cent(note_segm(ii,1):note_segm(ii,2),:);
curr_cent_tota = prof_cent_tota(note_segm(ii,1):note_segm(ii,2),:);
% curr_powe_cent = powe_cent(note_segm(ii,1):note_segm(ii,2),:);
% curr_powe_cent_tota = powe_cent_tota(note_segm(ii,1):note_segm(ii,2),:);
curr_cent = curr_cent(moti_inde(note_segm(ii,1):note_segm(ii,2)),:);
curr_cent_tota = curr_cent_tota(moti_inde(note_segm(ii,1):note_segm(ii,2)),:);
midd_poin = floor(length(curr_cent_tota) / 2);
firs_half = sum(curr_cent(1:midd_poin,:),1);
seco_half = sum(curr_cent(midd_poin+1:end,:),1);
for kk = 1:length(file_path_list);
if firs_half(kk) > seco_half(kk),
curr_cent(1:midd_poin,kk) = abs(curr_cent(1:midd_poin,kk));
curr_cent(midd_poin+1:end,kk) = -abs(curr_cent(midd_poin+1:end,kk));
else
curr_cent(1:midd_poin,kk) = -abs(curr_cent(1:midd_poin,kk));
curr_cent(midd_poin+1:end,kk) = abs(curr_cent(midd_poin+1:end,kk));
end
end
curr_cent = curr_cent - repmat(mean(curr_cent, 1),size(curr_cent,1),1);
firs_half = sum(curr_cent(1:midd_poin,:),1);
seco_half = sum(curr_cent(midd_poin+1:end,:),1);
% if (mark_list(ii) >= 1 && mark_list(ii) <= 2) || (mark_list(ii) == 8),
% firs_half(2) = firs_half(2);
% seco_half(2) = seco_half(2);
% elseif (mark_list(ii) >= 4 && mark_list(ii) <= 6),
% firs_half(1) = firs_half(1);
% seco_half(1) = seco_half(1);
% end
tant_valu(ii) = atan2(abs(firs_half(1))+abs(seco_half(1)), abs(firs_half(2))+abs(seco_half(2))) / pi * 180;
dire_valu(:,ii) = firs_half < 0;

if tant_valu(ii) > 60,
if dire_valu(1,ii) == 1,
note_list(ii) = 7;
else
note_list(ii) = 3;
end
elseif tant_valu(ii) < 30,
if dire_valu(2,ii) == 1,
note_list(ii) = 5;
else
note_list(ii) = 1;
end
else
if dire_valu(1,ii) == 1 && dire_valu(2,ii) == 1,
note_list(ii) = 6;
elseif dire_valu(1,ii) == 1 && dire_valu(2,ii) == 0,
note_list(ii) = 8;
elseif dire_valu(1,ii) == 0 && dire_valu(2,ii) == 1,
note_list(ii) = 4;
else
note_list(ii) = 2;
end
end
note_list(ii) = mod(note_list(ii), 8) + 1;
% if tant_valu(ii) > 45,
% if dire_valu(1,ii) == 1,
% note_list(ii) = 8;
% else
% note_list(ii) = 4;
% end
% else
% if dire_valu(2,ii) == 1,
% note_list(ii) = 6;
% else
% note_list(ii) = 2;
% end
% end

% if dire_valu(1,ii) == 1 && dire_valu(2,ii) == 1,
% note_list(ii) = 7;
% elseif dire_valu(1,ii) == 1 && dire_valu(2,ii) == 0,
% note_list(ii) = 1;
% elseif dire_valu(1,ii) == 0 && dire_valu(2,ii) == 1,
% note_list(ii) = 5;
% else
% note_list(ii) = 3;
% end
fprintf('index:%d... tant:%f... dire: %d,%d... mark: %d... pred: %d\n', ii, tant_valu(ii), dire_valu(:,ii), mark_list(ii), note_list(ii));

总结

上述简单分析了整个实验的流程和代码,有些地方没有细细讲解,也有部分没有看懂,欢迎指正。以下是完整的代码。当然这个完整代码依赖一些其他的函数,考虑到完整的数据太大,打包了包含部分数据的完整代码上传至百度云。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
%% Load CSI data.
data_path = 'data/final/';
resu_path = 'experiment/all_2.mat';

%% Total Training
training_list = {'P1-A1-I2-G1','P1-A1-I2-G2','P2-A1-I2-G1','P2-A1-I2-G2',...
'P3-A1-I2-G1','P3-A1-I2-G2', 'P4-A1-I2-G1','P4-A1-I2-G2',...
'P5-A1-I2-G1','P5-A1-I2-G2','P1-A2-I2-G1','P1-A2-I2-G2',...
'P1-A3-I2-G2'};
training_inde = {2540,3000,2300,2150,...
2890,2800,2500,2400,...
3400,3000,2300,1150,...
2050};

%% Total Testing
testing_list = {'P1-A1-I2-G3','P1-A1-I2-G4', 'P2-A1-I2-G3','P2-A1-I2-G4', ...
'P3-A1-I2-G3','P3-A1-I2-G4', 'P4-A1-I2-G3','P4-A1-I2-G4', ...
'P5-A1-I2-G3','P5-A1-I2-G4', 'P1-A2-I2-G3','P1-A2-I2-G4', ...
'P1-A3-I2-G3'};
indx_list = [3200,3230, 2250,2900, ...
3000,3000, 2350,2000, ...
2550,1850, 2500,2100, ...
1900];

prefix_list = {'P1-A1-I2-G1','P1-A1-I2-G2','P1-A1-I2-G3','P1-A1-I2-G4', ...
'P2-A1-I2-G1','P2-A1-I2-G2','P2-A1-I2-G3','P2-A1-I2-G4', ...
'P3-A1-I2-G1','P3-A1-I2-G2','P3-A1-I2-G3','P3-A1-I2-G4', ...
'P4-A1-I2-G1','P4-A1-I2-G2','P4-A1-I2-G3','P4-A1-I2-G4', ...
'P5-A1-I2-G1','P5-A1-I2-G2','P5-A1-I2-G3','P5-A1-I2-G4', ...
'P1-A2-I2-G1','P1-A2-I2-G2','P1-A2-I2-G3','P1-A2-I2-G4', ...
'P1-A3-I2-G2','P1-A3-I2-G3'};

%% Data Processing.
carr_freq = 5.825e9;
wave_spee = 299792458;
n_subc = 30;
n_ante = 3;
wave_leng = wave_spee / carr_freq;
n_note = 50;
all_tant = [];
all_mark = [];
all_dire = [];
all_resu = [];
confusion = zeros(8,8);
type_summ = zeros(1,8);
for ll = 1:length(prefix_list),
note_inte = 2;
prefix = prefix_list{ll};

file_path_list = {[data_path, 'R1-',prefix,'.dat'], [data_path, 'R2-',prefix,'.dat']};
ground_truth_path = [data_path, prefix, '.mat'];
save_path = [data_path, 'MED-', prefix, '.mat'];
ground_truth = load(ground_truth_path, 'mark_list');
mark_list = ground_truth.mark_list;

if ~exist(save_path),
chan_data_list = cell(1,length(file_path_list));
for kk = 1:length(file_path_list),
file_path = file_path_list{kk};
[chan_data, time_stam] = csi_get_all(file_path);
if kk == 1,
% samp_rate = length(time_stam) / (time_stam(end)-time_stam(1)) * 1e6;
% samp_rate = 2^(round(log2(samp_rate)));
samp_rate = 1024;
n_data = size(chan_data, 1);
elseif n_data > size(chan_data, 1);
n_data = size(chan_data, 1);
end
chan_data_list{kk} = chan_data;
end

%% Set up filters.
half_rate = samp_rate / 2;
uppe_orde = 6;
uppe_stop = 40;
lowe_orde = 3;
lowe_stop = 4;
[lu,ld] = butter(uppe_orde,uppe_stop/half_rate,'low');
[hu,hd] = butter(lowe_orde,lowe_stop/half_rate,'high');

data_wind = samp_rate * 2;
iter_star = 1:samp_rate:n_data-data_wind+1;
freq_bins = [0:samp_rate/2-1 -samp_rate/2:-1]'/samp_rate;
time_bins = (1:length(iter_star) * (0.5*data_wind)) / samp_rate;
freq_sele = freq_bins <= uppe_stop / samp_rate & freq_bins >= -uppe_stop / samp_rate;
freq_bins = freq_bins(freq_sele,:);
% powe_freq_sele = freq_bins >= 0 & freq_bins <= uppe_stop / samp_rate;
% powe_freq_bins = freq_bins(powe_freq_sele,:);
freq_time_prof = zeros(length(freq_bins), length(time_bins),length(file_path_list));
% powe_freq_time_prof = zeros(length(powe_freq_bins), length(time_bins),length(file_path_list));

for kk = 1:length(file_path_list),
%% Processing data streams.
chan_data = chan_data_list{kk};
chan_powe = chan_data .* conj(chan_data);
ante_pairs = zeros(2, length(iter_star));
for ii = 1:length(iter_star),
%% Select antenna pair.
part_data = chan_data(iter_star(ii):iter_star(ii)+data_wind-1,:);
part_powe = chan_powe(iter_star(ii):iter_star(ii)+data_wind-1,:);
part_vari = sqrt(var(part_powe,1));
part_mean = mean(part_powe,1);
part_vari_ante = mean(reshape(part_vari.', n_subc, n_ante));
part_mean_ante = mean(reshape(part_mean.', n_subc, n_ante));
part_indi_ante = part_mean_ante ./ part_vari_ante;
[~, maxi] = max(part_indi_ante);
[~, mini] = min(part_indi_ante);
ante_pair = [mini, maxi];
ante_pairs(:,ii) = ante_pair.';
part_diff = part_data(:,(ante_pair(1)-1)*n_subc+1:ante_pair(1)*n_subc) ...
.* conj(part_data(:,(ante_pair(2)-1)*n_subc+1:ante_pair(2)*n_subc));

%% Filter out noises and interferences.
for jj = 1:n_subc,
part_diff(:,jj) = filter(lu, ld, part_diff(:,jj));
part_diff(:,jj) = filter(hu, hd, part_diff(:,jj));
end
% for jj = 1:n_subc*3,
% part_powe(:,jj) = filter(lu, ld, part_powe(:,jj));
% part_powe(:,jj) = filter(hu, hd, part_powe(:,jj));
% end

%% PCA analysis.
pca_coef = pca(part_diff);
part_diff = part_diff * pca_coef(:,1);
part_wind = floor(0.5*samp_rate+1:1.5*samp_rate);
part_spec = tfrsp(part_diff, part_wind, samp_rate, tftb_window(samp_rate/8+1, 'gauss'));
freq_time_prof(:,(ii-1)*samp_rate+1:ii*samp_rate,kk) = part_spec(freq_sele, :);

% powe_pca_coef = pca(part_powe);
% part_powe = part_powe * powe_pca_coef(:,1);
% powe_part_spec = tfrsp(part_powe, part_wind, samp_rate, tftb_window(samp_rate/8+1, 'gauss'));
% powe_freq_time_prof(:,(ii-1)*samp_rate+1:ii*samp_rate,kk) = powe_part_spec(powe_freq_sele, :);
end
freq_time_prof(:,:,kk) = abs(freq_time_prof(:,:,kk));
% powe_freq_time_prof(:,:,kk) = abs(powe_freq_time_prof(:,:,kk));
end
save(save_path, 'freq_time_prof', 'powe_freq_time_prof');
else
load(save_path);
samp_rate = 1024;
uppe_stop = 40;
freq_bins = [0:samp_rate/2-1 -samp_rate/2:-1]'/samp_rate;
freq_sele = freq_bins <= uppe_stop / samp_rate & freq_bins >= -uppe_stop / samp_rate;
freq_bins = freq_bins(freq_sele,:);
% powe_freq_sele = freq_bins <= uppe_stop / samp_rate & freq_bins >= 0;
% powe_freq_bins = freq_bins(powe_freq_sele);
time_bins = (0:size(freq_time_prof,2)-1)/samp_rate;
end

prof_cent = zeros(length(time_bins),length(file_path_list));
prof_vari = zeros(length(time_bins),length(file_path_list));
prof_ener = zeros(length(time_bins),length(file_path_list));
% powe_cent = zeros(length(time_bins),length(file_path_list));
norm_freq_time_prof = zeros(size(freq_time_prof));
% norm_powe_freq_time_prof = zeros(size(powe_freq_time_prof));

for kk = 1:length(file_path_list),
norm_freq_time_prof(:,:,kk) = abs(freq_time_prof(:,:,kk)) ./ repmat(sum(abs(freq_time_prof(:,:,kk)),1), size(freq_time_prof(:,:,kk),1), 1);
% norm_powe_freq_time_prof(:,:,kk) = abs(powe_freq_time_prof(:,:,kk)) ./ repmat(sum(abs(powe_freq_time_prof(:,:,kk)),1), size(powe_freq_time_prof(:,:,kk),1), 1);
prof_cent(:,kk) = freq_bins.' * norm_freq_time_prof(:,:,kk);
% powe_cent(:,kk) = powe_freq_bins.' * norm_powe_freq_time_prof(:,:,kk);

prof_vari(:,kk) = sqrt(sum((repmat(freq_bins, 1, length(time_bins)) - repmat(prof_cent(:,kk).', length(freq_bins), 1)).^2 ...
.* norm_freq_time_prof(:,:,kk), 1));
prof_ener(:,kk) = smooth(sum(freq_time_prof(:,:,kk),1), samp_rate/8+1, 'moving');
end

%% Motion Detection...
moti_indi = smooth(mean(prof_vari,2),samp_rate/2+1,'moving');
dete_thre = 0.015;
time_thre = [0.25, 0.25];
moti_valu = [1, 0];
moti_inde = moti_indi < dete_thre;

for jj = 1:length(time_thre),
curr_thre = time_thre(jj);
curr_valu = moti_valu(jj);
ii = 1;
while ii <= length(moti_inde),
while ii <= length(moti_inde) && moti_inde(ii) ~= curr_valu,
ii = ii + 1;
end
if ii > length(moti_inde),
break;
end
kk = ii + 1;
while kk <= length(moti_inde) && moti_inde(kk) == curr_valu,
kk = kk + 1;
end
if kk - ii < curr_thre * samp_rate,
moti_inde(ii:kk-1) = ~curr_valu;
end
ii = kk;
end
end
moti_indx = find(moti_inde);
note_inte = floor(note_inte * samp_rate);
note_indx = moti_indx(ll);
%note_indx = indx_list(ll);
note_end = moti_indx(end);
note_segm = zeros(n_note, 2);

%% Segmentation...
segm_lowb = 0.8;
segm_higb = 1.2;
for ii = 1:n_note,
note_segm(ii,1) = note_indx;
if ii < n_note,
curr_ener = mean(prof_ener(note_indx:min(note_indx+2*note_inte-1,size(prof_ener,1)),:), 2);
[~, segm_poin] = min(curr_ener(floor(note_inte*segm_lowb)+1:floor(note_inte*segm_higb)));
note_segm(ii,2) = note_indx+floor(note_inte*segm_lowb)+segm_poin-1;
% plot(time_bins(note_indx:min(note_indx+2*note_inte-1,size(prof_ener,1))), curr_ener);
% hold on;
% plot(time_bins(note_indx-1+floor(note_inte*segm_lowb)+segm_poin), curr_ener(floor(note_inte*segm_lowb)+segm_poin), '*');
% hold off;
note_indx = note_indx+floor(note_inte*segm_lowb)+segm_poin;
else
curr_ener = mean(prof_ener(note_indx:note_end,:), 2);
note_segm(ii,2) = note_end;
% plot(time_bins(note_indx:note_end), curr_ener);
end
end

%% Recognition..
for kk = 1:size(prof_cent,2),
prof_cent(:,kk) = smooth(prof_cent(:,kk), samp_rate/4+1, 'moving');
% powe_cent(:,kk) = smooth(powe_cent(:,kk), samp_rate/4+1, 'moving');
end
prof_cent(~moti_inde,:) = 0;
prof_cent_tota = sum(abs(prof_cent),2)/2;
prof_cent_tota = smooth(prof_cent_tota, samp_rate/8+1, 'moving');
% powe_cent(~moti_inde,:) = 0;
% powe_cent_tota = sum(abs(powe_cent),2)/2;
% powe_cent_tota = smooth(powe_cent_tota, samp_rate/8+1, 'moving');

note_list = zeros(1,n_note);
tx = [1.6,1.6].';
rx = [0,1.6;1.6,0].';
init_loc = [0.8,0.8].';
tant_valu = zeros(1,n_note);
dire_valu = zeros(2,n_note);
peak_dist = 0.1;

for ii = 1:n_note,
curr_cent = prof_cent(note_segm(ii,1):note_segm(ii,2),:);
curr_cent_tota = prof_cent_tota(note_segm(ii,1):note_segm(ii,2),:);
% curr_powe_cent = powe_cent(note_segm(ii,1):note_segm(ii,2),:);
% curr_powe_cent_tota = powe_cent_tota(note_segm(ii,1):note_segm(ii,2),:);
curr_cent = curr_cent(moti_inde(note_segm(ii,1):note_segm(ii,2)),:);
curr_cent_tota = curr_cent_tota(moti_inde(note_segm(ii,1):note_segm(ii,2)),:);
midd_poin = floor(length(curr_cent_tota) / 2);
firs_half = sum(curr_cent(1:midd_poin,:),1);
seco_half = sum(curr_cent(midd_poin+1:end,:),1);
for kk = 1:length(file_path_list);
if firs_half(kk) > seco_half(kk),
curr_cent(1:midd_poin,kk) = abs(curr_cent(1:midd_poin,kk));
curr_cent(midd_poin+1:end,kk) = -abs(curr_cent(midd_poin+1:end,kk));
else
curr_cent(1:midd_poin,kk) = -abs(curr_cent(1:midd_poin,kk));
curr_cent(midd_poin+1:end,kk) = abs(curr_cent(midd_poin+1:end,kk));
end
end
curr_cent = curr_cent - repmat(mean(curr_cent, 1),size(curr_cent,1),1);
firs_half = sum(curr_cent(1:midd_poin,:),1);
seco_half = sum(curr_cent(midd_poin+1:end,:),1);
% if (mark_list(ii) >= 1 && mark_list(ii) <= 2) || (mark_list(ii) == 8),
% firs_half(2) = firs_half(2);
% seco_half(2) = seco_half(2);
% elseif (mark_list(ii) >= 4 && mark_list(ii) <= 6),
% firs_half(1) = firs_half(1);
% seco_half(1) = seco_half(1);
% end
tant_valu(ii) = atan2(abs(firs_half(1))+abs(seco_half(1)), abs(firs_half(2))+abs(seco_half(2))) / pi * 180;
dire_valu(:,ii) = firs_half < 0;

if tant_valu(ii) > 60,
if dire_valu(1,ii) == 1,
note_list(ii) = 7;
else
note_list(ii) = 3;
end
elseif tant_valu(ii) < 30,
if dire_valu(2,ii) == 1,
note_list(ii) = 5;
else
note_list(ii) = 1;
end
else
if dire_valu(1,ii) == 1 && dire_valu(2,ii) == 1,
note_list(ii) = 6;
elseif dire_valu(1,ii) == 1 && dire_valu(2,ii) == 0,
note_list(ii) = 8;
elseif dire_valu(1,ii) == 0 && dire_valu(2,ii) == 1,
note_list(ii) = 4;
else
note_list(ii) = 2;
end
end
note_list(ii) = mod(note_list(ii), 8) + 1;
% if tant_valu(ii) > 45,
% if dire_valu(1,ii) == 1,
% note_list(ii) = 8;
% else
% note_list(ii) = 4;
% end
% else
% if dire_valu(2,ii) == 1,
% note_list(ii) = 6;
% else
% note_list(ii) = 2;
% end
% end

% if dire_valu(1,ii) == 1 && dire_valu(2,ii) == 1,
% note_list(ii) = 7;
% elseif dire_valu(1,ii) == 1 && dire_valu(2,ii) == 0,
% note_list(ii) = 1;
% elseif dire_valu(1,ii) == 0 && dire_valu(2,ii) == 1,
% note_list(ii) = 5;
% else
% note_list(ii) = 3;
% end
fprintf('index:%d... tant:%f... dire: %d,%d... mark: %d... pred: %d\n', ii, tant_valu(ii), dire_valu(:,ii), mark_list(ii), note_list(ii));

% [peas,locs,wids,pros] = findpeaks(curr_cent_tota, 'SortStr', 'descend', 'MinPeakDistance', peak_dist * samp_rate, 'WidthReference', 'halfheight');
% if length(locs) == 1,
% if curr_cent_tota(1) > curr_cent_tota(end),
% locs = [locs; 1];
% peas = [peas; curr_cent_tota(1)];
% else
% locs = [locs; length(curr_cent_tota)];
% peas = [peas; curr_cent_tota(end)];
% end
% elseif length(locs) == 0,
% locs = [1; length(curr_cent_tota)];
% peas = [curr_cent_tota(1); curr_cent_tota(end)];
% end
%
% subplot(4,1,1);
% mesh(time_bins(note_segm(ii,1):note_segm(ii,2)), freq_bins, norm_freq_time_prof(:,note_segm(ii,1):note_segm(ii,2),1));view(0,90)
% ylim([-0.04,0.04]);
% subplot(4,1,3);
% mesh(time_bins(note_segm(ii,1):note_segm(ii,2)), freq_bins, norm_freq_time_prof(:,note_segm(ii,1):note_segm(ii,2),2));view(0,90)
% ylim([-0.04,0.04]);
% subplot(4,1,2);
% plot(time_bins(note_segm(ii,1):note_segm(ii,2)),curr_cent(:,1)); hold on; plot(time_bins(locs(1:2)+note_segm(ii,1)-1), curr_cent(locs(1:2),1), '*');hold off;
% subplot(4,1,4);
% plot(time_bins(note_segm(ii,1):note_segm(ii,2)),curr_cent(:,2)); hold on; plot(time_bins(locs(1:2)+note_segm(ii,1)-1), curr_cent(locs(1:2),2), '*');hold off;
end
all_tant = [all_tant, tant_valu];
all_mark = [all_mark, mark_list];
all_dire = [all_dire, dire_valu];
all_resu = [all_resu, note_list];
for uu = 1:8,
type_summ(uu) = type_summ(uu) + sum(mark_list == uu);
for vv = 1:8,
confusion(uu,vv) = confusion(uu,vv)+sum(mark_list == uu & note_list == vv);
end
end
end

for uu = 1:8,
for vv = 1:8,
confusion(uu,vv) = confusion(uu,vv) / type_summ(uu);
end
end
accuracy = (sum(diag(confusion).' .* type_summ)) / sum(type_summ);

boxplot(all_tant, all_mark);
% save(resu_path, 'all_tant', 'all_mark', 'all_resu', 'all_dire', 'accuracy', 'confusion', 'type_summ');

最后,祝我本科毕业快乐,希望今年下半年读研也能一帆风顺。