yk2011

积分:473
上传文件:5
下载次数:1
注册日期:2011-08-20 16:07:03

上传列表
WinMain.zip - a project of winmain function,to tell newer to learn C++,2011-08-20 16:54:26,下载4次
C-Languige-programe.zip - the C languige programe design,is very popular,2011-08-20 16:51:52,下载3次
sova0.rar - This function implememts Soft Output Viterbi Algorithm in trace back mode Input: rec_s: scaled received bits. rec_s(k) = 0.5 * L_c(k) * y(k) L_c = 4 * a * Es/No, reliability value of the channel y: received bits g: encoder generator matrix in binary form, g(1,:) for feedback, g(2,:) for feedforward L_a: a priori information about the info. bits. Extrinsic info. from the previous component decoder ind_dec: index of the component decoder. =1: component decoder 1 The trellis is terminated to all zero state =2: component decoder 2 The trellis is not perfectly terminated. Output: L_all: log ( P(x=1|y) ) / ( P(x=-1|y) ) Frame size, info. + tail bits,2011-08-20 16:41:48,下载17次
trellis.zip - set up the trellis given code generator g g given in binary matrix form. e.g. g = [ 1 1 1 1 0 1 ] next_out(i,1:2): trellis next_out (systematic bit parity bit) when input = 0, state = i next_out(i,j) = -1 or 1 next_out(i,3:4): trellis next_out (systematic bit parity bit) when input = 1, state = i next_state(i,1): next state when input = 0, state = i next_state(i,i) = 1,...2^m next_state(i,2): next state when input = 1, state = i last_out(i,1:2): trellis last_out (systematic bit parity bit) when input = 0, state = i last_out(i,j) = -1 or 1 last_out(i,3:4): trellis last_out (systematic bit parity bit) when input = 1, state = i last_state(i,1): previous state that comes to state i when info. bit = 0 last_state(i,2): previous state that comes to state i when info. bit = 1 ,2011-08-20 16:37:59,下载8次
turbo_sys_demo.zip - This script simulates the classical turbo encoding-decoding system. It simulates parallel concatenated convolutional codes. Two component rate 1/2 RSC (Recursive Systematic Convolutional) component encoders are assumed. First encoder is terminated with tails bits. (Info + tail) bits are scrambled and passed to the second encoder, while second encoder is left open without tail bits of itself. Random information bits are modulated into +1/-1, and transmitted through a AWGN channel. Interleavers are randomly generated for each frame. Log-MAP algorithm without quantization or approximation is used. By making use of ln(e^x+e^y) = max(x,y) + ln(1+e^(-abs(x-y))), the Log-MAP can be simplified with a look-up table for the correction function. If use approximation ln(e^x+e^y) = max(x,y), it becomes MAX-Log-MAP.,2011-08-20 16:35:19,下载53次

近期下载

收藏