I am trying to run the following LDPC_Encoder stand alone test is it possible ?
I get the following error concerning the data set file TbCbsUncoded ?
sbaker@ubuntusim12:~/zodiacArtemis2/zodiac/trunk/Prototypes/Artemis2/cuda_test_3/build/cuPHY/examples/ldpc_encode$ ./ldpc_encode /home/sbaker/HDF5/standalone_LDPC_encoder_test/mat_gen_data_1_1144_752_0.hdf5
AERIAL_LOG_PATH unset
Using default log path
Log file set to /tmp/ldpc_encode.log
HDF5-DIAG: Error detected in HDF5 (1.10.7) thread 1: #000: …/…/…/src/H5D.c line 298 in H5Dopen2(): unable to open dataset
major: Dataset
minor: Can’t open object #001: …/…/…/src/H5Dint.c line 1429 in H5D__open_name(): not found
major: Dataset
minor: Object not found #002: …/…/…/src/H5Gloc.c line 420 in H5G_loc_find(): can’t find object
major: Symbol table
minor: Object not found #003: …/…/…/src/H5Gtraverse.c line 848 in H5G_traverse(): internal path traversal failed
major: Symbol table
minor: Object not found #004: …/…/…/src/H5Gtraverse.c line 624 in H5G__traverse_real(): traversal operator failed
major: Symbol table
minor: Callback failed #005: …/…/…/src/H5Gloc.c line 376 in H5G__loc_find_cb(): object ‘TbCbsUncoded’ doesn’t exist
major: Symbol table
minor: Object not found
H5Dopen(): Unable to open file TbCbsUncoded
terminate called after throwing an instance of ‘hdf5hpp::hdf5_exception’
what(): HDF5 Error
Aborted (core dumped)
If I input the GPU_test_input file to the stand alone ldpc_encoder example I get the following
./ldpc_encode “./GPU_test_input/ldpc_BG-1_Zc- 36_C-1_R-0.89.h5”
AERIAL_LOG_PATH set to /home/sbaker/AERIAL_LOG
Log file set to /home/sbaker/AERIAL_LOG/ldpc_encode.log
10:39:02.159077 WRN 46088 0 [NVLOG.CPP] Using /home/sbaker/zodiacArtemis2/zodiac/trunk/Prototypes/Artemis2/cuda_test_3/cuPHY/nvlog/config/nvlog_config.yaml for nvlog configuration
10:39:02.260586 ERR 46088 0 [AERIAL_CUPHY_EVENT] [CUPHY.PDSCH_TX] ERROR: the wrongly structured reference output: 2448 vs. 2376
Note that the GPU input file does include TbCbsCoded
Dataset ‘TbCbsCoded’
Size: 2376x1
I think there is some issue with generated data do you agree ?
Can you please provide a spec for the expected HDF5 data file input for both the encoder and decoder ?
For the stand alone error correction LDPC decoder this is what I get from command line help e.g.
-i input_filename Input HDF5 file name, which must contain the following datasets:
sourceData: uint8 data set with source information bits
inputLLR: Log-likelihood ratios for coded, modulated symbols
inputCodeWord: uint8 data set with encoded bits (optional)
(Initial bits are sourceData. No puncturing assumed.)
Is the input LLR before or after de rate matching ?
This is an example I would like to try stand alone testing
TrBlk size = 966920 (966896 info bits + 24 bit CRC)
Number of CBs = 115
CB size = 8408 + 24 bit CB CRC = 8432 (115*8408=966920)
n_cb = 25344
Zc = 384
BG = BG1
rv = 0
Qm = 8
numFillerBits = 16
Rate-matched size Er = 9088 for CBs 0-14, 9120 for CBs 15-114
If I want to decode just the first CB
sourceData 8432 = the CB plus CRC
inputLLR 9088 ? = rate matched size
BG = 1
mb = 44 ?
Thanks for your post above HDF5 tools are useful e.g. h5ls and h5dump we can guess the GPU input format.
I understand there is no support for this stand alone testing? If so do you advise that we stop this activity and find some other way to run the encoder in a different fixture?
“The pyAerial library provides a Python-callable bit-accurate GPU-accelerated library for all of the signal processing CUDA kernels in the NVIDIA cuBB layer-1 PDSCH and PUSCH pipelines. In other words, the pyAerial Python classes behave in a numerically identical manner to the kernels employed in cuBB because a pyAerial class employs the exact same CUDA code as the corresponding cuBB kernel: it is the CUDA kernel but with a Python API.”