Thread: Encoding a data structure based on TLV encoding

  1. #1
    Registered User
    Join Date
    Dec 2012
    Posts
    13

    Arrow Encoding a data structure based on TLV encoding

    I have to encode the parameters present inside a structure based on X.690 encoding.
    Suppose, my structure is:
    Code:
    struct Data_Struct
       {
          parameter1
          parameter2
          parameter3
       }
    Some or all of these parameter may be having valid data, say, parameter1 and parameter3.
    Then I am supposed to encode only parameter1 and parameter3 using TLV encoding.
    Do I have to follow a sequencial procedure to follow this. Like,


    Code:
    Check whether parameter 1 is present
        If present, find the tag of the parameter from a lookup table and encode it
    Check whether parameter 2 is present
        If present, find the tag of the parameter from a lookup table and encode it
    Check whether parameter 3 is present
        If present, find the tag of the parameter from a lookup table and encode it

    As the procedure is repetative can I modularize it? What will be the best way to do it? Is there any way to access the parameters sequentially? How can I have a relationship between the parameter and it's tag? Length of the value is variable.
    Last edited by Sajas K K; 02-14-2013 at 12:24 AM.

  2. #2
    Registered User
    Join Date
    Oct 2006
    Posts
    3,445
    my recommendation is to put your struct(s) that needs to be encoded in its own header file, and then write a program that can parse that header file and generate the encoder/decoder automatically.

  3. #3
    Registered User C_ntua's Avatar
    Join Date
    Jun 2008
    Posts
    1,853
    I say create a TLV class first and available encoding as an enum with a value equal to the linked tag.
    Code:
    enum encoding { ASCII = 0x01, binary = 0x02};
    
    class TLV 
    {
       unsigned char tag; //a byte
       std::vector<unsigned char> value; //a byte array
    public:
       TLV(unsigned int char, int length, const char* s) {
          ....
       }
    };
    Then create a parser, lets assume that you have a string literal as an input
    Code:
    void parser(const char* input, std::vector<TLV>& output) {
       ...
    }
    Finally, you just need to encode depending on the tag. You can do so by
    Code:
    vector<TLV> tlvs = parser("010231310203112233");
    for (int i = 0; i < tlvs.size(); ++i) {
        if (tlvs[i].tag == ASCII) {
           ....
        } else if (tlvs[i].tag == binary) {
           ...
        }
    }
    //this could potentially output
    //parameter_1 = "11"
    //parameter_3 = 0x112233

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. How to convert string in url encoding to html encoding?
    By Jerel2k11 in forum C Programming
    Replies: 6
    Last Post: 11-06-2011, 09:05 AM
  2. Encoding
    By /dev/bag in forum Linux Programming
    Replies: 1
    Last Post: 06-05-2011, 02:08 PM
  3. HUffman encoding(data compression)
    By jia in forum C++ Programming
    Replies: 4
    Last Post: 06-01-2010, 03:44 PM
  4. Encoding data in 7 bit words, need some ideas.
    By Subsonics in forum C Programming
    Replies: 7
    Last Post: 01-18-2009, 11:44 AM
  5. Encoding
    By gvector1 in forum C# Programming
    Replies: 0
    Last Post: 06-20-2003, 10:17 AM

Tags for this Thread