Thread: Best way to convert std::vector<unsigned char> to std::vector<char>?

  1. #1
    Registered User
    Join Date
    Nov 2006
    Posts
    184

    Best way to convert std::vector<unsigned char> to std::vector<char>?

    If I have an vector of unsigned char's, what's the best way to pass them to a function that only takes a vector of char's? What about a vector of signed chars?

    In the conversion, is there any risk of losing data?

  2. #2
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by 6tr6tr View Post
    If I have an vector of unsigned char's, what's the best way to pass them to a function that only takes a vector of char's? What about a vector of signed chars?

    In the conversion, is there any risk of losing data?
    You have to copy the data to a vector of signed chars. There's no good way to just twiddle the type. However, there is no risk of data loss. Only the type changes -- the bit patterns remain the same.

    A custom vector class could be written that allows magical conversion to and from signed or unsigned, but you can't do it with std::vector<T>

  3. #3
    C++まいる!Cをこわせ!
    Join Date
    Oct 2007
    Location
    Inside my computer
    Posts
    24,654
    But then, seeing as it only differs by type, hoe about casting it to an appropriate vector type? If it's in the signed range, it should be fine, but I'm guessing it's sort of undefined behavior...
    Quote Originally Posted by Adak View Post
    io.h certainly IS included in some modern compilers. It is no longer part of the standard for C, but it is nevertheless, included in the very latest Pelles C versions.
    Quote Originally Posted by Salem View Post
    You mean it's included as a crutch to help ancient programmers limp along without them having to relearn too much.

    Outside of your DOS world, your header file is meaningless.

  4. #4
    The larch
    Join Date
    May 2006
    Posts
    3,573
    If you can change the function that you want to pass to, how about making it templated?

    Otherwise, how about using a vector of char to begin with?
    I might be wrong.

    Thank you, anon. You sure know how to recognize different types of trees from quite a long way away.
    Quoted more than 1000 times (I hope).

  5. #5
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by Elysia View Post
    But then, seeing as it only differs by type, hoe about casting it to an appropriate vector type? If it's in the signed range, it should be fine, but I'm guessing it's sort of undefined behavior...
    Uh... That's not possible. Okay, it is, but it's sick and wrong.

  6. #6
    Registered User
    Join Date
    Nov 2006
    Posts
    184
    Thanks. So I can't do the following?

    Code:
    std::vector<char> newOne = std::vector<char>( oldUnsignedCharVector.begin(), oldUnsignedCharVector.end() );

  7. #7
    Officially An Architect brewbuck's Avatar
    Join Date
    Mar 2007
    Location
    Portland, OR
    Posts
    7,396
    Quote Originally Posted by 6tr6tr View Post
    Thanks. So I can't do the following?

    Code:
    std::vector<char> newOne = std::vector<char>( oldUnsignedCharVector.begin(), oldUnsignedCharVector.end() );
    That seems like it should work, since there is a conversion available between unsigned char and char...

  8. #8
    Registered User
    Join Date
    Nov 2006
    Posts
    184
    Quote Originally Posted by brewbuck View Post
    That seems like it should work, since there is a conversion available between unsigned char and char...
    Cool, thanks! I assume then it'll also work for a signed char?

  9. #9
    C++ Witch laserlight's Avatar
    Join Date
    Oct 2003
    Location
    Singapore
    Posts
    28,413
    So I can't do the following?
    That should be okay, since each individual unsigned char element is type cast to char, not the whole std::vector<unsigned char> being type cast to std::vector<char>. In other words, you are copying the data to a vector of chars.
    Quote Originally Posted by Bjarne Stroustrup (2000-10-14)
    I get maybe two dozen requests for help with some sort of programming or design problem every day. Most have more sense than to send me hundreds of lines of code. If they do, I ask them to find the smallest example that exhibits the problem and send me that. Mostly, they then find the error themselves. "Finding the smallest program that demonstrates the error" is a powerful debugging tool.
    Look up a C++ Reference and learn How To Ask Questions The Smart Way

  10. #10
    Cat without Hat CornedBee's Avatar
    Join Date
    Apr 2003
    Posts
    8,895
    Quote Originally Posted by 6tr6tr View Post
    Thanks. So I can't do the following?

    Code:
    std::vector<char> newOne = std::vector<char>( oldUnsignedCharVector.begin(), oldUnsignedCharVector.end() );
    Yes, but you can also do this:
    Code:
    std::vector<char> newOne( oldUnsignedCharVector.begin(), oldUnsignedCharVector.end() );
    All the buzzt!
    CornedBee

    "There is not now, nor has there ever been, nor will there ever be, any programming language in which it is the least bit difficult to write bad code."
    - Flon's Law

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Another syntax error
    By caldeira in forum C Programming
    Replies: 31
    Last Post: 09-05-2008, 01:01 AM
  2. Replies: 3
    Last Post: 08-21-2006, 06:42 AM
  3. Convert 10.2 to binary
    By sara.stanley in forum C Programming
    Replies: 20
    Last Post: 02-08-2006, 09:22 AM
  4. Convert Char to Int Function
    By drdroid in forum C++ Programming
    Replies: 9
    Last Post: 02-19-2003, 12:53 PM
  5. please help ... to convert date to string
    By mel in forum C Programming
    Replies: 1
    Last Post: 06-12-2002, 10:26 AM