Thread: Convert.ToDecimal vs (decimal)

  1. #1
    Registered User
    Join Date
    Aug 2007
    Location
    Gloucester, England
    Posts
    11

    Question Convert.ToDecimal vs (decimal)

    Hello everyone,

    First post

    Anyone know the difference between the following two ways to cast one type to another?

    Code:
    decimal myVar = Convert.ToDecimal(1.23);
    and

    Code:
    decimal myVar = (decimal)(1.23);
    Using the Stopwatch class I ran each of the above statements through a loop 10,000 times and the Convert.ToDecimal was 40 times slower...

    Regards

    Pete

  2. #2
    Sweet
    Join Date
    Aug 2002
    Location
    Tucson, Arizona
    Posts
    1,820
    I am honestly not sure, but maybe one is a compile time conversion and one is a run time conversion.

  3. #3
    Registered User
    Join Date
    Aug 2007
    Location
    Gloucester, England
    Posts
    11

    Compiling

    Thanks, you're right. A friend has told me that the (decimal)(1.23) is compiled at runtime, whereas the Convert.ToDecimal method is determined at compile time.

    Perhaps the performance of Convert.ToDecimal improves once built then.

    Thanks.

    Pete

  4. #4
    Sweet
    Join Date
    Aug 2002
    Location
    Tucson, Arizona
    Posts
    1,820
    I doubt Convert.ToDecimal is compile time since it is a function call. I was speaking of (decimal). The compiler fires that error cause it really wants to make sure you want that value to be a decimal.

  5. #5
    Technical Lead QuantumPete's Avatar
    Join Date
    Aug 2007
    Location
    London, UK
    Posts
    894
    Quote Originally Posted by Pete_O View Post
    [...] is compiled at runtime [...]
    I don't think there is such a thing. Expressions are either evaluated at compile time *or* at run-time. And prog-bman is right. Convert.ToDecimal() is a function call, hence it is evaluated at runtime. Whereas (decimal) is a cast, which tells the compiler how to interpret an expression, i.e. it is evaluated at compile time.

    QuantumPete
    "No-one else has reported this problem, you're either crazy or a liar" - Dogbert Technical Support
    "Have you tried turning it off and on again?" - The IT Crowd

  6. #6
    Registered User
    Join Date
    Sep 2007
    Location
    Adelaide, Australia
    Posts
    9
    As mentioned above, one is a function call where the other is a cast. However, they are designed for slightly different things.

    For an integer, Convert.ToDecimal simply does a cast - the reason it is slower in your test is simply because of the extra function call overhead.

    However, for a string, if you tried to cast a string to a decimal, it would fail. Convert.ToDecimal(string) on the other hand will try to parse the string.

    The rule of thumb is - if you know you can cast, then cast. If you're unsure about the type being passed in (it could be a string, or an int, or who knows) then Convert.

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Replies: 3
    Last Post: 07-04-2008, 12:39 PM
  2. I need help with decimal to binary Algorithm
    By webznz in forum C Programming
    Replies: 4
    Last Post: 03-13-2008, 03:52 AM
  3. hex to binary,hex to decimal
    By groovy in forum C Programming
    Replies: 2
    Last Post: 01-25-2006, 02:14 AM
  4. Confused by expression.
    By Hulag in forum C Programming
    Replies: 3
    Last Post: 04-07-2005, 07:52 AM
  5. decimal to binary, decimal to hexadecimal and vice versa
    By Unregistered in forum C++ Programming
    Replies: 9
    Last Post: 12-08-2001, 11:07 PM