Thread: command line and convergence

  1. #1
    Registered User
    Join Date
    Mar 2011
    Posts
    2

    command line and convergence

    <<< split from command line arguments >>>

    I need help please, i need to write a program that accepts a single integer as a command line argument, validate and the perfoms process described below until it converges and reports how many iterations it took to convrge and what is the converged value.

    the process is :
    take any three digit number and arrange its digits to descending order and ascending arder and subtract the smaller number from the bigger number. Repeat with the result until the resulting number stops changing (converges). for example

    using 132:
    321 - 123 = 198
    981 - 189 = 792
    972 - 279 = 639
    963 - 369 = 594
    954 - 459 = 495
    954 - 459 = 495
    you see it brings the same result now (converges)

  2. #2
    Banned
    Join Date
    Aug 2010
    Location
    Ontario Canada
    Posts
    9,547
    We don't do homework for you. But some of us will help you if you run into problems doing your homework.

    Work it as far as you can on your own. If you get stuck, post your code and ask specific questions ... Also see: << !! Posting Code? Read this First !! >>

  3. #3
    Registered User
    Join Date
    Mar 2011
    Posts
    2
    ohh sorry bra, i did something you know!!!

    this is my program:
    Code:
    # include <stdio.h>
    
    int main(int argc, char *argv[])
    {
        if (argc == 1)
        {
                 printf("No arguments were entered.\n");
                 return -1;
        if (argc <= 0)
        {
                 printf("It is zero or negative.\n");
                 system("pause");
                 return -1;
        }
        if (argc < 3)
        {
                 printf("It is not a 3 digit number.\n");
                 system("pause");
                 return -1;
        }
        int a, b, c, d;
        int i = 0;
        a = 1;
        b = 3;
        c = 2;
        
        for (i=0;i<1;i++)
        {
            d = a*100 + b*10 + c;
            printf("%d\n",d);
        }
        
        void sort (int d[], int n)
        {
             int i, j, temp;
             
             for (i=1;i<n-1;++i)
                 for (j=i+2;j<n;++j)
                     if ( d[i]>d[j])
                     {
                          temp = d[i];
                          d[i] = d[j];
                          d[j] = temp;
                     }
                     if ( d[i]<d[j]
                     {
                          temp = d[i];
                          d[i] = d[j];
                          d[j] = temp;
                     }
        }
        long difference = 0;
        int i;
        int answer;
        
        for (i=1;i<argc;i++)
        {
            printf("%s: %d\n",argv[i],atoi(argv[i]))
            difference-= atoi(argv[i]);
        }
        
        
        return -1;
        system("pause");
    }
    i don't understand which loop to use for taking the answer and arranging it again
    PLEASE someone help me

Popular pages Recent additions subscribe to a feed

Similar Threads

  1. Iterations and convergence
    By WOPR in forum C Programming
    Replies: 1
    Last Post: 12-22-2005, 11:23 AM