So I am trying to write a program that converts roman numerals into decimal numbers.
I so far have come up with:
Code:
#include <stdio.h>
#include <ctype.h> // importing the tolowerfunction
//variables
int decimal, total;
char numeral[];
//prototypes
int decimal_to_roman(char input);
int roman_to_decimal(char letter);
int roman_str_to_decimal(char input)
{
int i;
for (i=0; i< strlen(input); i++)
{
total += roman_to_decimal(numeral[i]);
}
}
int roman_to_decimal(char letter)
{
switch(tolower(letter))
{
case 'm':
return 1000;
case 'd':
return 500;
case 'c':
return 100;
case 'l':
return 50;
case 'x':
return 10;
case 'v':
return 5;
case 'i':
return 1;
default:
break;
}
}
int main(void)
{
printf("Please enter the first number: ");
scanf("%s", &numeral);
roman_str_to_decimal(numeral);
printf("%d\n", total);
//printf("%c\n", numeral[2]);
}
but each time I compile it, it times out as if it were hitting an infinite loop. I have a feeling that I am not passing an individual character to the roman_to_decimal function but am unsure why.