Here is my code. Its purpose is to test my method for generating pseudorandom numbers for my diamond-square algorithm terrain generator. It was supposed to be a proof-of-concept, to see whether the numbers it gave were random enough for my purposes.
Code:
#include <stdio.h>
int main () {
unsigned int seed, r, h, i; // Declare all our delicious unsigned ints.
printf ("Please input the seed, the range, the decrement (to the range), and the initial.\n"); // Just so the user knows what to input where.
scanf ("%d, %d, %d, %d", &seed, &r, &h, &i); // Get the parameters from the user.
int v = i + (((seed * (seed % 10) + 5) % r) - (r/2)); // Create our printending number, and give it a believable value.
int n;
for (n = 0; n <= 50; n++) {
// To get the nth digit of integer i, use ((i % (10^n)) / (10^(n - 1))).
printf ("%d ", v);
h++;
v = i + (((seed * ((seed % (10^n)) / (10^(n - 1))) + n) % (r - h)) - ((r - h)/2)); // Generate a new v, with a smaller range than last time.
}
}
Here is the error Xcode gave me. Notice that I have my ints declared as ints, rather than unsigned ints. I did this to alleviate problems that might have occurred from comparing incompatible data types (although I didn't think that this was the problem). The code builds, and gets as far as the scans, which I complete with glee... and then it gives me some bull about EXC_ARITHMETIC. What's that supposed to be?