I am trying to debug a confusing bit of stuff in a parallelized code. One possibility for a source of error is that the person who wrote it is using long ints for array indices, then for some reason casts it to an int for a function call, in which it is again used as an array index. Printing it showed that sure enough, it was bigger than INT_MAX before the cast.
Is there a standard way of handling a cast like this? Or is it undefined? Because I am getting different results from run to run, indicating that something undefined is happening, but I am not sure if that is what it would be. Would this cast behave deterministically?