This comment came up in another thread:
The sentiment above by a student illustrates this completely. Are students to learn idiosyncrasies of C and get bogged down in it, apparently, or would they be better served to learn algorithms, meticulousness and attention to detail, logic, deductive logic, systematic approaches, and a more generic approach to solving problems using computers? Perhaps C isn't the best tool for introducing programming.
I'm an oldgeezer/oldtimer.... I was taught programming using BASIC (which was designed to be a teaching tool, coincidentally). In high school we learned how to iterate Pi using series, solve area and volume problems (early methods equivalent to integrals). For example: what's the most efficient ratio of a cylinder to minimize surface area for a given volume - i.e. how would you design a pop can if material costs were all the same for sides or top.
Language idiosyncrasies were few. If any. Simple IF, PRINT, FOR/NEXT. I don't recall anyone needing remedial help with syntax issues. More likely they were asking about why the results were off... or why one had to convert from degrees to radians.
C doesn't seem appropriate because too many "pointer" issues, memory allocation, integer vs.. float, plus bracket and indenting distractions miss the point of making computer literate people entirely.
Maybe it's because we used to call it "computer science" in the mid 70s. Where's the science nowadays in tracking down some segmentation fault or core dump?
That's my opinion.