C should be taught like any other language. One should understand the historical context in which a language was born and why it gained popularity. A lot of "vintage" C code is indeed bad because elementary software engineering principles were not widely understood back then. Pascal code of the same vintage isn't significantly better, either. We've walked a way in collectively understanding the importance of modularity, cohesion and coupling and balancing complexity, and that smart isn't always better in the long run.
C still is, and will continue to be for the foreseeable future, the lingua franca, the least common denominator. Acknowledging how C comes with a very... interesting set of tradeoffs that make it uniquely well-suited for certain purposes and at the same time incredibly dangerous is a worthwhile proposition if one is aiming to truly understand C development.
True, but a good C course should explain exactly those concepts and what kind of problems (in relation to assembly and other HLLs) was C meant to solve back at the time.
I would highlight the following:
* Structured programming support, which means nested loops and conditionals without a primary need for "goto" jumps, enabling a sense of "depth" that is missing in the "flat" world of assembly
* Expression-oriented syntax, meaning that operators (even those having a side-effect) return a value of a certain type, and can be nested, again enabling recursive program structure versus a flattened one
* Global symbol allocation and resolution, which means that a programmer uses names rather than addresses to refer to global variables and functions
* Abstraction over function calling conventions, which enables the programmer not to worry about function prologues and epilogues and the order of pushing arguments on the stack or in registers
* Automatic storage management, meaning that a function-scope local variable is used by the programmer with its name and the compiler decides whether to put it in a register or at a certain offset in the stack frame
* Rudimentary integer-based type system that has the distinction between a scalar and a fixed-size collection of scalars laid-out sequentially (arrays), and special integers called "pointers", supporting a different set of operations (dereferencing to a certain type and adding or subtracting other integers from them, without any safety guarantees whatsoever)
Nothing more, nothing less. Not understanding these foundations is the source of major pain.
The expression-ness of C really sets it apart from even its successors. Declarations look like expressions, everything does. You can see in C code from the time how heavily expressions are used. C++, Java, etc. all added a bunch of new keywords to the syntax that make it much more like Pascal or other 'normal' languages. The culture of those languages leans much more towards statements as well.
Macro Assemblers, which were never a thing in UNIX, do offer support for structured programming, see MASM, TASM, or going back to the days C was born, something like HLASM on IBM mainframes.
Additionally many of the C features had already been sorted out in JOVIAL, NEWP, PL/I, BLISS among others about a decade before C was born.
C was solving the issues of UNIX v3 design, that is all.
Plenty of languages can be used to teach low level programming concepts.
> C still is, and will continue to be for the foreseeable future, the lingua franca, the least common denominator.
In the context of platform ABIs, sure. The widespread stabilization and ossification of C ABIs is a boon for the rest of the ecosystem but it's entirely at the expense of the C language/stdlib. Hence the performance advantage of projects like fmtlib.
Notwithstanding its ubiquity C is in many ways "The Sick Man of Asia". Every major C compiler is written in C++ with tooling heading the same way. The dominance of C++ in the heterogeneous space has accelerated this trend and spread it to many HPC libraries. Even foundational bits such as Microsoft's UCRT or llvm-libc are written in C++.
On the current trajectory C will become the next Fortran, i.e. a widely used language which is nonetheless unable to support itself.
A little. While I have heard it (from native speakers and non), it's less common. I also find it confusing because "least" can be interpreted as "lowest", or it can be interpreted in context of the following word: "common". "Least common" (meaning infrequent or most rare) changes the meaning to one different and misleading compared to "lowest", which is why I interpret this phrasing as sounding weird.
I think so? To me, least seems to be used for things like patience, distance, tidiness etc whereas lowest seems to be used more for money or countable things.