Anyone who works in a technical field should know some rudimentary coding. The challenge is that the actual coders keep changing the language. I mean FORTRAN 77 still gets the job done.
There are things that FORTRAN can do, and things that it can't. One of the things it is not designed to do at all, is the implementation of operating systems or compilers. And it can't easily do direct memory access, so you still end up needing stubs written in C or assembler to interface to your Fortran code if you're operating an embedded system that needs low level access to the hardware.
In my professional career I've straddled the fence on hardware ( electronics, analog / digital / RF ) and software ( operating systems, data acquisition, and realtime numerical algorithm ) for about 30 years now. I've long since gotten past the idea that one language is "the best" for anything, but using C you can go right from the bare metal up to abstract object oriented concepts, so it ends up being my tool of choice most of the time.
One new-ish thing that has turned out to be pretty useful is the interactive Python scripting tool "Jupyter Notebook". It runs in a browser on pretty much any platform (Mac/Windows/Linux) and it lets me figure out Python hooey interactively before committing functions to a standalone script.
Less is more.