On efficiency (what’s a nanosecond, what’s a microsecond)
I heard Mark Andreessen predict that programmers need to get very efficient on programming again (At about 17:30) in this very interesting interview with Kevin Kelly.
https://a16z.com/2019/12/12/why-we-should-be-optimistic-about-the-future/
If we do not get more efficient in programming, things might get stuck.
Another insteresting perepctive was already provided by Grace Hopper, the lady that invested COBOL, amongst other things.
See this video: How long is a nanosecond
This all reminds me of a small test we did recently to check resource consumption of programming languages, by writing just a very small Hello World program. On ein COBOL, one in Java and one in Groovy.
The following summarizes how many CPU seconds these programs needed to run:
COBOL 0,01 msec (basically it was unmeasurable).
Java 1 second.
Groovy: 3 seconds.
And then we are only looking at very inefficient programming languages. Much more could be gained when looking at application architectures. Microservices architectures, especially when applied radically, are incredibly inefficient compared to traditional tightly coupled applications in C, COBOL or even Java.
Of course I do not want to advertise stovepipe applications, history has proven the maintenance issues to be inhibitive, but a more balanced architecture with more eye for efficiency seems inevitable.