Abstraction

  • Post category:Principles
  • Reading time:2 mins read

The abstraction principle in software systems allows us to create more generically usable software components.

Abstraction removes details from the problem space and thus makes it easier to think about problems and solutions. This principle lets us focus on problems by reducing complexity.

Abstraction makes software components more reusable: code for specific problems is abstracted to code that applies to a broader class of problems.

The process of abstraction, or generalization, removes specific attributes or functions and replaces these with general counterparts. The abstract components can be made specific through parameters or inheritance.

Abstraction is also a modeling technique that allows logical grouping of software structures so they can be better designed, realized, and maintained. Through this grouping, a complex software solution is methodically chopped up into smaller pieces that can be handled better in all phases of a software system lifecycle.

Examples of abstraction in software systems:

The OSI networking model abstracts communication protocols in functional layers.

Messaging middleware solutions abstract the connectivity logic from the application.

Picasso’s bull

Exchange an MVS dataset with Windows

  • Post category:Utilities
  • Reading time:2 mins read

I have discussed in another article how you can convert data from one codepage to another.

Also I have described a method to copy unix files to MVS datasets.

This article summarizes a method to copy an MVS dataset to Windows, while keeping records and converting from ebcdic to the utf8 codepage.

This job copies the MVS dataset to a UNIX files, indicating to keep record indicators with the Windows CR character (UNIX uses the CRLF typically as record / line separators). The dataset below ‘YOUR.TEST.PS’ should be a PS – physical sequential – dataset or a PDS(E) member.

//STEP1    EXEC PGM=BPXBATCH                                         
//STDOUT   DD SYSOUT=*                                               
//STDERR   DD SYSOUT=*                                                
//STDIN    DD DUMMY    
//* Values in STDENV below are kept but have no meaning for this function                                              
//STDENV   DD *                                                      
JAVA_HOME=/usr/lpp/java/J8.0_64                                      
PATH=/usr/lpp/mqm/web/bin:/bin:/usr/sbin                      
LIBPATH=/usr/lpp/mqm/java/lib                                 
//STDPARM  DD   *                                                    
SH cp -v -F cr "//'YOUR.TEST.PS'" /your/unixdir/tst.txt 

You can now convert the EBCDIC data to UTF-8 as follows:

iconv -f 37 -t 1208 /your/unixdir/tst.txt  > /your/unixdir/utf8-tst.txt

Now you can transfer the file to Windows. Transfer the file in binary mode, otherwise another unwanted code page conversion will happen.

The myth of zero data loss ransomware recovery

  • Post category:Uncategorized
  • Reading time:3 mins read

My proverbial neighbor asked me some time ago if he could have a zero data loss ransomware recovery solution for his IT shop. He is not a very technical guy, yet responsible for the IT in his department, and he is wise enough to go seek advice on such matters. My man next door could very well be your boss, being provoked by a salesperson from your software vendor.

What is a zero data loss ransomware recovery solution?

A ransomware recovery solution is a tool that provides you the ability to recovery your IT systems from the incident in which a ransomware criminal has encrypted a crucial part of your IT systems. A zero data loss solution promises to provide such a recovery without the loss of any data. The promise of zero data loss must be approached with the necessary skepticism. A zero data loss solution requires you to be able to decrypt the data that your ransomware criminal has encrypted with the keys that he offers to give you for a nice sum of money. To get these keys you have two options:

  1. Pay the criminal and hope he will send you the keys.
  2. Create the keys yourself. This would require some highly advanced algorithm, possibly using a tool based on Quantum computing technology. This is a fantasy of course. This first person to know about the practical application of such technology would be your ransomware criminal himself, and he will have applied this in his encryption tooling.

So getting the keys is not an option, unless you are in the position to save up a lot of money, or find an insurer that will carry your ransomware risk. Although I expect that will come at an excruciating premium.

The next best option is to recover your data from a point in time just before the event of the ransomware attack. This requires a significant investment in advanced backup technology, and complex recovery procedures, while giving you little guarantee as to what state your systems can be recovered. And, setting the expectations, will come with the loss of all data that your ransomware criminal managed to encrypt. We cannot make it more beautiful.

Programming languages and what’s next

  • Post category:Programming
  • Reading time:7 mins read

My review of programming languages I learned in during my years in IT.

BASIC

On the Texas Instruments TI99-4a. 

Could do everything with it. Especially in combination with PEEK and POKE. Nice for building small games.

Impossible to maintain.

GOTO is unavoidable.

Assembler

In various variants.

Z80, 6802, PDP 11, System 390.

Fast, furious, unreadable, unmaintainable.

Algol 68

Loved this language. REF!

Have only seen it run on DEC 10. Mainly used in academic environments (in the Netherlands?)?

Pascal

Well. Structured. Pretty popular in the early 90s. 

Again is this widely adopted?

COBOL

Old. Never programmed extensively in it – just for year 2000.

Totally Readable.

Funny (rediculous) numbering scheme.

Seems to be necessary to use GOTO in some cases which I do not believe.

Smalltalk

Beautiful language.

Should have become the de facto OO programming language but failed for unclear reasons.

Probably because it was way ahead of it’s time with it’s OO base.

Java

Totally nitty gritty programming language.

Productivity based on frameworks, which no one knows which to use.

Never understood why this language was so widely adopted – besides it’s openness and platform independency.

Should never have become the de facto OO programming language but did so because Sun made it open (good move).

Far too many framework needed. J(2)EE add more complexity than it resolves.

Always upgrade issues. (Proud programmer: We run the application in Java! Fed up IT manager: Which Java?)

Rexx

Can do everything quickly.

But nothing structurally.

Ugly code. Readable but ugly.

Some very very strong concepts.

Php

Hodge podgy language of programming concepts and html.

Likely high programmer productivity if you maintain a stark discipline of programming standards. Stark danger of creating unmaintainable crap code mix of html and php.

Python

Nice structured language.

Difficult to set up and reuse.

Can be productive if nitty gritty setup issues can be overcome.

Ruby (on Rails or off-track)

Nice, probably the most elegant OO language. Too nitty gritty to my taste still. Like it though.

I would start with this language if I had to start today.

What is next

Visual programming? Clicking building blocks together?

In programming we should maybe separate the construction of applications from the coding of functions (or objects, or whatever you call the lower level blocks of code.

Programming complex algorithms (efficiently) will probably always remain a craft for specialists.

Constructing applications from the pieces should be brought to a higher level.

The industry (well – the software selling industry) is looking at microservices but that gives operational issues and becomes too distrubuted.

We need a way to build a house from software bricks and doors and windows and roof elements.

Probably we need more standards for that. 

Another bold statement.

AI systems “programming” themselves is nonsense (I have not seen a shred of evidence). 

AI systems are stochastical systems. 

Programming is imperical.

In summary, up to today you can not build software without getting into the nitty gritty very quickly. 

It’s like building a house but having find your own tree and rocks first to cut wood and blicks from. 

And then contruct nails and screws.

A better approach to that would help.

What do you think is the programming language of the future? What need should it address?

The Internet of Everything – from toilet seats to human bodies

  • Post category:General
  • Reading time:3 mins read

I walked into the restroom. A mechanic stood at the sink fixing something. It saw him holding a toilet seat. He was fooling around with the wiring of the apparatus. Then he replaced some electronics components and rewired the seat.

Toilet sensors

It never occurred to me that even toilets could be usefully equipped with electronic features. I asked the mechanic. He explained that the toilets in the building are all connected to the Internet. If there is something wrong with the antiseptic fluid produced by the toilet, it starts calling out for help. He told me that the towel dispenser was also connected to the Internet, so that when it runs out, a maintenance operator is called in. Makes sense.

Never has technology so much helped improve the The Loo.

To cell sensors

So all things will be supplied with sensors. And it looks like these sensorized things are getting smaller and smaller and a reaching the nano space.

Sensors are gtheetting so small that they can flow through our blood and mend our bodies. And maybe fix cancer cells in the future. Or detect issues with blood vessels. Or measure the chemistry in our bodies. They can be injected in plants to protect themselves from diseases. Or be used in constructions to measure stability at smaller scales than we had ever assumed possible. Possibilities beyond imagination.

Neb sensors surveilling the body 

Imagine what it would mean if we could instrument every cell we like to. I would like a surveillance team of bot swimming through my body, like the Nebuchadnezzar in the Matrix flows through the sewers and tunnels of the abandoned cities.

To signal when my internals run out of supplies.

The Lindy effect and technology

  • Post category:Modernization
  • Reading time:1 mins read

Stuff that has been around for x years, can be expected to be around for another x years.

That is what the Lindy effect tells us.

Read about Lindy in Nassim Taleb’s Skin in the Game.

This informs how we should approach legacy technology: 

  • Maintain it motherly, or
  • Decommission it aggressively

The Lindy effect also informs us how to approach the adoption of new technology: with care.

Simple, complex, quality

  • Post category:General
  • Reading time:1 mins read

How to set an incentive to create/buy simple solutions.

The problem is that complex solutions are perceived better than simple solutions.

“It can’t be that simple”.

And complex solutions have more features. 

And new technologies make complex solutions even more attractive (reverse grandmother and Lindy effect), and intellectually more interesting. 

We can wrap a complex solution and new technology in Newspeak.

A solution based on existing technology can’t beat that.

But simpler solutions can beat on quality: fit-for-purpose. Simpler means cheaper, easier to design and develop, and easier to use and maintain.

Managing the open source software complexity with platforms?

  • Post category:Uncategorized
  • Reading time:2 mins read

The last couple of days I was working on a new setup for software development. I was surprised (actually somewhat irritated) by the efforts needed to get things working.

All the components I needed did not seem to work together: Eclipse, PHP plugin, Git plugin, html editor.

The same happened earlier when setting up for a Python project and some APIs (one based on Python 2, the other on Python 3).

I am still trying to think through what is the core problem. So far I can see that the components and platform are designed to integrate, but the tools all depend on small open-source components in the back, which we find incompatible between the components. 

Maybe there should be a less granular approach to these things, and we should move to (application) platforms. Instead of picking components from GitHub while building our software, get an assembled platform of components. Somebody, or rather, somebody, would assemble and publish the open source platforms, periodically, say every 6 months.

Status quo discomfort

  • Post category:General
  • Reading time:1 mins read

A thought:

The status quo should feel more uncomfortable than the uncertainty of the future.

Best practices, theories, grandmother

  • Post category:General
  • Reading time:2 mins read

Best practices stem from the practical, not from the theoretical. 

A theory explains reality. The current theory explains reality best. A theory is valid as long as there is no theory explaining reality better.

Best practices are ways of doing things. The practice is based on year long experience in the real world. Grandmother told us how she did it. It is not theory. It is not proven formally, by mathematics. It is proven by action and results.

Best practices are perennial. They change very infrequently. Theories change frequently.

In IT best practices are independent of technologies. Examples are: separation of concerns, layering, encapsulation, decoupling. 

Best practices exist for a reason: they work.

A theory may explain why they work. But it is not necessary.

Best practices have been around for years. They were not invented half a year ago. They may be theories. More often then not theories about the applicability of technologies.

I think we need to question “new best practices” .

Instead we must rely on grandmother’s wisdom. 

*All of this very likely inspired by (or rather, stolen from Nassim Taleb’s Anti-fragile writings, and the Lindy effect)