Computer compatibility and the web
Since the inception of computers, things have been breaking and changing. The two are not synonymous, although early works were so exploratory, I suspect things broke with more regularity and the two seemed linked. At least I hope this because otherwise it means horrible things for "progress".
I don't want us to be using our laptops from today in 20 years, although I do suspect not much would be lost in doing so, and that much would mature and create deeper understandings where things like that are possible.
This post is my exploration, and plea more than anything, to try to increase compatibility and accessibility for the greater good.
Earliest computing was plagued by incompatibilities between systems of different vendors, and to some extent; systems from the same vendors. As when learning or exploring anything totally new; you don't know what you don't know.
The reasons for this were varied and rich. At it's simplest; computers had less capacity because miniturization was also in it's infancy; and engineering had not yet made it cost effective to throw billions of bits in commodity hardware.
The rise of Virtual machines
There are two types of virtual machines. System and process. One is concerned with providing a compatible environment for hardware, and the other is concerned with providing compatible environment for only a piece of software. Both have been in active development since the 1960s; but really came to the fore in my opinion in the 1990's.
Process virtual machines allowed code to compile once and be deployable to various operating systems and instruction sets. There was a cost for this flexibility, but the payout was the ability to have standardized backwards compatibility on some level.
This wasn't free of effort. Each platform would still need development to port the runtime, or managed environment to a new OS, machine architecture; but it would enable non-breaking changes.
It could be argued that all modern x86 PC's ship with a sort of compatibility layer akin to a virtual machine. I won't concern this article with that, but it is a fascinating topic to google. The crux is that we found ways to ensure some degree of management of backwards compatibility, because it is incredibly expensive to re-tool every 3-6 months, or even 18 months.
Why am I on about this?
Recently, protocols that define the internet have been changing. This affects my job. As more and more people and organizations compete for mind-space in arguably the very language and concepts we compose to perform our jobs; I do start to worry.
There are only so many things I can consider. This manifested last year in me not considering streaming responses for a REST API receiving payloads of < 100KB. You don't need to stream 100KB, there is no benefit whatsoever to it; however because a specification existed, two people mutilated some software to support the use-case.
It's not that I don't think others are also worrying, or that I believe we have entered a golden age where nothing should change; but as I age, my capacity for change and appetite for more of what I've experienced already tends to diminish.
Not using streaming API's outside of streaming use-cases ensures a small software footprint and manageable surface area. You'll get more done, in short if you pick the specifications to support.
This is the balancing act any seasoned professional developer must face.
- You need to embrace new things
- You need to maintain older things
- You need to evaluate when "what you do", has now changed.
How this impacts the web
We have choices when we innovate on the internet. Difficult choices, but choices nonetheless.
When a security vulnerability, or inability to serve 2000 images efficiently over HTTP is discovered; we get to chose if we break backwards compatibility to move to a new binary format created largely in an internet advertiser that seeks to continue serving hundreds and thousands of files per-request; or to put a stick in the ground and say "Actually, I'd rather not continue to subsidize or normalize your empire of greed."
Perhaps I can put that another way. If one stakeholder is creating an unreasonable amount of unnecessary work; then perhaps it's not your ability to do the work, that needs to exponentially increase.
I talk to people all the time about their problems developing for and using computers, and quite usually, the internet, or network enabled software. It's just better to do things together.
- You can get things done faster.
- You can reduce risk.
- You can reason about smaller parts of larger issues.
I'd guess that at least half the issues; certainly more than half of those recorded; seem to be foot-gun scenario's. People have pointed, aimed and fired, then came to me about a hurt foot. My answers generally start by trying to get them to recognize the foot-gun existence. This isn't always easy in a world where persistence is often mistaken for expertise, and long-form, deep understanding requires both persistence and dedication, as well as consistency. Now they need to do 3 times the thinking.
What can we do instead
Some of my biggest peeves on the internet are lack of accessibility; over-use of solutions shifting problems to third parties without deep thought; and an abundance of marketing masquerading as knowledge.
- I'm unlikely to fix the last one; it seems a permanent fixture in human history.
- The second one seems to require people to be regulated or tricked into the right thing.
- Because the second and last are not generally available the first becomes problematic.
- The first one is also an ongoing and generally remitive process.
So, I'd like to appeal to my future self and any other readers, to come up with ideas for how to get people to act in societies shared-interests. Consider the overall cost, risks and direction of that new feature you need to install babel and transpile for. That killer feature you need ReactJS, or the latest PHP or Python for.
I was quite surprised today when someone told me they were coding using PHP7.3 runtime, but compatible to PHP5.3 language features. If they have the latest runtime, I guess it's just a lot more work.
If we can reduce breaking backwards compatibility unnecessarily; we may be able to keep the web the web, and allow others who've not yet caught up, or even started to get involved; participate and succeed. We may be able to explore alternative avenues, which none of us has trodden, which yield fantastic results. I'm ultimately not asking you to be Einstein; but consider not trying to be an olympic athlete either.