Monday, 7 April 2025

Fare Thee Well, Skype!

It was with a fair bit of wistfulness when I read the news that Skype would be officially decommissioned by Microsoft on the 5th of May, right next month. Hard to believe that just a few short years ago, I was between jobs and on Skype speaking to interviewers during the COVID-19 pandemic.

Skype is dead soon.

Skype first appeared in 2003 as a telecommunications software application, when I was still at my Desktop Support job. Back then, my Indian boss amusingly referred to it as "Sky Pee". The setup was smooth and easy, and with this software we could talk to people all over the world instead of paying some hideously expensive phone charges. The user experience just seemed to be one of the smoothest around. The installation process was simple, a joy to embark on. I remember thinking that even just the chat feature was a lot more pleasant to use than Microsoft Messenger's.

Somewhere along the way, in 2011, Skype was acquired by Microsoft. Fourteen years later, Microsoft has decided that it is no longer tenable to continue maintaining Skype.

Why did Skype fail?

Most people have heard of the Microsoft BSOD - the Blue Screen of Death. Well, it appears what we have here is the Microsoft Kiss of Death. It may seem a little unfair to Microsoft. After all, all they really did was acquire Skype. And yes, perhaps they tinkered with Skype's inner workings just a little. And... you know what, you're right, this is all on Microsoft.

I think COVID-19 provided an even greater environment for Skype to shine. In those dark days, due to the need to avoid close physical proximity, video calls became a necessity. However, COVID-19 ironically was also one of the factors hastening Skype's downfall, because it provided that very same environment for Skype's competitors to shine.

Left behind in the race.

And shine they did.

Google Hangouts, Oracle Zoom, Slack, Microsoft Teams. WhatsApp. Instagram. Even TikTok. All of them provided video conference features (Instagram and TikTok to a limited degree) and it was clear that Skype was being quickly outpaced. And since Microsoft was understandably more concerned with supporting Teams, Slack had become an afterthought. 

And like most afterthoughts, Skype eventually faded into obscurity. Software has to be relevant in order to survive. And in order to be relevant, software has to evolve. Under Microsoft, Skype was dead in the water. 

Goodbye, Skype!

We had great memories together. Countless memorable conversations were had in your pale blue-and-white interface. There should be a special placed reserved for Skype in the annals of software history - arguably the first of its kind.

The Skype's the limit,
T___T

Friday, 4 April 2025

Deemphasis on quality is what drives Artificial Intelligence

As the progress of Artificial Intelligence marches on, I find myself needing to reiterate a point I may have made earlier, simply because it cannot be overstated.

While doing some reading on the internet, I came across this thread and quickly became immersed in the arguments and counter-arguments presented. Something stood out to me: the assertion that LLMs are incapable of genuinely writing software and will always churn out code that is inferior to what is produced by human developers who actually understand what they're doing.

Code written by machines.

It was a fascinating discussion, not least because of the developers who came out to say how useful they found A.I in their work. But in all this back-and-forth, there's one thing which none of them seem to have mentioned. Namely, the importance of quality.

LLM-generated software can work. However, here's the thing software developers know that laypeople don't - working software is the lowest possible bar to meet. This is like calling myself a "good person" just because I've never molested a child. Of course software has to work. But properly-crafted software does not only work; it is also maintainable, clean, extensible and testable. It is robust and won't break after a few changes.

That's quality software.

Is it true? Do LLMs really produce rubbish?

I think people who aren't tech workers, even some tech workers themselves, really underestimate just how defective all software is. Software these days does not exist in a vacuum. It's built on layers of existing software platforms, and connected to other pieces of software, each with its own set of flaws and vulnerabilities.

The end result is that a staggering amount of software in the world today is held together by duct tape and prayers. And that's only a slight exaggeration.

Duct tape and prayers, yo.

What has that got to do with LLMs, you ask? Well, what do you think LLMs are trained on? That's right - existing software. The crap code we've all committed at some point or other, if we're all being honest. And LLMs are gobbling all of it up, warts and all. If rubbish is the input, what are LLMs likely to produce as output?

That's right - more rubbish.

So what if it's true?

The world at large is made out of non-technical people. They are perfectly happy for software to just work, and give zero shits about maintainability, code cleanliness and all those high-minded ideals. They aren't the ones who have to deal with software on that level.

So what if the LLM-generated software is generic bug-ridden rubbish? It works half the time. Maybe even three-quarters of the time. It's good enough until the next LLM-generated software comes along. The important thing is that we meet those business needs now.

When you're hungry
enough, this looks good.

Let's use food as an analogy. Why do we eat? If we disregard luxuries such as pleasure and nutrition, at the core of it, we eat because the human body is designed to starve (and die!) if it goes without food for too long. So, what's the difference between a double-tier cheeseburger with fries slapped together by a fast food chef, or a gourmet meal lovingly prepared by culinary genius Gordon Ramsay? Forget nuances in taste, empty calories, cholesterol, salt and all that jazz. Both do the job of staving off starvation. One is cheap and produced in minutes. The other needs significantly more time and money, plus you have to put up with that smug bastard Gordon Ramsay. What do the majority of people generally choose most of the time? It's a no-brainer; Option A wins.

Let's examine another example. What about art? Some collectors or connoisseurs will pay a hefty sum for certain pieces or works from certain artists. But what if they weren't interested in art? What if they could only afford cheap imitations? What if their purpose was simply to cover the walls with something, anything? Would cheap A.I-generated art, a simulation of the genuine article, suffice? I think we all know the answer to that.

Finally...

The reality is that quality doesn't matter. At least, not as much as most of us think it should. Especially in software. Questions like "will it accommodate future changes?" and "have all security holes been closed?" and "what happens if we give it unexpected input?" have been replaced by "does it generally work?", "is it too costly?" and "is it ready right now?". Artificial Intelligence didn't cause this; years of business culture did. But that's also a big reason Artificial Intelligence isn't going away - because the world is addicted to quick and dirty solutions.


Your quality tech personality,
T___T