Showing posts with label web development. Show all posts
Showing posts with label web development. Show all posts

Monday, 8 September 2025

Five Tech Support Horror Stories

The early years of my career were in tech support. As with any other job, there were good days and there were bad days. After the third year at the job, the bad days started to outnumber the good. It all seems hilarious in hindsight now, but there were some days where things in this list caused me to question my career choices.

Until one day it all came to a head and I decided I'd had enough, and started over in web development.

Sometimes I get together with some friends who are still in tech support, and we trade horror stories of the users we have to help. These are some of the stories that get 'em, every time.

1. Plugging in

This is actually a fairly common one, but let's start small. You get called to a user's desk because the desktop computer refused to turn on no matter how many times they pressed the On/Off button. And they even checked if the main switch was on. And judging from the light, it was.

Not plugged in.

However, upon closer examination, it turned out that the cord wasn't plugged in. Yes, you read that right - the power was on but the plug was just halfway into the socket and needed to penetrate another two inches before the computer could actually benefit from that power source.

Sound stupid? Welcome to my life at that time, buddy.

2. Opening Excel

Another alarmingly commonplace occurence was getting called into the office of some hotshot executive who was encountering an issue opening a MS PowerPoint file in his (or sometimes, herMS Excel application.

Now, if you're still scratching your head and wondering why that's a problem, reread the preceding sentence. MS PowerPoint file. MS Excel application.

Just a bad fit.

I dunno, that was the early 2000s, and attempting stuff like that smacked of trying to fit a square peg in a round hole. It was amusing the first couple times, and then it got old real fast.

3. Infinite scroll

This was was so cartoonish it was almost amusing. I got a panicked call to a user's desk because her MS Excel spreadsheet was scrolling endlessly downwards on her screen and she couldn't understand why. It conjured up images of getting hacked, a malfunctioning monitor and whatnot.

The truth was even funnier.

Held down the ENTER key.

I got there, and the first thing I did was remove the heavy binder from her keyboard, which had been pressing down on the ENTER key and causing MS Excel to react as though some user was holding down that key.

4. Email Signature

This particular incident did not happen during my years of Desktop Support, but rather during my fledgling years as a web developer. However, the incident in question made me more determined than ever to never get back to Desktop Support.

A user had asked me to help set up her email signature because she had no clue how to use MS Outlook. I obliged, because I know sometimes Microsoft software functionality can be hidden in the darnedest places. But then after I got into the interface, input the standard company email signature template, I asked her to type in her name into the box and click the SAVE button.

Yay! We're now
qualified to type
our own names!

Guess what she told me?
"You should do it. You have an IT Degree."


That level of entitlement was staggering. What was she implying now, that she needed an IT Degree to type in her own goddamn name? What foolishness was this? This wasn't a competence issue. This was an attitude issue. And the less of this I see in the workplace, the better. There's no place for this nonsense in any work environment. Hopefully this woman has since retired. At the very least, she's someone else's problem.

5. Emails

This is also a fairly common complaint among grunts, not just tech grunts - people feeling like they're entitled to your time outside of office hours.

I remember having a dinner appointment with someone, and Human Resources asking me to stay back because they needed me to, wait for it, retrieve some emails from the email server backups between three of the staff. Staff they were planning to terminate, thus they needed evidence of wrongdoing as leverage.

A whole bunch of
DVD backups.

Basically, nothing was on fire. They just needed me to help cover their asses. Hours later, as I was retrieving yet another batch (back then, it was the era where stuff like that was stored on DVDs), when HR asked me: "I'm sorry, did you have something on tonight?"

Seriously, lady, if the answer was "yes" would it have made a difference? If not, how about just shutting the fuck up? You know what's worse than people who don't care? People who don't care and try to act (badly) like they do.

Phew!

I wouldn't say any one incident turned me off of Desktop Support. Even on its own, it can be a repetitive grind that wears on the soul. But these were the war stories that I shared with the guys. And their reactions suggested that these occurrences weren't at all unheard of. Some of their stories were even more unbelievable than mine.

No, you don't need an IT Degree to read this,
T___T

Saturday, 19 July 2025

Five Reasons to learn Web Development in 2025

Recent events such as the rise of generative A.I, have made tech work a little less attractive than it used to be. Web development, in particular, has suffered. That's probaby because a large chunk of web development is automatable, and even before A.I came on the scene, there had been numerous tools such as Content Management Systems and Low-code development platforms.

Thus, web development being automated by A.I was par for the course.

Robots writing websites.

Still, not all is lost. While web development might have lost much of its luster, there are still good, strong reasons to pick it up in ones tech career. Unlike the tech reporters and HR executives who write listicles like these, I have actually been a web developer before. I speak from experience, my dudes. And following are some of the most compelling reasons I had, in no particular order of importance, for going down this path.

1. No complicated installation

Ever tried to learn a language like PHP or Java? Every single one of these languages requires you to set up some kind of compiler or interpreter environment. PHP requires an Apache server. Java needs the Java Runtime Environment. You can write all the code you want, but until the code gets compiled or interpreted by the environment that you have to install and set up, you're not getting even a Hello World program done.

All you need is a browser.

HTML, CSS and JavaScript, however, do not. All of them already run in any major browser - Firefox, Chrome, and so on. In effect, the environment is right there for you.

This is not to say that you will never need to do any complicated installation. But for the basic building blocks - again, HTML, CSS and JavaScript - of web development, you don't. You will need to do that when you want to pick up a server-side language and maybe databases and definitely for the NodeJS style of development. But for basic stuff? Even the slightly more advanced stuff? Nope, not even a little bit. That is a lot more than you could ever say about other programming languages or platforms.

2. Good skill spread

When you learn web development, you learn HTML, CSS and JavaScript as a base starting point. That's already a good spread right there.

HTML and CSS are where you learn front-end and possibly even design. When you learn JavaScript, in addition to all the things you pick up when learning a programming language such as operators, arrays, branching and iterative logic, you also learn asynchronous operations and DOM manipulation.

A good spread of tools.

That's not to say that other tech disciplines don't have their own unique perks. But where it comes to the skill spread, web development wins. I don't think anything else even comes close.

Once you get past the basic toolset of HTML, CSS and JavaScript, back-end programming and databases will come into play. It's never just web development. Even if you are averse to the thought of being a humble web developer for the rest of your career, there are far worse places to start.

3. Resources

Now, when I say "resources", I don't just mean documentation, references and learning materials, though there's plenty of that, yes. But web development is not special in that regard because any other tech discipline boasts plenty of learning resources and a community dedicated to helping each other learn.

A good learning
community.

Though, in this case, web development has something extra.

You see, every humble HTML page on the internet can have its source viewed and played with in the browser, reverse engineered, and so on. Every URL on the internet is potentially a resource for learning, much like how I learned to cobble together JavaScript widgets decades ago.

In contrast, it's not possible to just take any desktop application and reverse-engineer the code, because the code has already been compiled and is no longer human-readable.

4. Ready use case

Often, when learning a programming language, it's helpful to be able to use newly-acquired skills to build something, so as to really hammer home the muscle memory. Something both relevant and useful, preferably. Not that Hello World programs don't have their place, but if one wishes to level up, better use cases are the order of the day.

And with web development, those use cases are almost too easy to find. Web development creates web pages, at the minimum. And after that, at varying levels of complexity, web applications. One does not have to stretch too far to find something worth building... and because it already exists, you know that it is both worth building and possible to build.

Applying what you learn.

My larger point is that what you learn can readily be applied. Not just in creating and editing websites, but in general software development. This also means that your chances of landing a job with that skillset cannot be understated. In this day and age, web developers are perhaps not nearly as in demand as they were a decade ago, or paid nearly as well, but the skillset goes beyond just web development.

For example, a lot of existing software already leverage things like REST API endpoints. These are basically URLs, which are pretty much the backbone of the web. REST is an almost inescapable part of the whole web development deal. Ergo, if you deal in web development, at some point you are going to be dealing with REST endpoints, which overlaps a large part of software development regardless of discipline.

Or even mobile development. In case you weren't aware, a large chunk of mobile tech is basically HTML, CSS and JavaScript.

I could go on... but do I really need to?

5. No gatekeeping

In the legal profession, there's the Bar Exam. In the medical profession, there's the Medical Regulatory Authority. In tech? Other than job interviews which exist at almost every industry, there's almost no gatekeeping in tech. Even the requirement for Degrees of Diplomas is not a really hard one.

When I say "no gatekeeping", I don't mean that nobody tries to gatekeep. The fact is that many people try to gatekeep, but it just doesn't work because to gatekeep, one needs a unified set of standards. It's almost impossible to establish said standards in a landscape as varied as tech, whose goalposts shift constantly.

The gatekeeper.

And while this inability to gatekeep exists in many areas of tech, none moreso than web development. HTML, CSS and JavaScript are fairly stable at this point, but these are just the base technologies. Their offshoots - frameworks, libraries and the like - keep springing up like mushrooms. And when you consider databases and backend programming languages, the possibilities multiply even more.

All in all, one could come in anytime in web development, and still be relatively fresh and relevant. No one can stop you from making and publishing web pages and applications, not in the same way they can stop you from practising law. You don't need a license to write code, so nobody can revoke it.

Some clarifications

The reasons stated here are in relation to those for choosing other tech fields. Why, for instance, web development when you could go for Data Analytics or cybersecurity? Reasons specific to web development.

I was inspired to compile this list because there are a lot of vague, generic and - to be brutally honest - trite lists out there on the web that extol the virtues of web development. Hopefully this is a better list.

<html>Bye for now,</html>
T___T

Saturday, 12 April 2025

Buttons or Divs? What to use, and when

With the power of CSS, HTML is significantly more visually versatile than it was in its inception more than two decades ago. Especially with divs. You can make divs appear as anything - paragraphs, block quotes and images. In extreme examples, you could even render entire paintings using many, many divs. 

A huge variety of shapes,
especially rectangular.

The humble div tag, coupled with CSS, is no longer just a rectangle on the browser. Using properties such as transform, border-radius, width and height, among others, a web developer can achieve a myriad of looks.

And this manifests quite frequently, in buttons. Previously, I discussed whether button or input tags would be preferable, but today we make a separate comparison between divs and buttons.

Divs as buttons

Making divs look like buttons is simple enough. How about behavior? Well, for that, JavaScript accomplishes this fairly easily.

One is a button and the other is a div.
<button>This is a button</button>
<div style="cursor:pointer; background-color:rgb(230, 230, 230); border:1px solid rgb(100, 100, 100); border-radius: 3px; font-family: sans-serif; font-size: 12px; width: 8em; padding: 0.2em; text-align: center">
This is a div
</div>


But they can both be made to perform certain actions on a click.
<button onclick="alert('I am a button');">This is a button</button>
<div onclick="alert('I am a div');" style="cursor:pointer; background-color:rgb(230, 230, 230); border:1px solid rgb(100, 100, 100); border-radius: 3px; font-family: sans-serif; font-size: 12px; width: 8em; padding: 0.2em; text-align: center">
This is a div
</div>


Depending on the browser, you should see no appreciable difference between these.
This is a div


How about submitting a form? Well, a button usually does this.
<form id="frmTest">
    <button>Submit<button>
</form>


But if you want a div to do this, all you really need is a bit more code.
<form id="frmTest">
    <div onclick="document.getElementById('frmTest').submit()">Submit<div>
</form>


Definitely possible, but should we?

Visually, there's not a lot of difference. In fact, styling divs to look like buttons, could even potentially offset visual differences of button rendering between browsers. For example, we take the code written earlier.

This is how it looks on Chrome.


This is how it looks on Safari. See? There's no visual change in the div we styled, but the button looks remarkably different.


However, not everything is about the visual. Especially not to the visually-impaired. The button tag and a div tag reads differently in semantics. On a screen reader, the button tag immediately stands out as a control to be clicked, while a div is semantically no different from any other div.

That is the greatest, and most significant difference. Not being blind, it is understandably difficult to imagine perceiving anything other than in visual terms, since a large part of what people like us perceive, is in the visual medium.

Conclusion

The internet was not only made for people like myself. The internet was meant as an equalizer where it came to information access. Not only did it mean that the average person now had access to information that was not available readily in the past, people with visual disabilities were supposed to be able to access this information.

And that access could be compromised if code was written with the visual intent in mind rather than the semantic.


T___T

Thursday, 6 February 2025

Tech Democratization in the Era of Artificial Intelligence

There's a term that I have been seeing more of late: Tech Democratization. More precisely, I first heard it while watching this speech by Nvdia CEO Jensen Huang.



"And so I think that democratizing, activating every region, activating every country to join the A.I advance, is probably one of the more important things, rather than convincing everybody it's too complicated, it's too dangerous, it's too mystical and only two or three people in the world should be allowed to do that."


To be honest, the first time I heard the term, I was perturbed and more than a little offended. Like what, tech needs to be democratized? As opposed to it being a dictatorship right now? However, after researching the term a bit, it's not as acrimonious as one might think.

So, what is Tech Democratization exactly?

Tech Democratization describes a state of tech where creation of software (web portals, applications and such) can be carried out without relying on the expertise of technically qualified people. In short, laypeople would no longer need tech professionals for these things.

At least, that's the promise of Tech Democratization. It brings the power to create, to the masses, instead of keeping it in the hands of the elite few. More importantly, the power to create without needing a technical foundation acquirable only by years of training. As a concept, Tech Democratization is not new. One could even argue that it started all the way back in the 60s where high level languages were created so that people like me wouldn't need to deal with Assembly, thus opening up possibilities for a less hardcore breed of nerd.

Building without the
need for expertise.

How about frameworks? Frameworks were created back in the day (and are still in use now) to reduce repetition in coding, and thus reduce the amount of code developers needed to write for a typical application. Now that code in scaffolding no longer needed to be written from scratch, surely this meant that application programming was now opened to a lot more people.

Web development came up with content management systems and even entire website builders to allow the man in the street to create websites without needing to know any HTML, CSS or JavaScript. Low-Code platforms bring this to a new level, allowing laypeople to create simple applications without knowing having much of a technical foundation at all.

From this point of view, A.I is merely the latest in a long line of innovations that outsource the repetitive parts of writing code to the system, so that the human can focus on the creative bits.

The Scariness of Tech Democratization

A.I is a tool made by actual software engineers. Even if non-technical people can now use this tool to create software that would previously be out of their reach, this still does not make them engineers. It sounds obvious now the way I say it, but it is human nature to get carried away by the perception of ones own perceived ability.

The danger is that non-technical people using A.I as a tool to generate code, will start thinking that they can replace actual developers. It does not matter whether or not this is true. What matters is that these laypersons, having not actually encountered software development in all its glorious and terrible complexity, will create a few working apps and start wondering what the big deal is, precisely because they are unburdened by that knowledge.

How much should
non-techies control?

Imagine a world in which software that runs air traffic routes or maintains banking transactions, is created with the help of A.I and people who don't actually understand software development, but are given the ability to act as if they do. Are you afraid yet? No? Perhaps you should be.

Me? I'm at the tail end of my career. Professionally, there's nothing I could lose to A.I that I don't already plan on giving up in a few years. I don't fear losing my job to A.I. It's a tiny consequence compared to the terrible implications I just outlined. And that's all I'm doing - outlining those implications. In no way do I think Tech Democratization is a bad thing.

Besides, it would be hypocritical of me to start decrying Tech Democratization now. I, too, am a beneficiary of Tech Democratization. Were it not for the significant effort put in by engineers better than myself, I would not have the tools - databases, operating systems and programming languages - I use so readily today. I recognize and acknowledge my privelege, and would never presume to place myself in the same bracket as the tech professionals who made those tools.

There's a stark difference between developers benefitting from such tools, and laypersons doing the same. Developers have (generally) the technical foundation, in some cases to the point where those tools would merely be helping them do things they already know how to do, but faster. Laypersons, on the other hand, would be hapless infants without those tools, at least where accomplishing those specific technical tasks were concerned. That, I think, is an important distinction.

In conclusion

Look, I'm not some sort of tech elitist or wannabe gatekeeper. I'm just not. But it's one thing to allow the general public the power to write software. It's quite another to consider them on the same level as people who have been doing this for years (or, in the case of Jensen Huang, claim that "everyone is now a programmer"), just because there are tools to help them accomplish the same goals. That is laughable.

Faster than Phelps.

It is absurdity on the level of me claiming to be as qualified an athlete as Michael Phelps just because I can outpace him on a jetski. Or claiming to be as qualified as my family doctor just because I can read up on medical symptoms on WebMD.

Tech Democratization has value. Like all things with value, however, one has to know exactly how far to take it. Speeding up the work of developers or automating away routine tasks is a great idea. Giving every schmuck the tools to write simple apps? Also a pretty good idea. Allowing them to think that software development is actually that easy? There are not enough words in the dictionary to describe what a horrible idea that would be.

Democracy now!
T___T

Saturday, 7 December 2024

Shout-out to Lance Storm of Storm Wrestling!

2001. Picture a year where the Internet was a vastly different place. Ethernet had just taken off, rapidly eliminating those charming dialup modems. Websites, however, hadn't caught on that quickly, the vast majority (or at least, those I visited), still designed as though they had to abide by network bandwidth restrictions. Clunky graphics, clumsy animation, that sort of thing.

And it was during this time where I watched pro wrestling. In the 90s and early 2000s, pro wrestling was filled with all manner of colorful characters, and one such character was the Canadian pro wrestler Lance Evers, ring name Lance Storm. He had (and still has) his own website, Storm Wrestling.

There's a storm
coming.

It was a cozy spot on the Internet where Lance Storm discussed wrestling with fans and had his own little community. I was all for it, and even joined his book club!

But mostly, Mr Evers served as an inspiration for a young I.T Degree graduate and burgeoning web developer.

How was Storm Wrestling an inspiration?

Simple. No matter how crappy one might think the website was - mostly by today's standards, it was definitely cool back then - the fact remained that Lance Storm was not in the tech business. He was a pro wrestler.

And nevertheless, he registered a domain name and made a website. Again, a shitty website by today's standards, but still! Come on!

A relic of bygone times.

What kind of excuse did I, an actual web developer, have for not having my own website? None. Nada. Just pure laziness. Lack of motivation.

Registering a domain name and paying for hosting aren't extremely expensive. Honestly, if you wouldn't invest that much in your own career, why should any prospective employer?

No, I didn't get my own website right away. A lot still had to happen before I got off my ass to do shit. But it did kick-start the thinking process. And for that, I'll always be grateful.

Epilogue

It has been more than twenty years. And now I do have my own website and domain name. This has opened countless doors. Storm Wrestling inspired Teochew Thunder. Pretty poetic, wouldn't you say?


Stealing your thunder,
T___T

Friday, 1 November 2024

Why people should (and shouldn't) hire older software developers

Seven years ago, I wrote this post on my 40th birthday. Today, as I turn 47 in a few days, here are some more thoughts.

On my website, I described myself as "an aging software developer". Some people have told me that this could reflect negatively on me. They're mistaken, but I forgive them - they aren't from the tech sector and don't know better. You see, when I say the word "aging", this is not me being humble. This is me flexing the fuck out.

Just showin' off.

But please, hear me out. I swear, I'm not going to pull out lazy clichés like "older programmers are more experienced, more mature, have more gravitas, etc" not just because they're lazy clichés, but also because they're not true. And if I have to explain why they're not true, perhaps this is not a conversation you're ready for.

And because I enjoy being contrary even against myself, I will follow that up by explaining why older devs aren't necessarily the best choice. In the same spirit, I will avoid the stereotypes of being inflexible, slow and outdated. Again, those are lazy clichés and we should rise above them.

Why older devs are nothing to sniff at

You see, software technology is an industry which demands constant reinvention. As a result, past a certain number of years in the business, older devs tend to go "fuck this constant re-learning. I'm gonna go drive a cab or something".

In short, it's an industry where you find few old folks. And you know what they say about being wary of old men in a trade where men die young. Well, programmers aren't dying per se, but they're certainly quitting once they hit a certain age, because, if one was just doing it for the money to begin with, at that point it's just not rewarding anymore.

And because software technology is an industry which demands constant reinvention, it almost goes without saying that anyone who's had to stick around for that long, has gone through quite a bit of that. Me personally, I went from desktop support to web portals, to commercial websites, to web and mobile applications, to having to shoulder the duties of an entire infocomm department all by my lonesome. Sure, one could say people who have survived that long in the industry are merely lucky, but very few people are that lucky.

Adaptability is key.

So, if you needed someone highly adaptable that could adapt to the constant change that defines this industry, who would you choose?

An enthusiastic youth with lots of potential to learn and grow and evolve and theoretically should be able to adapt? Or an older programmer who's actually evolved over and over through the years and survived to tell the tale?

The conventional wisdom, of course, is to go for the proven product rather than the one that has potential -in theory. Yes, some of us older folks can be rigid and stuck in our ways, but the nature of this industry weeds such people out fairly quickly. You're left with the people who are adaptable enough to survive this industry (because we have!), and in this day and age, that's no small thing.

Again, no argument would be complete without presenting the other side. And there are plenty of compelling reasons why the modern employer might not want an older developer.

Why you should avoid older devs

Older software developers, unless they totally mismanaged their wealth, tend to have money. The industry pays well, and even a mediocre dev like myself might be earning more than Middle Management at an SME. As such, you're not going to get one for cheap. We more than likely don't need the peanuts you're reserving for code monkeys.

Peanuts, anyone?

Manipulation. No matter how noble a person an employer thinks they are, this is an organization and as such, there's always something of that sort going on, to one degree or another. Against manipulation, many older devs have developed, if not outright immunity, at least a discomforting degree of resistance to it.

Past a certain age, most older devs already have whatever we ever wanted out of life. We're there. We're comfortable. We're not hungry and desperate as the younger ones probably are. We're not going to kill ourselves for "exposure". Or submit to opportunistic lowballing (c'mon, we all know it happens) just to add to our resume. And that is an absolute negative because cold as it may sound, it makes us less open to manipulation.

Older devs do come with experience, and part of that experience is security. We're generally zen about the fact that we'll probably die and be forgotten. We've come to terms with the realization that if we were really destined to do anything truly exceptional, statistically we would have done it decades ago. Most of us are too tired, or probably done too much, to feel that we need to prove a damn thing. And if you're the sort of employer who likes to make your staff jump through flaming hoops to prove themselves, again, that kind of security and self-assuredness makes us untenable as employees.

All in all, older developers aren't cheap, and we're not desperate or hungry enough to run through walls at your command. And I can't in good conscience paint that as a positive for employers.

In a nutshell

For the right kind of employer, older software developers are an exceptional resource.

Also, consider this - with the news that Artificial Intelligence is going to make programmers obsolete, whether it's true or not, this is going to impact the number of young developers available on the market. Because if younger programmers think that this career path is no longer viable, they're just going to leave, and who could blame them?

Us older programmers? We're dug in, and we have little to lose. Chew on that!

Old but gold,
T___T

Thursday, 22 August 2024

Singapore's Smoking Samsui Woman Controversy Through a Web Developer's Eyes

The story I'm about to talk about today, is pretty old, at least in the news cycle context. It happened more than a month back, and the incident is fast fading from public consciousness. It's taken me at least that long to come to terms with it, and understand why certain elements of the case offended me on such a profound level.

To anyone who doesn't want to beat this particular dead horse, I more than understand; go play with another ball of wool and watch some funny cat videos while I unpack this.

What happened

The owner of a building in Chinatown had hired an artist, one Sean Dunston, to paint a mural on the exterior wall. Dunston decided to paint a tribute to some of the pioneers of Singapore, the Samsui woman. To that end, he produced a work covering the wall, of a young, slender woman in the blue work clothes and red hat of the quintessential Samsui uniform, smoking a cigarette.

From Google Maps

The Urban Redevelopment Authority (URA) issued an order to the owner of the building to remove the cigarette from the mural, as the depiction of smoking on a public-facing wall contravened the Singapore Government's stand on advertisements for tobacco products and smoking.

The URA also cited feedback form an unnamed member of the public. In Sean Dunston's original Instagram post, this was an excerpt from their correspondence. 
We wish to bring to your attention that URA has received feedback on the mural from a member of public. The feedback received is as follows - "We find this mural offensive and is disrespectful to our samsui women. The woman depicted in this mural looks more like a prostitute than a hardworking samsui woman."

Cringe, man. Massive cringe. More on that later.

The URA then followed up by saying that the mural would have to be changed, or the restaurant which operated in this building would have its temporary license revoked.

Reactions

It was a heady mixture of scorn and disbelief that anyone could mistake a simple picture of a female laborer sitting down and enjoying a cigarette as the come-hither soliciting pose of a prostitute. There were those who opined that the cigarette, while possibly too modern (in those times, it was more likely to be of the hand-rolled variety), was absolutely a feasible historical representation of the quintessential Samsui woman.

No artistic freedom?

There were those who expressed concerns about artistic freedom in Singaporean society.

And yes, those who actually thought that the Samsui woman absolutely did look like a prostitute. This weirdness thankfully seemed to be at a minimum, though I have suspicions that these were basically just civil servants in subtle damage control mode trying to make the URA's instruction seem reasonable.

The Association of Women For Action And Research (AWARE), perhaps feeling the need to appear relevant, also weighed in with some garbage about perpetuating "a male gaze", and then backtracked later. 

What really offended me, and why

Now, the actual opinion of the Samsui woman looking like a prostitute, though I'm thoroughly dumbfounded by that, wasn't by itself offensive. I'm not offended by the fact that people have opinions, and that often these opinions don't match mine. That's a fact of life, and, where subjective topics such as art and morality are concerned, par for the course.

What did offend me was that the civil servant who drafted the URA's response, saw fit to include the feedback of that one member of the public, and acted as though this inclusion was ample justification for their decision. Who was this one anonymous person whose opinion held so much weight? Was it their own opinion which they then passed off as some fictional other person's? Was it their mother-in-law? Some random Karen passer-by? Our newly minted Prime Minister Lawrence Wong? Our President, Tharman Shanmugaratnam? Which individual, out of Singapore's population of over 5 million, has enough clout to speak for all of us on this matter? We'll never know, because the URA also decided not to disclose their identity.

Our mysterious art critic.

The fact that the URA gave such a nonsensical reason as justification, is insulting; even more insulting than if they had not deigned to give a reason at all. It smacked of the good old heavy-handed, "I have the authority to do this, just take it and be grateful I even bothered to give you a reason, peasants."

This irks me due to similar experiences I've had in web development. Clients somehow seem to think that the fact that they pay for a website entitles them to say things that don't make any kind of logical sense. Buddy, you paid for a website. You didn't pay for the ability to redefine reality.

"This color scheme makes me feel moody. This needs to be more vibrant." It's been decades, but I don't think the technology to create a website based on subjective feelings, exists in any part of the world.

"I don't like this. Let's redo it." Pretty sure that in order to get anywhere, we'll need something a lot more concrete than "I don't like it". But for some mystifying reason, people confuse web developers with mind readers.

"I asked my friend what he thought and he didn't like the layout." Again, who is this "friend"? The imaginary childhood kind? Or at least some authority in the field of web design? Also, doesn't it occur to them that the website isn't actually meant for them (or their friends), but their target audience?

The URA response was a great example of the kind of out-of-touch correspondence that reeked of arrogance. The sort of arrogance that genuinely expects not to be ridiculed for providing such a laughable response. The sort of arrogance that comes with having precious little experience in justifying oneself, or worse, offering something rather more convincing than "because I said so". The sort of arrogance so cartoonish that it borders on comedy.

The similarities of this case to my previous experiences as a web developer - vague subjective objections, feedback from anonymous unrelated person cited and all - were pretty triggering by themselves. But at the same time, I fervently hope that this isn't the quality we can expect of our civil servants.

Many Asians have older relatives that expect to be taken seriously (even when they talk nonsense) simply because they're older, dammit. The URA, similarly, seems to be afflicted with the kind of characters that expect to be taken seriously (even when they talk nonsense) simply because they're the URA, dammit

So yes, I'm dismayed at the URA's response. I'm also annoyed that it took Dunston raising a stink on Instagram and the subsequent public outcry, for the URA to get their collective heads out of their asses. 

Epilogue

The mural of the Samsui Woman was allowed to remain unchanged; however the owner of the building was fined a modest 2000 SGD for contravening the regulations. This seems like a good compromise

And hopefully, the URA has learned a PR lesson from this; or better still, properly educated the civil servant in question. Singapore can do better; indeed, if we want to continue punching above our weight, we need to.

Your moural authority,
T___T

Monday, 29 April 2024

What's the best programming language for beginners?

What should be a programmer's first language?

Often, this is not a deliberate choice, more a matter of circumstance. Perhaps a programmer started on QBasic as a kid. Perhaps, like me, the programmer picked it up in school and was made to learn C++ or Java. I would even venture to say very few programmers start out saying I'm going to begin my programming journey with the best, most modern language out there! To begin with, how would they know what constitutes the "best" programming language, or what even is a good programming language? You don't know what you don't know, after all.

Let's learn!

Also, how suitable a programming language is for your purposes also really depends on what your purpose is. Do you want to crunch data? Write games? Build web portals?

So, obviously, before we can sort out what the "best" programming language is, perhaps it is more useful to figure out what a programmer's first programming language should be. After all, this choice opens different doors.

For that, we should probably look at a few factors. I can only draw upon languages I've coded in before. as examples, obviously.

Syntax

For this, we look at simplicity of writing code in that language. How intuitive the syntax is, and how few extra keystrokes needed to get something done.

For this, my obvious picks would be Python or Ruby. Less of the semi-colons and curly brackets. Anyone picking either Python or Ruby as a first language, I suspect, would be less likely to give up out of frustration.

Code syntax.

C# and Java is probably among the worst for this. Too many namespaces. Overly verbose syntax. I haven't had many projects that needed to be done in either of these languages, and for that I'm truly grateful.

But the one that takes the proverbial cake, in terms of being cumbersome, would be PHP. The function calls structures are wildly inconsistent, and in addition to semi-colons and curly brackets, it also requires a dollar sign preceding every variable name.

As far as syntax goes, the group of programming languages with visually similar styles - PHP, Java, JavaScript, C and its variants - is such that if you pick one up, the learning curve for the others, at least for the simple stuff, isn't so great. Thus, even if PHP is hideously cumbersome in parts, it benefits from being visually similar to other languages mentioned.

Transferable skills

This measures underlying principles or skills that one will pick up while programming in a certain language, that will be useful when transitioning to another stack. It could be something as simple as similar syntax (as mentioned earlier) or strong data typing.

Or it could be something so ubiquitous that no matter what stack you're in, you're almost certainly going to encounter it. A great example of this would be SQL. No matter what kind of programming you go into, the chances of you needing to access a database at some point, are pretty good. I wouldn't go so far as to say that this makes SQL the number one choice for a programmer's first language, but it makes a great case for being a programmer's second, at least.

Data processing.

For a similar reason, if one wanted to go into web development, it's almost impossible not to have to deal with HTML, CSS and JavaScript, but learning JavaScript as a first language just for that, is questionable. As a second language, definitely.

Strongly-typed, class-based languages such as Java and C# would be good choices as a first programming language for learning concepts such as Object-Oriented Programming. (PHP and JavaScript kind of implement OOP as well, but in a way that's a little odd). Python would be the choice if one wanted to have a strong foundation in data structures. However, Python's lack of strong typing might work against it as a first language.

Ease of setup

One of the most daunting tasks of learning a language is setting up the environment for it. Java requires the JRE (Java Runtime Environment). PHP requires an Apache server. Python and Ruby, well, you get the idea.

For this, the most fuss-free option has to be JavaScript, hands down.

All major browsers
run JavaScript.

JavaScript runs on all major browsers. You don't need to install anything you probably don't already have. If you're reading this, unless you're reading it on your phone, you have a desktop browser which you can run JavaScript in. It's almost zero setup. Just write your code, and run it in the browser.

For a Windows environment, VBScript is also almost zero effort. All you need to do is write your script, save it with extension "*.vbs" and you can run it by double-clicking it! Unfortunately, I'm not sure how useful knowing VBScript is.

In essence, in the cases of both JavaScript and VBScript, the environment has already been set up for the programmer.

Popular support

No developer plies their trade without consulting a reference of some sort. Textbooks are a valid source, but for the most updated material, we turn to the internet. Portals and forums provide most of these, with each language having large communities of programmers ready to provide support.

Learn by reading, or
by community.

Which programming language has the largest, or best, communities?

That's hard to say. All communities have their fair share of toxic losers who waste little time being condescending and acting like gatekeepers. At the same time, each community also boasts genuine people. I'm not going to waste time detailing how large each programming language's community is relative to others. Suffice to say, they're all (OK, mostly) large enough to be useful.

Generally, the longer they have been around, the larger they tend to be.

Conclusion

Whatever the choice turns out to be, ultimately, there will be benefits to that choice. That is because no matter what language someone chooses for their first programming experience, the fact remains is that they are still engaging in the activity of programming. It's a start. Some starts are better than others, sure, but it's a step. And hopefully the first of many.

Mind your (first) language!
T___T

Tuesday, 16 April 2024

How worried should software developers be about Devin AI? (Part 2/2)

On the subject of Artificial Intelligence taking over software development, there's both good news and bad news. Actually, most of it is bad. But let's start with the good.

Your edge against Artificial Intelligence

Computer processors have grown faster over time. That's a bit of an understatement; processing speed and power have increased at an astonishing rate over the past few decades. Faxes used to take twenty minutes to travel the world; now email performs the same function in seconds, or less.

But however fast computers are now, they're still at best capable of accessing and processing information millions of times faster than humans. Their ability to create new things is an illusion - creating what appears to be new things using existing content as input. The creativity factor is probably a zero. And mathematically, zero multiplied by millions, is still zero. Thus, no matter how fast computers get, they aren't any closer to true creativity than they were decades ago.

Machines are always
significantly faster.

Human beings on the other hand... while it's difficult to objectively measure creativity, I think it's safe to say that the creativity factor of the most brilliant minds on the planet, is probably above zero. Therefore, no matter how slow human beings are compared to computers, we still have that edge.

I've also mentioned before that machines aren't capable of loving their work. They aren't capable of being motivated by things like pride and passion. All that requires flesh and blood. So, for whatever it's worth, that is one thing that no A.I can ever replicate. For the simple reason that whatever A.I is capable of, is what humans have been able to define, just performed at significantly higher speeds. No human has ever been able to successfully formulate love, passion and pride. Subsequently, no A.I is capable of those things.

Artificial Intelligence's edge against you

One may think that A.I's lack of pride works against them. But this also means A.I, doesn't have an ego. A.I is not programmed to give up out of frustration, or refuse to learn because their non-existent pride forbids it. A.I is relentless, and keeps going. And because A.I is programmed to learn, at some point it will write better code than any human.

If, as a software developer, you have predicated your entire career around your ability to write clean, beautiful, well-documented and nicely structured code, you have spectacularly missed the point. A software developer's job is not coding. Your job is to solve problems and provide business value. Sometimes, that involves writing code. If there are people who can write code as well or better than you, since Devin AI can trawl the internet and get their code; subsequently Devin AI can code better than you, faster than you, and with a lot less effort.

Whose code is better?
Who cares?

Is it true that A.I can code better than the average software developer? That's the wrong question to ask. The correct question is, how badly do employers want it to be true?

We can argue until the cows come home, about the qualities human software developers bring - passion, pride, possibly better code. But none of it matters. When your bosses ask you about the progress of a project, do they ask how clean or beautiful the code is? No, they ask how soon it will be ready for production. The sad fact of the matter is, code quality is an engineering concern, and business people primarily care about profits. So even if human beings were truly able to write better code, business owners would probably still be more forgiving of whatever flaws A.I produced, as long as it didn't affect the bottom line.

One could argue that bad code does affect the bottom line. But again, how much would it matter to the customer base?

You could say you would support human-created art over computer-generated art. But when it comes right down to it, would you be able to tell the difference? Similarly, would the average consumer be able to tell if it was a human who wrote the code, or A.I? Would the average consumer even care as long as shit worked to an acceptable degree?

Twenty years ago, I was a web developer. I made database-driven websites for a living. Then came website builders that automated everything I was doing, and put the power of website creation squarely in the hands of non-technical people. Thankfully, I had already moved on to bigger things before this happened. Would these website builders truly be able to outdo the creativity of the human mind? Maybe not. Would it matter if they didn't? Would the average web surfer be able to tell the difference, or even care? How creative or cutting-edge do we truly need websites to be?

Think about all the writers whose work A.I is generating new content based on. Or filmmakers who may be going out of a job once A.I can replicate their work and create seemingly-new work imitating their style. Unless users have consumed enough media, books and films to distinguish A.I generated content from the "real" thing, there is going to be a market. And the machines can churn out more of this stuff quicker than humans ever can. Imagine the profits. By that point, would those profiting care about authenticity? Would consumers care enough to make a difference?

Therefore, it's no longer even about who can do the better job. It's about who can do an acceptable job, for much cheaper.

In summary

To answer the question in the title, how afraid should software developers be?

I don't want to underestimate the power of A.I. At the same time, though, let's not get carried away. Either way, I'm at the sunset of my career and I have just about no skin in the game. If A.I takes over, great. If it doesn't, also great. Either way, I doubt I'll be losing much sleep over it.

This is your world now, kids. Enjoy.

Keep calm and code on,
T___T

Monday, 18 March 2024

Profanity-laced Content Over The Years

Long-time readers of this blog will have noticed that whenever certain impolite words are used in a blogpost, I take care to tag the post with the term "Profanity Alert". I've been called many things (and some of them are even true) but let it never be said that I am not in the habit of giving fair warning.

Through the years, the suspicion that I use entirely too many vulgarities to communicate my thoughts, has been entertained, and I have striven to do better, especially where this blog is concerned. It's not a moral issue for me, more a professional one. And perhaps a desire to challenge myself by expressing ideas in a less vitriolic manner.

Wash your mouth!

Thus, imagine my chagrin when I ran some data analytics on tag usage on this blog, and found that "Profanity Alert" was the frontrunner in terms of appearances.

What. The. Fuck?! (Yes I know, this doesn't help my case at all)

The findings

In 2015, things weren't so bad. I was mostly in techie mode. In fact, The tag "Profanity Alert" was not even in the Top 5. I was all business.


Things changed in 2016. "Profanity Alert" was not only in the Top 5, it was numero uno. At this point, I should have suspected a dangerous trend there.


It didn't get better the following year. "Profanity Alert" was still the most frequently used tag. Tags like "Web Development" came a close second. And honestly, it wasn't even that close.


There was a little improvement in 2018. The tag was still in top position, but it was tied with "Web Development". Which I guess meant that I was at least spending as much time talking about things that really mattered, without swearing.


2019 continued in the same vein. "Profanity Alert" and "Life as an Economic Digit" were tied in first place, with "Web Development" running close behind.


2020 was almost a copy of 2019, as far as the top few were concerned. "War Stories" took the place of "Life as an Economic Digit". Now "Web Development" was nowhere to be seen in the Top 10, and "Singapore" was the runner-up in terms of frequency. It marked a change to me talking more about local affairs.


It was in 2021 where I made a concerted effort to stop swearing excessively. To my pleasant surprise, "Profanity Alert" no longer appeared in the Top 10 after having been a mainstay for years. Go, me!


It did not last, alas; for in 2022, "Profanity Alert" slithered back to Numero Uno spot. Imagine my consternation! Bad habits, do, indeed die hard. It was time to stop being complacent.


2023 looked OK, as I made another effort to limit my swearing. It met with some success, for now my posts centered around Social Media. Thank you Elon Musk and Mark Zuckerberg, for giving me so much material to focus on!


2024 has gotten off to a promising start thus far. I have not used enough profanities in any posts to push the "Profanity Alert" tag to prominence. A little discipline seems to have done the trick.




Moving forward

I hope to continue making progress. While in principle, I have nothing against swearing, I think overuse of it ultimately detracts from what this is supposed to be - a tech blog. There are certainly better ways than constantly dropping F-bombs.

Friends Ultimately Choose Kindness!
T___T

Sunday, 10 March 2024

Ten Professional Hazards Of Writing Code For A Living

Human beings are creatures of habit. Any profession which entails long sustained periods of doing things or thinking a certain way, will have an influence on its practitioners. Years of being a programmer have changed me. In all fairness, some of these traits already existed way before I wrote my first Hello World program, though they were certainly magnified after that.

Here are some of the ways my job as a software developer have shaped me as a person. Some of these may make me look like an utter moron. I assure you that I am of at least average intelligence. It's just the job.

1. Counting from zero

Probably the most stereotypical quirk ever. "Programmers tend to count from zero". Yeah, right. The stereotype exists because in almost all programming languages, counting does begin from zero.

Do I count from zero? No, that's silly. I don't, for instance, look at a cluster of five apples and tell you there are four. Any programmers who do, I suspect, are just trying too hard for geek cred.

There are five apples, not four.

However, it does happen if I'm looking at rows of data. If I see an MS Excel worksheet with data all the way to number 100, I am going to think there are 101 rows of data. It could be because spreadsheets are usually the way I visualize arrays in my head (especially two-dimensional arrays) so my brain subconsciously thinks of spreadsheets as arrays.

Thus, while this stereotype is way overblown, there's some truth to it.

2. Being easily startled

When deep in the throes of writing code, programmers tend to concentrate really hard. I am no exception. There have been plenty of times when my brain is on a one-track highway through some programming logic when someone interrupts me by speaking in my ear or tapping me on the shoulder.

Violently startled.

And sometimes, my reaction can be... absurdly violent. Spilled coffee, sudden jumps, things like that.

This doesn't happen all that much these days. For one, I work from home. Very few people get to interrupt me. And even if they do, the fact is, I simply don't concentrate as hard as I used to.

3. Being very literal

One of the hallmarks of writing code is being drawn into the true or false dichotomy. Thus, there's this tendency to interpret everything as a "yes" or "no" question. To be fair, sometimes the question is literally phrased this way. But that's only as a manner of speaking; often, the asker means to ask a bigger question.

No capacity for nuance.

And because of this tendency, I tend to miss it.

I remember being at an interview where the hiring manager asked me if I knew how the PayPal IPN works. To me, the answer was obvious. Yes, I do. It was only after an awkward pause, and the interviewer rephrased his question, that I understood that he actually expected me to demonstrate how it works.

There were other interviews where I was asked similar questions where I simply said "Yes" or "No". But honestly, I also feel that if you're going to ask a programmer questions like that, you should also be prepared for replies like that.

4. Single-mindedness

This next one does not apply to all techs or even all programmers. But there's a fair number of programmers who tend to take a very insular view when writing code. This can be useful when you're writing a single-use method; as small as possible. Not so much if you're trying to design an entire system.

Laser focus... or just
tunnel vision.

And in real life, that can have negative consequences too. I've gotten into trouble with the wife for doing the correct thing... at the wrong time, or under the wrong conditions. Because the correct thing, in isolation, is the correct thing. But few things in life exist in isolation.

You don't want to wash the windows when it's raining. Or flush the toilet before taking a piss. All this is putting the cart before the horse, which can happen when you consider only the task at hand and fail to take the larger context into account.

5. Overusing techspeak

Tech work comes with a whole lot of terminology, some of which we're sometimes guilty of assuming that non-techies understand. After all, techspeak has become increasingly mainstream over the last decade as technology has become more entrenched with the general public.

This does not automatically mean that everyone knows what SSL is (despite a staggering majority of websites employing it). Or that a gateway doesn't always mean that entrance to a building enclosure. Or that when we refer to something as a "string literal", we actually just (kind of) mean "text".

No, not that kind of string.

And yes, I'm just as guilty as other tech geeks.

We're not trying to sound smart when we break out the tech jargon, cross my heart. Mostly because we are smart, and know it. No, it's largely because our work practically demands that we speak in what could almost be a different language.

6. Expecting output

As mentioned before, I tend to interpret things very literally. However, this also applies in reverse.

As a consequence of writing functions which expect very specific data type values in a very specific order, as input, and produce output which is also of a very specific data type, what happens is that this has a tendency to manifest when I ask questions of other people: I tend to expect a certain output and only that output. And when I ask for A, and people give me B, C and D in addition to A, I tend to get annoyed.

A lot of irrelevant information.

It's human nature; people want to be helpful. But what usually ends up happening is that they give you a lot of information that has nothing to do with what you wanted in the first place.

7. Cynical

Expecting the worst, or not believing it when things go well, is something ingrained in developers who do a lot of testing, which, let's face it, is pretty much all devs at this point.

Does your code ever run perfectly the first time? And when it did, were you happy or wary?

Glass is always half-empty.

Unless it's something really simple that would have been expected to go right, sometimes when things go too smoothly, I get worried.

My time running code has taught me that when no warnings or errors are reported, it's rarely because I did everything right, especially if it's the first run. No, there are probably bugs and gotchas, and they're well-hidden!

8. The How 

One more quirk of being a web developer is that one of the first questions we tend to ask is the How question. It's a professional hazard that may be shared by engineers in general.

Even when I'm asking something like "why did I get an error 500 on this API endpoint call?" it usually gets formattered as "how the hell did I get an error 500 on this API endpoint call?"

Examining how things work.

That's relevant, I promise. Because even in my early days as a web developer, I was curious about how things worked. I would find stuff on the web, wonder how it was done, and then I would make my own, just to prove I could. This also applies to other things in my life, non-tech related. I'd look at something, figure out the how, and make my own. I'm not sure if I'm this way because I was a web dev, or I became a web dev because I'm this way.

9. Lack of sympathy

I've been accused of having a less than exemplary bedside manner when people complain about their shitty careers or lack of opportunities in the job market. I've also been less than impressed with people who claim to envy the fact that I work in tech.

Heart of stone.

Really? Do people want to potentially spend days sifting through code, making changes that may end up creating problems bigger than the one they're trying to solve? Would they enjoy having to constantly remind their less tech-savvy colleagues that software development is not witchcraft and still has to conform to logic? Do they want to go through the slog I did, in order to get where I am now?

I know I don't. I simply happen to enjoy writing code enough to not overly mind that rest of the bullshit I have to put up with. People want the perceived better pay and job security of tech, but the job itself would drive them nuts.

And I have no sympathy precisely because I know I'm not better than the average schmuck. In the sense that, the moment I'm of no use, I'll be put out to pasture like anyone else. Look at the mass layoffs happening in Big Tech if you're not convinced. Job security? What a joke, right?

10. Flowcharting

Agile practices include making use of a Kanban Board system. The use of said system is good for training an organized mind, and lends into the software development practice of breaking up a huge problem into smaller, more manageable tasks.

The good old
divide-and-conquer.

And this has translated over to my daily life, to things like chores and personal projects. Even long-term goals like language learning. Everything is obsessively broken up into smaller sub-components, then managed.

The wife thinks I'm certifiably insane. She doesn't understand why mopping the kitchen floor and mopping the living room floor are two separate tasks, and have to be scheduled. Honey, I'm just a software dev. It's who I am.

What a hazardous list!

Are you a programmer by trade, and if so, how many of these apply to you?

How many apples do you see?
T___T