OK Computer. (I’m it with the IT crowd).

For most of my adult working life I have worked with computers. I am not unique in this, since the early 80s computers have played an increasingly greater role in our lives, perhaps to the point where we could not function without them – certainly not at the level we do now. Whether, in a broader sense, that is a good thing for our species is not a debate I wish to enter into here. On a purely personal level, our increasing reliance on computers has been very beneficial.


From the moment I first entered ‘58008’ into Richard Burn’s electronic calculator and turned the device through 180 degrees I was captured by the possibilities of technology. I feel I must state here that I am not someone that gets particularly excited by technology in and of itself – I didn’t own a computer until I was well into my thirties – but I do like what you can do with it.


After my early flirtation with digital porn it was some time before I moved on to work on a real computer. As is typical in these stories of ye olde computer worlde, the machine I did get to programme for the first time took up the whole of the upper floor in the building in which it was housed and had all the processing power of late 90s Nokia phone. I was studying business at college and one of the modules required us to get to grips with something called Data Processing(DP). Long before anyone had heard of the term ‘Information Technology’(IT) the systems that controlled and recorded flight bookings or just-in-time stock levels were fed data overnight via cards with holes punched through them. This was referred to as ‘batch processing’ and it still goes on to an extent but most of what we now term computing is a real-time process and the Human Machine Interface (HMI) is a touch screen as oppose to a punch card. At its most basic level though the actual computing is principally the same – just speeded up a bit (well a lot).


It seems I have an aptitude for computing, well that’s what British Gas(BG) inferred from my results in the test they set me when I applied to join their ranks of computer programmers. This transpired because I was the data entry clerk in the job I held at the time. I suspect I was given this role because I was the youngest person in the office and because I was totally crap at every other function in which a clerk should have at least a basic competence. It was a bit boring but I discovered that you could get the computer to produce some reports and my boss was so impressed he sent me on a couple of courses and the next thing I know I am programming computers at Head Quarters(HQ). Well, strictly speaking, they never let me actually do any programming on the mainframe beyond the stuff we did in training. They insisted instead that I worked with something called a Personal Computer (PC). I was livid. All my friends were doing the ‘real’ work and I was messing around with spreadsheets (SS) and word processing (WP). Had I been even remotely interested in the actual technology I would have realised that I had been placed in what would become the dominant force in IT. But I wasn’t and I didn’t. I was, however, quite good at getting the PC to do stuff that people liked and then I found that I was alright at getting the PCs to ‘talk’ to one another and the rest, as they say, is history.


I invented the Internet(I).


I didn’t invent the I, that was the US (United States) military if memory serves me correctly. I could check, it will be out there in the World Wide Web (WWW) – everything is. I did use one the precursors, I was a CompuServe member and regularly used the service to download software updates for the hardware components of the PCs we supported. You could also search forums for solutions to common problems and generally communicate with the other ‘geeks’. Then came free File Transfer (Protocol) (FTP) sites that mirrored one another around the world supplying the drivers and software updates that ultimately replaced the paid services CompuServe offered. The trick was always to go for the site with the shortest address and so Imperial College London was a particularly popular site – ic.ac.uk (ickackuck as it was known) – but there were many others if that one wasn’t available. Then came the WWW, browsers, blogging, YouTube, Social Media (SM) and Smartphones. In the future we will have a maturing of Virtual Reality (VR) and Artificial Intelligence (AI) and through the power of Big Data (BD) everyone that wants to know everything about you will do and, worryingly perhaps, much of it you won’t know yourself.


What are we doing with all this ‘tech’ that is literally at our fingertips? Well generally, it seems, we are doing whatever the gargantuan corporations behind the technologies want us to do. In the same way in which we only use a fraction of our brains capacity we are using these powerful computers we all carry around with us to complete the same mundane and repetitive tasks day in and day and barely scratching the surface of their true capabilities. Why on Earth (E) would we do that?


I have just finished reading a remarkable book by Jaron Lanier called ‘You Are Not A Gadget’. Lanier has a life-long passion for technology and has worked for many of the top US tech giants, however it is in the use of the products that these companies have developed that he takes exception.

Something went wrong around the start of the 21st Century. The crowd was wise. Social Networks replaced individual creativity. There were more places to express ourselves than ever before… yet no one really had anything to say.”


This is a quote from the book that explains Laniers main complaint – there have never been more ways in which we can express our creativity, out uniqueness, and there has never been so little original content created. The technologies that are intended to open up the space for expression are actually closing it down and this, he explains, is because we become bounded by particular technologies that take hold in the different specific areas in which they operate. One of the examples he uses is Musical Instrument Digital Interface (MIDI).
MIDI was developed in the mid 80s by a keyboard player called Dave who was looking for a way to link his synthesisers together and achieve some different sounds. Hence the technology was developed around the way in which notes are played on a keyboard – key-up, key-down, how hard the key is played and for how long etc. It doesn’t address the way in which a violin plays notes but then it was never intended to. The problem is that is was adopted quite quickly and broadly across the tech world and became ubiquitous. People wrote software to drive it and interfaces for their computers so that they could produce sounds according to its rules. Eventually is was so embedded and enmeshed in both the products and the minds of their creators that it became almost impossible without extensive redevelopment costs to replace it with something that was batter suited to the task. It became locked in and, as liberating as its introduction might have been, was now equally as restrictive. In my own personal experience the same thing happened with a piece of software called Lotus 123 (123). 123 was an extremely powerful SS application that was capable of not just tabulating data but the programmable manipulation of the cells and cell content. There seemed nothing that was beyond its capabilities especially once what-you-see-is-what-you-get (WYSIWYG) printing was added on. I worked for people that used 123 for literally every computing task they faced. If it wasn’t capable of something the users merely adapted their practices and expectations so that it seemed as if it was. If 123 couldn’t do it, they ignored it or manipulated it in such a way as it was close enough to what they wanted. The same thing happens with MIDI still. Because devices that support the MIDI standard are capable enough at producing sounds similar enough to what they composer is after they are willing to accept the deficiencies because it makes their life easier.


Fast forward (FF) to today and we see the same thing happening all over again only this time the technology is under the umbrella term of Social Media (SM). I am referring here to the ‘big 3’ – Facebook, Twitter and Instagram (FTI). I appreciate there are others but they are largely niche to my mind – applications like Snapchat are the domain of a particular generational (in this instance) group. FTI, on the other hand seem to be used cross-generationally and regardless of social standing. It seems to me that the SM people use the most is the one they prefer and they will use that platform to post content regardless of whether it might be the most appropriate. Instagram, for example, is essentially a way of sharing images. Twitter is more word based and its punchy style encourages short bursts of social, political or emotional commentary. Facebook merges many things but at heart is a way of keeping in touch with distant friends and loved ones.
A significant factor in this is the audience. All of theses media platforms encourage you to make ‘friends’ with people, to follow people and to be followed. Hence the broader ones circle of devotees on one particular platform the more widely your thoughts and ideas are shared. There is even more discouragement to straying from your SM home app by virtue of linking your accounts together so that one post on Instagram will automatically appear in FB.


There is a downside to this and it links back to the software lock-in that we looked at earlier. If we get creative at all, and there is scope to do so on these platforms, then we will tend to do so with the audience in mind and this will certainly affect the subject we choose and the tone with which we treat it. Is that important? I think it is because those ‘likes’ are a bit addictive and eventually we will strive so hard for them that we will avoid honest, original content in favour of something more palatable and safe. Eventually we will abandon creativity altogether and just repost, re-share and rehash what is already there.
How are we to avoid this, what can we do to ensure that we are able to retain and maintain some creative element of our digital engagement with the world?


One answer, and perhaps the most obvious, is to stop using SM altogether. Drastic, I know, but for many SM is an addiction and cold turkey might well be the best way to rid oneself of the dependency. If that is too much then another good thing to try is to periodically switch platforms and take time off one or more of them. If you’re an FB person then try Instagram for a bit. Or you could do what I have recently done with Twitter, I have an account and I have no followers and I follow no one else. This permits me to see what is trending and browse through the commentary with no vested interest and no commitment to comment. I also like to swap between an iPhone and an Android device just so that I don’t fall into habits. The other thing I have done is to set up a website for myself on which I publish music, words, pictures etc. but in a way that I want to and if I want to change the way these things are presented then I am free to do so. I have no means of measuring who looks at my work and there is no space for comments of feedback of any kind. I find this frees me up to create the content that I want to without the external pressures that posting on SM imposes.*


I appreciate that this approach might not be for everyone but it is an honest attempt on my part to examine my relationship with the technology I use but, much more importantly, my relationship with myself and those around me.


You might like to try it yourself.

* Since I wrote this I have succumbed to sharing my website content via Facebook. I did run the website in the way I have described here and it was liberating to an extent but I really want to share my work and I have very few options for doing so outside of SM. Please don’t judge me too harshly x