Many, many moons ago, I dabbled in iOS development. At the time, everybody seemed to be jumping on, but, holding me back was not only my fears that it would just be a passing fad (how wrong I was), but I also found the whole process nightmarishly difficult. I'd approached it all wrong, at the time not understanding the concepts of object-oriented programming, thinking I could just jump in and somehow it would all fall into place, and, like a house of cards, eventually it all came crashing down.

Since then, I've watched with intrigue as numerous friends have entered this shiny land of the App Store, and today, it was the turn of one such friend: Andrew J Clark.

Most of you will know him from The Menu Bar, but I first 'met' Andrew when he was putting together his precursor to TMB: I Like This... Podcast. I now recall a very in depth discussion about the technicalities of podcasting, mic technique, and goodness knows what else. But it wasn't any of that which stayed with me, in fact I'd forgotten that conversation had happened at all until right this moment; what did stick with me however was his innate confidence and an instinctive tenacity that was immediately compelling. Fast-forward to today, and that particular cocktail of talent and creativity comes to us again, this time in the shape of Numerical.

Should calculators look delicious?

Should calculators look delicious?

"You mean I read all that for a review on another calculator app?"

Well yes, actually, you did. You see, I imagine it's pretty easy to pass off a calculator app as somewhat inferior to 'real iOS development', whatever that means. The staple of iOS programming textbooks the world over, it is likely the first app that many have pieced together, bashing out example code without really understanding what they do, what all those square braces are for, or quite why you need at least fifty lines of code to add 2 and 2 together (sorry kids, that's just Objective C for you). It's at this point, like me, that many reevaluate the way they're going to make their fortune, and burn the textbook in their backyard for fear that it might crawl into bed one night and strangle them in their sleep. However others don't heed the warnings and release the dirge they created anyway, thus the App Store is bloated with a billion fakealators you wouldn't trust to spell 'boobies', let alone use to actually add numbers together.

So when Andrew asked me to beta test his first iOS app, it was with a modicum of trepidation that I discovered he actually wanted to release his calculator... Was I right to be wary?

First impressions

Launching Numerical for the first time reveals the app's rather beautiful icon centre-stage, a nod to what is to come, followed by a fade to the main window of the app itself. As you'd expect from a calculator app, it makes use of every pixel on the screen, with the top quarter given over to output and the bottom three quarters, there or thereabouts, reserved for input, and interestingly, proportionally identical to iOS's Calculator. Helvetica Neue Ultra Light is the typeface of choice, again, in sync with Apple.

I know which looks more akin to iOS7 to me.

I know which looks more akin to iOS7 to me.

But there the similarities end.

Immediately noticeable is not the functions of the calculator, but the stunning colour scheme Andrew has chosen. Awash with a gorgeous gradient (I'd hazard a very bad guess at #fc5c41 through #de3fa0), it helps not only to attract a certain intrigue to the app (should calculator apps look delicious?), but also to visually separate both input and output; it's immediately clear what is where. Gradients came and went in the early 2000s when back-bedroom Photoshoppers plastered everything imaginable in the damned things, yet Apple, with its penchant for revitalising times past and making them cool again, has made a statement with eye-popping gradients in iOS7 and so here, I can see the appeal immediately. In fact, in comparison to Numerical, you suddenly realise how out of place Apple's own offering actually looks, a relic of a time past itself, when compared with the zing and pop throughout the rest of the OS.

Intuitively appreciative of the user

Well, it adds things up, which is a good start, I guess.

On first launch, the output display is filled with a mini-tutorial, that gives you insight into the gestures that can be employed to both navigate through the app as well as some neat touches such as swiping down to start a new equation with the currently displayed answer and swiping up to save the entire equation to a history page. It's touches like these that demonstrate a thoughtful approach to how calculators currently work, and their foibles. Everyday I type out an equation, write down the answer, clear the memory, and then type out a new equation with that answer. With Numerical, all of that nonsense is no more.

And it's around that point it hits you that Andrew has quite frankly lost his mind.

You see, I've never been good at maths (we spell it with an 's' here in the UK; god knows why, ask the Queen). That said, I like to think that I have at least a basic understanding of how an equation works. At least the last time I checked, what happens on one side of an equation has to be equally matched with what happens on the other. So yes, contrary to popular belief, 2 + 2 will equal 4 and 3 - 3 will equal, err...


Anyway, the point is this: in Numerical, there is no ability to type in an equal sign and round out the equation. It simply doesn't exist (believe me, I checked). And rather than calling out the insanity of its creator as I so unashamedly just did, that's actually the genius of this app: Andrew has eradicated redundancy, removed complication, and put simply, thought about the user. Sure, an equation has to be able to equal, but why should the user have to worry about that? It is the part of an equation that must happen in every instance, so why can't the app take care of that repetitive burden?

Numerical does just that. Add two numbers together, and by the power of witchcraft, the answer appears up top. Plus, take the equals sign out of the equation (see what I did there?) and you suddenly have room for keys that would normally be left out, or otherwise relegated to a different view.

It’s the small things.

But aspects of this app's intrinsic usability don't stop there. Dependant on the part of the equation you're at, certain keys will grey out, visually cueing you for what you need to input next. Just tapped an operator? They grey out. Just tapped an opening parenthesis? Well duh, the closed one isn't needed yet. Just hit the decimal point? Come on, what kind of idiot types two points in succession?

One thing infuriates me about traditional calculators: unless you spend more on a scientific version that is both cumbersome and, outside a nuclear test-site, frankly unneeded, you have no idea what you've already pressed. Make one mistake, and it's game over and you must start over. With Numerical, in one iPhone-sized display, you have both answers and equations displayed prominently, one above another, seemingly without any sacrifice in usability or legibility. For me, a guy with fat fingers and a terrible memory, who should probably wear glasses, this feature is a winner.

Packaged with the main view of the app is a secondary window (swipe from the left edge) that lists all your previously completed equations. Traditionally of course, your Casio would have room for one answer in memory, but it had no way of storing how that number was reached: again, Numerical comes to the rescue. It's the small things.

With every up...

So there's a but coming, right? Well, yes and no: none of this is major.

First impressions count. Everybody remembers the intro videos from their Macs of old. Ken Segall once talked eloquently about how much time, money, and effort, Apple spent on getting those very few seconds absolutely perfect, and, when you compare the duration of those videos to the total usage time you'll get out of your Mac, you can see right there, exactly what Apple thinks is important, and why their products have the appeal that they do. I'm not saying Numerical should have a million-dollar video to talk you through the features of a calculator app, absolutely not, but it does seem like the tutorial is perhaps unfinished business. It's minor, but compared with the elegance of the rest of the app, it does seem a little out-of-place. Oh and a thing only a language student such as myself would say: why are 'Left', 'Right', 'Up', 'Down' and 'History Page' capitalised mid-sentence? I know, I'm splitting hairs.

Secondly, to sounds. I'll state upfront, I've never been a fan. I turn off key clicks on every device I own, if an app has an option to silence sounds, I'll take it, and if a website employs any aural output, big or small, I'll click away. I hate the shutter sound when you press the power button on your iPhone, the swoosh when you send email from Mail, and that interminable system sound when you power on your Mac. To date, only Tweetbot is the exception to that rule. I have no idea why its sounds seem to resonate with me when others don't, but it seems like it will stay that way. It's not that the sounds in Numerical are offensive, they're just not for me, and, with no ability to turn them off from within the app, the only response is to silence my phone, which is not a solution that fits every scenario. I can understand that aural feedback may be desired, or even required, but with a display that gives you all the insight into key presses that you need, I do wonder if it's a step too far. (Of course I'm acutely aware that I entirely railroad accessibility by making this point, but even then, there's better ways to point user's in the right direction.)

And finally, to Helvetica. Within this review, I've already stated that Andrew has accented the design cues of iOS7 to his advantage with this app, and here I am using the exact same argument against him in the use of iOS7's system font. Well, yes and no. With the gradient, Andrew took it from one section of iOS7 (system apps) and applied it in a different situation (within the app itself). That I like. It calls out the aesthetic and uses it in new and unanticipated ways. With Helvetica however, and especially with Helvetica Neue Ultra Light, it just seems a bit, well, 'samey'. Don't get me wrong, there are some great touches here: Andrew took the time to replace Ultra Light's standard comma and period with much more, visually distinguished variants (a trick that Apple also employs in Calculator itself), however as much as I am an ardent fan of Helvetica, it is just...Helvetica.

A calculator without equal

So, adding it all up (even I winced at that one), it's clear that Numerical is head and shoulders above what the App Store already has to offer. Every element of design, in how it looks, and how it works, has been carefully considered, and there is a plethora of intuitive touches that make using the app so simple and so joyful.

Andrew has clearly thought out the schema of standard calculators and ameliorated current functionality with a superset of design elements that come straight from the heart of somebody who understands modern UI/UX principles, so much so that Numerical features proudly on my first screen of apps and will be replacing the system calculator for all tasks from this point forward.

Sure, there could be (tiny, incy-wincy) improvements, and from a 2.0 maybe everything from more scientific functionality through to variations on colour schemes could be options, but hell, what do I know? This is after all Andrew's first app; all you've got to do is look at Numerical and you see into the mind of somebody who 'gets it'.

And of course, there's at least a smidgen of envy in that he persevered when I did not. Goddamn clever-arsed, Australian, podcasting, iOS developing, s...

Numerical is available now in the App Store for £1.99.


Recently, Zac and I had a to'ing and fro'ing about the most insipid sensation to hit technology for some time: wearables.

And so it continued. Following that, Zac wrote an article.

As always, Zac wrote incredibly eloquently about this newfangled technology however personally, I can't help thinking that much like smartphone, post-PC, or innovation, even the terminology that has been chosen to represent this next big thing is wholeheartedly ridiculous on the tongue and unmitigatedly egregious in meaning.

All that aside however, I disagree when Zac posits that this technology is going to be useful. If you recall Episode 44 of Too in which Steven and I discussed watch technology in detail, you'll remember my apathy towards these devices. The episode centred on a phenomenon of which we are all slaves: information overload. We discussed the fact that if these watches did reach critical mass, then rather than being useful in assisting us to move forward in life, they would simply create yet more channels for annoying, and frankly, unnecessary notifications to infect our minds. If you think about all the notifications you currently receive and make an honest assessment of how many of them you actually require, need, or give a damn about, and then overlay a further, much more intrusive channel of delivery in the form of one of these watches, it's not difficult to understand why I find the notion of this technology to be largely misconceived.

Of course, I appreciate that if the tech did mature, notifications would not be their only purpose, but the fact still remains that even if it did mature, notifications would never not be a part of this tech, and that is reason enough never to go anywhere near it.

Having said that, purpose is but a sideshow in comparison to the primary reason why I believe this technology will fail in its current form: personalisation.

Up until now, consumer technology has taken the form of an item that is separate from its user. Your desktop is a box that lives in your home, your laptop a box that you carry in your bag, and your phone a box that lives in your pocket. All of these boxes are other to their user; we interact with them, but they are not part of us.

Nobody wears their uniform outside work.

With the current paradigm, uniformity is perfectly acceptable, and usually desirable. If at a coffee shop there's the guy by the window tapping away on his MacBook, at the moment you take out your MacBook, an implied connection has been made between you; you are subscribing to the same image, brand, or way of life. You can associate. Whether consciously or otherwise, you are marketing the fact that you conform to a certain image. And let's be honest, us Apple fans are the worst for it.

When you visit a store for clothes, shoes, glasses, or any other paraphernalia you may attach to your body, the statement you are making is not one of uniformity, but one of individuality. By buying this coat over that one, or that shirt over this one, or by wearing those glasses instead of these, or by wearing your hair in this way rather than that, you are making a statement of individuality, you are creating a separation between you and them. What you decide to wear defines who you are, colours your personality, and provides visual cues to your inner idiosyncrasies and characteristics. And that is where wearables fail.

Watches and glasses cannot be uniform: nobody wants to wear the same thing as the person next to them. Uniforms bind us in the workplace. They create a connection within a group of people to link them to a company. But nobody wears their uniform outside work.

Much in the same vein, I do not want to wear a watch, or a pair of glasses, or a pair of shoes, or a wristband, or anything else that is so-called wearable technology, if that thing makes me look like everybody else. And until technology that can be worn can be made to personalise me rather than divest me of human individuality, then it will always fall short.

Apple more than any other technology company is not known for personalisation, in fact it goes out of its way more often than not to preserve its perfect, beautiful image of uniformity. From its products to its website to its persona to its marketing: all of it prescribes to the same uniformity. In fact, all technology companies make their products with at least a nod towards uniformity. And that is why I find it incredulous to believe that these companies can shift to a new paradigm for technology creation; a fundamental reassessment of their current design principles would have to manifest itself before this technology could compete with traditional products in these categories. Put simply, in the wearable space, more choice is desired, not less.

So for now, though I am likely in the minority when I associate myself with this view, for me wearables are nothing more than a hugely misguided fad.


In this week's episode of Too, Steven asked me: "So is technology your religion?" to which I replied (and I paraphrase): "We're all waiting for 2007 again. We're all waiting for our next miracle."

Religion, in the traditional sense of the word, and I, are not good bedfellows. I've tried many times to 'enforce' religion on myself, and every time it has been an exercise in utter futility. I'm not certain what I thought I may be able to achieve, but attempting to reason myself out of my own reasoning was certainly something to behold; I imagine any hypothetical ventures into dancing, singing, or playing the flute, would look pretty much the same.

So when Steven asked me about religion, I thought we were doomed to discuss a huge phenomenon in the world of which I know barely anything. (Which wouldn't matter, thinking about it, given we nearly always talk about things of which we know barely anything anyway.)

As it happened, I conjured up a half-decent retort about how 2007 was our modern day equivalent to the parting of the sea, and how as a result, everyday since then until the next 2007, would feel to us like it wasn't a 'big week' week.

You see, at the time, I treated Steven's question with a level of flippancy which totally undermined his perfectly reasonable, and ever so articulately articulated question. And ever since, it's been playing on my mind.

I do believe that belief is something most people crave. And when I say that, I say it in the most literal of senses: I don't think it matters one iota what that belief actually amounts to, so long as intact is the physical act of believing. The trouble with belief of course, is that most people would find themselves unable to agree with that point, such is its potency for binding people to their chosen brand of faith. But irrespective of that, it seems to me that believing is simply a measure of the human condition. So if you believe in a god, great. If you believe that wearing a tinfoil hat will save you from a supposed alien invasion, good for you. And if like me, you believe that Jobs was a genius such as Beethoven, da Vinci, or most of the population of Ancient Greece it seems, then knock yourself out.

In technology, every aspect of religion is captured and covered.

And so, when Steven posed that question, as asinine as all my responses sounded, in hindsight, I stand by what I said: technology is undoubtedly a religion.

We have our idols. We have our temptations. We have our leaders and our prophets. We have our book of hymns and our book of words. We have our churches and our synagogues. We have our followers and our sheep and our blind believers. We have our uniforms and our badges. We have our crazy ones (and not in a good way) and we have our moderates. And we have our wars, boy, do we have our wars.

In technology, every aspect of religion is captured and covered.

Think about it. All of us are guilty of filling virtual page after virtual page with tweets and posts and articles about how Apple's vision of a black rectangular slab of metal is far superior to Samsung's. Or HTC's. Or Microsoft's. Tell me what makes that different to the incessant nitpicking over one version of scripture to the next? It all amounts to slight divergences built up into standpoints, built up into communities, built up into societies, built up into countries of people, literally ready to defend their version of what is good about their version of what is better about their version of what is best. And to the bitter end.

On reflection then, when I think about it: who am I to act with derision over those that believe in God? My god is just as unbelievable. My god is capable of the same level of propaganda and pretence and pathos. And my god I would defend nonetheless.

Seems like religion did find me after all.

The future is not that far away

When we look back over the entire achievements of humanity, which ones will we pick out as those that really changed the game?

For every person reading this, that list would be different. Sure, there will be commonality, but no two lists will be the same. Everybody holds different attributes dear to their hearts. Everybody has a checklist of what they admire in a person, and what they find unacceptable. That dividing line will be wildly different from person to person; even within the same person, that line will fall in different places on different subjects. The variety of life allows that to be the case.

So for me, when I look back, and consider what, as a body of life, this species has managed to achieve, I lurch deferentially to those that have given us advancements in science, technology, and the arts. For me, those areas are visceral, imperative, and entirely fundamental to the progression of humanity. And within those areas, there have been some truly remarkable accomplishments.

What I often wonder though, is if those people had not done those things, if history had taken a different path, would those things have eventually happened, and thus our lives would be largely the same as they have transpired to be, or would the here and now be a different here and now entirely?

Of course, we'll never know, but what I'm driving at is this: were those achievements only possible by those people at that time in those situations, or is it that actually, we're all possible of anything, no matter what?

Feasibility is but a matter of time, not a matter of skill.

If art illustrates a world of the impossible, science and technology blur the lines of what is possible. What we would never have believed as reality just years past, we now take for granted everyday. In that regard, the whole is most definitely larger than the sum of its parts; what we have put in motion is bigger than any of us, and all of us. And so, we are witnessing a change to the way that humanity thinks, and the way that it operates. Feasibility is but a matter of time, not a matter of skill.

In effect, if we can all create the future, then either we're all visionaries, or none of us are. And either way, the premise of 'the visionary' becomes meaningless.

There are many individuals throughout our history that have been given the moniker of 'visionary'. Allegedly they have some supernatural power that we mere mortals do not. They can grab the future, so it goes, and pull it into the present. They can shape our lives after the model of their world. And don't get me wrong, people can inflect history. Some people are more successful at it than others, and in some instances, it is only with the passing of time that the inflection becomes apparent, but in a world where anything is possible, even 'predicting' the future, then there is nothing super about that power at all.

You see, art, to start with, and now more so science and technology, have given all of us those powers. They have equipped us with furthering this species beyond the scope of what this species was likely ever designed to create. All we have to do is do.

So when I think of who I would claim to be a visionary, I don't embrace that process with any specific humans in mind. I look back over our entire achievements and give praise to the fact that we've managed to do what we've managed to do, that we've progressed to the point that we've progressed, and that right here, right now, we stand, as we have stood everyday, on the brink of something special.

The future is not that far away.

Language #6

That joy from the first cup of tea in the morning. That feeling of achievement following a run. That delight garnered when all your homescreen apps fly into place. That feeling of warmth after being out in the cold. That place you go when your favourite song plays. That smile it brings when you find something you lost. That inner feeling when you've been kissed. That breath of fresh air when the worry lifts.

For me, there's another: that revelling emotion when you speak to someone in a foreign language for the first time, and they understand entirely what you mean.

I think it's fair to say that glossophilia is deeply entrenched in my world. All this technological and philosophical cruft that I gabble on about everyday is but a sideshow in comparison to my love for language. If technology were earth and philosophy the moon, language would be the universe. And so today, I've decided to add Language #6 to the list.

What's leftfield about this love is that I seemingly have no understanding of its origin. My parents don't adore language, neither my grandparents. I don't have polyglots in the family, and, without wanting to sound too glib, I'm English; when it comes to the practice of learning language, the last thing for which this nation is known is understanding, humility and graciousness. But yet, in the face of such adversity, it appears my craving has assembled nonetheless.

It goes deeper though. Sure, the mechanics of taking in the words, committing them to memory, and slowly building up a bank of words that turn into phrases that turn into conversation that turns into language is enthralling, as is the act of communicating with others via this newfound skill. Evidently however, I see this linguistic influence pervade so many other areas of my life. Emails can take me many minutes, sometimes hours to craft. Does it resonate? Is the intonation correct? How does it flow? If the pace changes, what does that infer? How can I simplify without omitting? Will this word, or that sentence, appeal to the reader? It goes on and on. You can hear verbal ticks when I speak too. There's phrasing that I latch onto, evolve, and repeat. Variations for 'yes' is my current hang-up. Listen to any episode of Too and you'll hear 'sure' and 'correct' many, many times in place of their three-lettered alternative. In my writing, very rarely do I use 'to get' as its efficacy is entirely redundant. Try it: for any sentence in which you use 'to get', replace it with a verb more specialised, more meaningful. It won't be a difficult task. I'm a huge fan of the Oxford comma, and I often start sentences with conjunctions knowing full-well the 'rules' say it's not allowed. Oh, and you'll notice in my tweets that laconisms are a particular favourite. Like this one. And this.

And now, full circle, I’m back closer to home. Things have a funny way of doing that, don’t you think?

There's a beauty that can be derived from language. Not just in the way that it sounds, but also through the way that it looks: I find typography delicious. I dislike the typeface we use on the DYHAMB? site (which won't be around for much longer), and I lust over new and old fonts as much as all those crazy kids lined up outside Apple stores this weekend have been lusting over a gold one or a silver one or a grey one. You'll notice a heavy emphasis on typography in our show cover art. I can't help myself. A photograph is sometimes prescient, but a well-chosen word, in a considered font, can evoke a picture in the mind that is unique to every reader.

So, given the aesthetic value I place on language, it may come as a shock that this time round, I'm picking up German. Not renowned for its romanticism, of course, neither which is it much considered a 'language of use' given its relatively small user base, but German is the one I'm going after next all the same. For more time alive than dead, I've been learning European languages. My schooling revolved around them, university too. So when I left, I rebelled I think, and that's when the love for Japanese entered. And then Apple happened to me, followed by a wider appreciation for technology, and it was only natural I guess that once I understood that even the web was based on language, that I would start tinkering with it. And now, full circle, I'm back closer to home. Things have a funny way of doing that, don't you think?

Ever since a trip to Berlin, German culture and language has started to take on precise and exact meaning for me. Preempting Germany is a difficult thing. The obvious elements of the nation's history come to mind, but when you're there, all of that dissolves. Well, it doesn't really. Almost every street corner has a reference, or indeed a relationship to that very different time. But what inspired me so much, is the truly passionate nature in which this country has come back from the brink, time after time, more resilient, more determined, and more impressively than the last. Today's Berlin is one of a hotchpotch of cultures. It's taken all the best elements of pre-war Berlin, and reimagined them for today's times. There's something bohemian about the place, something visceral and urgent and demanding, but also calming, sympathetic and inviting.

Sure, I've only just scratched the surface, but that itch needs scratching some more. And learning German is the key.

And so: that joy from the first cup of tea in the morning; that feeling of achievement following a run; that delight garnered when all your homescreen apps fly into place; that feeling of warmth after being out in the cold; that place you go when your favourite song plays; that smile it brings when you find something you lost; that inner feeling when you've been kissed; that breath of fresh air when the worry lifts; that revelling emotion from learning some more words, and opening some more doors.

Universe: A Prelude

The speed at which technology iterates in the here and now is breathtaking. Six years ago, iPhone hadn't been invented; now look what it can do. Moore's Law may be slightly less stable than it has been of yesteryear, but the basic principle applies: the rate of technological advancement is unquenchable, undeterred, and unstoppable. Whether you envisage that the singularity will indeed be the next logical evolution of humanity or not, it is perfectly understandable to be awestruck by what technology is doing right now, and what it will be able to do in the future.

But roll back the tape for a moment. It's 1750, and the tea trade is just getting going. Tea is being planted all across China, and Europeans, with their boats and their superiority complexes, want it. Tea costs £100 per pound. It takes months to get it from one side of the world to the other. And when it gets here, only the rich can afford it. Fast-forward to the present day, and you can buy 200 teabags for an eighth of the minimum hourly wage. It took a long time to get to this. It took us six years to get to iPhone 5s.

Extrapolate the revolutions of that timeline over a long enough period, and it's easy to begin to envisage a situation where steps forward in our abilities, technological or otherwise, are so momentary, that the very essence of those quantum leaps is forgotten before they've been borne. At that point, our ability to capture those moments of sheer brilliance, is muted. To all intents and purposes, technology is irrelevant.

In a world where computing power can double, triple, or quadruple in magnitude faster than you can blink, it's clear that to our children's children's children, what we call out as 'innovation' now, will be laughable then. God knows what they'll be able to achieve, but with that power and that capability within such a short time frame, technology will all but disappear from consciousness. It will be invisible to the naked eye, it will be imperceptible but omnipresent, never showing its face, but always there when you need it (and when you don't).

This scares people. In that world, power is infinitely more dangerous than it is now, they say. Control of others is control of all. Ability is overshadowed by desire. Individual conquest will outweigh societal improvement.

But, just think about this: in a world where technology is hidden, where it develops and builds and evolves at speeds imperceptible to humans, wouldn't it be true that Apple's current vision would be all but complete: that technology disappears in place of experience?

And if technology has all but disappeared, and experience is but technology, how do we discern where one ends and the other begins? In fact, if technology would be all-encompassing, all-pervasive, would reality be but a fiction?

Join Steven and I, as we debate one simple thing: "Is the universe really what we think it is?" We hope you can join us.

(Episode 36 of Too will be out tomorrow. You can subscribe in iTunes, totally free. Or join us on Twitter.)


It all started a long time back. I tried to create an iPhone app. It was an unmitigated disaster. I had no more idea of what coding was or did or how to use it than I did flying to the moon. And there I was, jumping directly into Obj C and wondering why my feet never touched the bottom. I sank like a rock.

At the same time, I found myself looking at webpages like I'd never viewed them before. Rather than just taking in the content, I was mentally attempting to break them down into their component parts, understand their relationship with the browser, the user, and the code. I had no idea this is what I was doing, but it slowly, surreptiously, overcame me.

And then this day happened where, when books were still made out of paper, I found myself in the always-paltry IT section of the bookshop. I saw this hefty tome, advertising that I'd be able to code for the web like a pro after just six hours, so I avoided that one and bought a real book on HTML and CSS.

It took me about a week to get HTML. And about three days to get CSS. And then it took me about six months to put anything decent together. Half decent. Maybe.

That’s the power of words.

Coding is the best thing I ever decided to teach myself. No, seriously. It blows me away that we have the ability to adapt the language that we speak into something that can create things. Words build towers. It's stunning to think about it. And now that I've got a few more languages nearly, almost, not quite under my belt, I find myself thinking about things I could build. Things I could have a go at doing. I've never thought like this before.

In fact, it's changed my entire thought process. I'm still rambling and inefficient and entirely illogical at times, but when I give an issue some thought now, I come at it as you would come at a puzzle. I see the parts, see how they move, and then try to work out how it was built. Or how to replicate its action. I apply ifs and elses, trues and falses, loops and functions, and all that good stuff that code is so eloquent at articulating. Of course, this black-and-white approach is terrifyingly inappropriate for many situations, but still, I can't help myself. It's therapeutic to think things through in logical, causal steps. My thoughts seem to coalesce more than they've ever done before.

I taught myself all of this code so that I could build a CMS to power Do You Have A Mountain Bike?. And I continue to build the CMS to prove to myself that I can code.

The truth of the matter is, I may never do anything more than create that internal system that only a handful of people will ever see or use, but you know what, it's given me back my creative impetus to just think about doing stuff. And that's the first step to doing anything, so who knows what might come next. And to boot, I now think almost sanely about things before jumping.

That's the power of words. I don't feel like I'm sinking anymore.

Rhaudri's birthday

So, I released some new-old photos onto RLT today. I say new-old as I took these some time ago, but have just never got round to getting them on here.

So here they are. Just click to the right-side of the gallery to scroll through the photos.

An analytical analysis of the analysts

It astounds me to read the drivel that pours out of some people's mouths. Here we are, just a day from the latest Apple earnings call, and already the torrents have begun. In fact, they never stopped, but you can't help thinking that it's around these calls that the dirge intensifies, the miasma shrouds all possible escape routes, and suddenly people become hysterically hellbent on being the most hyperbolic, sensationalist or just plain asinine.

And I'm fed up with it. It's bores me senseless.

People need to grow up. They need to take a breather before they open their mouths, and they certainly need to stop writing, or using words in any way, shape or form.

Right now. Immediately.

The reason I say that is simple: I'm certain that should this gibberish continue, the entire world will implode under a colossal weight of analytical claptrap.

The fact that these cowboys call themselves 'analysts' is a mar on the real analysts that actually take pride in their work, do more research than spending ten seconds on Wikipedia, and take the time to put forward a respectable and well-reasoned argument. If analysing is but plucking an arbitrary number out of the air and judging an entire company based on its complete and utter randomness, then Jesus H Christ, who the hell isn't an analyst? How do these people get the air time? Which idiot is paying them to make up this stuff?

To my mind, Apple receives scrutiny of an intensity that no other company on the planet has ever experienced. It'd be startling, I think, to work out just how many times the word 'Apple', with a capital 'A', comes up in conversations, on the web, in papers and on TV each and every single day. Put that in one of your fancy analytical graphs, lined up against the other players in the field, and that might actually be a piece of analysis worth looking at.

I often wonder why Apple has a spotlight constantly shone upon its aluminium exterior. Why is it that Apple has been singled out as the one to hate, the one that's doomed, that one that can't innovate, or change, or do anything at all apart from nothing but then it can't even do that because nothing is as bad as doing something but something would be seen as weakness but so would nothing...

Just stop it.

That’s why the amount of articles claiming yet again that Apple is ‘doomed’ would likely fill an entire quadrant of the universe.

And the reason for this incessant, rabid frenzy? Think about it: it's the company with a fairytale beginning, that went to the brink of bankruptcy and back, and then became the most profitable, wealthiest company on the planet. It has had a visionary at its helm, and now another man of undeniable flair and resolve. It employs more talent per square foot than most do globally. It has changed the lives of millions. It operates as a start up but commands every stage it occupies. It does what it pleases, heeding the warnings and catcalls of nobody. It plays its own game that others find impossible to understand or to emulate. It invents things and it copies things better than everybody else. And it has sex appeal when a tech company certainly should not be sexy.

Let me put it another way: you see a guy or gal walking down the street who you think looks like they have it all, whatever that might be to you. Your first feeling is admiration, in all its brevity, followed by a sustained period of jealousy, envy and bitterness. 'Why do they get all the good luck?', you ask yourself.

And that's where we are with Apple. That's why the amount of articles claiming yet again that Apple is 'doomed' would likely fill an entire quadrant of the universe, because these 'analysts' are almost foaming at the mouth as they will something, anything, everything to go wrong at Apple. Every chink becomes a headline, every minor mishap becomes a mountain.

I can't have everything, so neither can Apple.

Jealousy is a foul, toxic emotion. It breeds insecurity and pushes people to do things they would otherwise never have thought themselves possible of doing. That's why these 'analysts' are making up this stuff, scrambling to denounce Apple's future (and its past, and its present). And that's why they can't bear to see Apple succeed.

Here's an idea: do some research, write something truthful, and be happy with your lot in life. Some people, and indeed companies, rise to the top, others don't. It might not happen a lot, but it certainly can happen. And that's exactly what Apple has done. Now get over it.


I'm having a Bowie day. 'Changes' is, without a doubt, my most favourite song of all time.

As Bowie says:

Still don't know what I was waiting for

And my time was running wild

A million dead-end streets and

Every time I thought I'd got it made

It seemed the taste was not so sweet

So true. So much of life is like this. I find myself trying so hard, trying anything, everything, just to prove to myself that I can do it. I'm not alone, I feel.

But then, once you've accomplished it, it never feels the same again, does it? Expectation, anticipation; that nervous state of being nearly-but-not-quite is so much better than the actual thing. At the time it feels like hell but once you've made it, bah, who cares, it's time to move onto the next 'nearly'.

He goes on:

I watch the ripples change their size

But never leave the stream

Of warm impermanence

So the days float through my eyes

But still the days seem the same

Things might look like they change, like they evolve, like they iterate, but actually, nothing ever changes. This fits squarely into my theory of innovation: there is no such thing. People and companies make all this noise about changing the world, and sure they do, but not through anything new. Altered, adapted, transmogrified maybe, but never innovated.

And finally:

Pretty soon now you're gonna get a little older

Time may change me

But I can't trace time

I said that time may change me

But I can't trace time

Again, personal change is but an illusion. Time rolls on, things bubble and spark and ripple, but actually, on the face of it, it's not us or the thing we've created that changes anything, rather it's the environment, the culture that it manifests that changes. Things around us change, but we ultimately stay the same.

Bowie was a genius.

Match point

This week, I've been utterly absorbed by the action at Wimbledon. Apart from F1, it's the only mainstream sport that piques my interest, so it gets headline billing at the Taylor household. Since moving, I've finally got a study, one with an awesome desk no less and instead of investing in a truly terrible, low-end TV to perch on the end of it, I've dug out my old 2009 iMac, and, along with the magic of BBC iPlayer Live, have set it up as a tennis-viewing machine for the next fortnight. It works a treat.

But one thing is bothering me.

You see, if you've been following my progress over the past few weeks or months, both in this place and on Twitter, you'll note that 'redundancy', 'wastage' and 'ornamentation' have been getting somewhat of a bad press. I started with the English language and have since ridden roughshod over language in general, iOS, coding, architecture, design, formalities, space exploration and will eventually no doubt, as so many things tend to do, end with the meaning of life itself.

This week however, with Wimbledon in full swing, it's the turn of the tennis scoring system. You see, you don't have to be an aficionado of the sport to realise that scoring in tennis is somewhat of an anomaly. And, in an environment where archaic and seemingly indefensible peculiarities of the past that linger on well beyond their sell-by date are all but outlawed, I zeroed in on it almost immediately.

For starters, describing what constitutes a tennis match is almost as Byzantine as the scoring system itself. In football, a match is a tale of two halves: you play one half of the pitch to start and the other half to end. In baseball, the teams swap halfway through. In F1, a lap is one rotation of the track. When swimming, you get to the end of the pool and then you come back.

In tennis, not so. In fact, before you can even begin to describe movement on court, we need some understanding as to why that movement happens. And it's actually with some sense of logicality that you start off, like in many sports, by acquiring points. When you have enough of those, you've played a game. So far, so good. But then, in lieu of that aforementioned logicality, when you've got enough of those, you've played a set. Then, and only then, when you've got enough of those, then you've played a match. Following so far?

Logically, the score starts at 0-0. You would naturally assume therefore, that the next point won would change the score to 1-0 or 0-1. Wrong.

Well, let's shake it up a bit. Back to movement. So, in tennis, one player serves first. Somebody has to. S/he will stand on the right-side of the baseline of the court and her/his opponent will stand diagonally opposite on the left-. When that point is played, they swap to the other end of their respective baselines. When that point is played, they swap back, and so on and so forth. After the first, third and fifth games, players swap sides of the court altogether, and then this merry dance starts all over again. Oh, and just to make it doubly clear, if a set ends and the total number of games played is even, then the players play the first game of the next set before changing ends. Yup.

Oh, I almost forgot: the scoring.

In most sports, if a player performs an action that results in the winning of a point, the scoring of a goal, the completion of a run; much in the same way that we tend to iterate through if-loops in programming, we start at zero and each point, goal or run results in an addition of one to that score. And so the beat goes on.

In tennis, not so. Logically, the score starts at 0-0. You would naturally assume therefore, that the next point won would change the score to 1-0 or 0-1. Wrong. In tennis, the first point is 15, the second is 30 and the third is 40.

But hang on a minute. Whoa, whoa, whoa. What the...

Yes, you heard me. We have to look to 12th Century France for the answer. You see, back then, Rolex hadn't been invented yet, and so those fancy scoring boards you see at the Majors weren't quite yet ready to roll, notwithstanding the fact electricity was a pretty long way off too... A clock of a different nature, however, was in play. In keeping score, French tennis players would use a clock face, the idea being that you would move around the dial by 90 degrees for each point scored. Starting at 0, or 12 o'clock, you would then move to the fifteen minute position for your first point, the thirty minute position for your second, the forty-five minute position for your third and back to the top for a win.

But hang on a minute. Whoa, whoa, whoa. What the...

Yes, you spotted it. The scoring system is 0-15-30-40, not 0-15-30-45. And it's right about now comes one of the greatest peculiarities in tennis of them all. You see, in most sports, if when whatever it is that denotes the end of the game happens, be it the number of laps completed or the elapsing of an arbitrary time, or the amount of goals scored; when that end arrives, whoever has the most points, or whoever is in front, wins. It doesn't matter by how much, or by how many, or by how far, it just matters that one player or team is in front of the other(s) in some way.

In tennis, not so. You see, in tennis, to win a game, when you get to 40, the next point scored must make you clear of your opponent by at least two points. Which means, if the score is 40-40 and a player wins the next point, then play continues. The player who gets that point is said to have the advantage, therefore the score becomes 40-A or A-40. If that player scores again, thus being two points clear, they've won the game. If the other player wins the point though, the score actually reverts to 40-40 and play continues once more, bouncing back and forth between these states.

Yes. The score. Actually. Reverses.

Back to the clock. So, having to be two points clear means 45 isn't going to work, as another fifteen minutes takes you back to the top of the clock. So the French, to make their theory fit we can only assume, adjusted the final score to the 40 minute position, with advantage becoming the 50 minute position, and then winning took you back to the top of the clock, thus a score of 40 rather than 45, was born. Phew...

But bear in mind that all of that is just to win one game. Remember, you then have to win a series of games to win a set and a series of sets before you've won the match. In fact, you have to win six games before you can take a set. But only if you're two games clear of your opponent. 6-4 is a win, but 6-5 is not. So what happens then? Well, much like the clock, we make it fit by way of an exception, of course. Either Player A wins the next set, taking them to 7-5 (thus two points clear), or Player B wins taking them to 6-6, and a tiebreak is forced.

And when you thought things couldn't get any weirder, nonsensical or entirely goddamn crazy, tennis does something you would never expect. Like an articulated lorry that veers into your path, or a loose paving slab that splashes water all over your clothes, tennis takes this particular opportunity to throw in a curveball. For in a tiebreak, 0-15-30-40 retires and 0-1-2-3 takes centre stage. The scoring system changes. The mind boggles...

Both the brute force of boxing and the intellectuality of chess are required in order to overcome your foe. It is a classic game of cat and mouse.

So that's it, right? I mean, surely it couldn't get any more bizarre?

Well, in most sports there is a finite amount of time or length of travel that passes before the game is called to a halt. Football is ninety minutes, plus stoppage and/or injury time. Swimming is set by the distance required to travel to meet the criterion of that race. In F1, it's usually the number of laps, but if two hours elapses first, the race is stopped and the car in front wins.

In tennis, not so. You see, when you get to that 40-40 state, the bouncing back between 40-A, 40-40 and A-40 can continue indefinitely. As long as it takes until one player is two points clear. And you remember that tiebreak? Well, again, you have to be two points clear of your opponent to win a tie. But, there isn't an upper limit to the amount of points played or the amount of time elapsed before the tiebreak is called, oh no. Instead, tennis players must march on, regardless of the waning light or the screaming of their legs or their loved ones wondering what the hell happened to them. In fact, the longest match in tennis history was played between John Isner and Nicolas Mahut at Wimbledon and lasted eleven hours and five minutes with the fifth set alone taking eight hours and eleven minutes to complete. The score was 70-68.

I think I've made my point.

So, if the multilayered, multidimensional, multi-freaking-mad scoring system laid out thus far wasn't enough to comprehend, well here goes with a final round. You see, as well as ditching convention in favour of some crazy clock nonsense, what we actually call those points isn't all that clear cut either. 0-0 is not 'zero-zero', 30-0 is not 'thirty-zero' and 40-40 is not 'forty-forty'. Try 'love-love', 'thirty-love' and 'deuce', respectively.

Again, we look to the French [1].

The etymology of 'love' is largely unknown, but popular belief tends to lean towards the French word for egg, l'œuf (pronounced more or less like lohf), as a 2D representation of an egg is largely the same shape as a zero. Deuce again is French and thankfully more understandable etymologically, however its usage is not. You see, 0-0 is 'love-love', 15-15 and 30-30 are 'fifteen-all' and 'thirty-all' respectively, and so only when the score is 40-40 do we refer to a tied score as 'deuce'.

If anybody happens to have made it thus far, congratulations. Pat yourself on the back. Take a breather. Don't play a game, a set, or a match of tennis, whatever you do.

In seriousness though, believe it or not, my love of tennis has never been stronger. As a game, as a sport, and as a pitting against each other of two individuals, there isn't much that can beat it. Both the brute force of boxing and the intellectuality of chess are required in order to overcome your foe. It is a classic game of cat and mouse.

But please, if it's the last thing we ever do, let's rid ourselves of this insanely complex scoring system. Sure, I totally get the etymology, the genesis, the source of how it came to pass. But out with the old and in with the sensible.

There has to be a better way.

[1] Though lawn tennis is thought of as a quintessentially British affair, the genesis of the sport predates Wimbledon by a 'mere' eight centuries, all the way back to 12th Century France.

The New Order

This week there has been a lot of chatter. A lot. And it's not over yet, not by a long shot. As we were treated to announcement after announcement at the opening keynote of this year's WWDC, the tweets started piling up: it's too flat, it's too bright, I hate that name, the cars were ridiculous. A never-ending torrent that has still got more mileage than the remainder of the Formula 1 season; the only logical solution was to mute certain words from my Twitter stream altogether, at least until next week. And even then, it might be too soon.

So, paradoxically, here I am, adding to that chatter. I guessed though that rather than spreading the infection that is Twitter's latest pandemic, I might squirrel away my commentary over here, in a place where if you want to find it, you can come and get it.

This week on Too, as seems to be our way of late, in an episode of the same title, Steven and I avoided direct talk about the keynote itself, and headed directly for the meaning and philosophy, moreover: does this keynote signify change at Apple? (It seems much more wholesome than discussing whether the curvature of the new icons in iOS 7 is 'right' or not, doesn't it?)

The short answer of course, is: it's surely too early to tell. With less than a week since the announcements, who could honestly say? And yet, I think it was fairly obvious that change was afoot in Cupertino, and, more importantly, Apple explicitly wanted those changes to be known.

That is the most exciting phenomenon to occur to Apple since iPhone.

Sure, it came across most obviously in the stark change of tack with iOS 7. But, for me, it was infinitely more sophisticated than simply a design choice: for me, this keynote signified an underlining of Apple pre-keynote, and a direct statement of intent that this is a different Apple, under a different guise, post-keynote.

I can already hear the naysayers condemning me for what they may think is a sensationalist slant on events, so allow me to qualify my thoughts. I'm not saying that Tim woke up one day and decided that it was time to pack away Steve in a box for prosperity's sake, never to be seen or talked of again. Absolutely not. Rather, this keynote signified that the past, the history of Apple, though never forgotten, was but exactly this: a superlative culture of historical importance that would always remain imbued in the ethos of Apple, but a history that would no longer flavour or influence a future Apple. Steve very famously told Tim that he should do what Tim thought was right, and not think about what Steve would have done; as usual, Steve had foresight beyond his years. He knew that mimicry and a clinging to the past would not be in the best interest of Apple. And now, for the first time, we are seeing Tim carve out a new groove, a slightly altered tempo, if you will, for the company. What happens from here on in, is truly at the discretion of Tim and his team.

And for me, that is the most exciting phenomenon to occur to Apple since iPhone.

I know that this very matter divides opinion, sorts those adverse to change from those willing it to happen. And for a long time, my heart was squarely in the former. It could not and would not let go of what Steve had created. He left and the thing fell apart; for a die-hard Apple supporter, at the time of Steve's passing, that was a clear indication to me that Steve and Apple were one and the same, and to remove one element would inexorably cause the demise of both.

I was wrong. But Steve wasn't.

So, here we are. With purpose and pace and precision, in just two hours, Apple stated its intent to the world. And for me, that statement was clear: Apple will continue to do whatever it takes to be Apple. If there were anything that lived on from Steve's era, that ethos is surely it.

We live in exciting times. I cannot wait to see what comes next.

OS X Helvetica

My one and only prediction for this year's Dub Dub: OS X adopts Helvetica as its base operating font.

There, I said it.

 UPDATE, 20/06/13: I was so drastically wrong. This time. It's still going to happen, mark my words.


Oftentimes, you’ll read tweets in which people moot a particular grammatical faux pas they have had to commit in order to fit their message into 140 characters. Perhaps an apostrophe is removed, or ‘and’ is supplanted with an ampersand, or hyphens are disemboweled from their associated prefixes. It’s almost like we have a nervous tic, as if we have to ensure that everybody around us is perfectly aware that actually, we do understand the subjunctive and its proper usage, thank you very much.

Well I say bollocks to the lot of it.

Having studied both English and a number of its foreign accompaniments for the majority of the life that I can comfortably remember, and having lustfully looked upon the linguistic leadership of those more lexically periphrastic than me, you would think that I would be firmly and squarely in this puritanical movement. I am not. Whenever I hear somebody correcting somebody else on their incorrect use of a split infinitive, or a sentence that ends in a preposition, or indeed the omission of an apostrophe where there should be one (or more likely the addition of an apostrophe where there shouldn’t), I wince and cower and balk, hoping to Christ that the proselytiser isn’t going to draw me into their cult.

It shames me that we even debate language in this way. For some unfathomable reason, linguistic aptitude, whatever that means, has become yet another badge that we can pin on our uniform to distinguish us from them. If you have bad language, they cry, then you must be bad of mind, bad of talent, bad of bloody everything.

Well, it’s simply not true. Some believe that language is a constant, that its construction and its laws are potbound, immovable, ever-lasting, as if somehow languages are impervious to environmental and cultural change. If we can appreciate that everything else in the world is entirely mutable, then why do we have this misanthropic misunderstanding that language must be retained in its current state?

We find it appropriate to introduce snipers, tanks and torpedoes to the battlefield, just in case anybody might actually be on for making it to the other side.

At its base, language is the arbiter of communication. It allows Person A to infer a message to Person B, and for Person B to respond to that inference. And so, this leads me to my biggest stick in the mud: if both parties fully understand the interchange, what difference does it make that the words are in the wrong order, or spelled incorrectly, or lack grammatical coherence? What viable argument can there be to halt that communication, and point out the inaccuracy of how that message was relayed, if in fact understanding was retained?

English is a convoluted and pedantic language. It has peculiarities that to the uninitiated can be entirely inexplicable. The inscrutable way in which parts have been bolted on top of other parts, like a wavering tower of Jenga pieces, makes understanding its complexities as difficult as wading through treacle. And yet, as people attempt to traverse the minefield that is the English language, we find it appropriate to introduce snipers, tanks and torpedoes to the battlefield, just in case anybody might actually be on for making it to the other side.

Of course, language is about so much more than just communication. In whatever form it takes, it can be worked, much like anything else, into a beautiful, artful masterpiece. It allows us to inform others of our personality, of our idiosyncrasies, of our independent and unique perspectives. Debilitating and debasing those that use it in a different way to you, is like saying Beethoven was wrong because he didn’t play like Rachmaninov.

Language changes. It evolves. It moves with the times. And if you’re one of those that hates it when others drop aitches from the beginning of words, or use the imperfect past instead of the perfect, or don’t differentiate between the use of ‘less’ or ‘fewer’, ‘there’ or ‘their’, then seriously, ask yourself this: Why haven’t you changed too?

The yoyo

I love philosophy. You see, I say that, and actually, I have no idea what I mean by it. The first paragraph of Philosophy 101 is about as far as I’ve got. Watching The Matrix doesn’t make you an expert. Skimming Simulacra and Simulation doesn’t make you Plato. Being able to pronounce Nietzsche doesn’t connect you with the inner workings of the universe.

That aside, there’s some deep, deep questions within the subject that enthral me. Captivate me. Things I’ll always wonder. Things I’ll never understand.

There was a pretty dark period in my life, when I was doing stupid things, taking alarmingly unnecessary risks. To get away from the dementedness of it all, sometimes I’d find myself sat alone, literally trying to piece together the meaning of life. God only knows why. It never occurred to me that these sessions with myself were probably enabling rather than abetting, but that’s a story for another day. It was a battle with my own conscience, a fight that quite frankly, I was always going to lose. I was looking through some boxes full of old papers and notepads recently, and happened across pages and pages of insane notes. I clearly didn’t have a clue, but at the time I remember ‘discovering’ a theory I thought explained everything. That insatiable need to have meaning, reason, understanding; it was deafening. I look back and wonder why I put myself through it.

I’m going to come away with more questions than answers.

Ever since, philosophy has left an indelible mark on my mind. It’s fascinating to me that there are incredibly intelligent people on this planet that are still debating theories created thousands of years ago, theories that will never, ever, be proven. And yet, human nature continues to walk the line, pushing and pulling, like Houdini and his straightjacket. Only this time, there’s no escaping.

My thoughts on philosophy are primitive at best. But I have them, and every now and again, I catch myself running down the track, my mind unravelling like a yoyo, all the way to the limit of the string. And it’s at that point, when the string is taught and the tension unbearable, the yoyo spinning its incessant carousel, that I believe I’ve finally crossed the rubicon. And then, snap! The yoyo comes flying back, the tension dissapating, the thought gone, the meaning unexplained all over again.

I’m acutely aware that writing about philosophy is likely to be a death knell. I appreciate why it switches people off, why they back away from the nutcase dreaming about reality in the corner, why something that is largely unexplainable is too much for most to bear, why when there’s tangible problems to solve here on planet Earth dreaming about the cosmos seems like folly. But to me, it’s the greatest vortex ever created. It draws me in. Its allure is inescapable. I love going another round with a foe that is bigger and stronger and better than me, and knowing, all along, that it’s going to win. That I’m going to come away with more questions than answers.

That’s power. That’s insurrection. That’s beautiful.

Do something great

When you look back through history, who do you regard as heroes, idols, those that in your mind made something happen? Of course for everybody, the list will be entirely different; so many factors lay foundations for one individual’s perception of what ‘great’ looks like, it’s hard to make a list that in some way won’t be disputed by the person sat next to you. A disagreement ensues, where those who contend your assertions immediately look to the negatives, using them as an argument for why you are wrong.

Like everybody, I have those that I hold in high esteem, but actually, when I boil it down to the atoms of what ‘great’ looks like, it’s less about the people themselves, and more about the qualities that they imbue. For me, people carry too much baggage; undoubtedly every great person has a string of misdemeanours they would rather you forget. So to avoid falling into a lengthy debate over why your top ten is different to mine, here I choose to focus on one particular quality instead.

‘Modern management’ frustrates me with its constant reliance on committee. Whenever a big issue takes to the stage, it is a very rare occurrence when an individual stands up for that in which they believe, and asserts that actually, this is the way it’s going to be, no questions, no debate. Instead, you have stagnation, indifference, and back-pedalling, disguised and veiled as ‘an opportunity to let everybody have a voice’, to let the people speak, to engender community and social decision making, to get ‘buy-in’ from those the issue will affect.

The scenario is one of incentives: a decision needs to be made on how to get a workforce to be more productive. There are two possible approaches: in the first, the leader makes a decision, informs the troops, and onwards the team moves. In the second, the manager opens up the floor, gives time to debate amongst the troops, and a solution is collectively agreed upon. So what just happened? In the first instance, a leader earmarked a problem, applied a pathway to resolution, and informed his or her team of how to proceed. In the second instance, a manager earmarked a problem and then allowed every voice to interpret that problem for themselves, from their own perspective, before eventually resolving to solve an entirely different problem to an entirely different issue.

How many committees do you revere? How many groups do you remember? How many congregations caught your attention?

The disparity may seem inconsequential, but it can have huge effects on what happens next. The whole reason a leader is in place, is to lead. And the reason they are better equipped to do that from their position in the organisation, is not because they are in charge and you are but a minion, but because they see a holistic problem that needs a holistic solution. As soon as the gates are opened to everybody, that problem is lost as each voice decides for themselves what is important, and what would work best for them, as an individual. At that point, the leader is no longer a leader, they are but a manager. And to boot, they managed that situation badly.

Of late, waving it in on a ticket of improving staff engagement or reducing voter apathy, leaders think that should those ‘affected’ have an opinion, should they be heard, then they will feel more inclined to come on board. What they seem to forget however is the whole reason why they raised the issue in the first place. In the above example, worker productivity falls off the slate and instead it’s about how that individual can make their life easier, make their environment customised for their needs, how things can be better for them. Sure, that’s great for that person, but for that person alone. And to hell with the team, or the mission, or the whole reason why you’re there in the first place.

Now open your history books and decide who is ‘great’ to you. How many committees do you revere? How many groups do you remember? How many congregations caught your attention?

Who you’ll remember are individuals, single people who stood up for their values, took the initiative, and for better or for worse, moved forward on an issue. Most likely they were divisive, most likely they had as many detractors as they did supporters, but such is life. Leadership comes with a price: not everybody will like what you have to say, and not everybody will want to support you. But simply managing comes with an even bigger price: indecision, flatlining and a total lack of direction.

Yes, the former will likely cause you enemies, but nothing got solved by a committee.

People don't change

The desire to evolve, to change; I think it’s simply a part of me that I’m going to have to live with. Some people seem to be immune to the need to mix things up, swap things around, get a new perspective, yet I have never been good at staying in one place for too long. I guess that’s why an interest in technology is such a perfect fit for me. This microcosm is frentic, it doesn’t stand still for long, and that allows me to flit from one thing to the next without much friction.

What this type of perspective disallows me to do however, is develop something into a more longterm project. And that, that right there, can be debilitating. It’s not that I have a short attention span, far from it. What I do have though is an incessant need to learn something new, try out something different. If you’ve visited Do You Have A Mountain Bike? more than once within the last year, then chances are you’ll have seen at least two of the three redesigns. That’s not because I particularly disliked the ones that went before it, though I’m never satisfied, it has to be said, but rather I see something that looks incredible elsewhere and I want to adopt it, iterate on it, assimilate some of its beauty into my own work.

Do You Have A Mountain Bike? is probably a bad example, simply due to the fact that web design is relatively new to me, and I’m still trying to find a footing in terms of what I like and appreciate in modern web design. I’m still trying things out, I’m still trying to find a ‘style’, and of course that is a problem-layered-on-top-of-a-problem with a mind that wanders all too soon.

And then I realise that the all grown up part has already arrived, and nothing has seemingly changed.

So maybe the fact that I’ve lived in six different apartments in ten years is a better way to conceptualise what I’m trying to say. Maybe the twenty pairs of jeans in a bottom drawer that never see the light of day and have perhaps been worn once each, if that, would do it. There’s a chance the plethora of magazine subscriptions, or saved URLs, or unused notepads, or gigabytes of music that I hoard, hoard, hoard, only to drop in the next breath; maybe that illustrates what’s going on here. I could go on.

Whenever that spark of interest hits me, it seems that it has to be new and exciting and fresh. To some, that will sound trite, to others they’ll probably call me a hipster (and already have) and others yet still, they’ll see me as fickle or unable to make a commitment. I see it as something that gives me great pleasure. And something that makes ‘making a go of it’ really very hard.

If we skip back to Do You Have A Mountain Bike? for a moment, then at least that’s something that seems to be staying around. Although it is of course very new, it’s lasted much longer than I ever thought it would. And with a study wall affixed with mountains of sticky notes containing possible ideas for what to do next, and with a to do list overflowing with tasks that I need to complete to take Do You Have A Mountain Bike? forward, it’s both surprising, and pleasing, that at least for once, something is being sustained.

I always say to myself that when I’m all grown up, my focus will crystallise and suddenly, like the wool being taken off my eyes, I’ll look upon a landscape in which I finally want to build a house. And then I realise that the all grown up part has already arrived, and nothing has seemingly changed.

House always says that people don’t change. I tend to agree. If I’m stuck like this for the rest of my days, then at least one thing is for certain: I’m never going to get bored.