tag:blogger.com,1999:blog-71142545011669640032024-03-24T16:32:10.492-07:00Duncan Anderson's BlogSome random musings from someone who enjoys mobile technology. You can contact me on <a href="http://uk.linkedin.com/in/duncansanderson/">LinkedIn</a> or <a href="https://twitter.com/duncsand">Twitter</a>. IBM pays my salary, but the opinions I express here are my own, not necessarily those of Big Blue.Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.comBlogger45125tag:blogger.com,1999:blog-7114254501166964003.post-10092848741537344462015-11-29T07:53:00.000-08:002015-11-29T07:53:11.723-08:00Why the smartphone is more like a car than an MP3 player<p>Clayton Christensen is one of the most celebrated business thinkers and writers. His seminal writings on the subject of <a href="http://www.claytonchristensen.com/key-concepts/">“Disruptive Innovation”</a> helped to explain the forces of commoditisation. In this, he argued that most companies focus on the top of their market, where the profit comes from. This makes them vulnerable to cheaper, but “good enough”, upstarts innovating at the bottom of the market. This is a good thesis to explain the consumer electronics market, where it has been proven true again and again. But I don’t think it helps explain the nature of the smartphone market.</p>
<p>Superficially, Christensen’s explanation seems to neatly define why a company like Apple is vulnerable. After all, Apple is the very definition of a successful company focussing on the top of the market. They are therefore doomed, as the narrative goes. But I don’t think this is the case - and the easiest way to explain why, is to look not at consumer electronics but at cars.</p>
<p>For if Christensen were universally right, a company like BMW shouldn’t exist. Toyota, Skoda and Nissan long ago proved there are cheaper, reliable, and functional alternatives at half the cost of a BMW. So why is BMW not just still here, but thriving?</p>
<p>I believe that BMW exists because its product exhibits a number of attributes that make it immune to the forces of commoditisation:</p>
<ol>
<li>Innovation means the product is continuously being reinvented, not just refined. Electric drivetrains, safety changes, self-driving and entertainment innovations all mean the car of tomorrow is nothing like the car of today. As a result, consumer expectations of “good enough” are constantly advancing.</li>
<li>Consumers highly value the fashion, brand and aspirational social status of the products. These things ensure “good enough” isn’t good enough for anyone who can afford to pay a little more.</li>
<li>The product is essential to consumer’s lives and is used on a constant basis - meaning it’s easier for consumers to justify an upgrade or investment, because of the critical nature of the product.</li>
</ol>
<p>Once an MP3 player has enough storage capacity, it’s hard to imagine how it can be “better”. But a car isn’t an MP3 player. So we always want the next innovation because it improves our lives. Or sometimes we don’t need it, but we’ll still buy it because that new metalwork looks “oh, so stylish”. In these ways BMW ensures its continued place as an innovator at the top of the market. </p>
<p>Despite its position in the consumer electronics industry, the smartphone is more like a car than an MP3 player. The definition of a modern smartphone is continuously changing, manufacturers have successfully worked out how to exploit fashion and brand and the product is at the centre of how we live our lives.</p>
<h3>Continuous product redefinition</h3>
<p>Some have fallen into the trap of thinking a smartphone is a phone - and is already “good enough”. Like an MP3 player, it’s hard to imagine how different or better a telephone needs to be. But the last thing a smartphone is, is a telephone.</p>
<p>My teenage daughter has removed the phone app from the home-screen of her iPhone. Where I see a phone icon in the bottom-left corner, she has a messaging app. When I questioned her, she very logically protested that “nobody my age uses a phone”. Her iPhone is a computer, not a telephone. Notice how the <a href="https://www.apple.com/pr/library/2007/01/09Apple-Reinvents-the-Phone-with-iPhone.html">original iPhone press release</a> talks about the iPhone being a “Breakthrough Internet Communications Device”.</p>
<p>Whilst the power of the original iPhone seemed adequate for the time, the processor of today’s is <a href="http://www.extremetech.com/computing/189787-apples-a8-soc-analyzed-the-iphone-6-chip-is-a-2-billion-transistor-20nm-monster">50x the performance of that device</a>. It is hard to imagine using that original iPhone today - it would be <em>far</em> too limiting and slow for anything remotely serious. But more significantly, that processor innovation has enabled entirely new uses for the smartphone - some of the things we now use it for weren’t even conceived in 2007. </p>
<p>When we judge the level of innovation in the smartphone market, we shouldn’t look at individual product releases. That’s like saying “is the 2015 BMW 520i sufficiently different from the 2014 520i to make people upgrade?” (which of course it isn’t). Rather like the car market, we can only see the true level of innovation across product cycles. So “is the iPhone 6s sufficiently better then the iPhone 4 to tempt upgraders?” is perhaps a better question to judge the level of innovation.</p>
<p>One data point for how the definition of a good smartphone is changing is that people now talk seriously of the smartphone replacing the camera. Indeed, the <a href="http://petapixel.com/2014/12/15/chart-shows-badly-digital-camera-sales-getting-hammered-smartphones/">compact camera market is already succumbing</a> to this threat. The camera on the original iPhone was just a toy, but the latest models can be considered <a href="http://petapixel.com/2015/03/01/20-of-apples-favorite-photos-shot-with-the-iphone-6/">serious cameras in their own right</a>.</p>
<p>We can now edit videos, play high-end games, compose music, write documents, edit images and many other tasks on our smartphones. Product innovation is changing what a smartphone is, not just refining its capabilities. </p>
<p>And there is much yet to come - who wouldn’t pay for a smartphone with a week-long battery life, a screen that’s as easy to read as a book in sun-light, or that provides dynamic haptic feedback that makes the on-screen keyboard feel “real”? The smartphone of tomorrow will be radically different to that of today - and we’ll almost certainly find we <em>need</em> those innovations.</p>
<h3>Fashion, brand and social status</h3>
<p>We value the design, style and quality of our cars, and we do the same with our smartphones. For an essential item we use all the time, many are willing to spend the money to get a better model. Or to trade-up when our current model starts to look a little tatty around the edges. </p>
<p>Many upgrade their cars every few years. We don’t have to - for although the seats may be a little dirty and the paintwork not as shiny as it once was, our old car is frequently still perfectly functional. But “perfectly functional” isn’t good enough when we can afford something better. The same is true with smartphones. We change our smartphone for often similar reasons that we change our car. The new model looks nicer, is faster, has more features. We might not need it, but we want it.</p>
<p>Whilst some who don’t see the appeal of a “premium” product, that attitude is a relatively niche one. Whatever the rights and wrongs, there’s undeniably a healthy market associated with aspiration and style.</p>
<p>But aspirational brands and products shouldn’t be mistaken as purely superficial. If that were the case, consumers would soon catch-on. A BMW is a genuinely good car and very well engineered. The fit-and-finish is excellent and it drives superlatively. That quality allows the brand to have a consistent appeal over many years.</p>
<p>An iPhone’s engineering is similarly great. Like a BMW’s engine, it’s <a href="http://www.anandtech.com/show/8514/analyzing-apples-a8-soc-gx6650-more">processor design is industry leading</a> and the physical engineering of it’s case is top class. Even if you value other attributes of competing products, it’s hard to ignore the quality and engineering inherent in the design. And so there’s a substance to its position as a “premium” product that allows it to hold that position consistently. </p>
<p>There may be cheaper products that are perfectly functional. But “perfectly functional” isn’t the aspiration for many. Like cars, smartphone brands and products have become aspirational. </p>
<h3>Critical role</h3>
<p>Our smartphones are how we communicate with both loved ones and colleagues, they are how we navigate, they entertain us with games, we listen to our music collection on them and they are how we consume the day’s news events. Without a smartphone, most of us would be lost in the modern world.</p>
<p>If you’re one of those modern city-dwellers who’s calculated that a combination of public transport and Uber are more cost efficient than car ownership, I bet I couldn’t prize your smartphone from you hand? And if I did, how would you know the times of the trains or how to book an Uber?</p>
<p>We have only to watch the human misery of mass migration on the news to notice one common and consistent theme. Those who have left their homes in hope of a better life carry one thing with them: their smartphone. The smartphone is their link to reality amongst a sea of misery. It helps people keep in contact with loved ones in the most trying of circumstances and helps them navigate treacherous routes to safety. If you’re going to leave a war-torn country and travel half-way around the globe, your smartphone (and it’s charger) is your most prized possession.</p>
<p>The smartphone isn’t just a critical communication device for the chattering classes, it’s become essential to people of all backgrounds. </p>
<h3>But…</h3>
<p>There’s one enormous difference between a new BMW and a new iPhone. The finance payments on a new BMW are hundreds of pounds every month, but a new iPhone can be purchased outright for just one of those monthly payments. As such, the iPhone becomes the “affordable luxury” for not just the elite, but the masses. </p>
<p>Much like the car, we have a product in the smartphone that is essential to people’s lives, is constantly being redefined through innovation and appeals to the sensibilities of fashion and style. I think it’s reasonable to consider the smartphone market much more like the car market than the consumer electronics one. That’s why premium smartphones continue to exist and why people continue to buy them. The smartphone is much more like a car than an MP3 player.</p>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com8tag:blogger.com,1999:blog-7114254501166964003.post-56839752436889498462015-11-10T14:06:00.000-08:002015-11-10T14:07:07.156-08:00Taking the blue pill<p>I previously blogged about <a href="http://duncan-anderson.blogspot.co.uk/2015/10/breaking-rules.html">breaking rules and why it should be a skill we all learn and perfect.</a> </p>
<p>In this post I’d like to present the counterpoint to my previous argument. Today I shall be arguing why rule breaking is a bad thing.</p>
<p>Breaking rules puts you on the outside. It excludes you from the ruler-maker’s club. This makes it tough for you to influence that club. For throwing small stones from the outside can be tough when the windows are reinforced. Pebbles are useless, you need boulders. And you probably don’t have boulders.</p>
<p>Instead of futile pebble throwing, work out how to get on the inside. Work out how to infiltrate the system, and then influence it from the inside - this can be much more effective.</p>
<p>Bide your time, hold your tongue. Hide your opinions when they are strong. Keep yourself to yourself until you’re on the inside. Don’t give your hand away. Have patience. </p>
<p>Breaking rules can have the effect of offending the rule makers. It puts their backs up. People who might otherwise have supported you might turn away - making it more difficult to influence them. </p>
<p>Showing anger or emotion can be a sign of weakness. It can be used against you - “he’s unstable, he’s letting his emotions get away with him”. Better not to display that anger, so you appear reasonable. The rule makers love reasonable people, people who don’t threaten their position. </p>
<p>Rule breaking is also the more difficult path personally. You’ll have a tough time, feel on the outside, feel not one of the club. That can be tough emotionally. Sometimes it's better to play the system to get on. </p>
<p>Gaining influence and position is the objective. For without influence and position it’s hard to change things. This is why politicians sometimes compromise their principles in order to broaden their support - for a politician can only really change things if they get elected. To be elected on a manifesto of doing half what you would like, is better than not to be elected at all.</p>
<p>Once you’ve become elected or gained entry to the rule maker’s club, that’s when you can start changing things. Perhaps. Maybe you’ll just bide your time a little more to consolidate your position. You’re a new girl/boy in the club, after all - it would not be prudent to make a noise until your support is broad. Perhaps you can make some small changes whilst you prove yourself. Next year might be the time to do something more dramatic. Perhaps.</p>
<p>Do you still want to take the red pill? Perhaps the blue one is looking more attractive? Does the blue pill feel like you’re selling out? Or maybe you feel you can manage the conflicts associated with it? Maybe the blue pill is the way to get things done? Life is complicated.</p>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-9727270025959045502015-10-22T14:51:00.001-07:002015-10-24T02:18:41.515-07:00Breaking rules<p>We should all follow the rules, right? </p><p>Countries have laws, companies policies, societies rules. They all amount to the same thing - an attempt to place some order on chaos.</p><p>If the rules are wrong, we work to change them, we don't break them. For if we just ignore them, chaos would reign. No organisation can allow any random person to ignore its rules. This is how civilisation and business was built, right?</p><p>Actually, no.</p><p>Civilisation was built by people breaking the rules. Women got the vote because people were <a href="https://en.wikipedia.org/wiki/Women%27s_suffrage#United_Kingdom">willing to break the rules</a> of the day. Slavery was abolished because people broke rules. </p><p>Maybe we shouldn’t compare the epic struggles for human rights with the stuggles associated with a startup business. That might be going too far. Or is it? There are some similar principles at work - the overriding need for change.</p><p>Uber cannot request new rules to support its new form of taxi service. To do so would probably take decades, as a combination of intertia and vested interests work to frustrate any change. No, it sometimes needs to break or skirt-around those rules in order to launch its business in a new city. In so doing, it creates evidence of demand for its service that pressures authorities into making their rules match the new reality. Change doesn't happen by people asking politely, because the polite request is likely to be squashed. To get change you need to create a lot of noise.</p><p>It turns out that most rules are put in place by old rich people. I say that provocatively. I don't necessarily mean <em>really</em> rich people, just people who have something to lose. Those with a certain investment in the way things are. That immediately means anyone who’s worked their way into a position of influence - they owe their role in society to the rules within which they've worked. They're probably a bit older than most because you have to work your way through a system to be the custodian of the rules. Rules are hardly ever made by the people at the bottom - the poor, the young, those without influence or eduction. No, rules are made by the old rich people. And there’s a problem with that.</p><p>Old rich people are susceptible.</p><p>Some of them just can't envisage a world that's different. They’ve spent their lives getting to where they are with a certain mental model. It's not their fault, but they just struggle to see that things could or should be organised differently. They have become set in their ways and are blind to the need for change.</p><p>Some of them just don't want to change because it's not in their own interest.</p><p>And some of them are perhaps easily influenced. There are a lot of vested interests lobbying to keep things a certain way. A little naivety can make an individual susceptible to the wrong influence. A friendly handshake, a nice meal, a feeling of friendliness - these and other flatteries can influence minds. In contrast, the uncouth protestors, the angry, the disrupters - they're dangerous, not one of us. This is why “political lobbyist” is a job - subtle influence works. It’s especially effective when you have money - for the nice meal, the day at Ascott and the like don’t come cheap.</p><p>There's one other thing about the rule setters. They're very good at coaching their successors, in creating a system whereby others achieve a role in the system through effort. Small incentives of prestige ensure there's always a new generation to take over - but one which fundamentally thinks the same way. The status quo can be maintained.</p><p>So for those that want to fundamentally change things, it's a tough gig. That's why they often don't play by the rules. For if they did, nothing would happen. Or it would happen at a glacial pace, which is the same thing.</p><p>That's why companies like Uber sometimes fly a bit close to the wind. It's why some argue that Google plays it fast and loose with the copyright of newspapers and books. It's why technology companies sometimes get sued - for to build a new product without treading on the toes of any existing providers is sometimes impossible. It's why the phrase “it's better to ask for forgiveness than permission” is a mantra of those wanting to change things.</p><p>Change often doesn't happen within the rules. The rules are put there to keep things the way they are. Maybe that's not the explicit objective of those setting them, but it's the inevitable outcome. Old rich people just don't change things quickly or easily.</p><p>This is why the rule setters are often outraged to see rule breaking. The outrage is genuine, but sometimes misplaced. For rule breaking is a critical part of change. It has to happen, or things stagnate. It's why some companies lose markets - their system of rules and politics prevents them from recognising the need for change. Blackberry and Nokia, I'm thinking of you.</p><p>But of course breaking any and every rule will just get you into trouble and results in chaos. So, a key life skill is learning which rules to break and when. Which ones will be overlooked if the change turns out to be good? This can often be a finely balanced evaluation. Go too far and you're sacked or put in prison. Don't go far enough and nothing changes. </p><p>I wonder why our schools and universities don't address this challenge? These institutions seem the embodiment of “the rules are here for a reason” attitude. But I'm not sure that helps in passing on the skill of rule breaking.</p><p>After all, were some people not willing to break the rules, women might never have got the vote, slavery not have been banished and we’d be standing in the rain waiting half an hour for a taxi.</p><p>So perhaps we should all stop before venting at a rule breaker. Perhaps their form of rule breaking isn't so outrageous if we stop to think about it? Perhaps they're ushering in a form of positive change that can't happen without some rule breaking.</p><p>Rather than criticising those who break the rules, maybe we should consider rule breaking as part of a broader system? A system that tolerates <em>some</em> rule breaking in order to remain flexible to change. Exactly which rules we can break is the big life lesson we all need to learn.</p><p>By the way, you have watched ‘<a href="http://www.warnerbros.com/matrix">The Matrix</a>' trilogy, haven't you? For ‘The Matrix’ is the embodiement of a system that survives <em>precisely because</em> it has learnt the need to allow rule breaking. </p><blockquote><p>“You take the blue pill, the story ends. You wake up in your bed and believe whatever you want to believe. You take the red pill, you stay in wonderland, and I show you how deep the rabbit hole goes.” The Matrix.</p></blockquote><p>I salute the rule breakers, the ones who take the red pill.</p><p> </p><div style="text-align: right; font-size: small; clear: both;" id="blogsy_footer"><a href="http://blogsyapp.com" target="_blank"><img src="http://blogsyapp.com/images/blogsy_footer_icon.png" alt="Posted with Blogsy" style="vertical-align: middle; margin-right: 5px;" width="20" height="20" />Posted with Blogsy</a></div>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com2tag:blogger.com,1999:blog-7114254501166964003.post-29487575247383664922015-10-04T07:59:00.003-07:002015-10-04T13:07:10.067-07:00App craftsmanship<p>Bespoke artisan furniture; we can all appreciate it, even if we can’t afford it. There’s a place for Ikea, mass production and flat-pack, but there’s also a place for craft. </p>
<p>It’s similar with software. There’s a phrase <a href="https://codeascraft.com/about/">“code is craft”</a>. The mobile app revolution is perhaps one of the best examples of that philosophy.</p>
<p>We’re not talking business cases, global delivery, multi-platform development or sprawling requirements here. We’re talking software developed by a handful of people, or maybe even a single person, building the absolutely best thing they can. We’re talking small nuggets of software perfection. </p>
<p>I can think of two good examples of this: <a href="http://tapbots.com/tweetbot/">Tweetbot</a> (Twitter client) and <a href="http://netnewswireapp.com">NetNewsWire</a> (RSS reader). Both of these apps are something rather special. They are also the two most-used apps on my iPhone. Every day I use them to keep abreast of the world around me. And they are both little nuggets of perfection - created by software artisans.</p>
<p>The UI design of these apps is subtle. They are both designed to fit within iOS in a respectful way. They don’t attempt to reinvent the wheel, but use standard UI capabilities, extending only where useful. Branding is exhibited through subtle use of colour, icon design and animation. Nothing is overt, everything feels natural - as if it had been desiged by Jonny Ive himself. </p>
<p>Discrete might be another way of explaining this kind of design philosophy. Being discrete and subtle sometimes takes a lot more effort and thought than being brash or overt.</p>
<p>Take the way that NetNewsWire subtly colours the navigation bar with the theme of the news article’s source. We don’t see a brash logo, but rather just a subtle colouring. As a result, everything feels consistent. But articles are also differentiated in a discrete way. </p>
<p>And take the way that everything in Tweetbot scrolls at 60fps. Consistently smooth inertia scrolling at 60fps takes design effort. That effort isn’t obvious, but it’s there. This kind of attention to detail is needed if you want an app to be something special. It’s not easy to measure and justify the effort involved. But it matters if you want to delight your users. </p>
<p>Sweating these details matters - because it creates something a little special. And when something’s a bit special, it’s obvious - in the same way that a hand-built piece of furniture is obviously special.</p>
<p>I use NetNewsWire and Tweetbot not just because they get the job done, but because I appreciate the thought that has been poured into them. Using apps like this is <em>pleasurable</em>. </p>
<p>In contrast, some other apps annoy me every time I use them. Their brash colours, their clumsy styling, their inept attempts to re-interpret standard UI controls for no purpose, their stuttery scrolling. Some exceptional examples have even failed to update for iPhone 6/6+ screen sizes, continuing to operate in zoom-mode a year later - hardly an attempt to delight users.</p>
<p>Much has been said and written about the economic difficulties of making a business in app development, with so many free or £0.69 apps around. But I gladly coughed up the ‘premium’ price for Tweetbot and NetNewsWire. </p>
<p>When I say ‘premium’, I mean the price of a couple of Cappuccinos, which perhaps puts things into perspective. I can drink coffee for a few days, or I can buy an app I’ll use every day for the next couple of years, at which point I’ll gladly pay again for the upgraded version. </p>
<p>There’s a role for something better and that role has value. Many of us still appreciate craftsmanship and are willing to pay for it.</p>
<p>I cannot afford to furnish my house with bespoke furniture. But luckily, software craftsmanship is cheap. If you can afford a Starbucks, you can afford a good app. So when you see a <em>great</em> app that costs a few pounds, compare it to the value you get from a coffee. Apps like Tweetbot and NetNewsWire are enormous bargains.</p>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-76702543303109759362015-08-05T14:05:00.000-07:002015-08-07T00:18:44.786-07:00Apple Pay - the first novel payment system that actually works for users?<p>I used Apple Pay last week for the first time. It’s really excellent and a huge contrast to other mobile payment solutions I’ve used or looked at. I thought it worth a little analysis of what’s going on with Apple Pay because there are some interesting lessons here.</p>
<p>To start things off, it’s worth recapping why other mobile payments solutions have struggled to succeed. I’ve previously been quite dismissive of those solutions because they all have at least one of the following drawbacks:</p>
<ol>
<li>They aren’t secure. e.g. Barclays <a href="http://www.bpay.co.uk/home#">bPay</a> thing is very convenient, but relies on a £20 limit on transaction value as there's is no way to authenticate a transaction. I can’t accept the idea of so little security on a banking transaction, regardless of the imposed limits. bPay is just a digital version of cash - if you lose the physical bPay device, you quite possibly lose all the money loaded onto it.</li>
<li>They are actually <em>less</em> convenient than our normal European chip+pin solutions. Unlocking a phone, finding and launching an app and entering a secondary security code is a surprisingly common series of steps. Or loading money from your bank account into a special new “mobile wallet”, that comes with yet another passcode to remember. These approaches are <em>less</em> convenient than a plastic card. And the £20 transaction limit on UK contactless payments pretty much limits its use to coffee+sandwich only - hardly convenient.</li>
<li>There’s no critical mass of acceptance at retailers. If I can’t actually use a new payment method to buy the things I need, it’s pretty useless. There’s a profusion of interesting “fintech” payment solutions - but you have to <em>really</em> try hard to find a retailer that accepts them and there’s no realistic prospect of that changing.</li>
</ol>
<p>I think that Apple Pay looks like it might be a winner - because it ranks well on all three of my problem fronts:</p>
<ol>
<li>It’s secure - you can’t pay without authenticating with TouchID. And TouchID is almost certainly <em>more</em> secure than a 4-digit PIN code that anyone can steal by looking over your shoulder.</li>
<li>It’s convenient - just hold the phone whilst touching TouchID. No codes to enter, no buttons to press, no apps to find. It’s about as simple as it’s possible to be. Because TouchID means the Apple Pay transaction is secure, some UK retailers are already <a href="http://techcrunch.com/2015/07/14/most-uk-apple-pay-retailers-cap-at-20-but-pret-and-bills-show-transactions-can-be-limitless/">removing the countries £20 contactless limit</a> for Apple Pay.</li>
<li>It’s gaining critical mass. Apple Pay works at the growing number of standard contactless points that are being rolled out across retailers. Because UK banks had already invested in NFC contactless features in debit/credit cards, Apple Pay exploits that industry standard point-of-sale infrastructure. And in the USA the launch timing was perfect, as industry rushed to renew point-of-sale infrastructure in order to increase security in the light of scandals like the <a href="http://lifehacker.com/target-hacked-credit-cards-and-private-data-for-40-mil-1486402421">Target hack</a>.</li>
</ol>
<p>It’s still early days for Apple Pay (especially in my native UK where it’s only just launched), but it’s the <em>only</em> novel payment solution I’ve seen that has a chance of meeting my three criteria. It has to be secure, it has to be simpler than chip+pin and it has to have a demonstrable likelihood of critical mass acceptance by retailers.</p>
<p>I don’t care about how novel it is or isn’t, if it saves retailers a few fractions of a penny or how it might disrupt the world. The <em>only</em> thing that will make a new payment mechanism successful, is if the experience is better for users. If it’s not, people won’t use it and everything else is irrelevant.</p>
<p>Of course Apple Pay will always be a relatively niche solution because it’s maximum market size is a subset of iPhone users. But American Express seems to have survived for many years on a similar basis. </p>
<p>The way that Apple Pay is integrated deeply into iOS is a critical part of the overall user experience. Others could use the TouchID API to build fingerprint authentication into their payment solutions, but that would still require the user to unlock the phone and find/launch the appropriate app before starting the payment process. In contrast, the that Apple Pay is integrated deeply into the iPhone and iOS requires none of this, just needing the phone to be placed near the payment terminal.</p>
<p>This experience could only be created by an organisation who has the ability to influence not just the banking ecosystem (Apple had to galvanise the banking industry into delivering on the promise of <a href="http://usa.visa.com/download/merchants/encryption-tokenization-09182013-public.pdf">Tokenisation</a> as part of the Apple Pay solution), but also the phone hardware and operating system design.</p>
<p>Lets look at what was needed to bring Apple Pay to fruition (this is far from a complete list, I am sure):</p>
<ul>
<li>Bring TouchID to market, ensuring the availability of a seamless authentication solution</li>
<li>Wait to ensure a critical-mass of TouchID-enabled phones being used by customers, so that when Apple Pay launches there’s enough people with TouchID-enabled phones to use it</li>
<li>Influence Visa and Mastercard to implement Tokenisation</li>
<li>Get the retail banks on board to implement their end of the solution - at the time of writing I can count over 380 banks currently supporting Apple Pay, so no mean feat</li>
<li>Strike deals with payment processors like <a href="https://stripe.com/apple-pay">Stripe</a> to ensure that Apple Pay can also be used for in-app payments</li>
<li>Integrate Apple Pay deeply into iOS so the end-to-end user experience is simpler than anything else</li>
<li>In the USA, sign up a broad array of retail partners who will implement the required point-of-sale infrastructure to support Apple Pay</li>
<li>In the UK, start influencing retailers who’ve already implemented contactless payments to remove the £20 limit for Apple Pay transactions</li>
<li>And since it’s Apple, do most of this in secret</li>
</ul>
<p>Let us not forget that Apple had to <a href="http://www.reuters.com/article/2012/07/27/us-authentec-acquisition-apple-idUSBRE86Q0KD20120727">purchase Authentec for $356m</a> in order to obtain the underlying technology needed to create TouchID in the first place. On top of that were the presumably significant investment costs to perfect and bring to market that nascent technology. If we then add in the costs of creating Apple Pay, it’s easy to see that it might have cost well in excess of $0.5bn to make Apple Pay happen. I don’t think that is the sort of money that banks are typically investing in their innovation programmes. </p>
<p>Apple Pay is the fruition of a <em>very</em> ambitious goal, the creation of new innovations like Tokenisation and TouchID across several industries and a <em>very</em> complex set of coordinated deliveries over several years. I think it’s fair to say that Apple isn’t playing at this.</p>
<p>For those that ask “why could the banks not have created their own successful mobile payment solution?”, the answer is simple. They can’t. Without the ability to influence the way the mobile OS and hardware interact with payments, it’s not possible to engineer the required user experience. This is why we’ve seen a proliferation of small-scale “experiments” that people like me sneered at, rather than anything significant. </p>
<p>Companies like Apple are also able to justify huge investments that banks would really struggle with. They can do that because they see mobile payments not as an end in itself, but as just another jigsaw-piece in the success of their wider mobile platform. And a company like Apple sees this on a global scale, whereas nearly all banks are regional players when it comes to retail payments.</p>
<p>The cost for traditional industries who wish to become serious digital players is higher than I think many acknowledge. Take Facebook’s urgent need to make the transition from desktop browser to mobile app - acquiring Instagram for $1bn and WhatsApp for $19bn. Or BMW, Mercedes and Volkswagon purchasing Nokia’s HERE mapping division for $2.8bn. Or Adidas purchasing runtastic for $240m. Whether purchasing mature and successful businesses (Instagram, WhatsApp, HERE) or raw technology (Authentec), the costs and business risks involved are often significant.</p>
<p>I find Apple’s ability to control not just <em>its</em> components of the solution, but also to influence and lead the required wider consortium, perhaps the most interesting lesson here. Vast arrays of retailers and banks, together with Mastercard and Visa, all needed to be influenced and coordinated to make Apple Pay happen. Banks would typically go about this kind of thing with armies of highly paid consultants and a heavy-weight “project office”. I don’t think that is how Apple did it - I’ve certainly not heard of those consultant armies. Intriguing.</p>
<p>Overall, it seems that creating success in mobile payments needs high ambition, deep pockets and an ability to influence all parts of the ecosystem involved - with that in mind, Apple Pay is a rather remarkable achievement. I’m not sure I see how banks can do this kind of thing on their own - which is an extraordinary statement about the role of the banking and technology industries.</p>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com2tag:blogger.com,1999:blog-7114254501166964003.post-29050159744081338662015-07-29T13:07:00.000-07:002015-07-31T10:34:31.609-07:00Might "chat" become the universal User Interface?<p>Google’s home page has a single box: “what are you looking for?”. We type our questions, and receive answers. </p>
<p>In contrast, most other home pages are cluttered with images, adverts, complex menus and general clutter. We have to hunt for what we’re looking for amongst the trivia, the distractions and the irrelevant. Often times our answer is buried several pages into a site - so we have the added distraction of working out how to navigate to what we want. </p>
<p>The web was a brilliant invention for publication, but it’s a frustrating one when we just want an answer or to get something done. </p>
<p>Google originally became popular not because its search was especially better than others, but because its clean and simple home page loaded quickly, in an era when many of us were still using dial-up modems. </p>
<p>Fast, clean and simple, the “search box as universal home page” launched a $430bn company.</p>
<p>If search as a UI works for Google, why should it not be extended for a bank, an insurance company or a retailer? Of course our needs with those organisations are frequently more complex than simple search. So the simple search box needs to be extended to allow a dialogue. We ask questions, we get answers without distractions. </p>
<p>Your retailer might ask “What are you looking for?” and we might type “I’m looking for men’s casual shirts”. And Boom!, there you are, straight to the men’s casual shirt page. </p>
<p>Or your insurance company might ask “How can I help you?” and we might reply with “I’d like a car insurance quote”. The chat would then enter a natural language dialogue where we are asked a series of further questions to gather the required information. All of this can be done through a single chat dialogue - just in the way that we message our friends and relatives in iMessage, Facebook or WhatsApp. </p>
<p>Rather than searching in vein for the “how to change your password” and navigating several pages of a website, we would just type “how do I change my password” and get straight to the information we need.</p>
<p>Of course chat needs to be more than simple text. A chat dialogue can have embedded pictures, videos, maps, links and can cause an app to do things like display a popup. It’s not that there’s <em>only</em> chat, but that chat might be the centre of the user experience.</p>
<p>The dominance of social media apps in the modern age means that a “chat interface” is now our most frequently used interface with a computer. All of us, from teenager to grandparent, have a common and universal understanding of how chat works. </p>
<p>Indeed <a href="http://dangrover.com/blog/2014/12/01/chinese-mobile-app-ui-trends.html">evidence from China</a> is that chat interfaces for brands of all sizes are now expected and commonplace. The Chinese appear comfortable adopting this style of interaction not just with friends, but also businesses.</p>
<p>Of course there is one drawback to using “chat as the universal UI” – and that’s that it needs a human on the other end. It’s all very well us chatting to our bank or our local builder’s merchant, but that needs an army of trained people to respond…</p>
<p>That is where cognitive technologies like <a href="http://www.ibm.com/smarterplanet/us/en/ibmwatson/">IBM Watson</a> come in. Because Watson understands natural language and is able to hold a conversation, it can automate these chat-based interfaces. Instead of chatting with a human operator at a business, that business trains Watson to handle the chat on its behalf.</p>
<p>We’re already seeing these automated “virtual assistants” in our phones. Siri, Google Now and Cortana prove that no smartphone worth its salt can now come to the market without one. Some may sniff at the idea of chatting to a computer in our phones, but <a href="http://googleblog.blogspot.co.uk/2014/10/omg-mobile-voice-survey-reveals-teens.html">more than half</a> of US teens now use “voice search” daily. </p>
<p>Riding on the popular familiarity of assistants like Siri, we’re beginning to see a new wave of “conversational” apps where the primary interface is a chat one. </p>
<ul>
<li><a href="https://nativeapp.com">Native</a> is a travel concierge app, with a chat interface</li>
<li><a href="http://www.vida.com">Vida</a> is life and health coach app, with a chat interface</li>
<li><a href="https://x.ai">x.ai</a> is a personal assistant who schedules meetings for you, through a chat interface</li>
<li><a href="https://luka.ai">Luka</a> is an app that finds and books places to eat, through a chat interface</li>
</ul>
<p>One of the most popular calendar apps on iOS is <a href="http://flexibits.com/fantastical-iphone">Fantastical</a>. A big reason for its popularity is because you can use natural language to interact with it. Type “I want a meeting with Fred at 10pm tomorrow” and it creates an appropriate calendar entry for you. </p>
<p>Simplicity sells and chat is the simplest UI imaginable – instantly recognisable to all and more efficient than complex menus and mice. I think we’re seeing the start of a big trend with chat interfaces because of that simplicity.</p>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-24713878987942070052015-05-11T12:56:00.001-07:002015-05-12T07:42:54.349-07:00A couple of weeks on: what I've done and not done with my Apple Watch<p>I’m a bit of a sucker for a big Apple launch and an entirely new device like the Watch intrigued me greatly. I’m sure I didn’t surprise many when I was the proud owner of a 42mm Apple Watch Sport on launch day. However, despite my excitement, I was unsure of what to expect. I entered ownership as an experiment and with a few anxieties (most notably battery life). I might have been excited, but I wasn’t blind. So how have I got on? </p>
<p>If my personal experience of the way I’m questioned about Apple Watch is anything to go by, a lot of people are very intrigued by the device. I thought it worth capturing my experiences now that I’ve got past the initial novelty stage and begun to settle into a bit of a rhythm in its use.</p>
<p>I’ve structured this post around the three big questions that I’m consistently asked about Apple Watch:</p>
<ul>
<li>What can you do with it and why might I need one?</li>
<li>Is the battery life good enough?</li>
<li>Does it suck you further into the digital realm?</li>
</ul>
<p>So, here we go with those questions.</p>
<br>
<h2>What can you do with it and why might I need one?</h2>
<p>Lots of people have asked me why they might need an Apple Watch.</p>
<blockquote>
<p>“It’s about desire, not necessity. Convenience, fun and style are not needs. They’re wants. And people will gladly pay for what they want. The iPad faced similar misguided criticism. How many times did you hear or read someone say of the iPad, ‘Why would anyone who already has a phone and a laptop need an iPad?’ That was the wrong question, because almost no one needed an iPad. The right question was ‘Why would someone who has a phone and a laptop <em>want</em> an iPad?’ ” <a href="http://daringfireball.net/2015/04/watch_apple_watch">John Gruber, Daring Fireball</a></p>
</blockquote>
<p>As Gruber says, "why do I need one?" is the wrong question. Nobody <em>needs</em> an Apple Watch. But I believe that lots of people might <em>want</em> one when they begin to see what it can do. Lets not kid ourselves – mostly the watch is about convenience and desire, not need. But these are mostly the same desires that drove the smartphone revolution. We chose iPhones over Blackberry’s, not because we <em>needed</em> iPhones, but because we <em>wanted</em> them.</p>
<p>So what have I done, and not done, with my Apple Watch?</p>
<p>I’ve checked into security with my British Airways boarding pass on my watch. Note, I had to resort to using my iPhone at the actual gate because the scanning machines aren’t big enough to get a wrist underneath. I had fun joking with the BA staff about the size of my wrist though!</p>
<p>I’ve hailed a London cab from my wrist using the Hailo app. You can do a very similar thing with Uber as well - the watch app finds your location and calls a car to where you are. My cab driver was a bit taken aback when I explained how I’d called him - he didn’t know the watch app existed and was very excited to tell his grandchildren that he’d been hailed by an Apple Watch!</p>
<p>I’ve discovered that I need to do more exercise using the Activity app. I had a Fitbit before and quite liked the way it tracked my steps and encouraged me to walk more. But a Fitbit only counts steps, it doesn’t know if you’re walking briskly and therefore exercising, or just dawdling. Apple Watch has made it painfully obvious that I need to exercise, not just walk. It logs my heart rate every 10 mins and I can view that in the Health app on my iPhone. I’ve already started walking energetically to meet the exercise targets on my watch. It’s actually changing my behaviour, for the better. Health and activity might be the ‘killer app’.</p>
<p>After realising that my heart rate from my Apple Watch is logged in the Health app on my iPhone every ten minutes, I geeked out about this data before wondering “what is a healthy heart rate?”. DuckDuckGo (my chosen search engine) to the rescue, it appears that a <a href="http://www.mayoclinic.org/healthy-lifestyle/fitness/expert-answers/heart-rate/faq-20057979">resting rate of 60–100 beats/minute</a> is considered the normal range – the lower the healthier. Luckily I fit nicely within that range. I’m now wondering if I could get my resting rate down by doing more exercise – a new project.</p>
<p>When close to my daily exercise target, I do confess to having run energetically on the spot for a couple of minutes in order to attain my daily award badge. Gamification seems to work!</p>
<p>I’ve given a Keynote presentation by remote controlling the slides on my watch. My iPad was connected to the projector and I could go forward/back on my watch - pretty cool! MS PowerPoint also has a similar Apple Watch app.</p>
<p>I’ve responded to a text message whilst out walking in the street and without brandishing my iPhone. Apple Watch provides some useful stock answers (which you can tailor) that mean you can easily respond whilst walking. I also used Siri to dictate a reply, which seems incredibly accurate - I could sware its more accurate than on iPhone.</p>
<p>I’ve made my daughter reply to a text message because of the funky big animated emojis you can send from Apple Watch. Getting a teenager to react isn’t always easy - Apple Watch helps.</p>
<p>I’ve found a nice restaurant to eat at with a friend by using the Foursquare app on my watch (there's also TripAdvisor for a similar function)- this shows the best places in your current vicinity, so ideal for on-the-go last-minute dining choices.</p>
<p>I’ve answered a phone call on my watch when my iPhone was in another room of the house and I’d likely not got to it in time. I wouldn’t do that in the street, but it’s very convenient in the right environment.</p>
<p>I’ve changed the music track playing on my iPhone whilst leaving it in my pocket whilst on the train.</p>
<p>I’ve listened to a news article that I’d clipped to Instapaper for later reading. Instapaper is a great ‘read later’ option for caching articles for reading when you may not have phone reception. The Apple Watch app converts the text to speech and reads the clipped articles for you. It sounds strange, but is actually much more practical that you might think - close your eyes and just listen whilst travelling, rather than get tired reading. </p>
<p>I’ve propped my iPhone up and used the camera app on my Apple Watch to remotely take a photograph that included me in it.</p>
<p>I’ve used the maps app to get directions whilst walking. Instead of having my watch barking instructions at me in the street, Apple Watch taps you gently and discreetly on the wrist to indicate left/right turns. </p>
<p>I've found the time of the next train home using the CityMapper app, which is programmed with my home address - so all I needed to do was to tap the ‘get me home’ button on the watch. It then plans a route from my current location and tells me what time train I should aim for.</p>
<p>I've silenced an incoming call by covering the screen with my hand - such a human and natural way to silence something, but a delight to find that it works the way you might think it should.</p>
<p>I've realised that I can also turn the watch’s screen off by coving it with my hand. </p>
<p>I've become more aware of the day, by having sunrise and sunset continuously displayed on the watch face (keen photographers will appreciate that in order to be aware of ‘golden hour’ where the light is perfect for photography).</p>
<p>I was informed that the Duchess of Cambridge had given birth to a daughter the moment the news hit the BBC, through the wonders of the BBC Apple Watch app and notifications. More interestingly, for me at least, I got notified of developments during the recent UK General Election - all consumable by a simple glance and so very convenient when on-the-go.</p>
<p>I get notified when I have a retweet, reply, mention or new follower on Twitter. Perhaps not earth-shattering, but interesting and more timely if you’re a Twitter addict.</p>
<p>I’ve recently invested in the OmniFocus “getting things done” app for my iOS devices. With OmniFocus also on my watch, I can see my to-do’s and mark them complete there. This is outstandingly useful - I focus on getting things done and just tap the big round button on my watch when I complete my tasks. I like this a lot.</p>
<p>I <em>do</em> make sure that the notification settings on my iPhone don’t result in Apple Watch notifications becoming intrusive. Specifically, I only get notified of emails from people on my VIP list. It’s tough to get on that list, and easy to get removed from it if you irritate me. As a result, Apple Watch is not a new way for <em>anyone</em> to gain my attention - only the special VIP candidates get that privilege. </p>
<p>I <em>don’t</em> tend to read emails on the watch, although I can - I hate “doing email” at the best of times, so squinting at a tiny screen just brings a whole new form of pain to the medium that I can do without. Instead of working through my inbox, I tend to just glance at the emails from my VIP list that I get alerts for. My “email on watch” experience is therefore a hugely refined version of my more general-purpose email experience on devices with a bigger screen.</p>
<p>I <em>haven’t so far</em> got to test the freehand-message-drawing or heart-beat-messaging – you need a friend with an Apple Watch to make that work. So far my wife hasn’t relented and bought one, but I detect a wavering. So maybe I’ll be heart-beat messaging her in the near future.</p>
<p>I <em>haven’t</em> played any games on my watch. Seriously, games on a watch and with a screen that small? They exist, but I don’t think it makes any sense and I’m not much of a gamer anyway.</p>
<p>The rich variety of things that Apple Watch makes more convenient or just <em>more fun</em> is what makes the device for me. I might argue that no one use, except for perhaps the health stuff, is on its own a killer-app. But taken as a whole, I now find it impossible to imagine going without it. As a consequence I do use my iPhone less, preferring the simplicity of the watch. There are days that, when I do use my iPhone, I’m struck by how positively enormous the screen is in comparison. I think I’m also a bit more aware of how much light is emitted by a smartphone when in a darker environment - the watch is much more discreet, given the small size of its screen and that it uses a black background.</p>
<br>
<h2>Is the battery life good enough?</h2>
<p>This is the one thing that <em>really</em> concerned me about Apple Watch. None of us are used to watches that need regular changing and it was hard to tell if more ‘enthusiastic’ use was going to make charge-anxiety ‘a thing’. I didn’t fancy the idea of carrying a charger with me, even less so the anxiety over where my next ‘fix’ of power might come from. I imagined this to be the one thing that might make Apple Watch impractical.</p>
<p>As it turns out, battery life has been a complete non-issue for me. Apple Watch <em>always</em> gets me comfortably through a long day, even with my ‘enthusiastic’ use when it was novel and everyone wanted a demo. The past week I’ve gone to bed with between 30% and 60% battery-life remaining, after long ~18 hour days. I’ve learnt that it doesn’t matter how much I fiddle with Apple Watch, I’m not going to run out of charge.</p>
<p>I have my Apple Watch charger setup next to the bed. Every night I take the watch off of my wrist and place it on my bed-time table, in the same way that I would with an analogue watch. The only difference is I snap the magnetic charger to the back. Because I don’t need to charge at any other time, the charger never moves around the house and the routine is set. I charge every night, the charger never moves from the bedside, and I’m guaranteed an easy full-day without any battery-life-anxiety. </p>
<p>The only time I’ve broken this routine was on the evening of the UK General Election. I had taken my watch off whilst watching TV into the early hours and forgotten that I’d left it on the sofa, without charging overnight. In the morning I placed it back on my wrist, without charging, and spent a whole day using it. By the time I got home from work I was down to just under 20% remaining power. So it’s possible to get two days of use without charging, at a stretch.</p>
<p>As a result of my experience, I don’t carry a charge cable with me – I don’t need it. Apple Watch has a ‘power reserve’ mode which cuts back function to preserve basic time-keeping, but I’ve never even been tempted to try it – battery-life has been such a non-issue for me. It’s possible to add battery-life as one of the ‘complications’ that are displayed on the watch face, but I’ve not done so because I’ve become so uninterested in how much power remains. </p>
<br>
<h2>Does it suck you into the digital realm?</h2>
<p>This is a great question and a very valid concern. Many of us worry that smartphones can sometimes become a barrier to human relationships. Google Glass seemed to go a whole stage further, coining the term ‘glasshole’ for the type of people who used it. Does Apple Watch have a similar affect? My initial conclusion are that it does not. People seem genuinely interested rather than annoyed by it. It’s tiny screen limits the its ability to suck you in. Most of all, the software design choices seem to heavily favour short snacking rather than extended use.</p>
<blockquote>
<p>“The Apple Watch is best used as that: a watch. It’s something you check for a second or two and then put away. And in 2015, it’s nice to have a watch that can do more than simply tell time. We can carry it around with us everywhere we go, and it springs to life when it receives a notification: a text, an email, a tweet, a Facebook message. Those notifications don’t always need a response, but they are important to glance at, just like the time.” <a href="http://uk.businessinsider.com/apple-watch-review-2015-5?r=US">Steve Kovach</a>.</p>
</blockquote>
<p>The Watch has a concept of ‘glances’, which many apps provide - these are a way to quickly see the essence an app needs to convey, without providing further function. A glance is just that - you glance at it, in the same way you would glance at a mechanical watch. I think it works. A glance takes a second of your time, no more.</p>
<p>Glances are perfect for a watch and make it quick and discreet to use. They are the antidote to browsing a Facebook news feed, or a Twitter timeline. Those things “suck you in” and are an activity in and of themselves. But browsing a timeline or feed on a watch is impractical because of its tiny screen. It makes no sense to spend minutes doing something on an Apple Watch – everything about it is designed to get you away from the watch as soon as possible. In that way it’s the polar-opposite of the smartphone. So no, this is not a device that sucks you into the digital realm. If anything, it frees you from that. </p>
<p>I find myself glancing at my watch for information, rather than turning on my phone and getting distracted by the rich experience it provides. For example, I might get a notification of someone favouriting a tweet on my watch. All I do is view it, smile, and dismiss it. On the phone I’d probably end up looking in my twitter client (Tweetbot) and having a quick look at my timeline, before….. But on the watch, I glance and move on with life. </p>
<br>
<h2>Should you buy one?</h2>
<p>You almost certainly don’t need an Apple Watch - most of what it does, you can do with a smartphone. But it <em>is</em> more convenient, more discreet, and fun. I think it’s less of a distraction than a smartphone and a better way of communicating in many situations. It doesn’t suck you in and doesn’t rule your life in the way that a smartphone sometimes can. It’s beautifully designed and a delight to use. The battery easily lasts a day (or two) and isn’t something I’ve needed to worry about. There’s a wide variety of third-party apps that make the experience much more complete. The health function seems to be a big deal and has the capacity to change behaviour, which if you think about it is profound.</p>
<p>I like Apple Watch a lot – I think we’re witnessing the start of something big.</p>
Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com4tag:blogger.com,1999:blog-7114254501166964003.post-81590745689342012242015-04-25T00:09:00.000-07:002015-04-27T05:17:20.353-07:00Some initial thoughts on the Apple Watch experience<p>My watch arrived towards the end of the day yesterday [that's launch day, if you're reading in the future], so I naturally spent all evening playing with it. It’s too early to draw any firm conclusions, so I’ll wait a week-or-two before penning anything definitive. However, there are some clear initial impressions that I think won’t change. </p>
<ol>
<li><p>The sport straps are wonderful<br>
I spent ages prevaricating over which strap to get and how much I wanted to pay. In the end I decided that, since the Watch utility is unknown at this stage, I’d stick with the cheapest and see how it goes. In reality the Sport strap is incredibly comfortable and very natural. It feels more in-tune with the modern aesthetics of Watch than the leather options. I feel really happy this was the right choice.</p></li>
<li><p>You need to prune notifications<br>
Watch can alert you whenever you have an email, a tweet, a diary entry, a Periscope, etc. Every app that notifies on your iPhone can give you a light tap on the wrist to gain your attention. In theory this can make you more productive. In reality there’s a big risk that it shackles you to the digital world and work. I’ve turned off a lot of the notifications so that I’m still a free man. I’ve set notifications only for emails from my VIP list - if you’re not on my VIP list then I don’t get notified about your email. I just don’t feel a need to be “more efficient” in everything I do. I want to be able to switch-off and tune-out. Pruning the notifications to ones that <em>I choose</em>, rather than everything, is a good way to ensure that is possible. </p></li>
<li>Apps make the experience<br>
On setting up my Watch I had 47 apps available as a result of those I already had installed on my iPhone. That’s quite a few - I’ve not installed them all and am still working through them one-by-one. Already I can see some stand-out apps that transform the utility of Watch. Just as with iPad, the value of the device is made not by what Apple installs on it, but by the third-party apps. My favourite ones so far include:
<ul>
<li>TripAdvisor, which very quickly shows the top things to do around where you are now. Wandering around a new place and want something to eat? Just raise your wrist and use the app.</li>
<li>Foursquare, which does a very similar thing to TripAdvisor, but on a different data set. I’m not sure which is the best yet, but I find it interesting that Foursquare has already gained my attention when I’d all but forgotten it on my iPhone.</li>
<li>CityMapper, which has a brilliant travel guide capability. These guys have really thought things through. I press a button “Get me home” and it shows me directions to the local station and finds the next trains to my registered home location – utterly brilliant.</li>
<li>Uber, for getting a cab and tracking it’s arrival.</li>
</ul></li>
<li><p>The best apps provide a subset of the iPhone experience<br>
Watch isn’t about doing everything an iPhone app does – it’s about convenience and quick access. So, developers that have worked out the essence of their app and provide a very streamlined experience for the core use, are the ones that seem to work well. TripAdvisor is a good example – it’s not about doing all the research and looking for different options, the Watch app just focusses on what is near to where you are now.</p></li>
<li><p>App quality is higher than I expected<br>
Some of the apps show some rough edges – it was tough for developers to get the experience right when they’d never seen the app running on a physical device until yesterday. On the whole though, the vast majority are useful and high-quality – and I’m sure those with minor issues will see those ironed out in the coming weeks. Given the very limited testing opportunities most developers have had, I’m incredibly impressed with what I see.</p></li>
<li><p>Some things just don’t work on the small screen<br>
Take email or browsing a Twitter timeline. You can do it, but frankly the screen is <em>far</em> too small for it to make any sense. I very much doubt I’ll be using Watch for things like that – it’s easier to get the phone out and use that. I’m also not convinced of using it to phone people. Sure, you can. But using a Watch to phone means that you’re either broadcasting the conversation to everyone around you, or you need bluetooth headphones. Maybe I’m wrong, but talking to your watch, so that everyone can hear the conversation, has elements of “glasshole” about it for me.</p></li>
<li><p>The quick-access button for contacts is a master stroke<br>
If you press the long button it takes you to your contacts so you can send a message, phone (I think not), etc. But it’s not your whole contact list because that would be very hard work to navigate on the wrist (my address book has hundreds of entries). Instead, it’s your favourites – so again, the Watch usage is for messaging the important and frequent contacts, not trying to do everything. As a result, it carves a niche for itself (very rapid and easy access to your important contacts) that might be devalued if it presented a scrolling list of 500 contacts.</p></li>
<li><p>An un-Apple-like level of customisation<br>
This thing has <em>so</em> many ways to customise things – the level of tinkering you can do is even slightly daunting initially and I’m still getting to grips with what I can change and how I want things setup. </p></li>
<li><p>Health & fitness is intriguing but will take time to explore<br>
I dallied with a fitbit for a while, but fell out of the routine of charging it. Already Watch is telling me my heart rate. I have no clue what to do about this, but plan to delve into the fitness side of things over the next week and see if it might get me more active.</p></li>
<li><p>Siri comes alive<br>
I never really used Siri on the iPhone. But on the watch you need to – e.g. to compose a message. It works *really* well and suddenly I find I’m using Siri a lot. The whole idea of natural language interaction suddenly makes so much more sense when you have a tiny screen that prohibits typing.</p></li>
</ol>
<p>Overall I’m very impressed so far. Time will tell where the utility is, but I’m fascinated by this idea of “essence of iPhone”. The vibrancy of the iOS development community has ensured that many of the best iPhone apps have a “mini” experience on Watch. So “essence of iPhone” doesn’t mean “essence of the Apple experience” but also an "essence of third-party apps" experience. </p>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-343944889939970742015-04-16T22:48:00.000-07:002015-04-16T22:53:39.514-07:00Seeking inspirationI meet with many different people in my job. I go to many meetings – some good, some not so good. Increasingly I’ve noticed a huge variance between the best and worst. The more I've noticed this, the more I've thought about why it is – and have come to believe there are two fundamentally different environments in which we work.<br />
<br />
<br />
<h2>
Type-A environments</h2>
The best meetings, environments and teams are inspirational. They are characterised by people hungry for new ideas and facts. Those people are always incredibly sharp, operating at their peak. They are enthusiastic and energetic. You feel that energy as soon as you walk into the room.<br />
<br />
There’s a pace and urgency to do things. <br />
<br />
There are no pre-determined answers at the start of these meetings – the outcome is determined by the discussions, with opinions being formed around the ideas and facts as they emerge. <br />
<br />
No idea is off-limits, and diversity of thought is celebrated. <br />
<br />
Problems aren’t expressed as road-blocks, but instead people seek insight by asking questions like “this might be a challenge, how could we overcome it?” <br />
<br />
Seniority is irrelevant – your grade or experience is not a factor, the only thing that matters is the value you bring to a particular discussion. Typically nobody even knows the grades of those involved, its so trivial a matter. <br />
<br />
Discussion focusses on ideas, not on politics. <br />
<br />
These environments are incredibly empowering – entering them is like attaching jump-leads to the brain. Ideas bounce and feed off of each other, with new possibilities continuously emerging.<br />
<br />
<br />
<h2>
Type-B environments</h2>
Conversely, there are environments that are somewhat different. The people working here always seem to have a reason not to change anything, or not to do anything. <br />
<br />
Type-B environments are characterised by people who seem to know what the “right” answer is, and are only interested in data that supports that pre-determined view. If contrary data-points or opinions are expressed, a clear disapproving response is provided that makes it clear to all involved that these views are not appreciated. <br />
<br />
Dissenting opinions are suppressed not via debate, but by disapproval and unsaid threat of ex-communication. <br />
<br />
The environment is very low-energy – and with very little emotion. <br />
<br />
The priority seems to be to only do the “right” thing, no matter how long it takes to decide what that might be - time is not a priority. <br />
<br />
“Conventional wisdom” is used to raise roadblocks that ensure new ideas are squashed without any real exploration of how their perceived drawbacks might be mitigated. <br />
<br />
Grade and seniority are critical factors, with a clear hierarcy that expects “respect” – the political complexities of navigating that hierarchy often dominating discussions. <br />
<br />
Type-B environments and teams suck all of the energy and creativity out of the air. They are profoundly unempowering and depressing.<br />
<br />
<br />
<h2>
Contrasting Type-A and Type-B</h2>
The contrast between these two environments is so enormous and obvious within 30 seconds of entering a room. Its impossible not to immediately notice which you are in. The impact is dramatic - one team gets things done, the other finds excuses. I wish all leaders would look hard at the culture and environment they foster – because that leadership, or rather a lack of it, is what allows Type-B environments to exist.<br />
<br />
Diversity is often a factor – teams of people of a similar age, gender, profile and background tend to form a collective “assumed view of the world” that prevents alternative opinions from emerging. In contrast, where diversity of background exists, diversity of thought thrives.<br />
<br />
Early in my career I was once asked to attend a board-meeting of a significant public company. Part way through the meeting the Managing Director stopped the meeting, pointed at me and said “this guy is here for a reason, I want to know what he has to say.” It was a defining moment – for my junior rank and lack of grey hairs had, until this point, meant I was continuously talked-over by my elders and supposedly betters. We need more leaders like this – people hungry for ideas who are willing to break conventions to seek them.<br />
<br />
Can we please have some more diverse teams and empowering environments? I get up in the morning for meetings like that. I have to take intravenous caffeine for the other type, just to get through them.Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-5291346887435438142015-03-24T02:46:00.000-07:002015-03-28T05:34:34.238-07:00Demystifying Cognitive ComputingIs "Cognitive Computing" a huge computing revolution that ushers in an era of thinking machines?<br />
<br />
If you listen to Professor Stephen Hawking and some of the press commentary, you might be forgiven for thinking that the <a href="http://www.bbc.co.uk/news/technology-30290540" target="_blank">age of science-fiction is upon us</a>. However, the reality is a little more mundane, much less scary, but no less exciting. The science-fiction talk frequently disguises the practical reality, which is that any app developer can start to use cognitive capability for very small tasks – no matter their understanding of the technology or the depth of their pockets. It's this more down-to-earth view of the subject that I'd like to address in this post, hopefully dispelling some of the confusion and mystique that sometimes surrounds the topic.<br />
<br />
In my university days we didn't talk about Cognitive Computing, we talked about Artificial Intelligence. The two phrases describe broadly similar concepts, but cognitive is perhaps better because it does less to conjure up images of sci-fi "thinking" machines. Instead, the subject implies a set of capabilities that are slightly less fanciful – things like identifying the subject of a passage of text, or picking out the names of people in a paragraph. It also includes more ambitious capabilities, like the ability to converse in full natural language. But the term is very broad and does not always imply things that you would necessarily think of as "intelligence".<br />
<br />
At its most basic, cognitive computing techniques allow the analysis of data types other than the traditional structured records in a database. These might be sentences of natural language, voice recordings or images – all things that until very recently we would have considered a ‘blob’ of data, but a ‘blob’ with which we could do very little. Cognitive computing gives us the ability to peer inside the blob and to start doing interesting things with it – to parse sentences, recognise the subject of images, translate speech, etc. <br />
<br />
It can sometimes be hard, even for humans, to understand the true meaning of ambiguous natural language, or to be absolutely certain that two photographs are of the same person. We often hear people express this in terms like “I think that’s <em>probably</em> Jane”. <br />
<br />
When a computer tries to match faces, it turns out that we can be a little more specific about its confidence. By using the level of evidence found to support a suggested answer, we can calculate a probability score. This approach is very useful with many cognitive functions – images, natural language, speech, etc. We can then use this calculated probability to make decisions – for example, when IBM's Watson <a href="https://www.youtube.com/watch?v=lI-M7O_bRNg" target="_blank">played the Jeopardy! quiz show</a>, it had a threshold that ensured it didn’t answer questions when its confidence was too low – because if you answer and get the question wrong, you lose money. <br />
<br />
Because the level of confidence in a cognitive function is really important, techniques like “machine learning” are often deployed to increase confidence over time. A machine can "learn" by taking feedback from users of the system on its accuracy. If users can tell a system when it is right or wrong, it gives that system the ability to use this feedback to adjust its confidence levels and approaches to problem solving. Or, a machine might learn in an "unsupervised" way by discovering patterns in data. Sometimes machine learning is also used to build a knowledge base by discovering data - for example, allowing a computer program to traverse links in Wikipedia and build a database of celebrities. In these ways, we can build computer systems that are more accurate.<br />
<br />
Machine Learning hasn’t typically been needed in the past, because traditional computer applications that work on numeric and structured data are dealing with certainties – it’s not that we <em>think</em> 1+1 is 2, we <em>know</em> it is. In this context there is no need to learn and become more accurate because the system is already precise in its judgement. Some things we have done in the past – like writing a program to parse web links – have just inherited a more sophisticated label. And some things you might not immediately realise are using machine learning, are – for example, the way that google guesses your words when you type in search keywords.<br />
<br />
Sometimes cognitive function brings about radical new types of apps – for example, Watson Oncology Advisor, which is helping to <a href="http://www.mskcc.org/cancer-care/watson-oncology">treat cancer and save lives</a>. Some of the systems being built are very ambitious and aim to democratise knowledge by consuming large quantities of written documents that can be queried using natural language.<br />
<br />
But cognitive function can also be used in a more bite-sized way – the capabilities being seamlessly blended into very useful, but far less ambitious, apps. For example, the popular mobile app <a href="https://getpocket.com/signup.php?src=homepage">Pocket</a> uses cognitive services to <a href="http://go.alchemyapi.com/hs-fs/hub/396516/file-2128832688-pdf/Case_Studies/AlchemyAPI_CaseStudy_Pocket.pdf">accurately categorise and discover interesting content</a> from the web, saving it to the user’s mobile device for later reading. The use of cognitive capabilities is both subtle and invisible to the app’s user. Nobody would think that Pocket is "thinking" or that it is revolutionary – but it <i>is</i> useful. It provides a streamlined experience, with articles being automatically tagged without the user needing to suggest or type those tags themselves. We don’t always need to change the world in order to exploit cognitive capabilities.<br />
<br />
Apps like Pocket are made possible because we can deploy cognitive capabilities into a cloud and hide the underlying complexity behind a very easy-to-use developer <a href="http://apievangelist.com/" target="_blank">API</a>. They, the app builders, get to concentrate on <em>how to use cognitive function</em> rather than <em>the engineering required to build cognitive function</em>. In effect, we get to democratise access to the underlying cognitive service. This simplification for app builders is a good thing, like all simplification, because it sets our minds free from the chains of complexity to dream of new possibilities.<br />
<br />
I notice this shift towards how to use cognitive function, rather than how to build it, all the time in my conversations around the topic – the discussions are almost exclusively in the “how can we use this”, rather than the “how does this work” camp.<br />
<br />
Hiding cognitive behind a cloud API is particularly important, because some of the computing systems needed have a high degree of complexity. Sophisticated software architectures and unusual physical infrastructures that exploit graphics processors (GPUs) for their ability to perform high-speed parallel calculations, abound. This is a specialised area of technology and one, thankfully, that app builders do not need to worry about thanks to APIs.<br />
<br />
The provision of cognitive services as APIs is also important because it often brings a “pay per use” charging model. Instead of large up-front investments in complex infrastructures, developers can start small and pay on a usage basis. Often the providers of these APIs offer a free tier sufficient to support the development process of a new app. “Starting small” might even mean “zero cost” in the initial stages. This low entry cost is perfect for fostering innovation and small experiments. Because the up-front costs are so low, ideas that might otherwise be strangled by red tape can be allowed to grow. And in cognitive computing this is important, because the things we are doing are often novel. The ideas need a little space to prove their worth before the full force of ROI and Business Cases are forced upon them.<br />
<blockquote class="tr_bq">
<i>“The level of computational resources required for us to get to the required scale of natural language processing functionality would be cost- prohibitive.”</i> Jonathon Morgan of <a href="http://crisis.net/" target="_blank">CrisisNET</a>, on why they <a href="http://go.alchemyapi.com/hs-fs/hub/396516/file-2162382309-pdf/Case_Studies/AlchemyAPI_CaseStudy_CrisisNET.pdf">chose to use cognitive APIs</a> rather than build their own natural language processing.</blockquote>
So the “API-ification” of cognitive computing both hides the underlying complexity of the systems and also removes the need for large up-front investments - empowering ordinary developers to start using cognitive capabilities in their apps. Rather than the topic being a complex and mysterious one, it is instead very simple. It takes just a few lines of code in any programming language to access a cognitive API, using industry standard REST API concepts and technologies. <br />
<br />
IBM’s Watson Group is pioneering the development of cognitive APIs at the <a href="http://www.ibm.com/smarterplanet/us/en/ibmwatson/developercloud/">Watson Developer Cloud</a>. There’s a lot of services now available, some perfected and some still being perfected in an open Beta programme. These APIs can be used to build new classes of cognitive apps, but they can also be used in much more subtle ways to augment and make apps just a little more natural and easy to use. But either way, you don’t need to know <em>anything</em> about cognitive computing, or have deep pockets, to use the APIs. Don't let industry jargon and mystique stop you from exploring the potential.<br />
<br />
<br />Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-58089347862763262412015-03-15T04:10:00.000-07:002015-03-16T14:09:23.716-07:00Why I will be in the Q to purchase an Apple WatchI’ve decided I’m getting an Apple Watch. My current watch has broken (no, really) and I’ve tried using my iPhone for several months as a timekeeping device, but it just doesn’t work for me. I hear the stories of millennials not having watches, but clearly I’m not one of them. <br />
<br />
The way that things are designed and made are important to me. If I need something, I’ll often wait and prevaricate until the right thing comes along – rather than buy something quickly to “do a turn”. The world is full to the brim with badly designed and manufactured objects – they all depress me. When you find something that is just right, it’s an immense pleasure – like finding a rare jewel. So, for me, the sheer care that has goes into the Apple Watch <a href="http://atomicdelights.com/blog/a-glimpse-at-how-the-apple-watch-is-made">manufacturing processes</a> is very appealing. I have yet to see one in the flesh, but this looks like something that’s a bit special. And judging by the reactions of <a href="https://www.youtube.com/watch?v=khRagAb_S0U">those who have actually touched one</a>, I might be very right. <br />
<br />
Most importantly, unlike the industries early attempts at smart watches, the Apple Watch isn’t a big black plastic macho hulk. I respect what Pebble did with their watch, but its physical design is just not something I would consider. I could make similar comments about most of the other competition – there are signs that the market is waking up, but most have a long way to go to match Apple’s attention to detail.<br />
<br />
I appreciate that Apple offer two sizes for different sized wrists. I appreciate the variety of case metals and strap choices. And no, I’m not buying the rose gold version! It’ll be aluminium or stainless steel for me – I’ll make my choice when I get to see them in the flesh. And I’ll be choosing a strap material and colour to suit my personal style. Choosing an Apple Watch is more like tailoring than off-the-peg clothing.<br />
<br />
I think the price is about right. If you’re an iPhone owner then the entry price of £299 is almost certainly less than you payed for your phone (if you bought sim-free like I did). It’s not cheap, but it <em>is</em> easily within reach for the target market. I’ve had two fitbits (one broke) – which is already 2/3 of the price of an Apple Watch. And an Apple Watch does <em>much</em> more than a fitbit, with a <em>great</em> deal more attention to style and design.<br />
<br />
But what really fascinates me about the Apple Watch is its utility – what will I use it for? My dallying with fitbits fizzled out – I kept forgetting to charge it or put it on my wrist. I like the idea of a fitness band, but the single use nature of them means they aren’t central enough to my life for me to focus on. Can the Apple Watch become essential in the way that my iPone has?<br />
<br />
I think the only way to make a device essential, and therefore train me to consistently wear it and remember to charge it, is for it to perform a multitude of important functions. Each on their own might not be enough to make it essential, but the combination does.<br />
<br />
I <em>can</em> see the combination of Time, Health, Communication, Travel and Payment making this device essential in a similar way that my iPhone has become. Time will tell (no pun intended), of course, but I can see the utility.<br />
<br />
<h2>
Time</h2>
I need a new watch to tell the time and Apple Watch does that. It has power-reserve mode, so even if the battery should be nearly depleted, it still works as a timepiece. Job done.<br />
<br />
<h2>
Health</h2>
I want to track my activity with a device that does more than that. The Apple Watch does so, but also captures my heart-beat as well – which I find intriguing. Its haptic feedback engine can lightly tap me on the wrist when I’ve been sitting for too long – nothing dramatic, but something useful (maybe a good excuse to force in break in long business meetings that are running on). It does what my fitbit did, but it does more.<br />
<br />
<h2>
New forms of Communications</h2>
This is the one that completely fascinates me. The Watch includes entirely new ways to <a href="http://www.apple.com/uk/watch/new-ways-to-connect/" target="_blank">communicate without words</a>. The ability to send my heart-beat to another, to sketch messages that are drawn on another’s watch, or to send a walkie-talkie style voice memo – these are all novel forms of communication. <br />
<br />
I can tap a code on my watch that the haptic feedback engine then uses to silently tap against my friend’s wrist. The haptic tapping is truly silent and discrete - in a way that “silent buzzing” mode on a phone is not. I can think of <em>so</em> many meetings that are going to be made better by the ability to silently communicate amongst attendees: “three taps means this presenter is boring, lets find an excuse to get out of here” ;-)<br />
<blockquote class="tr_bq">
"If I'm at a party and a guy is being super creepy, I can double-tap it and that'll be like 'come and save me, right now'! Girl-code taken to the next level!". <a href="https://www.youtube.com/watch?v=khRagAb_S0U" target="_blank">First reaction</a> of a woman using an Apple Watch.</blockquote>
Or, as <a href="http://mashable.com/2014/09/09/apple-watch-sexting/">Mashable</a> put it, “sharing your heartbeat over Apple Watch is the new sexting”. Maybe, maybe not – but it feels like these new, more intimate (what is more intimate that something next to your skin), forms of communication might be significant in a similar way that SMS was all those years ago.<br />
<br />
One thing is for sure – I don’t think I’ll be using the Watch for either phone calls or emails. It can do both, but I’m just not interested in using it for either. For me, its ability to communicate in <em>new</em> ways is far more interesting than its ability to perpetuate existing forms of communication.<br />
<br />
<h2>
Apps</h2>
Many iPhone app developers are adopting their apps to present a simplified Watch interface. As an amateur app developer I can speak from personal experience that it <em>is</em> relatively easy to adapt an existing app. The Watch as an app platform is a guaranteed success, even before its launch, simply because there are <em>so many</em> developers and apps that can easily be adapted.<br />
<br />
A couple of examples have already caught my attention. I often use Uber. When I do so I stand in the street, iPhone in hand, watching the progress of my Uber car on a map. Uber are a launch partner for the watch and their new app allows me to leave my phone in my pocket and view the map on my wrist – much preferable.<br />
<br />
Trip Advisor is a very useful tool for finding places to visit. I’m not a fan of its mobile apps - they aren’t particularly well built and rely too heavily on reuse of existing web content to provide a truly excellent user experience. But, they are an Apple Watch launch partner - and you can’t reuse web content on the device, so it looks like we might be in for a treat with a genuinely useful travel advisor on our wrists ,with a native UI.<br />
<br />
There will be many more examples – the ones that will catch on will be those that capture an “out and about” usage, where leaving your iPhone tucked away in a pocket is a valuable convenience. The sheer volume of Watch Apps we’re likely to see means I will have plenty of things to play with.<br />
<br />
<h2>
Apple Pay</h2>
You can use Apple Watch to Apple Pay. We don’t have Apple Pay in the UK yet, but all the indications are that it’ll be with us by the end of the year. Apple Pay features in the “out and about” usage category for me - I don’t particularly want to pull my iPhone out to pay, so the ability to extend my wrist looks interesting. I know this isn’t going to change my life and its hardly a hardship to fish in your pocket – but I suspect it’ll become <em>so</em> convenient we’ll wonder how we managed in the dark ages of wallets and pockets!<br />
<br />
<br />
None of these uses is, on its own, a killer reason to own an Apple Watch. But, I <em>can</em> see their combination making the watch essential by sheer volume of different uses. Being essential means I might remember to charge it and to put it on my wrist – for me that will be a key test. If it’s only semi-useful then it’ll quickly fall into disuse, but I don’t think that will be the case. <br />
<br />
Early <a href="http://techcrunch.com/2015/03/06/the-apple-watch-is-time-saved">anecdotal reports</a> from those who have used the watch are fascinating:<br />
<blockquote class="tr_bq">
“People that have worn the Watch say that they take their phones out of their pockets far, far less than they used to. A simple tap to reply or glance on the wrist or dictation is a massively different interaction model than pulling out an iPhone, unlocking it and being pulled into its merciless vortex of attention suck. One user told me that they nearly “stopped” using their phone during the day; they used to have it out and now they don’t, period.”</blockquote>
Adopting the Watch will be an experiment, in the same way that ownership of an iPhone was in the early days. Many people will hold off, sceptical of the newcomer, in the same way that they held onto their Blackberries and professed the essential nature of a physical keyboard. But I’m not sceptical, I’m excited in the same way that I was excited by my first iPhone. I was in the queue to purchase the iPhone 3G on launch day and never looked back. The watch is cheaper than that iPhone was and looks like its had an order-of-magnitude more love poured into its physical design and manufacture. Only time will tell if these guesses are correct – but I’m placing my bets on the Apple Watch being a significant inflection in the way that we use technology.<br />
<br />Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-89860180966638544622014-12-21T06:46:00.001-08:002014-12-21T06:49:31.691-08:00Reviewing 2014 Mobile PredictionsAbout this time last year I wrote a <a href="http://duncan-anderson.blogspot.co.uk/2013/12/unlucky-for-some-13-mobile-expectations.html">blog entry outlining my predictions for mobile technology in 2014</a>. I said at the time that I might review my predictions and here we are – a year later. So, how did I do? <br />
<br />
<h2>
1. Smartphones get smarter, with more real-world sensors</h2>
This prediction wasn’t exactly difficult. Every new phone continues to support an expanded set of environmental and communication sensors. We saw Apple’s iPhone6 include new “Sensor Pixels” to provide Phase Detection autofocus normally associated with DSLRs, an air pressure Barometer sensor and new NFC communications for Apple Pay. We can all see the way that smart-watches are including personal health sensors, made possible by their close proximity to our skin – the Apple Watch, for example, includes movement sensors and a heart-rate monitor. There have been many articles speculating on other health sensors that might be built into mobile devices – detecting chemicals or perspiration in our skin, blood flow, etc. There doesn’t seem to be any let-up in the real-word sensing our mobile devices acquire.<br />
<br />
SCORE: YES!<br />
<br />
<h2>
2. Smartphones get smarter, with more artificial intelligence</h2>
<a href="https://www.apple.com/ios/siri/">Apple’s Siri</a> and <a href="http://www.google.com/landing/now/">Google Now</a> both continue to get cleverer. This year they were joined by <a href="http://www.microsoft.com/en-us/mobile/campaign-cortana/">Microsoft’s Cortana</a>. Having natural speech technology and artificial intelligence now appears to be table-stakes for a smartphone platform.<br />
<br />
My personal usage of these services is probably best described as erratic at best. Interestingly, however, <a href="http://dangrover.com/blog/2014/12/01/chinese-mobile-app-ui-trends.html" target="_blank">app trends in China</a> seem to indicate that “chat” interfaces are now de-rigour for all sorts of apps. If this trend crosses to the West, then automated intelligence is going to be critical a critical component. <br />
<br />
My employer, IBM, launched its new <a href="http://www.ibm.com/smarterplanet/us/en/ibmwatson/">Watson</a> division this year. A $1bn investment in artificial intelligence (we call in “Cognitive Computing” now, but its the same thing) isn’t to be sniffed at. Most Watson apps tend to surfaced through a mobile interface. And Watson isn’t about fulfilling the trivial whims of a smartphone owner; its about much more serious things. Like <a href="http://www.mdanderson.org/newsroom/news-releases/2013/ibm-watson-to-power-moon-shots-.html">diagnosing and curing cancer</a>.<br />
<br />
SCORE: YES!<br />
<br />
<h2>
3. Mobile extends to the Home</h2>
Google bought <a href="https://nest.com/" target="_blank">Nest</a>, validating the importance of the “smart” home.
Nest saw a bunch of competitors emerge with their own “smart” thermostats, including <a href="http://yourhome.honeywell.com/home/Products/Thermostats/7-Day-Programmable/Lyric.htm">Honeywell</a>, <a href="https://www.hivehome.com/">Hive</a> and <a href="https://tado.com/gb/heatingcontrol-savings">Tado</a> to name a few.
Apple incorporated <a href="https://developer.apple.com/homekit/">HomeKit</a> into iOS8, providing a common framework to manage smart home devices from your iphone or ipad. We seem to be getting pretty close to the connected home being a reality not just for the geek squad, but for the average consumer.<br />
<br />
SCORE: YES!<br />
<br />
<h2>
4. Mobile extends to the person</h2>
Android Wear devices trickled onto the market, including the much talked about <a href="http://www.engadget.com/2014/09/12/moto-360-review/">Moto 360</a>. They’ve so far failed to make a market impact and I’ve never seen one, so maybe the jury is out on how much consumers really <em>want</em> or <em>need</em> this kind of capability.<br />
<br />
The <a href="http://www.apple.com/watch">Apple Watch</a> announcement was one of the biggest and most anticipated from the fruity company. Can it succeed where others are struggling? Its rumoured to make it’s debut around February 2015, so its not long to wait now. <br />
<br />
For me the killer app for these devices might be a new form of very <a href="https://www.apple.com/watch/new-ways-to-connect/">personal communication</a>. The haptic feedback in the Apple Watch that lightly vibrates a message against your skin, mimicking that tapped out by your friend on their watch, is a peculiarly intimate form of communication. Or perhaps not quite as intimate as transmitting a pulsing heart rate to your lover? Time will tell, but there seems high potential for new forms of intimate communication here.<br />
<br />
SCORE: YES!<br />
<br />
<h2>
5. Fashion and style become increasingly important</h2>
Apple have been on a mission this year to woo us with the fashion status of the Apple Watch. Articles in <a href="http://www.vanityfair.com/online/daily/2014/09/apple-pay-watch-iphone-launch">Vanity Fair</a> and events at <a href="http://techcrunch.com/2014/09/30/apple-rubs-elbows-with-the-fashion-elite-for-paris-apple-watch-event/">Paris Fashion Week</a> - they sure want us to see the watch as a fashion accessory. The Apple Watch is probably the first electronic gadget that’s been deigned and marketed so explicitly with fashion in mind. <br />
<br />
The watch comes in four different case metals, two different face sizes and can be customised with a huge profusion of straps in different colours and materials. When you then add the array of electronic faces and software customisations, its clear the Apple Watch is without doubt the most customisable and personal piece of gadgetry we’ve yet seen. <br />
<br />
This is the first serious piece of fashion-conscious design and marketing we’ve seen in the tech-world - and if the rest of the industry does what it usually does when Apple leads, we’re going to see much more of this. Indeed, the most talked-about Android Wear watch is the Moto 360 - not because of its capabilities, but because of its looks.<br />
<br />
Personally? I’m tempted by an Apple Watch, 38mm Stainless Steel Case with Brown Modern Buckle. Maybe Santa will pay an unseasonal visit in the spring!<br />
<br />
SCORE: YES!<br />
<br />
<h2>
6. Bluetooth Low Energy (BLE) powers new forms of mobile interaction</h2>
I’m a big fan of BLE-enabled iBeacon - I think it has the potential to bring about very interesting new forms of interaction between mobile devices and their physical environments. But we’ve not seen much of it in our day-to-day lives so far. <br />
<br />
I’m told there’s an Oxford Street app (one of London’s main shopping districts) that sends you marketing alerts as you walk past shops - but why would you want that? In the USA the NFL is using iBeacon <a href="http://appleinsider.com/articles/14/02/01/nfl-brings-ios-compatible-beacon-tech-to-new-york-for-super-bowl">within baseball stadiums</a>. And of course Apple itself uses iBeacon within its stores. But so far, I’d argue, the usage of this technology is relatively invisible to most of us and therefore its potential remains unlocked. <br />
<br />
We now have iBeacon devices being manufactured by high-profile organisations like Qualcom’s <a href="https://www.gimbal.com/">Gimbal</a> spinoff and <a href="http://estimote.com/">Estimote</a>. But it’s taken a year for this manufacturing capability to mature. The way that iBeacon apps might be used is also taking a little time to mature beyond the simple ‘marketing alerts’ that few of us want or need. However, I think we’re getting there - it’s just taking a little time for all of the components to fall into place.<br />
<br />
SCORE: Maybe<br />
<br />
<h2>
7. Processor innovation shifts from performance to low power usage</h2>
I kind of got this one wrong. I thought phones were powerful enough, so the emphasis would shift to battery life (because that’s what we all complain about).<br />
<br />
However, I misread the forces. Screen pixel density and size has been increasing - and more pixels mean more processor power is needed to shift them around. We’ve still been on the “need more power” part of the maturity curve.<br />
<br />
Perhaps more importantly, phones are no longer phones. If smartphones were to stay as relatively simple devices then we have all the power we need - but they won’t. They are morphing into pocket computers with more real-world sensors and capabilities than we ever imagined.<br />
<br />
Phones are becoming games machines - Apple’s 64-bit A8 processor means that iPhones are beginning to rival dedicated consoles for their graphical capabilities. <br />
<br />
And the rumour sites continually mention tantalising snippets of information about <a href="http://9to5mac.com/2014/12/09/apple-3d-ios-interface-motion-sensing-gestures-3d-mapping-primesense/">3D user interfaces</a>, <a href="http://allthingsd.com/20131124/apple-confirms-acquisition-of-3d-sensor-startup-primesense/">3D real-world sensors</a>, <a href="http://www.cnet.com/news/rumor-apple-to-add-tactile-feedback-to-iphone/#!">Haptic displays</a> and <a href="http://www.tuaw.com/2014/11/20/john-gruber-next-gen-iphone-may-feature-the-biggest-camera-jum/">DSLR-quality cameras</a>. Such capabilities surely drive increasing demands on processors due to the sheer quantity of data that’s needed to be processed.<br />
<br />
As excitement over the novel tablet form factor seems to have slackened off slightly, we're seeing a focus on the software capabilities. ie. "yes, ok, I like the device, but what do I *do* with it"? Microsoft Office on the iPad was an important inflection point. And <a href="http://www.pixelmator.com/">Pixelmator</a> on the iPad rivals Photoshop - which was unthinkable when the first iPad emerged. But serious apps like this need all the processor power they can get.<br />
<br />
My mistake in this prediction was in categorising smartphones and tablets as an effectively static category with static capabilities. However, it's now clear that its a moving category - as the devices become more powerful, consumers and industry are finding more ways to exploit that power. The iPad Air 2 is <a href="http://daringfireball.net/2014/10/ipad_air_2">faster</a>, processor-speed-wise, than a 3 year old Macbook Air - and if we're going to be editing documents, photos and videos it needs to be.<br />
<br />
There is, however, an interesting dynamic that’s going to hit us soon. The yearly reduction in die-size of microprocessors has been a central part of the increase in power that we’ve seen over the years. As processors become smaller, they also get faster. Today’s Apple A8 chip is based on 20nm technology and next year’s A9 is rumoured to be based on 14nm technology. After that 10nm is predicted, but then it starts getting tricky. Below 10nm the experts say that we’re nearing the edges of physics - to reduce even further has a great deal of <a href="http://semiengineering.com/will-7nm-and-5nm-really-happen/">uncertainty and requires the invention of new materials</a>. The level of deep science required to continue the <a href="https://en.wikipedia.org/wiki/Moore%27s_law">Moore’s Law</a> path is enormous, which is why <a href="http://www.eweek.com/servers/ibm-to-spend-3-billion-to-research-the-future-of-chips-systems.html">IBM is investing $3bn in microprocessor research</a>. Whether we hit a bump in the road, or if the science keeps ahead of us, is a little uncertain – it’ll certainly be interesting to see what plays out over the next few years.<br />
<br />
SCORE: NO<br />
<br />
<h2>
8. Innovation emphasis shifts from hardware to software</h2>
I was naive in predicting that processor speeds will stop increasing and I think I was naive on this one for similar reasons. The idea that my iPhone is as good as I ever need it to be for a phone ignores that my iPhone is going to do a whole lot more as hardware capabilities improve. Screens with haptic feedback, 3D sensing of the real word, possibly even 3D screens (if Nintendo can do it with the <a href="https://www.nintendo.com/3ds/">3DS</a> then I’m sure we’re going to get it on phones before long).<br />
<br />
Yes, software is an engine of innovation. Apps like <a href="http://www.pixelmator.com/ipad/">Pixelmator</a>, that provides a photoshop-like experience on the iPad, transform what we can do with mobile devices. Without this software-led innovation we get stuck making phone calls and browsing the odd website. But we're not. When <a href="http://www.truthnyc.com/HolidayWonders/" target="_blank">entire ad campaigns are filmed on an iPhone</a>, we know software innovation is at the heart of things. The software revolution of apps are central to what has driven the popularity of mobile devices. But so is hardware – and I’m willing to predict that rather than a tailing off of innovation, we might even see an increase. Haptic feedback is in the Apple Watch, so how long before it hits the iPhone, for example?<br />
<br />
SCORE: NO<br />
<br />
<h2>
9. Backend As A Service gets serious</h2>
In the IT industry there’s one sure-fire way to tell when a technology is getting serious – when my employer, IBM, gets involved. IBM’s release of <a href="http://bluemix.net/">Bluemix</a> in the Spring of 2014 was one such move. Bluemix includes mobile-back-end-as-a-service(MBAAS) capabilities and is seen as an incredibly strategic move within the company. Most mobile apps now need to interact with some server-side capabilities – and making the job of mobile app developers easier and simplifying server-side programming is critical. I can’t think of a better way of validating this prediction than the evidence of Bluemix. <br />
<br />
Whether it’s <a href="http://www.tuaw.com/2014/11/20/john-gruber-next-gen-iphone-may-feature-the-biggest-camera-jum/">Amazon</a>, Facebook’s <a href="https://parse.com/">Parse</a> or IBM’s <a href="http://bluemix.net/">Bluemix</a> – any app worth its salt needs a back-end.<br />
<br />
SCORE: YES!<br />
<br />
<h2>
10. Mobile continues to eat the PC market</h2>
Yes, the PC market continues to <a href="http://www.cio-today.com/article/index.php?story_id=0110014L2KU3">stagnate</a>, with further sales declines year-over-year. There’s little let-up for PC manufacturers and price competition is relentless. Innovation certainly seems more obvious in mobile, where sales volumes continue to increase. The very personal nature of mobile devices means those volumes are unlikely to slack-off until every human on the planet has one. Facebook’s Mark Zuckerberg even thinks he <a href="http://www.independent.co.uk/life-style/gadgets-and-tech/news/facebook-mark-zuckerbergs-hunt-for-five-billion-new-friends-8778024.html">can make that happen</a> with <a href="http://internet.org/">internet.og</a><br />
<br />
SCORE: YES!<br />
<br />
<h2>
11. The platform wars are over</h2>
I’ve noticed a more balanced commentary in the market as 2014 has progressed – less ill-informed “my phone is better than yours” discussions. I’m not sure the fan boys have quite declared de-tante yet, but most normal people have. iOS and Android own the market and it seems pretty clear that position isn’t going to change in the foreseeable future. Android’s higher sales figures are balanced by Apple’s focus on the more affluent (and therefore commercially influential) part of the market.<br />
<br />
In some (but not many) price-sensitive markets MS's loss-leader strategy with Windows Mobile has led to reasonable market shares for the platform. But I find it hard to see how this success can be replicated around the globe profitably.<br />
<br />
For a given market its obvious what the important mobile platforms are – mostly that's iOS and Android. But most of us have got bored of talking about it. If you still think your preferred mobile platform is going to take over the world – get over it!<br />
<br />
SCORE: YES!<br />
<br />
<h2>
12. (Small) Cameras replaced by phones</h2>
The market for compact cameras appears to have collapsed. The cameras in our phones are now as good as many of us need need for casual snaps. A perusal of the most popular cameras on photo-sharing site Flickr, I think, provides all the <a href="https://www.flickr.com/cameras/">evidence</a> that we need.<br />
<br />
We’re also just beginning to see the <a href="http://techcrunch.com/2014/04/16/google-camera-app-brings-lens-blur-background-defocus-to-any-kitkat-android-devices/">emulation of depth-of-field lens blur</a> in smartphone software – so even that arty shallow-focus effect of high-end DSLRs is now possible. <br />
<br />
Whichever way you look at it, it seems pretty certain that smartphone cameras and very clever image manipulation software are rendering the point-and-shoot camera obsolete. Maybe they will one day threaten DLSRs as well, who knows? <br />
<br />
SCORE: YES!<br />
<br />
<h2>
13. Tension between cloud convenience and corporate/government snooping grows</h2>
Apple even went as far as building an <a href="https://www.apple.com/privacy/">entire website</a> dedicated to privacy and making their approach to the subject a selling point of the iOS platform. And shortly afterwards, Google announced that <a href="https://www.apple.com/privacy/">Andriod 5.0 Lolipop</a> would follow iOS8’s lead and enable encryption by default. Many internet companies have also moved to rapidly increase the <a href="http://www.datacenterknowledge.com/archives/2013/09/09/google-boosts-encryption-between-data-centers/">level of encryption</a> used in their cloud data centres. <br />
<br />
Of course, these moves <a href="http://www.pcpro.co.uk/news/security/391474/gchq-tech-companies-in-denial-over-terrorists">incensed the security agencies</a>, with the UK’s Robert Hannigan (head of spy headquarters GCHQ) writing an <a href="http://www.ft.com/cms/s/2/c89b6c58-6342-11e4-8a63-00144feabdc0.html#axzz3MWL1KlIt">article</a> in the Financial Times complaining about it. Quite how this will all work out is very unclear – but the tech companies certainly seem to have caught onto the public mood of concern about recent spying revelations. <br />
<br />
SCORE: YES!<br />
<br />
<h2>
Summary</h2>
I make this 10.5/13, if I award myself 1/2 mark for the “maybe”. Not bad. It’s certainly interesting to see how things have worked out and where some of the trends are heading. I don’t think I’d change the categories here for next year’s predictions and I’ve given a perspective on what I think will happen in 2015. I wonder what I’ll be saying in another year’s time?Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-65353981081380742982014-09-18T13:58:00.001-07:002014-09-18T13:59:12.864-07:00Breaking (photography) Rules<p>As a keen amateur photographer, I used the luxury of a recent holiday to explore the more creative side of my photography. In recent years I've focussed on learning the science of photography, but this year I've been trying to be a bit more creative. What I've found has surprised me, so I thought this might be something worth sharing, together with my experiences of migrating towards the iPad for my image processing. A common theme in my discoveries has been that of breaking what I had previously thought of as rules. I quite like breaking rules - it's much more fun than following them! So, this post is organised around the rules I've been trying to break.</p><h2>You need a computer to process photographs</h2><p>I'm a keen amateur photographer. Anyone who knows what this means will also know the profusion of “stuff” the hobby brings with it. Lenses, filters, tripod, laptop, the list goes on. The amount of bulk and stuff is a real problem. If you're not careful, a holiday can become about your stuff, not about the holiday experiences themselves.</p><p>For these reasons I moved entirely to Olympus micro-four/thirds a couple of years ago. The system's cameras are smaller and the lenses dramatically so. The size and weight savings over the more traditional Canon/Nikon/Sony options are significant, image quality more than most of us need. Less stuff, less weight and bulk are things I value greatly. So, m4/3 is the format I've decided to go with.</p><p><a href="https://www.flickr.com/photos/dubswede/12760429823/" title="40d vs omd e-m5 by peteSwede, on Flickr"><img src="https://farm4.staticflickr.com/3706/12760429823_f7a64d75f2.jpg" id="blogsy-1411073929480.4795" class="" alt="40d vs omd e-m5" width="500" height="334"></a></p><p>Having replaced a bulky camera system with a lighter option, the idea of replacing a bulky computer with a lighter iPad is also very appealing. </p><p>There are other advantages to the iPad. The laptop brings too many distractions - I might be tempted to 'tinker' with things and, before long, half the day is spent on a screen. Holidays are about getting out and seeing places and experiencing things with the family, not sitting behind a computer. The iPad is more task-oriented, and it feels less “worky” and more in tune with the needs of a holiday. Fewer distractions and more space in the car. But could an iPad replace a computer for the comparatively high-end needs of a mobile photographer?</p><p>An Apple SD Card Adapter allows you to import photos from your SD card, storing them on the iPad. For my summer vacation this year I left the computer at home and <em>only</em> used an iPad to store and manipulate images for two weeks.</p><p>My workflow was to shoot during the day, upload to the iPad in the evening, perform any photo manipulation on the iPad and then upload my chosen images to Flickr. Using Flickr sorts the wheat from the chaff and also ensures my best images are backed-up to the cloud - so aren't subject to an opportune thief. If I'm staying somewhere without wifi, I just wait to do the flickr upload until I have a network. My 64gb iPad has sufficient storage space to hold all my photography for a 2 week holiday, so no big deal if I cant do an upload. On return from vacation, I plug the iPad into my computer and upload all images into my usual photo library, which happens to currently be Apple's Aperture.</p><p>This workflow worked <em>really</em> well for me and I thoroughly recommend it. Ditch the computer whenever you're away from home and start using that iPad! It worked out <em>really</em> well for me, acting as both a cache for my photos and a mobile editing studio. I no longer feel a need to carry a laptop for photographic purposes at all.</p><h2>JPG is for amateurs, real photographers only use RAW images</h2><p>All cameras manipulate the raw data that comes of of their sensors to create a pleasing JPG image. Many professional professionals prefer to use the RAW files their cameras produce, performing their own chosen manipulations rather than the defaults chosen by camera manufacturers. By editing the RAW file, you have more lattitude in changes to exposure and white balance, so this makes sense. </p><p>However, modern camera JPG engines are sometimes very sophisticated and difficult to replicate manually. I use an Olympus camera and their JPG engine is generally considered to be amongst the very best. I often find that, after playing with a RAW file, I actually prefer the camera's JPG version. I personally think we're going to see much more of this, as the processing algorithms become more sophisticated and are applied selectively to parts of an image.</p><p>I've developed a workflow of shooting in RAW+JPG mode, so I have both versions to choose from. For most of my images, I perform minor tweaks to the JPG version. This is quicker and easier in the field than playing with a RAW file, if I don't need the extra lattitude. It also means that things like Art Filters and Crop Mode, which are only applied to the JPG version by my camera, are retained. For specific images that warrant extra effort, I dip into the RAW file - but only doing so occasionally saves me <em>lots</em> of screen time and effort - something I like to avoid.</p><p>Here's an example of a camera JPG of St Michael's Mount. I'm not sure what I need to do to this that I need RAW for - the JPG seems perfectly good to me.</p><p><a href="https://www.flickr.com/photos/duncsand/15042622046/" title="image by Duncan Anderson, on Flickr"><img src="https://farm4.staticflickr.com/3862/15042622046_c0852c378e.jpg" id="blogsy-1411073929545.189" class="" alt="image" width="500" height="375"></a></p><h2>The iPad doesn't do RAW processing</h2><p>Yes it does. I use Apple's Lightening adapter to import images from an SD Card - this imports both JPG and RAW images. The iPad only shows the JPG versions, or a blown-up embedded JPG in the RAW file if you don't shoot JPG+RAW. However, the RAW is there and you just need an app that can see it (most apps can't). I use <a href="https://itunes.apple.com/gb/app/photogene-4/id363448251?mt=8">PhotoGene</a> to view and edit those RAW files on the iPad.</p><p>I had considered using Adobe's Lightroom iPad app. However, bizarrely, it doesn't read RAW files. I say bizarrely, because Lightroom on a computer is <em>known</em> as a RAW editor. So, a bit strange that it's a JPG editor on the iPad. Maybe they have plans for version 2.</p><p>PhotoGene is a pretty full-function RAW editor. It runs on both iPad and iPhone - I find it surprisingly good for mobile image processing. Here's a processed file on St Michael's Mount, generated from a RAW file on my iPad by PhotoGene.</p><p><a href="https://www.flickr.com/photos/duncsand/14883734919/" title="image by Duncan Anderson, on Flickr"><img src="https://farm6.staticflickr.com/5574/14883734919_3232f6bf27.jpg" id="blogsy-1411073929490.8242" class="" alt="image" width="500" height="375"></a></p><p>Whilst PhotoGene can't hope to match the sophistication of something like Lightroom, I found it more than adequate for my needs. It has all the usual exposure, levels, curves, highlights, shadows, etc adjustments you might be familiar with.</p><h2>You need a computer to do serious image manipulation</h2><p>Lightroom and it's computer-tethered compatriots are great. They provide a vast amount of flexibility. But if you're anything like me, it's sometimes <em>too much</em> flexibility; the options are daunting. And 'touching up' an on-screen image with a mouse I find to be <em>very</em> tricky. Hand-eye co-ordination for brushing-in exposure changes, for example, is something I've always struggled with.</p><p><a href="https://www.flickr.com/photos/rogermdl/5668025754/" title="iPad by Roger Martins, on Flickr"><img src="https://farm6.staticflickr.com/5186/5668025754_9fcb32c869.jpg" id="blogsy-1411073929494.6548" class="" alt="iPad" width="500" height="230"></a></p><p>On the iPad you literally paint directly onto the image with your finger. If you want a darker sky, just swipe your finger over it. Both iPhoto and PhotoGene, two of the apps I have used, both have this feature. PhotoGene is much more sophisticated, with blur effects, intensity control, etc. But both allow quick swipes to make exposure, sharpening and saturation changes. I find this approach much more natural than using a mouse on a computer. I strongly suspect, that as the software gets more sophisticated, this is the future of photo re-touching. iPad and photo re-touching are a match made in heaven. So, although PhotoGene is not Lightroom, it has it's own unique advantages because it sits on an iPad with its touch screen.</p><h2>In-camera art filters aren't for serious picture-taking</h2><p>I used to think this. After all, these are play things, surely? If you really want to mess around with an image, then you can do that later in your chosen editing suite.</p><p>However, I've come to believe that composing live and using the art filter as you are viewing a scene is much more creative. It means the image you create is an artistic representation of your reaction to it whilst you're viewing it, rather than a more mechanical approach created later on a computer. Artfully manipulating a photograph later-on means you have <em>so many</em> options they are daunting - most of the time I don't bother. However, doing it 'in the field' means you can create an image without spending hours in Photoshop (which I detest doing). It's faster, more creative and impulsive. It means I can create images I would never otherwise do.</p><p>Here's an example using the 'Key Lime' effect in my Olympus camera. It's of Mousehole harbour in Cornwall. It was a bit of a grey day and the colours were quite washed-out in the normal image. Using the art filter enabled me to create a pleasing image on a day that, otherwise, was only resulting in rather average pictures.</p><p><a href="https://www.flickr.com/photos/duncsand/15030818981/" title="image by Duncan Anderson, on Flickr"><img src="https://farm4.staticflickr.com/3870/15030818981_2922121ac1.jpg" id="blogsy-1411073929544.2515" class="" alt="image" width="500" height="375"></a></p><h2>It's better to under, rather than over, expose</h2><p>The logic to this rule is that if you over-expose, you end up with 'burn out' areas is the image - areas of white where no image data exists. When this happens, no amount of processing can recover the burnt-out sections of the image. However, a small amount of under-exposure can often be corrected as data still exists to be manipulated. </p><p>The problem with this rule is that it doesn't recognise that white areas of an image can be used to creative effect. For example, this image was created using the Olympus camera's B&W Grainy Film art filter, with about 3 stops of over-exposure dialled in. It results in a silluoette effect. I've framed the composition to ensure large areas of white sky dominate. </p><p><a href="https://www.flickr.com/photos/duncsand/15062810001/" title="St Michael's Mount by Duncan Anderson, on Flickr"><img src="https://farm4.staticflickr.com/3837/15062810001_07ac455b36.jpg" id="blogsy-1411073929506.582" class="" alt="St Michael's Mount" width="500" height="281"></a></p><p>For me, I feel that burt-out areas of an image can be used to creative effect.</p><h2>Good pictures are all about the science of shutter speed, ISO and Aperture</h2><p>I've come to realise there are three, all equally important, parts to good photographs:</p><ul><li>Mastering the camera controls - shutter speed, aperture, ISO, etc.</li><li>Mastering composition - understanding the impact of leading lines, the rule of thirds, etc.</li><li>Becoming an artist - creating images that have an emotional impact.</li></ul><p><a href="https://www.flickr.com/photos/nayukim/3969530649/" title="Camera lens and aperture by nayukim, on Flickr"><img src="https://farm3.staticflickr.com/2605/3969530649_39fa067a33.jpg" id="blogsy-1411073929548.0862" class="" alt="Camera lens and aperture" width="500" height="375"></a></p><p>This last one, of being an artist, is the most difficult. The first two can be leart fairly easily and quickly. But they can lead to 'formulaic' images. I'm only just starting this journey, but it seems to me that great images have an artistic flair to them. So, the science is necessary, but ultimately a subsidiary consideration. You need to be able to control a camera, but not be bound by it. </p><p>It might be that rules help you make good images, but I am beginning to learn that you need to break them in order to make great images. Breaking rules is so much more fun than following them!</p><p> </p><div style="text-align: right; font-size: small; clear: both;" id="blogsy_footer"><a href="http://blogsyapp.com" target="_blank"><img src="http://blogsyapp.com/images/blogsy_footer_icon.png" alt="Posted with Blogsy" style="vertical-align: middle; margin-right: 5px;" width="20" height="20" />Posted with Blogsy</a></div>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com1tag:blogger.com,1999:blog-7114254501166964003.post-17616051663716338292014-09-11T05:14:00.001-07:002014-09-14T00:28:39.610-07:00Some thoughts on Apple PayThis week Apple launched its <a href="https://www.apple.com/apple-pay/">“Apple Pay”</a> mobile payment service. To those familiar with NFC-equipped Android phones this might seem a “me too” offering. However, on closer inspection there are some radical differences between this NFC implementation and anything we’ve seen before. This fascinates me and I thought it worth sharing some of my thoughts on why Apply Pay might be significant. I have summarised the interesting points I’ve seen into six key areas:<br />
<ul>
<li>Authorisation & Financial Limits</li>
<li>Security & Privacy</li>
<li>In-App Payments</li>
<li>Business Model</li>
<li>How Apple Pay relates to Bitcoin</li>
<li>Innovation or not?</li>
</ul>
I’ll take these one at a time and hopefully you will be able to follow my logic!<br />
<br />
<h2>
Authorisation and Financial Limits</h2>
NFC implementations tend to impose a financial limit on every payment transaction. In the UK this is typically £20 per transaction. In other jurisdictions different limits apply - for example, in the USA Google Wallet has a daily limit. These limits are imposed because NFC transactions mostly (always?) require no user authorisation - lose your NFC-equipped card or phone and anyone can spend your money. In these NFC implementations, everyone involved has agreed to make payments very easy by removing the need for signatures or PIN codes. Of course this is a massive security hole, so the limits ensure thieves only get away with a limited amount of our money.<br />
<br />
It’s possible to design an NFC implementation that requires the user to enter a PIN number; indeed, I was involved in the outline design of such a solution for a bank a couple of years ago. However, there is <em>significant</em> complexity in doing this. Furthermore, the user experience of “tap, unlock phone, find wallet app, enter pin code to authorise”, destroys the original value of speed and simplicity that made NFC so attractive in the first place. A secure phone-based NFC wallet looked <em>slower</em> and <em>more</em> difficult to use than a plastic card and PIN number - hardly something likely to see market acceptance on a wide scale. Hence, everyone went with the unsecured solution and transaction limits; easier to use, but fairly limited in value. <br />
<br />
Now, a £20 limit is OK if the only things I want to buy are sandwiches and coffees. I don’t know about you, but my idea of a wallet includes the ability to be <em>slightly</em> more profligate than that. And a £20 limit means I <em>still</em> need to carry my plastic cards with me. It ensures the mobile wallet is incapable of replacing my physical wallet and this is a significant barrier to adoption.<br />
<br />
The big thing about Apple Pay is that it neatly solves this problem. By using TouchID for the authorisation, the transaction limits can be removed without interfering with the simplicity of a tap. All Apple Pay transactions are not only authorised by a biometric fingerprint that ensures we can buy expensive things, but the level of security actually <em>increases</em> over our traditional plastic cards. <br />
<br />
In the USA the reliance on signatures for plastic card authorisation is a well understood exposure. In Europe we’ve moved to chip-and-pin, which is more secure - but it’s still notoriously easy for a thief to capture a 4-digit PIN by peering over our shoulders. Neither signatures nor PIN codes are particularly secure and, as a result, fraud in the industry is estimated to be a multiple $bn problem.<br />
<br />
Any authorisation method can be defeated with enough effort and fingerprints can be stolen. We went through all of this when TouchID was first launched a year ago. <a href="http://duncan-anderson.blogspot.co.uk/2013/10/smartphone-security-and-touch-id.html">I blogged about it at the time</a>. We heard the nay-sayers warning of fingerprints being stolen off the glass screens of our phones, or of violent thieves amputating our fingers. The fact that we’ve seen <em>no</em> reports of any such James Bond type activity actually occurring tells me that, whilst theoretically possible, its just too hard to justify the effort involved. So, TouchID seems a remarkably easy and secure method of authorising a payment transaction. I’ve used TouchID for a year on my iPhone and can report virtually faultless performance - it hardly ever fails to work first time. Extending this brilliant technology into payment authorisation seems an ideal solution and ensures security without introducing friction to the payment process.<br />
<br />
In other NFC implementations that NFC radio is always active, which seems to be the only way to ensure a swift tap-only to make a payment. There has been some controversy about this, because of the concern of ‘accidental’ payments. Some have <a href="http://www.bbc.co.uk/news/business-22545804">claimed this happens</a>, <a href="http://tomorrowstransactions.com/2013/05/not-just-contactless-but-ms-contactless/">others that this is practically impossible</a>. Without wishing to assess the rights and wrongs, it is interesting that with Apple Pay it seems that a transaction can only occur whilst you are holding the TouchID button - so the risk of “accidental” transactions would appear to be virtually eliminated. So the combination of NFC payments and TouchID would appear to be a match made in heaven from what I can see.<br />
<br />
Apple Watch can also make payments - so how does authorisation work, given it doesn’t have TouchID? The solution is novel and intriguing. When you put Apple Watch on your wrist, you enter a PIN code. The watch then uses sensors on the back of the it to detect that the watch is in constant contact with your skin. If the watch is taken off, you need to re-enter the PIN code to make a purchase. I would never have thought of that solution - it’s brilliant! <br />
<br />
<h2>
Security & Privacy</h2>
One of the big challenges with traditional payment solutions is that they always rely on the integrity of retailers. When we make a credit card payment we hand over our card to a retailer, who is perfectly capable of stealing the information written on it if they are so inclined. Its really bizarre that the supposedly “secret” information is in plain sight on that card. A few years ago, when card-not-present transactions started to take off, the industry augmented the card number with the CVV code to increase security. But the CVV code is written in plain sight on the back of the card - presumably on the back because it was thought that thieves wouldn’t think to turn the card over! We’ve all heard the warnings to beware of restaurants who take our card away to process a transaction, because that allows them to easily clone the details - and with good reason.<br />
<br />
Many mobile payment solutions work by emulating that plastic card model - our wallet passes the same credit card numbers over to the retailer’s till. Such models, that rely of giving our “secret” details to a retailer and allowing them to debit our account, are all inherently insecure. Any system that surrenders information that can be used over-and-again to process fraudulent transactions has a massive fault-line through the middle of it.<br />
<br />
With Apple Pay, Apple is exploiting the new capability of <a href="http://usa.visa.com/clients-partners/technology-and-innovation/visa-token-service/index.jsp" target="_blank">“Tokenisation”</a>. This is a very new concept only just provided by the likes of Visa and Mastercard. It works by using a mathematical algorithm to create a one-time token. That token is what is passed to the retailer and used to complete a transaction, but the token can only be used once. So, even if a fraudster were to intercept the transaction and steal the token, its useless because it can’t be used for any future transactions. My colleague Richard Brown <a href="http://gendal.wordpress.com/2014/09/10/a-simple-explanation-of-how-apple-pay-works-probably-its-all-about-tokenization/">goes into more details about exactly how this works in his blog</a>. Suffice to say that this approach is very neat in that it removes the inherent risk in transactions that work by ‘pushing’ payment credentials to a retailer. Particularly significant is that Apple never stores our payment credentials - so there’s no need to worry about the security (or otherwise) of “the cloud” in this case.<br />
<blockquote>
“It’s the most secure combination of technology that we’ve ever deployed,”<br />
– <a href="http://online.wsj.com/articles/whys-apples-tap-to-pay-beats-credit-cards-1410394757">James Anderson, group head of mobile product development at credit card processor MasterCard</a></blockquote>
Much of the thinking for technology-enabled payments has revolved around the capture of data associated with those transactions and how that data might be mined for, usually, advertising purposes. Knowing how much we spend and where we spend it, are very valuable pieces of information. However, for some (many?) of us this has become increasingly disconcerting. Revealing such details and subjecting ourselves to targeted advertising is an increasingly unappealing practice. “It you’re not paying for it, you’re the product” is a cry <a href="http://www.youtube.com/watch?v=ldhHkVjLe7A">eloquently made by</a> by <a href="https://twitter.com/aral" target="_blank">@aral</a>. <br />
<br />
The Apple Pay commitments on privacy are very illuminating on this point:<br />
<blockquote>
“With Apple Pay, your payments are private. Apple doesn’t store the details of your transactions so they can’t be tied back to you. Your most recent purchases are kept in Passbook for your convenience, but that’s as far as it goes. Since you don’t have to show your credit or debit card, you never reveal your name, card number or security code to the cashier when you pay in store. This additional layer of privacy helps ensure that your information stays where it belongs. With you.”<br />
– <a href="https://www.apple.com/apple-pay/">Apple</a></blockquote>
With Apple Pay, we are seeing an explicit commitment <em>not</em> to collect our data or exploit it. For those concerned at a drift away from privacy in technology circles, this will be reassuring. This is a big difference in philosophy from some other providers. <br />
<br />
<h2>
In-App payments</h2>
Apple has a reputation for being ‘closed’, whatever that means. However, Apple Pay appears to be remarkably open. The scheme allows for third-party payment API providers to use Apple Pay as the payment mechanism. This is very nice, because it means we can all write mobile apps that take payments and with a choice of provider - a remarkably open approach. We’ve already seen announcements from <a href="https://stripe.com/apple-pay">Stripe</a>, <a href="http://www.payeezy.com/content/iospayments/index.html">Payeezy</a>, <a href="http://www.authorize.net/solutions/merchantsolutions/merchantservices/applepay/">Authorize.net</a>, <a href="https://secure.paymentech.com/developercenter/mobilesdk/ios/?WT.mc_id=adc001_sdk">Chase</a>, <a href="http://www.cybersource.com/applepayments/">Cybersource</a> and <a href="http://www.tsys.com/sdk/">TSYS</a>.<br />
<br />
Futher, Square’s Jack Dorsey has also tweeted that Square will be accepting Apple Pay in its mobile point-of-sale solutions.
<br />
<blockquote class="twitter-tweet" lang="en">
Our millions of sellers will be able to accept any form of payment that comes across the counter, including Apple Pay!<br />
— Jack (@jack) <a href="https://twitter.com/jack/status/509413534862544896">September 9, 2014</a></blockquote>
<script async="" charset="utf-8" src="//platform.twitter.com/widgets.js"></script><br />
<h2>
Business Model</h2>
There have been a lot of reports circulating in the past week stating that Apple has negotiated a reduction in the “Interchange Rate” on card transactions. Interchange Rate is basically the charges applied for processing a transaction. As consumers we’re blissfully unaware of this because it’s the <em>retailer</em> who pays, rather than us. Reductions in the Interchange Rate are <em>very</em> interesting. How much reduction is being applied, and where that money ends up, might tell us a lot. The latest report suggests that Apple is actually <a href="http://www.bloomberg.com/news/2014-09-10/apple-said-to-reap-fees-from-banks-in-new-payment-system.html">receiving a fee</a> from banks for processing Apple Pay transactions; this is <em>super</em> interesting.<br />
<br />
Other suggestions have indicated that Apple is taking some (all?) of the fraud risk. Presumably banks would be happy to pay a fee in order to eliminate their fraud exposure. There’s certainly something interesting going on to get so many banks onboard so quickly. Chase is even <a href="http://www.loopinsight.com/2014/09/10/chase-sends-out-mass-apple-pay-mailing/">mailing</a> it’s customers touting the benefits of Apple pay. Now banks are unlikely to be enthusiastic if they felt that Apple was parking its tanks on their lawn. They must be happy with the solution.<br />
<br />
Quite what is happening here we might never know in detail, given the secrecy involved. However, it does seem there is some form of rebate or fee involved and possibly some risk-sharing. I blogged about <a href="http://duncan-anderson.blogspot.co.uk/2014/09/speculating-on-apple-mobile-payment.html">the implications</a> of this when the rumours first appeared. <br />
<br />
If, as now might seem to be the case, Apple is receiving some form of fee for each transaction, this has <em>huge</em> implications. This potentially means that Apple is building a revenue stream from our usage of iPhones. If successful, this could allow them so subsidise the purchase price of phones. Given that the big benefit that Android phones have is a lower purchase price, this gives Apple a way to eliminate this competitive advantage. I have no idea if they really are receiving a meaningful fee or if they actually plan to use this to offset iPhone acquisition costs - I’m just speculating. Apple are cautious - no doubt they will be looking to see how Apple Pay takes off and see revenues become concrete, before deciding how to use this ‘fund’. But I think they have strategic options they didn’t last week.<br />
<br />
<h2>
How does Bitcoin relate to Apple Pay?</h2>
The payments world is currently obsessed with Bitcoin. However, Bitcoin isn't really analogous to plastic cards, and thus the current Apple Pay solution, because:<br />
<ul>
<li>It's not provided by trusted or recognised brands familiar to average consumers,</li>
<li>There is no way to resolve the loss of a password,</li>
<li>There are no procedures or solutions to remove the risk of financial loss in the event of a hack or stolen password.</li>
</ul>
This means that Bitcoin is far more like cash than plastic cards - you loose it, it's your problem. There is no governance, no help desk and no anti-fraud procedures - but neither is there for cash.<br />
<br />
We are beginning to see a strategy develop amongst the larger financial players of establishing trusted brands, electronic payment methods and wallets on top of the existing card infrastructure in order ease consumers into the idea of more novel forms of payment. Once these have been established, Bitcoin (or other electronic currencies) can be added as an additional payment form - much like we carry both credit cards and cash today. For example Stripe, who power a lot of in-app payments, first started by exploiting cards but are now working on <a href="https://stripe.com/bitcoin" target="_blank">easing Bitoin in</a> as an alternative payment method. This kind of approach legitimises novel currencies like Bitcoin for average consumers. Its easy to see that, once Apple Pay has achieved traction and consumer trust/acceptance, more novel payment mechanisms like Bitcoin could similarly be added. <br />
<br />
“Doing Bitcoin” straight off would be very un-Apple. It’s much more like them to first establish trust in Apple Pay by using existing recognisable brands and payment mechanisms. Only once that is established would more novel approaches be considered. I see Apple Pay as buying a strategic option to potentially adopt novel payment mechanisms or currencies in the future. Whether they will or not I have no insight - but I think they just bought themselves a future option.<br />
<br />
<h2>
Apple Pay - an Innovation or not?</h2>
I do a lot of work with customers around Innovation. I’m always careful to distinguish between “Invention” and “Innovation”, for they are different concepts. <br />
<br />
Invention is the creation of entirely new things. This typically involves deep research and science. Its more often the case that the initial releases of Inventions are not well tuned to the marketplace. Technology needs to mature, consumers minds need time to adopt to new concepts, manufacturing costs need to drop to affordable levels. Invention is necessary to create a pipeline of new products, but those products are more often than not created from second-generation inventions, where the “newish” but “not entirely new” concepts and inventions are pieced together. <br />
<br />
The <a href="http://www.oed.com/view/Entry/96311?redirectedFrom=innovation#eid">Oxford English Dictionary</a> defines Innovation as “the introduction of novelties; the alteration of what is established by the introduction of new elements or forms.” I like this definition because it succinctly defines what Innovation is and why it isn’t the same as Invention.<br />
<br />
Whilst all of the elements of Apple Pay are indeed not new, the combination of NFC, Tokenisation, TouchID and Apple Watch’s skin sensors together create something that has never been seen before. This is certainly novel, it certainly introduces new value and it looks very probable that we are seeing the establishment of a new model for payments. I think this easily counts as Innovation.<br />
<br />
To see what others think about Apple Pay and it’s implications, this <a href="http://fortune.com/2014/09/09/apple-pay-what-the-analysts-are-saying/">handy summary by Fortune</a> is very useful. Hopefully my thoughts align with those with only a slightly higher profile than my humble self ;-)<br />
<br />
<br />
<h4>
Update 11th Sept 2014</h4>
<div>
As a result of some useful prompts from @BillyBambrough I have revised the section on Bitcoin in order to better represent that it is more analogous to cash, than to credit cards. This is an important point, because trying to fix Bitcoin to make it work like a credit card would likely destroy what makes it so unique. Perhaps there is a future role for such novel currencies in the same way that we all carry cash in addition to credit cards today. My thanks to Billy for his helpful intervention to sharpen my thinking on this important topic!</div>
Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-72707655543298856282014-09-08T12:08:00.001-07:002014-09-08T12:24:08.837-07:00Speculating on an Apple Mobile Payment scheme<p>For those that have been hiding under a stone, tomorrow (Tuesday) is the announcement date for the next iPhone. But it's not the iPhone that interests me (well, ok, the new iPhone interests me a <em>little). </em>There's something much more fascinating that we're going to find out.</p><p>Apple are heavily rumoured to be launching a new mobile payments service as part of the announcement. VISA, MasterCard, American Express and a variety of American banks are heavily hinted to have already signed up. As part of this service, they are said to have agreed to a 0.25% reduction in the ‘interchange rate', or transaction processing fee. This is <em>huge</em>, not only because this kind of discount is unprecedented.</p><p>What fascinates me is what happens to this 0.25%, because the answer has <em>big</em> implications for business models, profit and take-up.</p><p> </p><h2>Retailer discounts</h2><p>It might be that the 0.25% is a reduction in the fees that retailers pay for accepting a card transaction. That would be significant because, for retailers operating on often very slim margins, it would mean they were heavily incented to support and prefer iPhone payment. This means they might easily be persuaded to make any necessary PoS technology upgrades. They are also likely to publicise their acceptance of iPhone payments. This could have big implications for uptake - something which all competitors have singularly failed in. Very rapidly, we might see Apple becoming the leading mobile payments supplier.</p><p> </p><h2>Increased profit</h2><p>Alternatively, Apple might keep the 0.25% for themselves. What might that mean?</p><p>Well, it might lead to a dramatic improvement in Apple's profit margin. A company taking 0.25% of retail transactions is a profit machine like no other. Stock prices would possibly soar.</p><p> </p><h2>Subsidies</h2><p>Alternatively, Apple might keep its profit margin static. It's gross profit margin has been consistenly in the mid-30% range over many years, so this is clearly a margin they feel comfortable with. It's not as if Apple is a company in need of cash; it seems to have more of a problem spending the money it already has. </p><p>Instead, Apple might choose to use the 0.25% to subsidise the purchase price of iPhones. Cost is rapidly becoming the defining competitive advantage that competing Andoid handsets have over the iPhone. If a payments-subsidised iPhone were free or discounted, this might have a <em>huge</em> impact on Apple's market share.</p><p> </p><h2>Disclaimer</h2><p>I have no inside information on what is going on. But, it strikes me that a company playing with 0.25% of a large number of retail payments has got to be doing something significant from a business model perspective. </p><p>Tomorrow evening (UK time) I will be looking for hints (or even explicit statements) about the funding and business model behind any new payments service. This might turn out to be much more interesting than any shiny new phones (or even iWatches).</p><p>I might be wrong. There might be other dynamics at play here that I'm unaware of. I'm only guessing and I'm fallible.</p><p> </p><p> </p><div style="text-align: right; font-size: small; clear: both;" id="blogsy_footer"><a href="http://blogsyapp.com" target="_blank"><img src="http://blogsyapp.com/images/blogsy_footer_icon.png" alt="Posted with Blogsy" style="vertical-align: middle; margin-right: 5px;" width="20" height="20" />Posted with Blogsy</a></div>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-38487806737485637922014-06-12T05:18:00.001-07:002014-06-12T05:40:25.735-07:00iOS8 for #fintech<p>Apple's <a href="http://www.apple.com/apple-events/June-2014/">announcement</a> of <a href="http://www.apple.com/ios/ios8/">iOS8</a> at its recent developers conference was one of the biggest OS announcements it has made since the original iPhone. Much of the new OS capabilities are aimed at developers, with over 4,000 new APIs, new developer tooling and even a completely new programming language. </p><p>There was so much new content, its hard to see the wood for the trees. I thought it might be interesting to interpret some of the announcements and ask “what does this mean for #fintech (the world of finance technology and innovation)? I've pulled together some of the more significant announcements, that seemed to me to have some relevance, and provided some commentary below about what they might mean.</p><p>Of course, being new OS function, the new capabilities are currently available in beta for natively coded apps only. Other development approaches (e.g. frameworks that build hybrid apps) will need to be updated to support the new features. But even for those apps, calling out to native code is a possibility to exploit some of the new iOS8 capabilities.</p><h2>TouchID API access</h2><p>This was an obvious and a biggie. Now that we can progratically access the TouchID fingerprint sensor, we can use that to authenticate the user in any app. Begone, complex passwords! </p><p>API access means that TouchID can play a role in sophisticated authentication scenarios. For example, a bank may choose to retain their existing authentication method for their mobile app, but use TouchID as an additional security mechanism for money transfer. The use of password and fingerprint is a <em>very</em> secure combination. Apple have used TouchID for convenience. That is a choice, but it can also play a role as an <em>additional</em> layer in high security situations. </p><p>The TouchID implementation is quite elegant, because it keeps all the security processing separate from both iOS itself and also your app. It means that apps never get access to fingerprints and the details of those remain locked away in the Secure Enclave. This maintains the users trust in the integrity of the system and of TouchID itself.</p><p>Of course this is great and has obvious applications in banking, finance and payments. <a href="http://www.tuaw.com/2014/06/06/paypal-already-exploring-integrating-touch-id-into-their-mobile/">PayPal are already investigating TouchID use</a>, and I think it's a fair bet that it will start cropping up in many, many apps once iOS8 ships. </p><h2>Widgets</h2><p>The ability to place widgets in Notification Centre brings many possibilities.</p><p>For example, what about a widget that shows your account balance, maybe with a TouchID authentication for the super paranoid? I'm not sure we need authentication though - many banks happily send account balance and more over unsecured SMS channels. Either way, the ability to embed function outside of a specific app brings all sorts of possibilities for convenient access to information.</p><h2>Inter-app communication</h2><p>The announcement of 'actions' that allow function from one app to be accessed in another brings some intriguing possibilities. Apple demonstrated a Bing language translation inline within a Safari web page - very impressive. </p><p>How might the #fintech world use this? Well, the data we have is often transaction data - so function that works on that is the most obvious choice. Maybe a 3rd party service that translates the sometimes cryptic transacting party name into something more readable? (I used to have a regular debit to "BDMLConnect" - which turned out to be Admiral Insurance after I disputed its authenticity). Or maybe there are opportunities to integrate with third-party 'spend analysis' services? Looking another way, a banking app could expose a currency calculator, so when you see a price in another app (or website) the bank could translate that into your chosen currencies value.</p><h2>Virtual Currencies</h2><p>Apple garnered a fair degree of criticism for banning Bitcoin apps from its App Store. However, at WWDC it updated its <a href="https://developer.apple.com/appstore/resources/approval/guidelines.html">App Store Guidelines</a> to state “Apps may facilitate transmission of approved virtual currencies provided that they do so in compliance with all state and federal laws for the territories in which the app functions.” Exactly what this means is unclear, but many are betting this deliberate change means a softening of the fruit companies stance to Bitcoin and its ilk.</p><h2>Swift and Playgrounds</h2><p>The introduction of an entirely new programming language was very unexpected. However, <a href="https://itunes.apple.com/us/book/swift-programming-language/id881256329?mt=11">Swift</a> looks like a well thought out language with some clever attributes. It has the readability and simplicity of JavaScript, but the robustness and speed of Objective C. In fact, Apple says Swift is <em>faster</em> than Objective C.</p><p>However, the big thing about Swift is how much faster it is to write code than other languages. Within a day of Swift being released, a <a href="http://techcrunch.com/2014/06/04/a-developer-cloned-flappy-bird-using-apples-new-programming-language-swift-in-a-matter-of-hours/">clone of the famous Flappy Bird game had been released</a>.</p><p>Swift is impressively easy to learn. Compared to Objective C, its a massive increase in simplicity. Where Objective C is clever but complex, Swift is simple and straightforward. There is also much less 'boilerplate' code in Swift. New programmers are going to be effective much, much quicker. And less/simpler code equates to more maintainable code with fewer bugs.</p><blockquote><p>“The line of code that's the fastest to write, that never breaks, that doesn't need maintenance, is the line you never had to wite.” — Steve Jobs.</p></blockquote><p>Equally impressive is the new interactive 'playgrounds' feature for Swift that's built into the XCode tooling. A playground interprets and runs code as you write it. Write 'println (“hello world”)' and you see 'hello world' immediately. Write a line of code to draw a shape, and see that shape immediately. Or, view a timeline of a variable, seeing how it changes over time in a 'for' loop. This immediacy and interactivity makes Swift a dramatic breakthrough in simplicity. Rather than wait to compile and run your code to see the results, you see those results as you type. Xcode and Swift promise real productivity breakthroughs. Quicker coding, more reliable code, these are benefits to <em>all</em> domains, not just #fintech.</p><h2>New WebKit framework & API</h2><p>Some portray the position of HTML-based and hybrid apps as some form of competition against apple-backed native apps. However, I think that is an entirely misguided characterisation. It's often forgotten than Steve Jobs originally <a href="http://www.cultofmac.com/125180/steve-jobs-was-originally-dead-set-against-third-party-apps-for-the-iphone/">resisted the provision of native app development, instead pushing HTML apps</a>. Apple is nothing but pragmatic on this point - if they wanted to ban hybrid apps, they could easily have done so; but they have not. </p><p>Hybrid apps were made possible through Apple's UIWebView API, which allows embedded web content in a native app. <em>All</em> hybrid apps use this API and are completely dependent upon it. One of the App Store rules is that web content can only be displayed with this official API, which is why even Google's Chrome browser on iOS uses Apple's web rendering engine, rather than Google's.</p><p>However, there have been some significant issues with the UIWebView API. Firstly, it does not support the more advanced JavaScript engine that Safari uses, relying on an older and much slower implementation. This means that all hybrid apps are slower than Apple's Safari. This included Chrome, which is forced to use the old, slow JavaScript engine. </p><p>The second problem with UIWebView is with the bridge between native Objective C and JavaScript in the web view. Typically there is a need to pass data from the native side of a hybrid app into the UIWebView control and pass results back again. There's a standard bridge into the web control, but none to receive the results of a JavaScript call back into Objective C. Instead, hybrid frameworks are forced to use a giant hack that relies on intercepting the user web taps and response html. It works, but is very messy.</p><p>In iOS8, both of these problems are fixed. Apple is introducing a completely new web API that supports both the latest JavaScript engine and returning the results of JavaScript calls back to native code. It's a big improvement and puts to rest the lie that Apple is anti this style of app. The new API also supports smooth 60fps hardware-accelerated scrolling, so the stuttery scrolling that made many hybrid apps appear so clunky is now banished. I think this new API will make it much easier to embed web content into native apps and to do so in a way that is imperceptible to users. Apple is making web content a first-class citizen. Sadly the new web API will require some engineering work in the hybrid frameworks to switch over to it, so we might not get to see the advantages immediately, but they will come.</p><p>The new improvement to web support don't mean that all apps should use HTML and JavaScript. Native code still executes an order-of-magnitude faster (which might be important not just for speed, but also restraining battery usage), the development tools have some advantages (eg Apple has specific tools for diagnosing animation framerate issues and memory leaks that are much harder to identify with web content) and there is a very rich ecosystem of developers and experience. New innovations like Swift help keep native code a great choice.</p><p>The reality is that native code has many advantages, mainly in UI consistency, ease of adopting new OS capabilities, speed and easy exploitation of the very rich native app development ecosystem. However, hybrid apps also have their advantages, particularly in corporate environments where the reuse of componentary across web, iOS and other mobile devices is important to support efficient delivery of the same function to multiple environments.</p><p>IMHO it's not about native <em>or</em> hybrid/HTML technologies, but about the appropriate use of both to build the best user experience. Many of Apple's own apps use embedded WebKit HTML content - iBooks uses it to render the text of books, Mail uses it to layout HTML email and even Messages uses it to render conversations with your friends. An intelligent use of native and HTML technologies, where the choices are guided by user experience rather than religious zeal, is the best approach. In my experience, good mobile developers don't talk about native versus hybrid apps; they talk about the appropriate use of all of the tools available to them, to best solve the user experience challenge that's presented. Anyone who starts the conversation with an assumption that there is only one way is best ignored.</p><h2>Interior positioning</h2><p>iOS8 brings a new interior positioning capability. Just as GPS allows navigation outside, now Apple allows clever use of the M7 motion processor and RF signals to report position when the user is <em>inside</em> a building. This is important, because GPS need line-of-sight to a satellite, which is impossible when there's a ceiling above you and walls around you. In iOS8 we have interior navigation, as well as exterior navigation. </p><p>This means that maps and location-tracking of users inside of buildings is now possible. Maybe this could be used to help guide a customer to an ATM in a shopping mall. Or to help guide the visually impaired to the right place in a bank branch. Or a payment app might use this capability to guide customers to receipt points in a store. I have to admit it though, I <em>really</em> want the real-life Pacman game that I'm sure <em>someone</em> is going to make.</p><h2>Per-app battery drain information</h2><p>The addition of information about per-app battery usage is a great feature to help users diagnose apps that are causing battery issues. It's been suggested that the Facebook app has at times been a particular issue, but this is hard to track down, until now.</p><p>That is now fixed, with simple data being reported on a per-app basis for all to see. And it makes it very obvious to users which are the quality apps and which are the poorly coded battery hogs.</p><p>This means we need to be more conscious of battery usage. An app that churns the battery, has runaway code, makes use of background resources poorly or leaves the GPS turned on when not being used, will now be obvious to the user. Users are very likely to simply delete apps that show as battery hogs, so developers will need to make sure they aren't at the top of the list of battery eaters on customer's devices.</p><h2>Location-based app suggestions</h2><p><a href="http://appleinsider.com/articles/14/06/03/apples-ios-8-uses-ibeacon-tech-brings-location-aware-app-access-to-lock-screen">Location-based app suggestions</a> are a great way of making users aware of apps that add value to a particular physical location. A discrete outline of a suggested app icon is placed on the lock screen when a user is in the vicinity. So, when a customer is near a bank branch, the bank's app might be suggested on the lock screen. </p><p>What is interesting, is that the suggestion is made even if the app isn't installed, so this capability might act as another way of encouraging users to download #fintech apps, in the right circumstances.</p><p>Exactly how Apple makes theses suggestions isn't completely clear yet - I suspect it's using the same underlying location-based app popularity algorithms as the App Store “Near Me” tab. If this is true, and app usage at a geolocation is the key, it implies some interesting possibilities. For example, <a href="http://www.flypay.co.uk/">Flypay</a> is a mobile payment service used in <a href="http://www.wahaca.co.uk/">Wahaca</a> Mexican restaurants in the UK. If Flypay is the most popular app in those restaurants (possible) does this mean it will pop up on the lock screen of all iOS8 users in Wahaca restaurants? If it does, this could be an incredible way of generating interest in #fintech apps.</p><h2>Cloudkit</h2><p>Cloudkit was a big surprise. Apple has never been known for it cloud capabilities, so to see them offer cloud services to developers raised a few eyebrows. Cloudkit is similar in concept to many of the existing “back end as a service(baas)” options like Parse(now owned by Facebook), BlueMix(IBM), and many others. These services provide many of the server-based features that mobile developers need, but without the usual complexity of finding a hosting provider, provisioning an OS, installing databases and middleware, etc. </p><p>Cloudkit and its competitors provide things like pre-configured server-based databases to centrally store information, push-messaging servers, etc. This can dramatically simplify app development and mean that you don't need to worry about all the server technical gubbins that used to distract you from building the actual app.</p><p>CloudKit isn't as rich in breadth as something like IBM's BlueMix, but does have some unique Apple twists that have some intriguing possibilities. It uses AppleIDs as user accounts - and allows some aspects of discoverability. For example, it's possible to find out who else in a user's address book is also using your app — assuming those participating allow sharing. This implies function that we would normally look to Facebook for. This is interesting, because I know quite a few people who don't use Facebook, but do have AppleIDs. I wonder if there might be a class of app that uses AppleIDs <em>and</em> Facebook to get a more complete social view? It's fascinating that Apple isn't building a Facebook, but might be allowing others to do so by leveraging the CloudKit APIs.</p><p>Augment this with some sophisticated change notification and delta-download capabilities, and it feels like CloudKit might enable some new types of function. </p><p>Obviously CloudKit is very Apple-centric, so other BAAS services are much more appropriate in multi-device situations. And those other options have many more capabilities. But CloudKit does hint at new possibilities that are unique to the Apple ecosystem. Whilst many #fintech companies need to be cross-platform, maximising the attraction of a service on iOS can also be an important business priority. These strategies don't need to conflict; it's perfectly possible to use a cross-platform BAAS option, with CloudKit augmenting where it has unique value. In other words, there is nothing to stop devs using both ClloudKit <em>and</em> a competitor to get the best of both worlds. At the end of the day, your choice comes down to business strategy and market focus decisions.</p><h2>Interactive & Silent Notifications</h2><p>Notification prior to iOS8 were pretty basic, but now you can include action in them. So, instead of just having an 'ok' button in a notification, you can have a number of buttons that initiate different actions.</p><p>How about a notification that you're bank account balance has just gone over your overdraft limit. Instead of 'ok' (huh, how is that ok?) The bank can offer a 'transfer from savings' button in the notification. When the user presses that button, the app is launched and makes the transfer (after using TouchID to authenticate, of course). This prevents the user needing to find the app, find the 'transfer money' option in the app, enter source, destination accounts and choose an amount. Instead, the whole process can be automated, with the user just pressing one button.</p><p>Another interesting feature of notifications (which was actually introduced in iOS7), is the ability to send an app 'silent notifications'. These are invisible to the user and wake the app up in the background, where it can then connect to the server and download data. The silent notification lets the app know there has been an event and there is data to retrieve. </p><p>Silent Notifications are perfect for #fintech - i.e. send the app a silent notification when a transaction occurs. Then, the app connects silently in the background, downloads the transaction details and stores those, encrypted of course, in a local database. This means the user's device has all the transaction data cached locally and is always bang up-to-date. Such an implementation would make a banking app incredibly responsive and fast - most of them today are quite tedious to use, because of the need to download data from the server all the time.</p><h2>Summary</h2><p>The overriding impression I'm left with from iOS8 is one of potential and possibilities. To take advantage of this potential is going to need some lateral thinking. Quite how the #fintech world can exploit this, only time will tell. Using TouchID is an absolute no-brainier - I will be astounded if we don't see a rash of enabled apps. Some of the other features have perhaps subtler but further reaching potnetial. There are definite possibilities for those that choose to open their minds.</p><p> </p><div style="text-align: right; font-size: small; clear: both;" id="blogsy_footer"><a href="http://blogsyapp.com" target="_blank"><img src="http://blogsyapp.com/images/blogsy_footer_icon.png" alt="Posted with Blogsy" style="vertical-align: middle; margin-right: 5px;" width="20" height="20" />Posted with Blogsy</a></div>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com2tag:blogger.com,1999:blog-7114254501166964003.post-81545814064144139152014-06-04T12:55:00.000-07:002014-06-14T09:14:09.368-07:00Why coding is decidedly not a commodity<p>I’ve lost count of the number of times I’ve heard that supposedly “coding is a commodity” in large corporates. In such organisations this has been the mantra over the past few years, supporting a shift of such “commodity” skills to low-cost offshore locations.</p><p>It’s a deeply depressing mantra, because it dismisses an incredible skill as of little value and in quite a condescending manner. It also creates a massive skills pipeline issue, because how are we supposed to grow management and IT architectural skills, if those entering the industry don’t have a chance to cut their teeth on coding? With this approach we were in danger of breeding future management types who had no experience of what software-engineering is. I’ve always believed you can’t manage what you don’t understand, so this was never going to work.</p><p>My experience is that the “coding is a commodity” inevitably results in large and bloated teams of average-skilled people (or worse). Because man-day rates are so low, there’s little incentive to not add more people to your project. But of course adding more people rapidly results in a coordination nightmare. And when the focus is on cheap people, its hard to get really good people. This is an unpleasant place to be.</p><p>Thankfully this “coding is a commodity” mantra is a bankrupt one. Even if it was true at one point (which I do not believe), things are changing – and changing rapidly. I’m happy to report that, IMHO, we now have ample evidence to put to bed forever this argument that coding skills are a commodity.</p><p>The first piece of evidence we have is the incredible success of Silicon Valley. It’s an over-used phrase, but software really <em>is</em> eating the world. Major corporations like Facebook and Google have forged new and very lucrative business models through incredible software innovations. <br>This software isn’t written by low-cost commodity skills, its forged by some of America’s brightest minds. Top universities like Stanford are hunting grounds for the internet companies. Google is well known for only <a href="http://www.google.com/about/careers/lifeatgoogle/hiringprocess/">employing</a> the very brightest of employees. Heck, competition for software skills is so intense that many companies erroneously entered into illegal <a href="http://www.siliconvalley.com/apple/ci_25824396/workers-tech-no-poaching-case-likely-get-4">‘no poaching’</a> agreements to try to calm down the overheated silicon valley software jobs market. Top-class coding skills are a career again.</p><p>But its not just the Internet companies where this is true. Some of our most cherished traditional industries are being reinvented by software. As <a href="http://www.kpcb.com/insights/2012-internet-trends">Mary Meeker</a> so eloquently puts it, we’re seeing the “re-imagination of nearly everything”.<br></p><ul><li>Encylopedias are going out of print</li><li>Printed photos are being replaced by on-screen digital equivalents</li><li>Newpapers are being replaced by internet browsing</li><li>Diaries are being replaced by social media</li><li>Scrapbooks are replaced by Pinterest</li><li>Magazines are being digitised by the likes of Flipboard</li><li>Books are being revolutionised by Kindle</li><li>Music and Videos are now more commonly downloaded than bought in an HMV store</li><li>Maps are replaced by sat-nav, which in turn is replaced by mapping apps on smartphones</li></ul><p>Everywhere we look, industries are being turned on their head by technology - much of it software. This doesn’t feel like a commodity to me. And its not only happening in wacky West-Coast American technology companies; every industry is affected, or will be shortly.</p><p>The second piece of evidence I’d like to call upon is my own personal experience. It seems everyone wants to talk to “the guy that wrote the code”. Jaded corporate types are tired of talking to spin-doctors; they want the real techies. Real techies are generally assumed to be coders. If those coders can spin up a few charts to explain their points, then all the better. My experience is that many of them can, and often do a much more credible job than the professional powerpoint jockeys.</p><p>Now, more than ever, it seems that people want to listen to the doers, not the professional talkers.</p><p>The third piece of evidence I’d like to discuss is the emergence of a new approach to software development. In the “coding is a commodity” world, projects were often large and sprawling. Large and sprawling implies lots of people and lots of people implies lots of cost. In such environments it’s perhaps not surprising that man-day-rate was king, because the pennies (or cents) soon add up. </p><p>However, we are seeing an increasing proliferation of a very different software architecture. This is an architecture that supports small, multi-disciplinary teams working together in an agile fashion. Typically they deliver new function in 2–3 months, rather than 2–3 years. Small teams sitting close with those who understand the business requirements, a multi-disciplinary approach rather than handoffs to large faceless specialist centres. These kinds of approaches are gaining ground and pricking the attention of even those who led the charge of coding commodity.</p><p>However, the complexity of the large core systems that runs many big companies remains. This new approach only becomes viable when that complexity can be hidden and development against it made ‘self service’. The moment we need a meeting with a specialist core systems team is the moment this new model starts to break. Hiding complexity and enabling self-service development are therefore critical in this new world.</p><p>There’s an increasing consensus around how we make this happen. Typically this requires a combination of<br></p><ul><li>A set of well-structured Application Programming Interfaces(APIs) that hide the complexity of the legacy function. New teams code to these APIs and don’t need to understand anything of the internal structure of the underlying application code.</li><li>A self-service development platform that enables development teams to immediately gain access to the infrastructures and tools they need to build new software systems. This is sometimes referred to “platform as a service (PAAS)” and for mobile developers the phrase “Back-end as a service(BAAS)” is also gaining ground.</li></ul><ul></ul><h2>Application Programming Interfaces(APIs)</h2><p>APIs are clever because they present a programming interface that hides what happens underneath them. Historically we’ve <em>tried</em> to do this, but it has often been tricky to gain the value promised. </p><p>Sometimes this failure was because of a lack of standardisation on the technology, or because the technology was overly complex and difficult to use. Other times it was because the “self service” aspects of the modern API movement were missing. In all cases it was because it was just too damned hard.</p><p>The modern API movement resolves these issues. It brings an industry consensus on a very simple-to-use technology set that supports modern and simple APIs. By moving to a standard technology set that everyone understands, and one that is easy to pick up rapidly for those that don’t, we avoid the complexity that previous attempts suffered from. And the addition of self-service capabilities like discussion forums, code samples, API explorers, etc mean its easy for anyone to work out how to use an API without needing to pick up the phone or attend a meeting.</p><p>APIs aren’t just about the API itself - there’s a wider ecosystem that is required in order to get the ‘self service’ development benefit. This only comes when three components are integrated, namely:<br></p><ol><li>APIs that:<ul><li>Adopt industry consensus on API technology, namely REST (architectural style), OAuth (security), JSON (data structure).</li><li>Are clear, logical and simply structured - aimed at easy adoption by developers, rather than intellectual cleverness</li></ul></li><li>An API Gateway that provides the run-time infrastructure to support the APIs including:<ul><li>Applies security policies</li><li>Enforces volume limits, protecting the underlying service</li><li>Provides data and analytics to help manage and plan the API service</li></ul></li><li>A Developer Portal that includes:<ul><li>Clear self-service documentation</li><li>Code samples in popular programming languages to illustrate static documentation</li><li>An ‘API Explorer’ that allows developers to interactively play with the API in real time, inputting calls and receiving returns in a way that allows them rapidly validate their understanding of the API.</li><li>Developer forums that allow questions to be asked and answers provided, all recorded for posterity and for the next developer to learn from.</li></ul></li></ol><p>Its common to find APIs implemented that focus on (1). Without (2) and (3) its not surprising that the authors are disappointed at their failure to encourage a new agile development culture. Only by addressing <em>all</em> of the aspects of a true API, in the way that popular internet companies like Facebook have, can this new world be fostered in. However, when done well, it becomes possible to exploit small, local, multi-disciplinary teams that rapidly deliver new capability.</p><h2>Platform as a Service(PAAS)</h2><p>I’m not the biggest fan of acronyms like PAAS, which often seem designed to obscure rather than inform. However, PAAS has gained some industry momentum, so I use it despite my reservations. PAAS describes an important capability that massively simplifies the life of developers, which is the self-service provision of the software/hardware infrastructure necessary to support their application code.</p><p>As a software developer I need access to things such as an application server that executes my code, a database to store/retrieve data, a user registration service that provides userid/password capabilities, a messaging service, an analytics services, etc. Building such IT infrastructures can be an unusually complex and lengthly process. I’ve seen big companies take upto 6 months to do this, during which time the development team is on-hold, waiting for the capabilities it needs to start work. </p><p>PAAS services provide a website where a developer can choose the infrastructure services her new application requires. This isn’t about low-level geekery. I don’t choose my operating system or versions of software - I only get to choose the type of service I require and all that low-level stuff is hidden. The provision of those services is then automated, with a live development environment being provided within a matter of seconds/minutes. Capabilities like IBM’s <a href="https://ace.ng.bluemix.net/">BlueMix</a> typify this radical new approach to supporting developers. The ability to eliminate all the typical corporate meetings and delays and get working immediately can be a massive boon to productivity.</p><h2>APIS + PAAS = Freedom</h2><p>When we put a well-engineed API together with a responsive PAAS environment, development teams are set free. They become self-sufficient end empowered to do what they do best; build software. In many big companies the delays, complexities and endless meetings are what made development so slow and costly. By removing that cost and delay we make small, local teams an economic reality. Small teams that are close to those who understand the business are nearly always more responsive than large and remote teams.</p><p>Software is redefining what is possible in business and the software-engineers that make that possible are far from being a commodity. When an API+PAAS strategy is put in place, it empowers a new generation of developers that, sitting close to the business, are best placed to build solutions that redefine what that business is capable of. And once again the IT industry has a natural career structure that enables those at the top to start their careers by doing stuff, rather than just talking about it.</p><h2>Evidence</h2><p>So how do we know this all works? We have the evidence in front of our eyes. This combination of APIs and PAAS is exactly what companies like Facebook have used to popularise third-party services and apps that build on their capabilities. Because Facebook has has such enormous volume, it cannot afford to have meetings with every app developer. Its API is successful precisely because it has been built to be self-service.</p><p>When Facebook updates its platform it doesn’t hold a meeting with the tens of thousands of developers that interface with it. That's clearly impossible, so it has engineered a solution that makes those meetings unnecessary. Which is a pretty powerful concept - thousands of apps interlinked without any coordination, discussions or meetings. If Facebook can do this, why cannot a Bank, an Insurance company or a Retailer?</p><p> </p><div style="text-align: right; font-size: small; clear: both;" id="blogsy_footer"><a href="http://blogsyapp.com" target="_blank"><img src="http://blogsyapp.com/images/blogsy_footer_icon.png" alt="Posted with Blogsy" style="vertical-align: middle; margin-right: 5px;" width="20" height="20" />Posted with Blogsy</a></div>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com2tag:blogger.com,1999:blog-7114254501166964003.post-50234086230933031132014-05-31T04:17:00.001-07:002014-05-31T04:25:09.739-07:00The shifting sands of careers<p>I remember careers advice at school. One of our tasks was to fill in a form indicating our likes, interests and skills. This form was then processed by one of the early computers (hey, this was back in the '80s). We were then, some weeks later, presented with a set of suggestions for potential careers. I distinctly remember two of the suggestions given to me: transport planner and astronaut.</p><p>A transport-planner would probably have had me working for British Rail at the time, so that was a good escape. Quite how I was supposed become an astronaut, living in that well-known space-pioneering country England, was not explained to me. Needless to say, the impression these recommendations made on an 18 year old student was not of the positive variety!</p><p>The more realistic career suggestions would have been of two forms. Careers at the time were heavily influenced by the concept of gaining entry to “one of the professions”. To become an Accountant, a Banker, and Engineer, etc, was considered a rite of passage for the middle classes and a life-changing achievement for the lower classes. Either way, this was what you were supposed to do. As an early-stage computer hacker I was probably something of an enigma to the careers master, who knew little about either computers or the careers they were going to create. Looking back, my disdain for his advice was probably obvious. </p><p>It occurred to me today that the world of careers advice must be dramatically different to in my day.</p><p>In my day the purpose of careers advice was to help you find a “job”, where “job” meant working for someone else. I'm not so sure that working for someone else is how I will be advising my daughter in a few years time. Its an option, but only an option. </p><p>Today we have tools that empower us to take on creative tasks that only five years ago were restricted to big companies. Whether it be Movie Producer, Musician, Programmer, Designer or Author, both the tools of production and distribution are rapidly becoming democracised.</p><p>In an age when cameras that record 4k video (higher quality than any movie you've watched) are available for only £1299 (Panasonic GH4) and the software used by Hollywood for movie editing software (Apple's Final Cut Pro) is only £199, we can all be movie producers. Sites like YouTube and Vimeo ensure we can distribute and even make money from our creative attempts.</p><p>Its the same in virtually any creative field; for a couple of thousand pounds its possible to acquire a pretty reasonable hardware and software setup that allows the creation of professional music, apps, artistry or design. And self-publishing through App Stores, online books stores, youtube, soundcloud or any number of alternatives ensure your work can become visible and earn money.</p><p>My wife recently registered a company: £15 and 10 minutes online is all it took. She's been inundated with banks asking her to open an account since. Even the tools of business itself are being democratised, it seems.</p><p>I don't have any illusions as to the difficulty of making a realistic income from these sources; it's tough. However, I do find it amazing how easy it has become to create and sell content that rivals the highest available quality. This is no amateur hour, for a modest investment you can now get top-notch quality.</p><p>When advising a young person on careers today, it seems to me that “entrepreneur” is a very realistic career choice. For young people with no financial commitments and an idea, why would they not take advantage of this? Try whilst you can, I say. If you fail, well, you've probably learnt more and faster than most of us in regular 'careers'. Initiative, business understanding, an ability to learn and recover from failure, a creative drive: these skills are highly valued in most big businesses. </p><p>I think we might be the first generation watching young people seriously question the “working for someone else” ethic. </p><p>Would I recommend any young person to go it alone? No. But I would recommend anyone who has an idea or interest and is motivated to do something tough, to give it a try. You're only young once, so grab whatever chances come your way. </p><p>So how will regular companies deal with this change and attract young people in the future? Responsibility, trust, empowerment, honesty, authenticity: these are the words that all company cultures need to learn. Are they words that describe those cultures today, I wonder? Somehow I doubt that they are in many cases. Sure, the mission statements and executive presentations will say otherwise, but the truth is often somewhat different as Dilbert has so accurately observed.</p><p>This implies to me that things will need to change. They will need to change not only because the expectations and requirements of employees are shifting, but because all companies need the skills and ideas that inspired and motivated young people can bring. Can the Dilbert culture of traditional companies survive, I wonder?</p><p> </p><p> </p><div style="text-align: right; font-size: small; clear: both;" id="blogsy_footer"><a href="http://blogsyapp.com" target="_blank"><img src="http://blogsyapp.com/images/blogsy_footer_icon.png" alt="Posted with Blogsy" style="vertical-align: middle; margin-right: 5px;" width="20" height="20" />Posted with Blogsy</a></div>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-26615295336732710032014-04-05T13:55:00.000-07:002014-04-05T13:56:39.319-07:00Privacy and Big Data<p>This has been a hard blog post to write. I’ve drafted it several times, each time pausing before posting and then dismissing the draft a few days later in frustration.</p>
<p>It’s hard to tread the right line when discussing privacy. On the one hand its easy to come across as some form of rabid revolutionary. On the other, the significance and enormity of some of the implications are too easy to down-play. I hope I’ve got the tone right, it’s an important topic.</p>
<p>My interest in this topic started with a series of talks I made about Big Data and Cognitive Computing. For those not of a technical persuasion, these techno terms mean:</p>
<ul>
<li>Big Data is the technology that allows massive volumes of data to be collected and processed economically.</li>
<li>Cognitive Computing is an emerging technology that aims to build computer systems that better emulate the human brain, in some ways appearing to ‘think’ like we do. By necessity, a computer that appears to think must know a lot, so Big Data is an underpinning technology in cognitive systems.</li>
</ul>
<p>I had some great feedback on the talks, but time and again the discussion they provoked rapidly focused on the issue of privacy. Or rather, the risk these technologies have to rob us of our privacy. The reaction was consistent and strong; people are concerned about a loss of privacy. They view it as something of importance and do <em>not</em> think it’s OK to give away their data. This reaction got me thinking and researching. What I found perplexed me and led to this blog post.</p>
<p>I guess in the technical world we’ve become used to the fact that our data is being exploited by others. But the penny dropped for me only recently that most people do <em>not</em> realise how much data they are giving away or the way it’s being used. When I point this out to them, they are almost always shocked. It’s worrying that private data should be collected and used without the knowledge or permission of those whom the data is about.</p>
<p>So, what have I noticed that concerns me and my audiences?</p>
<br>
<h2>Security Agencies</h2>
<p>The revelations around national security agencies spying on their own public are well documented. </p>
<p>If we heard that governments were opening all of our snail-mail and reading it, there would be justifiable outrage. But from what we hear of their intercept of electronic mail, browsing history and messaging, there seems little real difference. In fact, I might consider electronic surveillance <em>worse</em> than the intercept of physical mail because of its insidious nature.</p>
<p>Technology has made this possible on a vast scale and with near invisibility. Because this is possible, it doesn’t automatically mean it is right to do so. Its always possible to argue that collecting more data means security agencies can better target their efforts; but does this mean there should be no limits on their data collection powers? I don’t think so. </p>
<p>Of course I expect security agencies to spy on those suspected of wrong-doing. But to collect data indiscriminately on <em>all</em> of us brings uncomfortable parallels with the East German <a href="http://en.wikipedia.org/wiki/Stasi">Satsi</a>. In 1978 a young Englishman moved to Berlin. Fifteen years later, after the fall of communism, Timothy Garton Ash returned to look at a file that had been compiled by the Stasi. It contained a meticulous record of his life in Berlin and he recounted this story in <a href="http://www.amazon.co.uk/File-Timothy-Garton-Ash/dp/1848870884/">‘The File’</a>. Is this our future as well?</p>
<p>I dislike this idea, not because of some idealistic world-view, but because I think it has the potential to be a threat to even the most innocent. </p>
<p>There’s a psychological phenomena called “<a href="http://en.wikipedia.org/wiki/Confirmation_bias">Confirmation Bias</a>” that reveals that we all have a tendency to interpret information in a way that confirms our existing beliefs. We might think we’re objective, but our brains are hard-wired in a way that often makes us far from such. We have a tendency to ignore facts that refute our beliefs, focussing instead on those that appear to support them.</p>
<p>Confirmation Bias can be particularly dangerous when there is a lot of information at hand. This is because it becomes easier to assign guilt to innocent people by selectively choosing fragments of information that support a rogue theory. If you think this unlikely, its worth studying some real case studies of <a href="http://boingboing.net/2014/02/09/a-reason-to-hang-him-mass.html">wrongful imprisonment of innocent individuals</a> precisely because of confirmation bias. </p>
<p>Security agencies even appear to have <a href="http://www.bbc.co.uk/news/technology-26367781">“private” webcam images</a> from millions of citizens. Exposing yourself to a webcam arguably might not be the best idea in the world, but its certainly not illegal. </p>
<p>If you capture enough private data then it becomes inevitable that you’re going to find something potentially incriminating on quite a large proportion of the population – whether that be pornographic webcam images, tittle-tattle about others, private admissions of minor motoring offences, the download of illegally ripped music. The low-level transgressions that many, otherwise innocent, citizens perform is probably quite extensive.</p>
<blockquote>
<p>“We are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right.” George Orwell</p>
</blockquote>
<p>Even completely innocent activities can, when a culprit is being sought, take on new and sinister implications. A photograph of a smiling man with a bowl of mince and a meat cleaver might be an image of an amateur chef, or a sinister picture of a mass-murderer. Support of a fringe politician might be a sign of innocent eccentricity, or of a political extremist with violent tendencies. If you suspect guilt, then these items might be used to confirm it – but they might just as easily be of no consequence whatsoever. </p>
<p>Having too much information risks making potential criminals of us all. It is not safe to assume “I’ve got nothing to fear because I’ve done nothing wrong”.</p>
<blockquote>
<p>“If you give me six lines written by the hand of the most honest of men, I will find something in them which will hang him”, Cardinal De Richelieu.</p>
</blockquote>
<p>As I write, shocking revelations about the <a href="https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/287030/stephen_lawrence_review_summary.pdf">corrupt practices</a> around the dreadful Stephen Lawrence case are emerging. It appears that something more than just individual rogue officers has been at fault over a period of decades. With such a backdrop, its hard not to be suspicious of assurances from those in positions of power. </p>
<p>As citizens, it is our duty to ask what data is being collected, under what circumstances it can be used, by whom and most importantly <em>with what independent oversight</em>. For without that independent oversight, how are we to believe anything we are told?</p>
<p>That is not to say that targeted data collection, with appropriate oversight and controls can’t be justified. But who controls this and decides what is appropriate and what is not? So far the revelations I’ve read do not appear to have been balanced by any form of democratic oversight that ensures what is being done is appropriate. The answers to questions about oversight appear confused and murky. In such circumstances we have to assume there is ineffective oversight, or rather no oversight that we have any visibility of or influence over. If such oversight isn’t visible and we can’t influence it, then its not effective.</p>
<br>
<h2>Public Health records</h2>
<p>There’s an initiative to pool annonymised public health records in the UK called <a href="http://www.nhs.uk/NHSEngland/thenhs/records/healthrecords/Pages/care-data.aspx">care.data</a>. The idea being that this data can then be analysed for the benefit of citizens. On the surface this seems a noble endeavour. </p>
<p>However, there are a number of issues with this initiative, namely:</p>
<ul>
<li>Whilst the data is anonymised, its possible people might be re-identified by matching those anonymised records with other data.</li>
<li>Its proposed that data will be sold to certain qualifying third parties, for example drug companies. This raises the question of who will have access to our (anonymised) health records and what they might do with it (see previous point).</li>
<li>Although there was a half-hearted attempt to inform the public, this was so badly managed that it ended up being via a leaflet distributed with junk-mail. Most of us dumped it in the bin without noticing what it was.</li>
</ul>
<p>However, the theoretical concerns around care.data ended up being overtaken by real-life events when it emerged the similar hospital data had been sold to a consulting company who had then <a href="http://www.theguardian.com/society/2014/mar/03/nhs-england-patient-data-google-servers">uploaded it to Google’s cloud servers</a>, based in the USA, for analysis. </p>
<p>Now I work for an IT company that strictly forbids the use of such third-party cloud services because they are deemed a privacy risk to its business. And given recent NSA revelations, we have to assume that this data, now that it exists in the US, is in that government’s hands. I have no idea what they might do with it, but its available.</p>
<p>There’ve been other troubling revelations about how our health data is being treated. For example, this same hospital data was <a href="http://www.telegraph.co.uk/health/nhs/10659147/Patient-records-should-not-have-been-sold-NHS-admits.html">sold to life insurance actuaries</a> so that they could calculate the probability of death given a particular hospital procedure. This purpose, of course, was to “better” calculate health insurance premiums. The custodian of the data, the Health and Social Care Information Centre, has admitted that this particular revelation broke its rules and it is “investigating”. </p>
<blockquote>
<p>“patients’ medical records contain secrets, and we owe them our highest protection. Where we use them – and we have used them, as researchers, for decades without a leak – this must be done safely, accountably, and transparently.” Ben Goldacre</p>
</blockquote>
<p>As <a href="http://www.theguardian.com/commentisfree/2014/feb/28/care-data-is-in-chaos">Ben Goldacre so ably argues</a>, this saga of institutional incompetence (for thats what it surely is) is so troubling because the value of using analytics on our health data for public good is so great. To see this value so comprehensively undermined by a loss of public trust is indeed upsetting. Lets be honest here: if public trust is lost, such initiatives rapidly become politically impossible.</p>
<br>
<h2>Social Media</h2>
<p>My third area of concern over privacy is the way that various Social Networking and large Internet companies are grabbing our data and using it without our knowledge or permission.</p>
<p>Without knowledge or permission is the critical fact here. It’s OK for our data to be used if we give informed consent. But I would contend that very, very few people really understand what data they are giving away on the internet. Nobody is making the collection and use of data either explicit or obvious to users; presumably because they don’t want to scare people off.</p>
<p>Lets look at some real examples of what I’m talking about:</p>
<ul>
<li>What proportion of Google Mail customers realise that Google is reading the content of all their emails? Its using that data to build a profile of you and decide what adverts to serve you. Why is this any different from the Post Office opening your letters and deciding what junk mail to include with your post?</li>
<li>Who realises that Google stores all of your web searches, and Facebook all of your Graph Searches, again for profiling purposes? All of those embarressing medical problems you googled are retained for posterity - did you know that?</li>
<li>Does anyone realise that if they leave their Facebook or Google+ account logged in then Facebook/Google track your movement across the web. That means Facebook/Google knows what websites you visit, if you leave your account logged in. It builds a profile of you based on those websites. This might explain some of the adverts you see appearing on your screens.</li>
<li>Are you conscious that if you take a photograph with your smartphone it probably includes embedded tags giving the latitude/longitude of the location it was taken at? Take a picture at home and send it to someone and you’ve just told them precisely where you live.</li>
<li>Who really understands that every computer in the world has an <a href="http://en.wikipedia.org/wiki/Ip_address">IP address</a> that uniquely identifies it and when you visit any Internet location your IP is provided to the that site? Further, its trivial to <a href="http://www.infosniper.net">map an IP address to an approximate location</a>. But our web browsers also disclose other information about our computer - the operating system, the browser version, installed plugins, etc. There will only be a given number of people with an identical computer configuration in our exposed geographic location. An IP address along with a some intelligent <a href="https://panopticlick.eff.org/browser-uniqueness.pdf">browser fingerprinting</a> can uniquely identify who we are before we’ve consciously divulged <em>any</em> information.</li>
</ul>
<p>Who reads those impenetrable “terms of service” where the real privacy rules of those free services are described? Here’s an extract from Google. </p>
<blockquote>
<p>“Your Content in our Services: When you upload or otherwise submit content to our Services, you give Google (and those we work with) a worldwide licence to use, host, store, reproduce, modify, create derivative works (such as those resulting from translations, adaptations or other changes that we make so that your content works better with our Services), communicate, publish, publicly perform, publicly display and distribute such content.”</p>
</blockquote>
<p>Do you still want to store files in Google Drive knowing that you’re giving away such liberal rights? But how many people actually read and understand this stuff? I’d wager very, very few.</p>
<p>There’s nothing illegal about any of these things. If you agreed to T&C’s without reading them it’s your lookout. But are these big internet companies really doing the right thing by hiding what they do in long legalese they know that hardly anybody reads and even fewer understand? </p>
<p>Do these companies not have a duty of care to be more open and actively educate us? Their products are so polished, surely they could put a little effort into better communicating their privacy implications? Maybe they could even make it clear what their business model is - i.e. “We’re offering this service to you for free, but in return we will collect the information you provide to us and use it to target adverts.”</p>
<p>By being so obscure in their approach, are these companies doing what’s right or just what they think they can get away with? For I can only assume the reluctance to be open is because they are scared of how customers might react - there can be no other logical reason.</p>
<br>
<h2>The threat of data centralisation</h2>
<p>There’s also a good reason why we need to be aware and alert to how our personal data is collected and used. The more this data is collected and centralised, the higher the risk from the “bad guys”. Even if you’re OK with what Google or Facebook does, when you put lots of interesting data together you present an attractive target for hackers. Many high profile and respectable companies have been hacked and had data like this stolen. It’s <em>never</em> safe to assume that because the organisation hosting the data is “respectable” that nothing untoward will happen. </p>
<br>
<h2>What’s legal or what’s right?</h2>
<p>This is the crux of my concern across security and government agencies and commercial companies. They all appear to be doing what they think they can get away with. Little of what I’ve mentioned here seems to be “what’s right” by citizens. For if it were right, we’d be informed about it and we’d probably be asked permission. There would certainly be effective democratic institutions where we could influence what was happening. </p>
<p>Instead, the collection and analysis of this data is done surupticiously and in an underhand manner. If you’re a security agency then you need to be underhand to a certain degree - but do you really need to collect data on <em>all</em> of us? Do you really need to gather embarrassing personal factoids on people who will never present a threat? And if you’re not dealing with security data then you have no excuse at all. We might live in a democracy, but when all major political parties support the status-quo, how are we to influence this?</p>
<p>Just because something might be possible and legal, doesn’t mean it should be done. </p>
<p>As some of the protagonists of the banking crisis have found, society has a habit of finding ways to punish miscreants, even if they haven’t actually broken any laws. “Fred the Shred” (the CEO of one of the major UK banks impacted by the financial crisis) might not be in jail, but he was subjected to a sustained campaign of public humiliation by the UK press; it’s rumoured that he’s no longer invited to polite dinner parties, his pension is much reduced and he was humiliatingly stripped of his knighthood.</p>
<br>
<h2>But does privacy matter anymore?</h2>
<p>When even Mark Zuckerberg takes to phoning the President of the United States to “<a href="http://www.news.com.au/technology/online/zuckerberg-phones-obama-over-nsa-surveillance-concerns/story-fnjwnfzw-1226854682036">express frustration</a>” with it’s snooping of private individuals, you know that someone is rattled. </p>
<p>The more revelations we see, the more people start to withdraw. They stop using Facebook, or they reduce the information they share. My daughter’s school places a great deal of effort in educating their students on the perils of “over sharing”. Despite Facebook’s success, I know more people who don’t use it (or who use it minimally) than those who are active users. I was told recently of a bank who studied the proportion of their customers with social media profiles; they estimated that 60% didn’t have one and amongst those who did, the vast majority were inactive. I suspect the silent majority are more concerned about privacy than those within the “tech bubble” realise.</p>
<p>People have a limit to what they will tolerate. If that limit is breached then consequences result. I’ve seen some argue that privacy no longer exists and we should give up trying to pretend it does. My personal view is that this is a broken argument; I’ve never met anyone who really believes this. I’ve met a few people who work in the technology industry who’ve espouse it, but is this really their opinion of are they acting as a corporate mouthpiece to further their career? In other words, if the commercial interests of their employer were deeply intertwined with the protection of privacy, would they argue differently (I think they might)? </p>
<p>Certainly the reaction to my Big Data and Cognitive Computing talks has been unanimous; nobody argued against the need for privacy in the subsequent discussions. Everyone expressed concern.</p>
<p>I am forced to conclude that privacy <em>is</em> still important. At least it should be our choice as citizens; nobody has the right to take it away without our agreement. Governments and some large internet companies seem to be trying to hide what they are doing and surreptitiously make the removal of privacy the norm.</p>
<p>However, most of us care about the risk of identity theft, we care that we might live in a surveillance society. Most of us are inherently private people and care that our medical records are protected with the upmost respect. Life is not just about the efficiencies that might be gained through relaxing privacy. We care about privacy, it’s what makes us individuals. </p>
<br>
<h2>What should be done?</h2>
<p>At this point you might be thinking we need to stop the bus. However, I’m personally of the view that its impossible to stop change. The emergence of Big Data and Cognitive Computing technologies is impossible to rewind. There are benefits to the technology for sure. But without a better balancing of the privacy risks, there’s a terrible risk of public dissent. People can stop using social media. People can protest on the streets and force a new political reality. The press can mount very effective campaigns that bring about change. Once a tipping point is reached, things start happening. Despite what some think, the little people run things. </p>
<p>The collection and processing of private data on a large scale is a relatively new practice, so its perhaps not surprising that we should first enter a “Wild West” phase. But if this is to be anything other than a flash in the pan, there needs to be much more user control and transparency around how our data is used. </p>
<p>Its not good enough for commercial companies and government agencies to continue the “trust us, we have your best interests at heart” line; we have good reason <em>not</em> to trust them from what I can see. There are enough examples of those in positions of power abusing the trust they’ve been given for us to be suspicious.</p>
<p>So what should these data-gathering organisations do?</p>
<p>They need to:</p>
<ol>
<li><p><em>Build a culture that respects customers, that does what is right and not what is expedient.</em> When “doing the right thing” might sometimes be at odds with perceived short-term revenue objectives, such a culture needs to be robust. Robust cultures can only be driven from the very top, so strong leadership that sets the right tone is paramount. A leadership that is dominated by immediate commercial pressures, or ones driven from lower down an organisation, are unlikely to be strong enough to counterbalance commercial pressures. And its about leading by doing, not through words. History is filled with examples of companies who’ve said one thing and done another. Employees pick up on unsaid queues and take the action they <em>think</em> is wanted. If employees don’t see the leadership bolstering a company’s policy with specific and continuous actions, they might start to think the policy is just lip-service and not important.</p></li>
<li><p><em>Consider setting up independent ethics committees to oversee the use of private data.</em> I do mean <em>independent</em>. An ethics committee made up of those whose job is processing that data won’t have the distance to make the right judgement calls. We need ethics committees that are able to say “no” even when that might be against immediate commercial interests. Some level of independent scrutiny might help to counter-balance the strength of expediency that’s so prevalent in both commercial and governmental organisations.</p></li>
<li><p><em>Be open about what you are doing and, wherever possible, obtain informed consent.</em> Those who process data that any reasonable person might consider to be ‘private’ need to make it very clear to the user <em>what</em> data is being collected and <em>how</em> it will be used. Very few internet companies are doing this today.</p></li>
<li><p><em>Give users clear options that give them control of their data and of how it will be used.</em> Default settings should favour privacy over sharing and organisations should sell the benefit of sharing to persuade users to change those settings. Its not enough to provide hidden-away controls, or verbose privacy policies that no regular person will ever read. Policies and controls need to be accessible, clear, obvious and written in <a href="http://www.plainenglish.co.uk">plain English</a>.</p></li>
<li><p><em>Take security very, very seriously.</em> Too many large organisations have had private data stolen by hackers because they didn’t put a high enough focus on protecting their customer’s data. Those who process private data have a moral <em>and</em> legal duty to keep that data secure. In today’s world, top class security skills are required to discharge that responsibility. That costs money, but not doing it costs more. Reputational damage alone for a security breach can be enormous, but the current <a href="http://www.bloomberg.com/video/target-had-17m-net-expenses-on-data-breach-fWlIkmYQSmW18qiESRsuFQ.html">bill for Target’s recent hack runs to $61m</a> and is still rising. </p></li>
</ol>
<p>The veteran UK politician <a href="http://politicalgates.blogspot.co.uk/2011/04/ask-powerful-five-questions-tony-benns.html">Tony Benn was famous for his 5 questions about power and democracy</a>. If knowledge (and by implication data) is power, asking these questions of those who collect and process our private data might be illuminating.</p>
<ol>
<li>What power have you got?</li>
<li>Where did you get it from?</li>
<li>In whose interests do you exercise it?</li>
<li>To whom are you accountable?</li>
<li>How can we get rid of you?</li>
</ol>
<p>I think its reasonable to think the those who own data should have influence over those who process it. That means I want it to be easier to understand what data Facebook or Google are collecting on me and how they use it. It means I want to know how government security agencies are democratically accountable and how I can influence those who oversee them. It means I want control over how my health data is processed. Resolution of these issues would be a sign that we’ve exited the “Wild West” phase of data processing, but I fear that in 2014 we are still firmly in the centre of the gold rush.</p>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-20488364678164205152014-01-25T01:32:00.001-08:002014-01-26T01:19:12.721-08:00Remembering my first Mac experienceMy first experience of the Mac was at university. We had a whole lab full of the original 512k Macs. <br />
<br />
These machines had one floppy drive and no hard drive. You booted them from an Operating System floppy, which took an age. If you wanted to use MacWrite or MacPaint you then needed to put that floppy in the drive. Frequently the machine would request the OS floppy back because it needed some bit of code for the OS. We used to spend a lot of time swapping floppies in and out.<br />
<br />
Only this didn’t matter at all.<br />
<br />
Everyone was in complete awe of the Mac. It was unlike anything else at the time.<br />
<br />
Back in the late 80’s there were two types of computers. Big VAX things that took up an entire room and that you experienced through a “green screen” terminal. They were slow and the screens were dreadfully blurred, only displaying green ASCII characters. The PCs were almost as bad. They were faster, but were still attached to poor-quality character-based displays. Nobody had heard of the word “font”, as the concept of changing a font-size was just as alien as changing the font itself.<br />
<br />
Printers were just as character-based as the displays. Everyone’s printouts looked the same. A word processor might have been more flexible than a typewriter, but the printed output looked the same. <br />
<br />
Both the terminals and the PCs were the ugliest pieces of utilitarian design you could imagine. Massive, heavy, square, industrial-looking things. You could easily imagine some East German communist factory creating those monsters.<br />
<br />
And then there was the “Mac Lab” as it was called in our University. It was populated with machines that were so different from the world I just described, that they seemed to be a gift from an alien species.<br />
<br />
The Macs had these incredible high-resolution black-and-white displays. The refresh resolution was insanely high, so they looked and felt like a piece of paper. In contrast, all the other displays flickered away with their eery green glow.<br />
<br />
The Macs were small, portable, with a handle on the top. They were cute. Someone had clearly designed them to be “human”. The scale was different from everything else, and <em>everyone</em> loved them.<br />
<br />
They had this novel attachment called a “mouse”. We’d never seen one of those before. We moved the mouse and a little arrow moved across this high-resolution bit-map display. Awesome!<br />
<br />
The Mac taught us the word “font” and everyone became crazy for a while. Helvetica, Times New Roman, Bold, Outline, Underlined - the ability to produce a document styled in your own way was such a huge contrast to everything else. And when you printed on the attached “imageWriter” dot-matrix printers, your print looked just the way it did on the screen; either beautiful or slightly wild if you’d got carried away with the font menu. <br />
<br />
The Mac even sounded different. Every other computer beeped in an electronic “make a noise” kind of way. The sounds on the Mac were crafted by a genius. They sounded like notes from a musical instrument - well formed, with rising and lowering pitch. The noise massaged your ears, rather than assaulting them.<br />
<br />
We wrote documents in MacWord, painted pictures in MacPaint, wrote programs in Pascal. And we loved it. Using a Mac didn’t seem like work. In contrast to all our other assignments using terminals and PCs, these Macs felt like they were from a different planet. <em>Everything</em> was different; the size, the design, the screen-quality, the colour of the screen, the high-resolution display, the mouse, the ability to change fonts, the bit-mapped printouts that matched the display.<br />
<br />
When we used a Mac we all knew we were looking into the future. The Mac was such an enormous change, we knew the computing world would be turned on its head. And it was. At my first job after university we were still using DOS PCs and terminals. A couple of years later I moved companies and got a PC running Windows 3.1. It was the start, but to be honest it was dreadful. If you were used to DOS, Win–31 was attractive. But to a Mac user it was obviously a facade over the DOS that still lurked beneath. But it was a start.<br />
<br />
Windows–95 was the first time the PC started to feel like it was really learning from the Mac. Of course the machines themselves were still big and ugly, but the software was catching up. But this was 10 years after the Mac had arrived. It took 10 years for the rest of the industry to start to compete. I think that gives some feeling for how far advanced the Mac really was when it launched in 1984.<br />
<br />
The style, sense of design and the cuteness have never been copied though. There was something so human about that first Mac. Its a puzzle that, whilst all the component parts might have been copied, they never quite add up. <br />
<br />
A set of ideas, crafted into a single whole, created an entirely new vision for the industry. Nobody cost-justified every design decision on the original Mac. It was obvious that everything about it had been created with a single purpose; to make a statement. Things can, and will, be different. This was the start. It doesn’t matter what computer you use today; they are all derivatives of that 1984 Mac. Thirty years later, we’re all Mac users really.<br />
<br />
Oh, one more thing!<br />
<br />
Even the floppy was different. It was rigid, 3.5 inches and fitted in a pocket. All the other machines of the day had "floppy" 5.25 inch monsters that were fragile and difficult to carry around.Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com1tag:blogger.com,1999:blog-7114254501166964003.post-27859551309201026072013-12-29T12:01:00.000-08:002014-12-21T11:45:55.612-08:00Unlucky for some: 13 mobile expectations for 2014<i>Note: I've just posted a new blog entry, a year later, providing some commentary on how these predictions have fared. The new entry is <a href="http://duncan-anderson.blogspot.co.uk/2014/12/reviewing-2014-mobile-predictions.html" target="_blank">here</a>.</i><br />
<br />
<br />
The beginning of a new year is often a time for looking forward and of expectation for the future. I thought it would be fun to jot down my own personal expectations for mobile technology in the year ahead. Note, these are expectations rather than predictions. I'm not trying to crytcal-ball gaze 10 years into the future, just clarify the trends we can already see emerging and that will shape 2014. Perhaps this time next year I might review these expectations and mark my homework!<br />
<br />
<h2>
1. Smartphones get smarter, with more real-world sensors</h2>
We often forget the incredible range of sensors in today’s smartPhones; the iPhone in my pocket includes a pedometer, compass, accelerometer, GPS, fingerprint sensor, Bluetooth, Wi-Fi, 3G & 4G, camera and microphone. This is a veritable Swiss Army knife of sensors. <br />
But I don’t think the addition of more sensors is finished. Apple’s acquisition of <a href="http://www.primesense.com/">Prime Sense</a>, a leading maker of 3D sensors like those used in the XBox, might indicate our phones are about to get such capabilities. A 3d scanner in your phone for the real-world? It’s hard to imagine what this might mean, but I bet it will interesting. One thing is for sure; the addition of new sensors is part of the competitive battle in the industry and smartPhones are only going to get better at understanding our world. 2014 means more sensors, more ways of interacting, more ways of understanding the environment and context in which mobile devices are used.<br />
<br />
<h2>
2. Smartphones get smarter, with more artificial intelligence</h2>
Apple’s Siri and Google’s Now both try to anticipate your needs and interact in a natural language manner. But both of these services are relatively crude when compared to IBM’s <a href="http://www-03.ibm.com/innovation/us/watson/index.html">Watson</a>, a computer that beat the world’s best players in the US General Knowledge TV quiz program Jeopardy! <br />
The journey that Siri started is one that will end with an intelligent assistant that can converse in natural language and knows pretty much everything. And every year will likely see incremental progress towards that goal. I fully expect Siri and Google Now to incrementally improve in 2014 and beyond.<br />
<br />
<h2>
3. Mobile extends to the Home</h2>
<a href="http://www.nest.com/">Nest</a> is reinventing the home control market. Heating control and smoke/CO sensors have never been this stylish and easy to use. I control and monitor my Nest Protect via an app on my iPhone. Nest have launched an <a href="https://nest.com/developer/">API</a> initiative; their ambition is to mobile-enable the home. Heating temperature, CO and Fire sensors are only the start.<br />
Of course home control technology has existed for a long time, so nothing here is new technologically speaking. But what Nest is doing is important; its making previously obscure geeky technology available to regular people. Nest is available in John Lewis (a UK department store), its childs-play to install and is being marketed directly at non-techy consumers.<br />
The timing could be right here, as consumers get more conscious about reducing energy consumption and managing their bills. More intelligent management of the home promises significant energy savings; in the US Nest promises average <a href="https://nest.com/thermostat/saving-energy/">savings</a> of $173/year with their heating control.<br />
<br />
<h2>
4. Mobile extends to the person</h2>
We’ve also seen manufacturers try the “smart watch” concept, but with very little market success so far. It seems to me that the “smart watch” needs to be more than a remote control for the smartPhone, so there needs to be a “killer app” that we’ve not yet seen.<br />
What’s unique about wearable tech is that it’s attached to the body, which makes a variety of health sensors possible. <a href="http://www.fitbit.com/uk">Fitbit</a>, <a href="http://store.nike.com/gb/en_gb/pw/nike-fuelband-se/d4d">Nike</a>, et al have proved the concept and market acceptance of wristbands to track movement for health purposes. <br />
I’ve noticed with interest the increasing band of friends and relatives who’ve invested in a FitBit. Conversely, none of my associates have purchased or expressed interest in a <a href="https://getpebble.com/">Pebble</a>, <a href="http://www.samsungmobilepress.com/2013/09/04/GALAXY-Gear-1">Samsung Gear</a> , <a href="http://toq.qualcomm.com/">Qualcomm Toq</a> or <a href="http://www.sonymobile.com/gb/products/accessories/smartwatch">Sony Smartwatch</a>. Too bulky, too geeky, and “what’s the point ?” seem to be the general response.<br />
A small, discrete and fashionable device with a multitude of health sensors could provide the “killer app” that existing “smart watches” have been looking for. I wonder if a successful “smart watch” might not be more like a Fitbit on steroids than a Samsung Gear?<br />
<br />
<h2>
5. Fashion and style become increasingly important</h2>
At last tech has got style. Nobody launches a smartPhone without an eye to design, style and even fashion, anymore. This is the reality of mainstream consumer electronics, rather than niches for geeks. Fit, finish and build quality are table-stakes with mobile; manufacturers live in fear of snarky reviews from The Verge. <br />
Things like colour, shape, size, thinness and construction materials are increasingly used to attract consumers and we can only expect more of this in an intensely competitive marketplace. As mobile technology extends into the home and about the person, then we have even more reason to believe that fashion is important.<br />
<br />
<h2>
6. Bluetooth Low Energy (BLE) powers new forms of mobile interaction</h2>
All modern smartphones, be they iPhone, Android, Blackberry or Windows, now support Bluetooth Low Energy (BLE). It’s unrelated to the old Bluetooth with its pairing and battery sucking inconvenience. BLE devices can run for weeks or months on the smallest of batteries. Seamless networking makes their usability a big improvement, without the need for pairing. <br />
Most of the health tracking wristbands, like FitBit, use BLE to sync with your smartphone in the background. New uses like Apple’s iBeacons allow for micro location detection, where an app on your phone can perform an action when it detects a given iBeacon signal. BLE also allows detection of proximity - so an app knows not just that it can detect a BLE signal, but can estimate how close or far-away it is from the source of that signal.<br />
PayPal has already launched a <a href="https://www.paypal.com/us/webapps/mpp/beacon?locale.x=en_US">mobile payments service using BLE technology</a>. I have a game on my iPhone that’s a kind of “hide and seek” for smartPhone using BLE iBeacon technology to signal the seekers hotness or coldness to the hiding phone. A bar in London’s Shoreditch uses iBeacon technology to <a href="http://techcrunch.com/2013/12/04/ibeacons-used-to-deliver-location-based-access-to-ios-newsstand-publications/">deliver free content</a> to devices in the immediate vicinity. Popular in-store rewards app <a href="http://www.shopkick.com/shopbeacon">Shopkick</a> is integrating iBeacon with its service. <br />
I believe we’re just at the start of a profusion of BLE-based innovation - for both data connection/syncing and micro-location purposes. Given the widespread acceptance by all mobile device and OS manufactures BLE, and not NFC, is likely to be the dominant near-field wireless technology.<br />
<br />
<h2>
7. Processor innovation shifts from performance to low power usage</h2>
Apple’s shift to 64bit and the ARM-v8 instruction set for its A7 mobile processor make it so powerful is hard to tax an iPhone5s or iPad Air. No doubt the rest of the industry will follow this direction in 2014. At this point our mobile devices are <a href="http://daringfireball.net/2013/10/the_ipad_air">more powerful</a> than a three year old laptop.<br />
I wonder if the next period of processor innovation, after 64bit and ARM-v8, might focus more on battery life than on ultimate performance? Who doesn’t look back at the week-long battery life of old Nokias with fondness? <br />
So far the only phones with a longer battery life have been giant phones with giant batteries, for which the physical bulk limits demand. Some may like "phablets", but I'm adverse to the suspicious bulge they create in a man's trouser pocket. Just as Intel’s Haswell processors enabled Apple to tout a 12+ hour battery life for its svelte MacBook Air computers, so a similar focus in mobile processor efficiency could benefit users greatly and be a source of competitive advantage.<br />
<br />
<h2>
8. Innovation emphasis shifts from hardware to software</h2>
I love the hardware, but the truth is that better screens, faster processors, etc make little difference to my use of mobile devices. However, a good app can be transformational. Apps empower my devices to do new things and are the driver behind them increasingly supplanting traditional computers.<br />
I’m talking about Apps and usage scenarios like:<br />
<ul>
<li>Writing presentations and presenting directly from an iPhone or iPad (via a VGA adapter) with <a href="https://itunes.apple.com/us/app/keynote/id361285480">Keynote</a>.</li>
<li>Constructing blog posts (like this one) with a new generation of Markdown-based editors including <a href="https://itunes.apple.com/gb/app/editorial/id673907758">Editorial</a>, <a href="https://itunes.apple.com/gb/app/ia-writer/id775737172">Writer Pro</a> and <a href="https://itunes.apple.com/gb/app/byword/id482063361">Byword</a>.</li>
<li>Performing advanced photographic retouching, including RAW processing, in <a href="https://itunes.apple.com/gb/app/photogene-for-ipad/id363448251">Photogene</a>.</li>
<li>Managing my life with <a href="https://itunes.apple.com/gb/app/clear/id493136154">Clear</a>, the only todo list manager that’s ever got me to use it for more than a week.</li>
<li>Helping my daughter to build a stop-frame animation for homework with <a href="https://itunes.apple.com/gb/app/smoovie-stop-motion-animation/id424224789">Smoovie</a>.</li>
<li>Constructing diagrams for work with <a href="https://itunes.apple.com/gb/app/sketchbook-express/id404243625">Sketchbook</a>.</li>
<li>Organising ideas with <a href="https://itunes.apple.com/gb/app/corkulous/id499778467">Corkulous</a>.</li>
<li>Keeping track of projects on Github with <a href="https://itunes.apple.com/gb/app/ioctocat-github-for-iphone/id669642611">iOctocat</a>.</li>
</ul>
This is not about media consumption; good apps transform mobile devices into very efficient content creators. <br />
Its notable that many of these apps are from very small or independent developers. Whilst large companies can afford to create and offer free apps as a kind of “loss leader”, these small developers cannot. The ability of a given mobile ecosystem to sustain a viable living for small independent developers is therefore critical.<br />
<br />
<h2>
9. Backend As A Service gets serious</h2>
Mobile apps are remarkably quick to develop. The development time is typically measured in months rather than years. A very respectable app can be developed by a very small team in just <a href="http://fueled.com/blog/how-much-does-it-cost-to-develop-an-app/">3 – 6 months</a>. <br />
But mobile apps are increasingly connected, and dependent on, large server infrastructures for syncing and access to services and data. These systems typically have much longer development timescales and use different technologies to mobile apps.<br />
This is why many app developers are using “back end as a service” (BAAS) providers like <a href="https://www.parse.com/">Parse</a> (recently acquired by Facebook), <a href="https://www.stackmob.com/">StackMob</a> (acquired by PayPal), <a href="http://www.kinvey.com/">Kinvey</a> (still indie) and a profusion of others. These services vastly simplify interacting with and building server-based logic to support mobile apps. <br />
For big companies entering the world of mobile, the truth is that building a mobile app is trivial; the costs involved are small in comparison with typical corporate budgets. The hard, and expensive, bit is changing the back-end systems to allow access from that mobile app. I think we’re going to see the concepts and approaches popularised by BAAS providers creep into big companies in response to these challenges. <br />
<br />
<h2>
10. Mobile continues to eat the PC market</h2>
With limited funds to spend on technology, many are choosing to stretch their PC upgrade cycle and invest the cash saved in a far more exciting mobile gadget. I can only see this trend continuing. <br />
The simplicity, instant on, long battery life, low weight, silent nature (no fans, whirring hard drives or clickety keyboards) and fashionable styling seem to be winning fans. Those who feel a reminiscence for the physical keyboard can augment with a bluetooth one.<br />
At the same time, new app-enabled ways of doing things are emerging. For example, Editorial on the iPad is such a brilliant text editor precisely because it does <em>not</em> ape Microsoft Word. <br />
By reinventing for the mobile age, rather than copying the way we did things in the PC era, apps are helping mobile to supplant the PC. I wrote this blog on an iPhone on a crowded train, on an iPad on the couch, with the data synced in the cloud - in places where a PC was simply not an option. New ways of doing things, new ways of looking at old problems, rather than replicating the PC, are where mobile is succeeding.<br />
<br />
<h2>
11. The platform wars are over</h2>
Its a peculiar factor of computing technology that the various technologies attract an irrational and intense support. We saw it with Windows, Linux and Mac in the PC era. And we’re seeing it again with Win-Mobile, Android, iOS and Blackberry in the mobile era. <br />
My personal choice is strongly for iOS, but I can see the benefits of different platforms - Android with its cheaper devices and ‘openness’ (whatever that might mean), Blackberry for the keyboard and security-conscious corporate die-hards, Win-Mobile for its plucky upstart “lets try something different” approach.<br />
What I don’t understand, or appreciate, is those who feel a need to insult those who’ve made a different decision to themselves. Sadly, mobile technology seems to be fertile hunting ground for such individuals. Except I think the mobile platform battle is over. Both iOS and Android won. The fan-boys should disarm.<br />
Blackberry’s marketshare is collapsing at a frightening rate - it’ll be a herculean task for them to recover from where they find themselves today. Win-Mobile shows small signs of life, but only in some regions. In important major markets like the USA, its essentially nowhere. Murmurs of “alternative” mobile OS’s like “Firefox OS” or “Sailfish” remain exactly that; murmurs. <br />
Until Apple has secured the high-end, there is no reason for it to shift focus to lower-priced devices. It’s a little understood fact that the fruit company is still in its rollout phase for the iPhone, only very recently signing distribution deals with Japan’s DoCoMo and China Mobile (which alone secures access to an incredible 700m subscribers). The opening of massive new markets like these is almost guaranteed to secure healthy iPhone sales for the immediate future.<br />
I see no reason to suspect that we’ll end 2014 with market share statistics wildly different to today; namely Android dominating the low-end, Apple the high-end and everyone else struggling for relevance. <br />
Incidentally, global marketshare is of little relevance to anyone but journalists for two reasons:<br />
<ul>
<li>In the USA the iPhone and Android are neck-and-neck for share. In Spain, Android dominates massively. In Japan 3 in every 4 phones purchased are iPhones. And in some, although few, markets Win-Mobile has relevance. So share varies massively by country and region.</li>
<li>The usage of different devices also varies enormously. <a href="http://www-01.ibm.com/software/marketing-solutions/benchmark-hub/boxingday.html">IBM’s analytics</a> consistently report iOS generating 3x the online shopping than does Android, despite Android having greater device market share. Listening to two colleagues discussing the merits of the <a href="http://www.tesco.com/direct/hudl/">Tesco Hudl</a> as a cheap device to give to kids, I wondered if this might help to explain such discrepancies (kids not being known for their internet shopping prowess). I have yet to see a presentation at work being driven from a Hudl, but iPads proliferate. Its not what device you have, but how you use it, that matters.</li>
</ul>
<br />
<h2>
12. (Small) Cameras replaced by phones</h2>
For many, the cameras on top-end smartphones are beginning to compete very effectively with cameras as stand-alone devices. We reached the point where magapixel count on smartPhones was sufficient a couple of years ago. With 8+ megapxels there’s really no need for any more resolution. I’m a keen photographer, but don’t feel a need to purchase a carry-around camera when my iPhone’s camera is this good. There is even an <a href="https://www.ippawards.com/index.html">iphone photography awards</a>. <br />
But mobile technology isn’t threatening more serious cameras. Early attempts at merging phone and camera, like Samsung’s Galaxy NX, are very unconvincing. More serious picture taking still requires hands-on control of shutter speed, aperture, ISO, white-balance, etc. And this means lots of physical buttons. Buttons that your fingers can sense blind whilst you’re looking at, or through, the viewfinder. This is not a good fit for a big touchscreen focussed device. Creative control of depth of field (i.e. those arty photos where only a part of the picture is in focus) requires a sensor several orders of magnitude larger than a smartPhone's. The laws of physics dictate that a large sensor needs correspondingly large optics. So we aren't going to see DSLR's being replaced by smartPhones any time soon, if ever.<br />
I see mobile cannibalising “point and shoot” cameras, but not DSLRs. Canon, Nikon, Olympus, Sony: your advanced camera sales are safe.<br />
<br />
<h2>
13. Tension between cloud convenience and corporate/government snooping grows</h2>
Consumers are increasingly using cloud services like Dropbox and iCloud. The ease of use and freedom from backup issues is a real benefit. And yet we continue to hear revelation after revelation about how both governments and large companies are tapping into our personal data. This tension will likely remain unresolved, but increasingly obvious, throughout 2014. <br />
Are we happy to be the product rather than the customer, where companies like Amazon and Google give us free or subsidised services in order to get access to our data? Data that they make money from by using it to sell services to other companies? <br />
I don’t think there’s widespread understanding of the nature of the <a href="http://reformcorporatesurveillance.com/">bargain</a> we’re entering into when we Google something (Google retains a record of every search you do). Who really understands that if you leave your Facebook account logged in, Facebook is tracking you as you move around other websites? <br />
It strikes me the current situation is only tenable because most people don’t appreciate what data they are giving away. If they did, would they give their permission? Is the value obtained really equal to the value of the data we are giving away and the associated risks (eg identity theft)? I’m not sure it is. There is a big unresolved tension here, right at the centre of the new mobile-cloud-centric technology world we are building.Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com5tag:blogger.com,1999:blog-7114254501166964003.post-49562040776608071312013-11-15T02:33:00.002-08:002013-12-28T05:06:22.798-08:00Blogging StyleOne of the motivations of blogging, for me, is to better communicate ideas. I spend my day job working with large corporate companies, where I sometimes wonder if the concept of communication has been lost. Blogging is a way of re-discovering an ability to truly engage with the reader and communicate ideas. Its a personal journey and one where the ability to write needs to be re-learnt.<br />
<br />
In the business world, reports are often written using a dry and unemotional language that masks their meaning. By failing to engage with the reader, the writers points are lost; it sometimes feels like the words are being thrown into the empty vacuum of space. For the writer there is meaning, but the reader has to concentrate so hard that they often loose concentration and fail to understand that meaning.<br />
<br />
The sins of business writing, I suggest, include:<br />
<ul>
<li>Long complex sentences that do little to communicate, but everything to tire the reader and hide meaning.</li>
<li>Obscure words and phrases that only a small subset of readers really understand. Often these seem designed to prove the intelligence of the author, rather than to aid the communication of ideas.</li>
<li>Acronyms that create a sense of exclusion to those unfamiliar.</li>
<li>Stilted, dry and formal prose that fail to excite or engage the reader.</li>
<li>A lack of emotion or personal opinion, with everything expressed in a disengaged third-party form. Often I’m left wondering “does the author actually believe this, or are they just going through the motions?”</li>
</ul>
It has become common-place for business writing to be in the <a href="http://www.quickanddirtytips.com/education/grammar/first-second-and-third-person">third-person</a> format. I do wonder if this has a lot to do with the unreadability of many reports. It has a tendency to detach the writer from the content and leads to a lack of emotion and belief. <br />
<br />
I believe we should be making use of the more personal first-person format, where the writers opinions and beliefs are more easily expressed. It’s unusual in the business world, but perhaps misguidedly so. We are humans attempting to communicate with other humans; beliefs and opinions are part of that. We are not robots, so why do we write like robots when we get to work?<br />
<br />
What has fascinated me about blogging is its use of a very informal and conversational style of language. Its a style designed to emphasise communication. And its the way we talk, not the way business people often write. I happen to think its a much better way of communicating ideas.<br />
<br />
Did you notice the way I wrote that last paragraph?<br />
<br />
Short, punchy sentences. Sentences starting with, horror or horrors, “and”. The use of “I” to express a personal opinion. These and other stylistic flourishes, I hope, make my blog posts a little more readable than the average work document.<br />
<br />
Can we learn from blogging and use these lessons to help us communicate better in the business world? I think we can. I suggest, dear reader, that you consider the following:<br />
<ul>
<li>Make sentences short. If a sentence is more than two lines, it probably needs breaking up. Not every sentence needs “and” and commas in it; be punchy.</li>
<li>Don’t use obscure words or phrases that demonstrate your educational excellence, but which only confuse readers.</li>
<li>Avoid acronyms that aren’t in common usage. If you need to explain what an acronym means, maybe you shouldn’t be using it?</li>
<li>Consider writing as yourself, expressing some opinions and maybe even a little emotion. Do you really need to write in the third-person? The more personal first-person format is better at engaging with the reader on an emotional level. I believe that readers better understand what they are reading when they connect with the author’s emotions, not just their words.</li>
<li>When you feel yourself using “sophisticated” English, consider replacing your choice of obscure words with something a little simpler. The “Plain English Campaign” <a href="http://www.plainenglish.co.uk/">website</a> has a handy word lookup tool that suggests simpler versions of words. I suggest you review the tool; you might be surprised how many words you are using that have simpler alternatives.</li>
</ul>
Maybe we should all write our corporate communications with a little more simplicity and personal conviction. Lets put the robots back in the cupboard and celebrate our humanness!Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-54509867609548456962013-10-18T10:59:00.001-07:002013-12-28T06:19:35.623-08:0040 lessons of lifeThis post collects together a set of things I've learnt and wished I'd known earlier in life. The lessons of life are hard-won and it seems to me that we aren't always taught the right rules when we are young. I've written these lessons down in the hope that others might find them useful. They might better be described as beliefs, in the spirit of at least one of them, which suggests that we all have slightly different perceptions of reality. Its possible you might not agree with them all. Please take them in the spirit in which they are meant; as a set of things that have come to me over time and which I am sharing in the hope that others might find them useful, not as a set of rigid rules that must be followed. <br />
These thoughts reflect my personal philosophy and approach to life. You may differ from me. It's ok for there to be different approaches and beliefs; it's what makes the human race so dynamic. <br />
So, without further adoo, let me start with the first of my 40 lessons…<br /><br />
<h2>
1. Your dreams can't come true if you don't dream</h2>
It seems obvious really, doesn't it? Why settle for mediocracy when, just maybe, you could achieve something great? Dream away and work out what you want to come true. Just make sure that once you have a dream, that you think hard about how to make it happen - because the chances are that its more achievable than you might think.<br />
<blockquote>
“To accomplish great things, we must not only act, but also dream; not only plan, but also believe”– Antole France, poet</blockquote>
<h2>
2. But don't just dream, do something</h2>
If you have a dream, get out there and make it happen. Don't sit on your hands waiting for something to drop into your lap. It won't, you need to go and take the initiative to make it happen.<br />
<blockquote>
“Often the difference between a successful man and a failure is not one’s better abilities or ideas, but the courage that one has to bet on his idea, to take a calculated risk, and to act”–Maxwell Maltz, cosmetic surgeon</blockquote>
<h2>
3. Better to define yourself by what inspires you, than what frustrates you</h2>
Sometimes we think those little sarcastic comments, funny jokes, etc, are an effective way of handling things that annoy or frustrate us. But sometimes we can overdo it. Nobody wants to be around a cynic. We all love inspirational people, so I try to think positively. For people and things that annoy me, I try to ignore them. Why waste precious time on irritations and annoyances? Instead, I try to focus on the things that inspire. I've learnt not to tweet every annoyance of my day; a tweet stream of petty wifi problems, beaurocratic challenges and PC crashes isn't wow I want to be remembered. Finding things to love, that whichinspires and focussing on tha seems to me a better approach. Finding the positives in life and spending energy on those, rather than things that deflate, is better.<br />
<blockquote>
“The glow of one warm thought is to me worth more than money”– Thomas Jefferson </blockquote>
<h2>
4. Do what you love and you'll be successful</h2>
Some decide to pursue riches early in life, entering careers and professions not through love, but because they are lucrative. Beware of such attractions - they have a habit of fulfilling in the short-term, but ultimately lead to an empty soul. Early life decisions set you on a path - make sure its a path you can live with and that satisfies. I've always found that my best work is done when I love the topic, my worst when I have no interest. Bear this in mind - try to choose careers and jobs that interest and satisfy you. Avoid those that lead to disinterest and emptyness - you're unlikely to succeed at them, regardless of how lucrative they might seem.<br />
<blockquote>
“Instead of wondering when your next vacation is, maybe you should set up a life you don't need to escape from.”– Seth Godin</blockquote>
<h2>
5. Ignore what other people want or expect from you</h2>
The expectations of others on you can be very limiting - either because they think you should do things that don't inspire you, or because they think you not capable of something you want to do. I believe we should cast off the expectations and limitations placed on us by others. It's your life, not theirs. Find your inner voice - your dreams, not the dreams of those around you.<br />
<blockquote>
“Do not fear to be eccentric in opinion, for every opinion now accepted was once eccentric”– Bertrand Russell</blockquote>
<h2>
6. There is no such thing as a job for life</h2>
The middle-class was built by our Victorian ancestors, many of whom got on in life by being entrepreneurs. The “safe” job didn't exist for them, so they often built their own businesses. Somewhere in the last 200 years we've become anaethesised to risk and come to rely on safe jobs with large companies. The fact that “a job for life” is a phrase that's sometimes used, says it all. Don't kid yourself; there is no such thing. Increasingly large companies are restructuring. In a modern Western capitalist society corporations look to their shareholder's interests. The paternalistic leanings of early 20th Century companies are long gone. You are best to remember this - if your employer's priorities or finances shift, their past loyalty to you as an employee can change rapidly.<br />
I've learnt to keep one eye on the future and change - to be prepared for that change and to make sure I maintain skills that are attractive outside of any one employer. Be aware of the possibility of the <a href="http://www.strikemag.org/bullshit-jobs/">bullshit job</a>. If your job isn't really needed, someone will probably notice, given time. If you are doing a job that's not really needed, your skills are probably stagnating. Its best to take corrective action before its taken for you!<br />
<blockquote>
“The biggest mistake that you can make is to believe that you are working for somebody else. Job security is gone. The driving force of a career must come from the individual. Remember: Jobs are owned by the company, you own your career!” – Earl Nightingale</blockquote>
<h2>
7. It's never too late to start again</h2>
Sometimes we take the wrong path and find ourselves unfulfilled in our choices. Sometimes we make mistakes and regret our choices. We can feel trapped in our circumstances. But it's <em>always</em> possible to start again and change paths. It just takes a little courage and determination.<br />
I try to view life like a series of experiments - if one fails, I try to have the courage to try another. Anyone can learn new skills, find new opportunities and redefine themselves. Stepping away from the day-to-day and asking “what do I need to do to get on the <em>right</em> path” and making it happen, is something worth finding the time to do.<br />
<blockquote>
“So what do we do? Anything. Something. So long as we just don't sit there. If we screw it up, start over. Try something else. If we wait until we've satisfied all the uncertainties, it may be too late"– Lee Iacocca, businessman</blockquote>
<h2>
8. Be flexible</h2>
The best way to keep a job when change happens is to stay flexible. Being open to new ideas, being positive, being helpful, doing what's needed not what's in the job description. People like people who can help them, so helping others achieve is never a waste of time.<br />
<blockquote>
“The measure of intelligence is the ability to change.”– Albert Einstein</blockquote>
<h2>
9. Learning is the foundation of advancement, but can take many forms</h2>
When we need to change paths, obtaining new skills and knowledge are what opens those new paths to us.We can learn through either education or experience. Both are legitimate.<br />
Education is more about training the mind and less about the actual facts or knowledge; I've probably never explicitly used anything I learnt at university, but it did teach me a lot that's been useful in forming my approach to problems.<br />
Experience is more direct; at the coal-face you learn exactly, but only, what you need. Making things yourself, trying, succeeding, failing, learning lessons through the school of hard knocks are some of the most effective ways of learning.<br />
For many, taking 3 years out of work to attend university is difficult - with family, location or economic ties restricting us. Today its possible to learn in new ways. We can choose how and when we study and learn, fitting around our commitments. Online sites like <a href="http://www.codeacademy.com/">Codecademy</a>, <a href="http://www.codeschool.com/">Code School</a> or <a href="http://www.futurelearn.com/">FutureLearn</a> help you to learn programming. Some of the world's top universities are now offering their courses on online learning platforms like <a href="http://www.coursera.org/">Coursera</a>, <a href="http://www.edx.org/">edx</a> or <a href="http://www.apple.com/uk/education/iTunes-u">Apple's iTunes U</a>. And finding like-minded individuals, meeting up and taking part in and learning from a community is also becoming much easier. Online sites like <a href="http://www.blogger.com/meetup.com">Meetup</a> and <a href="http://www.eventbrite.com/">Eventbrite</a> help you to find, take part in, or even build, communities and events.<br />
Only you can decide how its best to learn, but never feel you can't learn something new or that the old traditional ways are the only ones. No matter your age or commitments, you have options your ancestors could only dream of. Make sure you take advantage of those options. Don't squander the luxury of living in the time that we do, with the options we have.<br />
<blockquote>
“Education is the most powerful weapon which you can use to change the world.”– Nelson Mandella</blockquote>
<h2>
10. Anyone can build a reputation</h2>
Today knowledge isnt quite enough - you need a reputation for having that knowledge. Fortunately, its easier than ever to build such reputations.<br />
A bit of a techy? Why not contribute to open source projects on <a href="http://www.github.com/">GitHub</a>? <a href="http://blog.linkedin.com/2011/03/08/github-linkedin">Connect</a> your GitHub account to your linkedin profile, so people on linkedin see your GitHub activity on your résumé. Contribute to technical questions on <a href="http://stackoverflow.com/">Stack Overflow</a> and build an online reputation for solving problems. Get a <a href="http://careers.stackoverflow.com/cv/get-one">Careers 2.0</a> account and let employers find you through your online technical activity on GitHub and Stack Overflow; for programmers, GitHub and Stack Overflow are rapidly democratising the search for technical talent.<br />
Fancy yourself as a bit of a graphic designer? Then why not create some concepts and upload them to <a href="http://dribbble.com/">Dribble</a> where aspiring designers gain an immediate platform to advertise their skills and gain jobs. Perhaps you think you're a photographer? In which case why not use the social sharing of sites like <a href="http://www.500px.com/">500px</a> or <a href="http://www.flickr.com/">Flickr</a> or create a tumblr blog like <a href="http://kevinruss.tumblr.com/">Kevin Russ?</a>.<br />
Have an opinion or want to gain a reputation? Why not start a blog and tweet on you chosen topic - you can build a reputation and following on almost anything, even your favorite <a href="http://www.katespuddings.co.uk/">puddings</a>. An aspiring chef? Why not start a <a href="http://emwilco.wordpress.com/">supper club</a> and build a reputation amongst your local foodie set through Twitter?<br />
It doesn't matter what your interest is, you can become renowned for it in a way our ancestors never could. Why would you not take that opportunity?<br />
<blockquote>
“The most important thing for a young man is to establish a credit… a reputation, character.” – John D. Rockefeller </blockquote>
<h2>
11. If it can go wrong, it will go wrong</h2>
Everything that shouldn't happen has happened, so take care. Chernobyl and Fukushima are classic examples of highly engineered safety systems circumvented by the simplest things; human stupidity. The operators of Chernobyl turned off critical safety systems while testing the reactor. Fukushima was designed to withstand an earthquake of 8.6 magnitude because anything larger was deemed “impossible”. Fukushima was hit by a quake of magnitude 9.1. If you want proof of why we should never trust the “trust us, i t can't go wrong” brigade, read <a href="http://www.theguardian.com/world/2013/sep/20/usaf-atomic-bomb-north-carolina-1961">this</a>. Things happen that should not be possible because humans have a habit of denying the possibility of things they think improbable.<br />
<blockquote>
“It is impossible to make anything foolproof because fools are so ingenious.”– Murphy's Second Corollary</blockquote>
<h2>
12. Admit your mistakes</h2>
Everyone makes mistakes. The best way of dealing with them is to admit they are mistakes and move on. If you don't admit them, you'll carry the baggage and be unable to emotionally learn. Until you admit the mistake it will feel big; once you admit it, it begins to shrink in size and impact.<br />
<blockquote>
“Admit your errors before someone else exaggerates them. ”– Andrew V. Mason, M.D.</blockquote>
<h2>
13. Reputations are founded on authenticity</h2>
Be authentic. Live life in a way that you won't regret. Do and be seen to do the right things. Those who get on in life by doing the wrong things or by hurting others, are <em>always</em> noticed. They build a certain reputation, whether they realise it or not.<br />
It is human nature to be more contemplative in our later years; nobody wants to look back on a life and feel regret. We all intuitively know what is right and wrong, so have the courage to stand up for right and make sure your older self will feel satisfaction and pride, rather than regret, when you look back.<br />
<blockquote>
“The way to gain a good reputation is to endeavor to be what you desire to appear.” –Socrates</blockquote>
<h2>
14. Anyone can build a business</h2>
It takes ten minutes to register a company. We can all build and sell things for virtually no financial outlay. If you're crafty you can sell creations on <a href="http://etsy.com/">etsy</a>. Most developer tools are free and for only £59 you can distribute your app applications globally in Apple's App Store. If your idea needs more substantial investment to make it happen, crowd-funding sites like <a href="http://www.kickstarter.com/">Kickstarter</a>, <a href="http://www.blogger.com/www.indiegogo.com">Indiegogo</a> or even Donald Trump's <a href="http://fundanything.com/">Fundanything</a> provide new ways of gaining that funding. Never have there been greater opportunities for those with a talent or an idea to build a business and income around it.<br />
<blockquote>
“Its kind of fun to do the impossible.”– Walt Disney</blockquote>
<h2>
15. But building a real reputation or business is hard</h2>
A reputation or business of substance takes a lot of work. Many stumble on the path. Reputations might be short-lived or not widely-based. Small businesses often fail or struggle to meet their funding goals. Don't kid yourself its easy - it's not. Those that succeed report that its tough and requires a lot of work and perseverance.<br />
<blockquote>
“A dream doesn't become reality through magic; it takes sweat, determination and hard work.”– Colin Powell</blockquote>
<h2>
16. With a lot of hard work and focus, anything is possible</h2>
Doing something amazing might be hard work, whenever I've dedicated myself and put the effort in its paid off. The bane of modern life is our butterfly-like tendency to flit from subject-to-subject. But greatness never happens through flitting; you need to focus on one thing hard and make it happen.Things might not work out exactly how you plan and you need to spot the opportunities to adjust your dream along the way. But with sheer hard work you can take control of your own destiny and surprise your more cautious friends. Every single example of amazing achievement brings with it an equally gruelling story of hard work, focus, dedication and sacrifice. If your dream matters, then investing effort and focussing is the best way to make it happen. There are no shortcuts, no route to success without hard work.<br />
<blockquote>
“Opportunity is missed by most people because it is dressed in overalls and looks like work.”– Thomas Edison</blockquote>
<h2>
17. Inspire others to help you</h2>
We all need help if we are to achieve something great. People who are manipulated, threatened and forced to help, do so begrudgingly and with the minimal of effort. Those who are inspired, do so with pleasure, enthusiasm and with disregard for the effort involved. One group is many times more effective than the other; I'll leave it to you to work out which.<br />
If you are struggling to get people to help you, consider how you can inspire them, rather than dreaming up ways to manipulate them. If your approach to getting help is to dictate or threaten, you're probably not a leader. A leader's job is to inspire greater things and support people when they struggle. Step back and think about why you might not be inspiring those who you wish to lead. More threatening approaches might work for a short period, but you will surely loose support in the longer term. People have free will and will tend to seek experiences where they are inspired, rather then manipulated, so the manipulative approach is at best only effective in the short term. The inspirational and supportive approach is very different.<br />
<blockquote>
“If your actions inspire others to dream more, do more and become more, you are a leader”.– John Quincy Adam, US President 1825—1829</blockquote>
<h2>
18. Surround yourself with people who believe its possible</h2>
Achieving things is hard. You need people who support your ambitions and believe in you. Skeptics will undermine the confidence that's so necessary. You won't change the skeptics opinions. Skepticism is a state of mind, not a reaction to a particular idea. Those who specialise in dreaming up reasons things will fail aren't reacting to your idea, they do this for <em>every</em> idea that doesn't fit their world view.<br />
So, try to surround yourself with optimists, those that see the possibilities and can carry you through the bad times. Don't waste time on the sceptics, the haters and the pessimists - they'll only drag you down. You won't change their minds, so don't waste your effort trying to.<br />
<blockquote>
“The worst enemy of creativity is self-doubt”– Sylvia Plath, poet</blockquote>
<h2>
19. Doers or talkers</h2>
There are those that are good at doing things and those that are good at talking about doing things. As a general rule we tend to be impressed by the latter, when the former is what we really need. It's why the interview process is so hard - by its very nature it tests your ability to talk about doing, not actually do. Of course what we often really want is the intersection of the two groups - those good at doing and good at talking about doing. Unfortunately that group is very small - and it's hard to discern if you're talking to a member of that group, or just a member of the 'good at talking about doing' group. So build your team of helpers in life from the doers group. Try not to be swayed by the talkers; the likelihood is that they can only talk, not do.<br />
<blockquote>
“Talkers are usually more articulate than doers, since talk is their specialty”–Thomas Sowell, economist</blockquote>
<h2>
20. You can't manage what you don't understand</h2>
If you don't understand something, you don't know the right questions to ask, the pitfalls that you could fall into or the alternatives that might exist. You are, in effect, entirely at the mercy of the people you are supposed to be managing - their opinions and biases can go unchallenged simply because you don't know how to challenge them. To manage something you need to first understand it and if you understand it then you stand a fighting chance of being capable of managing it. So before you try to manage something, invest the time to understand it first. Don't kid yourself that managing is possible without that investment. Lots of people do, most of them fail.<br />
<blockquote>
“Never invest in a business you can’t understand”– Warren Buffett</blockquote>
<h2>
21. There are no facts, only opinions</h2>
Once upon a time people considered the world to be flat. This was a fact, pure and simple…until compelling evidence emerged that it was, in fact, an opinion. Anything we consider to be a fact can be reversed if additional evidence comes to light. I'm always slightly suspicious of those presenting an argument with such conviction that they eliminate any possibility for a contrary opinion. A fact can only be so on the balance of probability and current evidence.<br />
<blockquote>
“Everything we hear is an opinion, not a fact. Everything we see is a perspective, not the truth.”– Marcus Aurelius, Roman Emporer 161—180</blockquote>
<h2>
22. Don't reduce your risks to the point where success means nothing; try to do something great</h2>
If your choices are very low risk, then success will have little value. Don't set your life up such that success is meaningless. Strive for something great, so that when you achieve it, it's worthwhile. Even getting halfway up Everest is a great achievement; walking to the shops, not so much.<br />
<blockquote>
“It seems to be a law of nature, inflexible and inexorable, that those who will not risk cannot win”–John Paul Jones, sailor</blockquote>
<h2>
23. Take risks before they become too risky</h2>
To achieve great things we need to take risks. The best time to take risks is when we're young and have fewer commitments. Losing your job or having your business fail when you're living at home with your parents is not a disaster; the same when you have a family and mortgage to support, a little more so. Whilst we're young we have a golden age to take those risks - don't miss those chances.<br />
Failing at something and not bothering to try have the same outcome. The biggest risk you can take in life is to die never having bothered to try.<br /><br />
<h2>
24. Most risks are probably not half as risky as they might seem</h2>
Those who take risks might just succeed. Even if you don't, you've demonstrated a resilience of character, have experienced life in a way that the plodders never have. I've never regretted making a decision that seemed risky.<br />
The personal attributes of those who try, even if they fail, are enormously valued; go-getters, risk takers, change agents, optimists. Even if you fail, you'll have more experience, knowledge and skills than if you never tried, so you will have likely enhanced your employability in the process.<br />
In many ways risks aren't half as risky as they seem when it's you that's taking the risk. Think about failure, about what you'll have leant, how it differentiates you from others; even in failure you'll be worth more than you are today, so is it really such a big risk?<br />
<blockquote>
“Take risks: if you win, you will be happy; if you lose, you will be wise”–Anonymous</blockquote>
<h2>
25. You can make your risks less risky; there's no need for wrecklessness</h2>
Read “The Lean Starup” and apply its principles.<br />
<blockquote>
“Take calculated risks. That is quite different from being rash”–General George Patton</blockquote>
<h2>
26. Sometimes its just the wrong time to take risks, or its just not for us; that's OK</h2>
Sometimes its not the right time for us to change the world because we have other things going on in our lives. There are good reasons for not pushing the boundaries, or not pushing them right now. The point is not that everyone should change things all of the time, but that you should be conscious about what makes you tick. Be aware of your choices and be explicit in your decisions. Whatever you decide can be the wright thing. Don't feel you should be someone else, be happy with who you are and make the decisions that fit your personality and will make you happy.<br />
<blockquote>
“You do things when the opportunities come along. I've had periods in my life when I've had a bundle of ideas come along, and I've had long dry spells. If I get an idea next week, I'll do something. If not, I won't do a damn thing”–Warren Buffett </blockquote>
<h2>
27. The really successful don't play by the rules; they change them</h2>
At school we are trained to abide by the rules, to be a good citizen and not to discent. The hard truth is that some of our more rebellious class-mates went on to be fabulously successful. That's because, to be really successful, you often need to change the rules rather than abide by them. If you want to change the world you need to be prepared to shake things up, to provoke, not to accept the rules. Our training in school might make for obedient citizens, but its entirely the wrong training for life, where a bit of discent is often what's need to break through fuzzy thinking, conventional wisdom and mundaneness.<br />
Here's the conundrum that nobody ever tells you: society needs obedience to the rules to be stable, but it also needs rebels to push new ideas forward. It can't afford for everyone to be a rebel, because that way chaos ensues, but it also can't afford for everyone to follow the rules because that way society stagnates.Look at examples of people who've achieved something great - they are often tetchy individuals, obsessed, maniacal, maybe even a little crazy. Steve Jobs was well known for his ignorance of convention and rules, a trait shared by many high achievers. Apple's inspirational <a href="http://www.youtube.com/watch?v=nmwXdGm89Tk&sns=em">Think Different</a> ad sums it up for me: Here's to the crazy ones, because “those who dare to think that they can change the world are the ones who do”. Nobody told my 16 year-old self that a healthy disregard for authority and rules was a good thing, but it turns out that it is.<br />
<blockquote>
“There are no rules for good photographs, there are only good photographs”–Ansel Adams</blockquote>
<h2>
28. You can't achieve anything by keeping everyone happy</h2>
There is a weird part of our psyche that doesn't want to upset others. But you can't achieve anything if you don't upset anyone. Builders often excuse a mess with “you can't make an omelette with breaking an egg” and they are right. Brave decisions, decisions that are meaningful and create change are going to upset someone somewhere, so get over it. Learn that doing things offends people and that very offence is validation that you are doing something meaningful, because if nobody cares they won't be offended.<br />
<blockquote>
“To me, consensus seems to be the process of abandoning all beliefs, principles, values and policies. So it is something in which no one believes and to which no one objects.”– Margaret Thatcher</blockquote>
<h2>
29. Leaders are people like us</h2>
You think its intimidating to speak to a CEO, the Prime Minister (or a President), a celebrity? It's not. They are just people, like us. They were once a child. They go to the toilet like the rest of us. When they take their clothes off they probably look just as ridiculous as we do. So treat them like ordinary people, tell them when they are being stupid, ask them why, challenge them. Most people in positions of power are surrounded by fawners and in my experience often find a new perspective stimulating. And those that don't? Well, they're probably not worth you wasting your time on.<br />
We used to be told to respect authority. However, we've seen through a series of scandals how those in authority have a tendency to loose their grip on reality. Every great politician ends up losing that grip on power because they can no longer see how the common person experiences things. Margaret Thatcher and the Poll Tax, Tony Blair and Iraq, the list of leaders who once intuitively understood the nations mood, but somehow lost that skill by being too insulated, is endless. The Financial Crisis came about because a small group of leading bankers created their own version of reality, insulated from the truth, and led the world's economy off the edge of a cliff. Historical sexual scandals in public life are only now emerging because too many people were willing in blindly follow and not question and challenge authority earlier.<br />
We cannot afford to blindly trust authority. Its all of our responsibilities to have a healthy disregard for that authority - making sure it doesn't get locked in its own insular view of the world. So, don't blindly follow. Instead, question, challenge and provoke. Those in authority are typically handsomely rewarded; make them work for their living!<br />
<blockquote>
“I love argument, I love debate. I don't expect anyone just to sit there and agree with me, that's not their job.”–Margaret Thatcher</blockquote>
<h2>
30. Clothes hide who we really are</h2>
I used to think it important to “look the part”, to dress appropriately. But I've learnt that its far less important than I thought. Most serious people who want to get things done really don't care what you look like; its your ideas and skills they want. Trust me, good ideas and skills are in sufficiently small numbers that nobody serious has the luxury of ignoring you because of your trousers. Those that do are perhaps revealing something about their own lack of focus on results - if they are more interested in you paying them respect through your dress, than in achieving something great with your skills, then they have lost the plot.<br />
Every time I've seen someone dressed unconventionally, but who has ideas and skills, their capabilities rapidly eclipse the importance of how they look. Sometimes we need to look the part so that less intelligent people take the time to listen to us. But, often, our clothes have no influence when we have something of value to offer.<br />
<blockquote>
“Beware of all enterprises that require a new set of clothes”– Henry David Thoreau, author</blockquote>
<h2>
31. Even quirky people can be successful</h2>
Steve Jobs was an odd-ball. Andy Murray is the first Brit to win Wimbledon in shorts, yet is awkward when being interviewed. But eccentricity and uniqueness don't matter. You don't have to be a perfect “conventional” person to be effective. Socially awkward people are often brilliant and effective as well. Apple's '<a href="http://www.youtube.com/watch?v=nmwXdGm89Tk">Think Different</a>' ad reminds us about the misfits, rebels, round pegs in square holes, the people who see things differently. These are the ones that change things, not middle-aged men in suits.<br />
Intoverts are just as, if not more, effective as extroverts. Most things of historical significance were invented or created by individuals working on their own. As Susan Cain says, <a href="http://t.co/6oScT7FiIB%5D">“There's zero correlation between being the best talker and having the best ideas.” </a>. You do <em>not</em> need to be loud, confident, or any of the other assumptions about leaders, to be one. Many individuals and businesses today appear to associate leadership with a extrovert character traits, but they are wrong. One of the world's greatest leaders, Ghandi, was an introvert. Leaders lead through ideas, not through bullying, shouting, demanding or manipulating.<br />
<blockquote>
“Just because it's something original, eccentric or you're not used to it; doesn't mean it's wrong.”– Sandra Chami Kassi, Lebanese author</blockquote>
<h2>
32. Most things of value emerge from solitude, not from group thinking</h2>
There is an inverse between the valuable output and the number of people involved in its production; If more than 3 people are involved in a meeting, my experience is that it's unlikely that meeting will result in much. Committees and boards are, by their very nature, incapable of innovation. Our cognitive biases mean that deep thought is best done in isolation or with very small groups. The more people involved, the greater the pressure to conform and avoid ideas that might appear daft to the rest of the group; but those 'daft' ideas are often the ones that lead to breakthroughs when they aren't squashed by group dynamics. There is much psychological analysis of <a href="http://en.wikipedia.org/wiki/Group_think">group think</a> that backs up these assertions; my experience is that it's all true.<br />
<blockquote>
“There is nothing in the world I loathe more than group activity, that communal bath where the hairy and slippery mix in a multiplication of mediocrity.”–Vladimir Nabokov, novelist.</blockquote>
<h2>
33. Listen and learn</h2>
The <a href="https://www.google.com/search?q=dunning%20kruger%20effect">Dunning-Kruger effect</a> says that people have a tendency to overrate their ability when they know little about a subject, whereas those who know more tend to underrate their abilities. If you combine this effect with dominant extroverts, you can see how stupidity can take hold. So, always seek out knowledge and listen and learn from those who know more than your do about a subject. If you are confident in your opinions, consider that you may be over-confident. If you are hesitant in your opinions, consider that you may be more experienced than those who are louder than yourself.<br />
<blockquote>
“One of the painful things about our time is that those who feel certainty are stupid, and those with any imagination and understanding are filled with doubt and indecision.”–Bertrand Russell</blockquote>
<h2>
34. With age comes wisdom, but with wisdom comes caution</h2>
As we age we gain wisdom; something that only comes through experience. However, increased wisdom means increased caution. As we foresee the pitfalls we stop doing things. We step back to assess before jumping in. The audacity of youth slips away from us.<br />
The world needs those with wisdom, but it also needs the naivity of youth - many great things appear ill-advised to the wise and without youthful ignorance we wouldn't reach for the seemingly impossible. Listen to wisdom, but don't let it constrain you.<br />
I've noticed that teams built only by the wise, or led by them, will tend to exhibit great caution. Companies that rely too heavily on their elder colleagues will blunt their potential. They won't make mistakes, but they might just not do anything either. It's why startups make breakthroughs, established businesses less so. My plea is that you should find a way of balancing wisdom with youthful enthusiasm.<br />
<blockquote>
“If you are too careful, you are so occupied in being careful that you are sure to stumble over something”–Gertrude Stein, writer</blockquote>
<h2>
35. Your mind only ages if you let it</h2>
An aged mind relies on experience and wisdom instead of curiosity. Instead of trying to understand something, we quickly fit it to something approximating a previous experience and pronounce an answer. Who hasn't noticed the irritating elderly relative who is always saying what the answer is before even understanding the question?<br />
Wisdom means we have a larger pool of experience to draw on, but also sometimes leads us to relying too heavily on that experience. The wise sometimes have a tendency to shortcut their understanding of the problem and go straight to the answer. By relying too heavily on previous experience, we eliminate new ideas and perspectives. We fall back on previous answers, never really searching for new ones.<br />
Psychologists have a name for these tendencies: <a href="http://www.psychologytoday.com/blog/the-power-prime/201305/cognitive-biases-are-bad-business">Cognitive Bias</a> and its something we can consciously avoid. By checking our perceived wisdom, consciously engaging with the problem at hand and searching out new perspectives, we can offset the tendencies that emerge with age. But it requires conscious effort. If you aren't consciously checking the what you think is your wise part of your brain, you're probably relying too heavily on it. Accept some humility; your assumptions might be wrong, so stop and think before pronouncing an answer. Above all, be curious and open to new ideas.<br />
<blockquote>
“Stay young, Stay foolish”–Steve Jobs, <a href="http://www.youtube.com/watch?v=Hd_ptbiPoXM&sns=em">Standford lecture</a></blockquote>
<h2>
36. Not everything of value can be counted</h2>
Some very important things are very, very hard, if not impossible, to measure. We shouldn't try. Sometimes we need to make decisions based on what we know the truth to be, even if its hard to quantify with numbers.<br />
<blockquote>
“Not everything that can be counted counts, and not everything that counts can be counted.”– Albert Einstein </blockquote>
<h2>
37. Things are never as important as they seem at the time</h2>
I've never looked at anyones educational qualifications when assessing them for a job. I'm interested in their experience and personal qualities and potential, not some arbitrary mark at a point in time. Exam grades get you on the first rung of the ladder, but very quickly the grades are meaningless and nobody will ask about them ever again. <br />
Exams, meetings, decisions, they all seem so important at the time. But, given a sense of time and perspective, none of them are half as important as we might have thought at the time. So, relax a little. De-stress. Know that, even if you fail, that with time it will almost certainly not be the disaster it might seem to be.<br />
<blockquote>
“One of the symptoms of an approaching nervous breakdown is the belief that one’s work is terribly important.” – Bertrand Russell, philosopher</blockquote>
<h2>
38. Life isn't about money</h2>
It's not about making more money. Its about experiencing life, trying things, exhausting the possibilities, doing things you and others remember, making an impact, leaving footsteps in the snow for those that come after you. What does that mean for you? One thing it shouldn't mean, is having more stuff. Doing great things does not always equate with earning vast amounts of cash or being at the top in a corporate structure chart. Earning enough cash is important; earning more cash is not.<br />
Do you want to die owning a big house and a smart car, but having spent 40 hours a week for 40 years working on things that don't matter? Or do you want to do something important, even if the house is smaller and the car scruffier? Maybe you won't have to make the choice, but if you did?<br />
<blockquote>
“Being the richest man in the cemetery doesn't matter to me. Going to bed at night saying we've done something wonderful, that's what matters to me.”– Steve Jobs</blockquote>
<h2>
39. But life <em>is</em> about having fun and making a difference</h2>
The only possible reason for existing, it seems to me, is to make a difference and and to have fun doing it. If you believe that (how could you not?), are you having fun now?<br />
Do you laugh at yourself, or do you take yourself too seriously? Serious is bad, laughter is good! A sense of humor can carry you through difficult times. And laughing at your own stupidity can help you accept your mistakes, learn from them and move on. The serious guys and gals can have a tendency to not acknowledge miss-steps and hence fail to learn from them. So, don't take yourself too seriously; you are human, you make mistakes, you're pretty ridiculous really, so see the funny side of it!<br />
<blockquote>
“Anyone who takes himself too seriously always runs the risk of looking ridiculous; anyone who can consistently laugh at himself does not.”– Vaclav Havel</blockquote>
<h2>
40. And finally…</h2>
Everything I've said might be true, but knowing that doesn't make it any easier to do anything about it. I've written these things down because I believe them to be true, but I'm as bad as anyone at not living by them day-to-day. I hope that by capturing these thoughts and making them explicit, rather than vague beliefs in the subconscious, they become easier to consciously do something about. And I hope that you, too, find them of some use.<br />
<blockquote>
“I am glad that I paid so little attention to good advice; had I abided by it I might have been saved from some of my most valuable mistakes.” –Edna St. Vincent Millay, poet.</blockquote>
Written with joy in <a href="https://itunes.apple.com/gb/app/editorial/id673907758%5D">Editorial</a> for iPad.<br />
<br />
<div id="blogsy_footer" style="clear: both; font-size: small; text-align: right;">
<a href="http://blogsyapp.com/" target="_blank"><img alt="Posted with Blogsy" height="20" src="http://blogsyapp.com/images/blogsy_footer_icon.png" style="margin-right: 5px; vertical-align: middle;" width="20" />Posted with Blogsy</a></div>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-86262733303715123892013-10-07T11:08:00.001-07:002013-10-07T13:31:52.159-07:00Smartphone Security and Touch ID<p>My door at home has two locks on it. I have a “Yale” lock that's there for convenience; when I shut the door it locks automatically. When I'm in the house this is sufficiently secure. But when I go away I also put the key in my “chub” lock and double-lock the door. The Chubb lock is more secure, but very inconvenient to use if I'm in and out of the house a lot. I also have a burglar alarm. It's a bit of a pain because I need to enter a code to disarm it. My home security is a series of barriers. It's not perfect; if a professional thief wants to deactivate my alarm and pick my locks, I'm sure (s)he can. But for most burglars my security is good enough to keep me secure. My security is also graduated - the most secure parts of my system are the most inconvenient. There's a trade off between security and convenience. I often choose convenience over security; setting the alarm and double-locking the door when I'm doing some gardening would seem overkill and definitely a right pain-in-the-neck, so I just use the Yale lock in such situations.</p>
<p>It seems to me that security on smartphones, and IT in general, is similar to my house security, namely:</p>
<ol>
<li>I can get super-secure, but it's at the risk of convenience. Super-secure often means super-inconvenient.</li>
<li>If security is too inconvenient I'm likely to not use it.</li>
<li>It's impossible to prevent a determined thief with a lot of resources, so it's about appropriate security and manageable risk, not about ultimate security.</li>
<li>The security of a system should be assessed on the basis of the end-to-end nature of the security systems, not of the hackability of one individual item in that system.</li>
<li>The average person in the street does not need, cannot afford, and would find the intrusiveness of the Bank of England's security unacceptable.</li>
</ol>
<p>There has been quite a bit of discussion around Apple's Touch ID fingerprint sensor. It's novel to see biometric identification used in a high-volume consumer electronics device, so it's natural for the industry to explore the implications of this. However, I've noticed a fair degree of confusion about both how Touch ID works and what its purpose is.</p>
<h2>Touch ID is about convenience, not super-security</h2>
<p>In Apple's introduction of Touch ID they focused on it as way of providing security with increased convenience. The presentation highlighted the the fact that nearly 50% of smartphone users do not have a passcode lock because of the inconvenience. Touch ID was presented as a solution to security with convenience. It was not positioned as some form of James Bond super-secure breakthrough. Apple's target is always the mass-market of consumers, not niche subsets, and Touch ID is very clearly trying to help <em>average</em> consumers increase their level of smartphone security. </p>
<h2>Human nature</h2>
<p>Human nature is that we will try to avoid things we perceive as an inconvenience. People write their PIN numbers on post it notes in their wallets. Would you believe the <a href="http://www.datagenetics.com/blog/september32012/">most common pin code is 1234</a>? Or that <a href="http://reviews.cnet.com/8301-19512_7-20072635-233/do-you-password-protect-your-iphone-poll/">Nearly 50%</a> of consumers choose not to activate a passcode on their phone? It doesn't matter what the implications are, the evidence is that if being secure is too much of a barrier, or just a barrier, a lot of people will find a way around it and “blow the consequences”. </p>
<p>When designing a security system we need to be aware of basic human behaviour. Just as my house has a level of security I would never use when gardening, so it is with smartphones. People cannot remember super-secure passwords, they cannot remember lots of passwords, entering them on a smartphone keyboard is a pain in the neck. As a result, the industry is starting to find ways of increasing the convenience of security in order make it easier for people to use, so that they don't circumvent the mechanisms offered. This is the problem I think Touch ID is trying to solve, ie convenience, not trying to provide a high-end security solution. As a user of Touch ID I can say with total confidence that it really is a revolution in convenient security. It's so easy to use you almost don't realise it's doing anything; its security that gets out of the way.</p>
<h2>Existing solutions are far from perfect</h2>
<p>4-digit passcodes are easy to guess and many people use memorable dates, drastically reducing the numer of possible choices from the maximum. </p>
<p>Passcodes are also very easy to capture by watching someone type them in.</p>
<p>We even leave smudges on our smartphone screens that <a href="https://www.usenix.org/legacy/events/woot10/tech/full_papers/Aviv.pdf">allow thieves to easily guess our passcodes</a>. </p>
<p>So, the current state of smartphone security is far from perfect and new innovations need to be judged against that imperfect state, not an assumed nirvana.</p>
<h2>Touch ID is only one part of a bigger security system</h2>
<p>Focussing on Touch ID only misses the point of Apple's approach to security. Touch ID sits in a wider security ecosystem and that makes the system more secure than any one component in it.</p>
<p>Firstly, it's importent to understand that Touch ID complements a passcode, rather than replacing it. With Touch ID enabled you still need a passcode and the phone requires you to enter that passcode:</p>
<ol>
<li>If you've not entered you passcode in the last 48 hours and try to use the phone, you will be required to enter your passcode rather than use Touch ID.</li>
<li>If you fail Touch ID verification five times, Touch ID is disabled and you are required to enter your passcode.</li>
<li>After a reboot of the phone, Touch ID is disabled until you enter your passcode.</li>
</ol>
<p>Apple also provides a secondary security layer in the form of 'Find my phone'. This system is designed for situations where you have lost your phone. It allows you to logon to the iCloud website and locate your phone on a map. There are many fascinating stories of people who have retrieved their phones from thieves after using this service, so it has real utility. You can also remotely put a phone into <a href="http://www.macobserver.com/tmo/article/ios-6-using-lost-mode">lost mode</a> through Find my Phone. Once in lost mode:</p>
<ol>
<li>Touch ID is disabled and you must enter a passcode. </li>
<li>You set a message and phone number to call that are displayed on the phone's screen.</li>
<li>The phone is locked to only be capable of calling the number you specify.</li>
</ol>
<p>In iOS7 this is further strengthened with <a href="http://support.apple.com/kb/HT5818">activation lock</a>, which means a thief needs your AppleID and password (different to you phone's passcode) before they can:</p>
<ol>
<li>Turn off 'find my phone'</li>
<li>Erase your phone</li>
<li>Reactivate or use your phone</li>
</ol>
<p>The security of Activation Lock was met with initial skepticism, but it hasn't been circumvented yet and appears to be effective. In fact, the New York police were so impressed they've actually been <a href="http://www.theverge.com/2013/9/22/4758534/nypd-promotes-ios-7-activation-lock-to-reduce-apple-picking-theft">handing out flyers</a> encouraging iphone users to upgrade to iOS7 in order to use Activation Lock and hopefully reduce phone theft.</p>
<p>Find my Phone and Activation Lock make it a trivial matter to securely disable the phone, and Touch ID, if it is lost. Given the way we all carry our phones with us all the time, there's a good chance that, in the worst circumstances, you'll notice your phone has gone missing and be able to deactivate it before the thief creates and uses any 'fake' fingerprint. </p>
<p>The security of the end-to-end system, taking into account the way passcodes are used with Touch ID, Find my Phone and Activation Lock, is much greater than any single aspect of security. It's a really good example of security system engineering and a rare one. Most security solutions seem to have been engineered in isolation, rather than as a part of some larger scheme.</p>
<p>Can aspects of this system be hacked? Maybe. If they are hacked, will Apple respond and prevent the hack? Almost certainly. Although reports of the possibility to make 'fake' fingerprints have emerged, there are good reasons to believe this isn't a big risk. Firstly, the faker needs a good quality fingerprint, not smudged, to work from. Secondly, the process is pretty laborious and takes a good number of hours. Thirdly, the secondary barriers of Find my Phone and Activation Lock limit the potential exposure. Most of us would have put the phone into lost mode, disabling Touch ID in the process, before anyone can apply any fakery. It's possible to circumvent it with some luck and a degree of skill, but probably still easier to threaten physical violence to obtain a passcode.</p>
<p>Time will tell, but the indications are that we have a very effective end-to-end security system designed to help most of us with real convenience. </p>
<p>For those that are nervous about Touch ID, they can opt out. It's use is entirely optional and there is no compulsion to activate it.</p>
<h2>Biometric <em>and</em> passwords?</h2>
<p>Some have called for Apple to allow passwords to augment Touch ID, ie that the user should always need to enter a passcode <em>and</em> fingerprint verification. The theory being that a combination of password and fingerprint would be <em>very</em> hard to crack. This may be true, but it misses the point of Touch ID. Apple's focus is to increase the convenience of security, not to provide the highest possible level of security. Apple could, in theory, provide the option to use both password and fingerprint, but I personally think its unlikely. Doing so would divert attention from the cause of convenience and Apple is well known for its focus on delighting the rump of consumers, rather than small niches. So no, Touch ID is not about super-high-end security; it's about increasing the convenience of security that is appropriate for most of us.</p>
<h2>Mobile payments increases the need for convenience</h2>
<p>We have seen an enormous proliferation in mobile payments solutions, but none of them have achieved critical mass yet. I would argue that some of these cannot achieve critical mass because they are <em>more</em> cumbersome than using a plastic card. Anything which requires a confirmation on the phone will need many of us (ie those that do have a passcode set) to enter our phone's passcode, find and launch the app, enter some app-specific passcode to authenticate the purchase. This is clearly not viable in a high volume retail environment. There is no chance my mother, or even my wife, could be bothered. </p>
<p>The inconvenience of authentication is why NFC implementations have no security at all, and instead rely on a transaction limit of £15 (in the UK). But a transaction limit of £15 forces NFC into a niche of sandwich and coffee shop purchases, rather then being a true next-generation payment solution. We are, in effect, stuck between two extremes; the completely-insecure and the secure-but-so-inconvenient-few-can-be-bothered. Touch ID, and no doubt the similar copies that will follow, present an interesting possibility for authentication that is “good enough” and super-convenient. Apple is rightly being cautious with Touch ID's potential; today it can only be used to purchase content from Apple's own online stores. No doubt, as the technology beds down, this may be opened up for others to use.</p>
<p>Before you start stressing about the potential to copy a fingerprint and use that to buy things, reflect on the fact that we still rely on signatures to secure high-value cheques. Our credit cards are only secured by 4-digit PINs that are super-easy for somone to read by looking over our shoulders. The requirement is not for absolute security, it's for convenient security that is “good enough” and a fingerprint is likely more secure than either a signature or 4-digit PIN. I, for one, think there is great potential here and am excited to see where this might take us. </p>
<p> </p><div style="text-align: right; font-size: small; clear: both;" id="blogsy_footer"><a href="http://blogsyapp.com" target="_blank"><img src="http://blogsyapp.com/images/blogsy_footer_icon.png" alt="Posted with Blogsy" style="vertical-align: middle; margin-right: 5px;" width="20" height="20" />Posted with Blogsy</a></div>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com0tag:blogger.com,1999:blog-7114254501166964003.post-86270747886654671782013-07-24T00:27:00.001-07:002013-07-24T00:27:50.072-07:00It's time to ramp-up the ambition in mobile payments<p> The world is full of novel mobile payments solutions, many of which are doomed to failure in my opinion. This is for one simple reason; they aren't ambitious enough. Most of these so-called innovations are replicating today's payments model and complexity. Very few actually transform the underlying business model and many actually increase end-to-end complexity, not reduce it.</p>
<p>Let's look at today's retail payments model. We could characterise this as the retailer saying "please give me the keys to your safe, so I can go and take what you owe me." We blithely hand over our bank credentials together with authority for the retailer to debit money from our accounts every time we pay at the point of sale. Precisely because this is so inherently insecure, the industry has had to liberally pepper the resultant payments with encryption, pin codes and complex/obscure technology. The whole <a href="http://www.scl.org/site.aspx?i=ed26289" target="_blank" title="">merchant acquirer infrastructure</a> grew up to connect our banks with the retailer's banks and process the resultant transactions. The reason for this? Only the retailer has connectivity and the ability to connect into the banking infrastructure. Before smartphones, we had no choice; the only way of making things work was by handing control to the retailer. It was complex, but it was the only way.</p>
<p><span style="-webkit-tap-highlight-color: rgba(26, 26, 26, 0.296875); -webkit-composition-fill-color: rgba(175, 192, 227, 0.230469); -webkit-composition-frame-color: rgba(77, 128, 180, 0.230469); ">Nearly all NFC contactless payments solutions attempt to use that existing retail payments infrastructure to process the payment. In effect they are layering NFC on top of the current solution. If you've ever delved into the arcane complexity of sim-based NFC, you'll quickly come to the view that this cannot reasonably be presented as anything other than "Very Complex"</span><span style="-webkit-tap-highlight-color: rgba(26, 26, 26, 0.292969); -webkit-composition-fill-color: rgba(175, 192, 227, 0.230469); -webkit-composition-frame-color: rgba(77, 128, 180, 0.230469); ">. Complexity increases costs. And most NFC pilots limit transaction values to £15, which is slightly ridiculous when looked at in the cold light of day. But why layer new technology on top of yesterday's infrastructure, business model and high cost base?</span></p>
<p>With the advent of smartphones we all have our own portable bank connectivity in our pockets. No longer do we need to hand over the keys to the safe; we can transfer money ourselves without losing control. So how about we redesign our payments future with this in mind? Instead of consumers handing over authority for a retailer to debit their account, what about if the retailer communicated to our phone the transaction value, and the consumer executed the transfer themselves? In effect we just say to the retailer "tell me how much I owe you and I'll transfer the money on my smartphone". This is like modelling retail payments on the way business payments work; the retailer gives us an invoice, then we make the payment ourselves. Now we can do this in a retail context because we've got a smartphone with connectivity to our bank. </p>
<p>There is work to do on the way the user experience on the smartphone works and making it quick and easy. For example, we probably want the ability for the retailer to somehow initiate the flow on the phone wirelessly. And we need to simplify the way we execute and authorise the payment on the phone. <span style="-webkit-tap-highlight-color: rgba(26, 26, 26, 0.292969); -webkit-composition-fill-color: rgba(175, 192, 227, 0.230469); -webkit-composition-frame-color: rgba(77, 128, 180, 0.230469); ">If we need to stand in a shop typing in passcodes on our phones, unlocking our phone, finding an app, etc, then it'll never be efficient enough. </span>Some form of simple biometric authentication, like fingerprint recognition, is probably needed before this is quick and easy enough to work efficiently. The user experience might need assistance from the phone OS vendors to optimise things, but we're not too far away from the components falling into place. </p>
<p><span style="-webkit-tap-highlight-color: rgba(26, 26, 26, 0.296875); -webkit-composition-fill-color: rgba(175, 192, 227, 0.230469); -webkit-composition-frame-color: rgba(77, 128, 180, 0.230469); ">NFC could be used to transmit the invoice to the phone and we don't need to worry about encryption or the silly £15 limit of today's NFC implementations - when the till displays the total due on its display for all to see, it's hardly confidential. Although I've been sceptical about NFC in the past, its not the technology that's at fault, its the way everyone has been using it.</span><br>
</p>
<p>Using this approach the fraud risks reduce dramatically because we never hand over authority to a third party to debit our account. At the same time, we eliminate the whole point-of-sale, merchant acquirer, PIN code and encryption complexity in one stroke. Reduced complexity = reduced cost.</p>
<p>In my view there is a huge opportunity to radically simplify retail payments and reduce costs. That means reducing the industry costs so that it can reduce the cost to retailers. <span style="-webkit-tap-highlight-color: rgba(26, 26, 26, 0.292969); -webkit-composition-fill-color: rgba(175, 192, 227, 0.230469); -webkit-composition-frame-color: rgba(77, 128, 180, 0.230469); ">Look at any breakdown of <a href="http://www.process4less.co.uk/" target="_blank" title="">card transaction costs</a> - the ridiculous maze of obscure charges are ripe for simplification. Setup fees, transaction fees, terminal rental fees, minimum monthly service charges. Its even impossible to work out what today's charges are without specifying a particular business scenario.</span></p>
<p>In my opinion the smartphone is at the centre of an impending transformation, because it empowers a radically simpler business model for transactions. But banks have so far not been willing to disrupt their own industry - we continuously see Contactless pilots that just layer new complexity on the old rather than taking an axe to yesterday's complexity. At some point the light will dawn and something more radical will emerge, something that strips out entire layers of complexity and cost in something like the way I've suggested. The way we pay in a retail environment will change dramatically and retailers will encourage that shift because their costs will reduce. Many retailers increasingly operate on wafer-thin margins, so a reduction in card processing charges will be a huge benefit for them. </p>
<p>The future is not based on using smartphones for payments because its cool technology and makes us feel like we're in a sci-fi movie. Instead, Its based on disrupting an industry and reducing charges to its customers. This must mean a radical simplification of the way we pay and subsequent reduction in industry costs.</p>
<p>Many current mobile payment solutions layer additional complexity and cost onto yesterday's model and will fail for that very reason. In my opinion, the successful solutions are likely to sidestep the merchant acquirer complexity. One such promising solution is <a href="http://www.zapp.co.uk/" target="_blank" title="">Zapp</a> from VocaLink - by avoiding merchant acquirer and routing retail payments over the UK's Faster Payments, a very different industry economic might emerge. I'm intrigued by Zapp and its refreshing willingness to rethink the future. I'm sure it will have its challenges; not least that the necessary biometric authentication to make the user experience efficient enough in a retail context is only just emerging - so I suspect uptake might be slower than we'd like initially. But it's definitely an interesting direction.</p>
<p>In summary, my view is that the future of payments belongs to those who radically simplify, strip out cost and reduce the burden on retailers. I suggest this means that we are going to move from a "here are the keys to my safe, please go and take what I owe you" model, to a "tell me what I owe you and I'll transfer it to your account myself" model.</p>
<p>With thanks to @gendal for the safe analogy.</p><div style="text-align: right; font-size: small; clear: both;" id="blogsy_footer"><a href="http://blogsyapp.com" target="_blank"><img src="http://blogsyapp.com/images/blogsy_footer_icon.png" alt="Posted with Blogsy" style="vertical-align: middle; margin-right: 5px;" width="20" height="20" />Posted with Blogsy</a></div>Duncan Andersonhttp://www.blogger.com/profile/03877209756077142249noreply@blogger.com1