An Objectivist Christmas Carol

  One of my odd holiday season traditions is to re-read Charles Dickens’ festive classic, “A Christmas Carol”.

After today’s re-run of this tradition, I felt – as I often do – that it’s not quite the appeal to a religious/social democratic way of life it’s often portrayed. Is there something of the libertarian – if not Objectivist in Ebinezer Scrooge by the end of the story?

Turns out, I’m not the only one who’s had the same thoughts. Here’s a comprehensive look at this idea from the very thoughtful Robert Davidson on the Rebirth of Reason site.

Davidson makes some thought-provoking points, this one really caught my eye:

Dickens argues here for an integrated rational, full-faceted individual who is as comfortable in the counting house as he is with spiritual values and the fulfillment and happiness they provide. The spirit of Christmas is a metaphor for the integrated life. Dickens describes Christmas as “the only time I know of in the long calendar of the year, when men and women seem by one consent to open their shut-up hearts freely, and to think of other people below them as if they really were fellow passengers to the grave, and not another race of creatures bound on other journeys.” 

Whenever I think of that powerful scene, when pre-transformed Scrooge asks the two gentlemen soliciting charity for the needy, he asks “Are there no prisons? Are there no work-houses?…I help to support the establishments I have mentioned: they cost enough: and those who are badly off must go there.”

Is that really the attitude of an Objectivist? Or is it more someone who relies on the state, to perform the functions of charity and forbearance? If you want a large state to redistribute wealth, look after the poor, and support us from cradle to grave, then you’ll be the sort of person who wants taxes to provide those services; those institutions. It would be someone who believes in self-reliance and thinks that charity should be precisely that: charity, then you can’t support Scrooge’s sentiment.

Anyway, I’m jotting this out of my phone on Christmas Eve night, so maybe I’m not thinking it through.

Either way, wherever you are, whatever you’re doing, happy holidays.

Advertisements

The Rise, Fall and Eventual Rise Again of eBooks

ebookIt was only about five years ago that the world – and me – decided that print books were going the way of vinyl records. In the mid 2000s, the technology that make e-ink screens possible was finally viable for mass production.

Soon after, Amazon released the Kindle, and ebooks went mainstream. Between 2008 and 2011, ebook sales rose 1,260 percent in the US alone. Game over. Independent bookshops, chains and printers stood in fear, waiting for the final death call.

But it never came. It was a close-run thing. Sales were skyrocketing, and in the US, the collapse of bookstore Borders (which filed for bankruptcy in 2011) seemed to signal the very end.

Then the numbers went the other way. Since then, paper-based books have slowly moved back into the mainstream. By this year – 2015 – people like me said ebooks would overtake sales of print. But it didn’t happen. There was something of a plot twist to this story, that I never saw coming. Book stores – including those independent chains – are stronger and more vibrant today than any time before 2010. The American Booksellers Association says they’ve got 1,712 members stores today, compared to 1,660 in 2010. Today, ebooks occupy about 20 percent of the market. That’s about the same market share in 2012. What happened?

I’ve heard a lot of publishers (and authors who have bought this line too) say it’s simply because readers prefer “real” books. And so digital is at 20 percent, and will stay at 20 percent. The market has spoken. I don’t quite buy this. I think there were two reasons why ebooks sales have slumped: one short(er)-term reason to do with a temporary technology disruption from another market, and a longer-term reason to do with corporatism on behalf of the big traditional book-publishing industry.

Let’s look at the first of those. The first mainstream ebook reader in the US, the Amazon Kindle, cost hundreds of dollars when it was first released in the American market. But it sold well. As is pretty much always the case with technology, the prices quickly went down and the features improved. But it’s just an e-ink screen right? So the improvements were incremental. The real push is to lower the cost. Today in the UK, the basic Kindle, (which is much better than the first generation model ever was), will set you back just £59. Adjusted for inflation, that’s a heck of a drop compared to the first model released in 2008. Most ebooks were usually cheaper than their hardback versions, and paperback editions too. Makes sense really. I mean, there’s not a lot of cost involved in the mass-distrubtion of a file that’s typically only a couple of megabytes big, compared to the printing and distribution of a paper-based product. Amazon made big gains with its cheap $9.99 price guarantee for bestsellers (which, because publishers didn’t have the big costs associated with mass printing and distribution, meant that they also actually made more money from the sales of the cheaper ebook versions).

Then a bit of marketplace disruption occurred. In 2010, Steve Jobs revealed Apple’s iPad. “The Kindle’s been great,” he told the enthralled audience at the keynote speech, revealing the tablet to the world for the first time, “but now we’re gonna take it further.” Stephen Fry upon recording his first impressions of the iPad, couldn’t help but write “…poor Kindle.” Tablets had been around for decades, but the iPad was the first tablet computer that captured the imagination of the mainstream. It was a big success, and dozens of rival manufactures brought out their own tablets (including Amazon, with their Kindle Fire range).

Suddenly, in 2010, millions of customers faced a choice. Buy a Kindle (or other e-reader) for, say, $250, or an iPad for $399. Yeah, the iPad is more expensive, but it can do a lot more an a e-reader, which is after all, a uni-tasking device. And the iPad can read books too. Jobs gave a demo of iBooks, and even Amazon produced a Kindle app, so you could read your purchases on the device. Most people, at the time, weren’t going to buy both devices given the prices, so they bought one. And that was the iPad they bought. Or, other, often cheaper Android/Microsoft-based rivals.

But there’s a problem. Reading a book on a bright computer screen – like an iPad – is not the same as reading it on an e-ink screen. The e-ink screen looks like, well, a page. Just printed text on paper. A regular screen is like staring at your laptop. After a while, holding a bigger, heavier, glaring screen to read a text-based book (like a novel or biography) just put people off. So they stopped buying ebooks, and, rather than buying an ebook reader, moved back to paper-based medium. Once bitten, twice shy.

I think this is a short-term issue. But, judging by how slowly the book industry moves, short-term might be 15-25 years. Based on current pricing, I think that the business model of the Kindle could end up being that Amazon will release it for free (“get a free e-ink Kindle for every 5 ebooks you buy!”). So people can have loads of them, all over the house. If you drop one or leave it on the bus, no matter. You can get another for next-to-nothing, and remote-wipe the one you’ve lost/damaged. This ‘free’ ubiquitous attitude will slowly bring people back to ebooks. The rise of people – some of which are very talented – self-publishing on the Kindle Digital Platform, through Barnes & Noble’s platform, Google, or iBooks through iTunes Producer, can also play a part as we see more and more cheap and readily available work. Think about it, the beauty of this, is even if you’re a first-time self-published author, the fact that you’re able to sell as many books (with no upfront risk or cost) as John Grisham is a really exciting and revolutionary thing. Getting it noticed by the public, especially with lots of people releasing utter garbage remains a challenge.

The second problem I see is a trickier one, that could stop things moving forward for a century or more. This is corporatism on the part of the major book publishers. Once the ebook reader arrived, they could see that with nimble, smart, savvy new writers (think E.L. James et al), soon, publishing a book just by yourself could become the “done” thing, even for well-established writers. If Stephen King publishes a book as a hardcover for $19.99, he could see $3 of it. If he were to publish it himself (paying for an editor, cover designer, etc. himself), he could sell it for, say $5, and still make the same $3 off every sale, regardless of how many copies sell, with no risk of doing an overly-ambitious print-run. And at that price, he’d shift many more books.

The big book publishers saw this as a scary future, one to be avoided if possible. Amazon’s $9.99 Kindle bestsellers deal in the US is over, and the publishers are in charge again now. And they’re charging much more for their ebooks than they were a few years ago, (making them less competitive and attractive to readers) while also doing all they can to lower the price of print-book production through innovations and economies of scale. Hachette boosted their Indiana warehouse by 218,000 square feet last year. Penguin Random House have coughed up $100 million to expand and update its wearhorse operations, with 365,000 square feet added in 2014 to its (already huge) warehouse in Crawfordsville Indiana, doubling its size. The boys and girls at Simon & Schuster are set to do the same to their distribution facility in New Jersey: it’s going to be 200,000 square feet larger.

Why the big investment? Because they can put a stranglehold on this business. At the moment, if people mostly buy print books, then big publishers will remain in charge as the gatekeepers, getting their percentage for every copy sold. Because of these expansions and distribution improvements, it’s now often cheaper to buy a paperback version of a book than the ebook version.

I hope this doesn’t last, but I’m not optimistic. I really like publishers, especially the one’s I’ve mentioned above. But I don’t like what they’re doing here. I envisaged a future for big publishers as representing new talent (and established talent), using their incredible editorial, marketing and promotional skills to be champions of quality. Just because “anyone” can self-publish wouldn’t mean they should. There would be a big market – a demand – for publishers who burrow and forage, looking for the best talent out there, and bringing it to our attention. Yes, the margins could be lower for publishers on a per-book basis, but not having to guess what sort of a print-run etc. they have to do would mean the risk is lower too. And they could invest more time not in building ever-bigger factories, but in nurturing more and more talent.

They’d be so important in this brave future. But I fear (and hope I’m wrong) that they could keep things the way they are for the next century and more, before the number of talented self-published writers tilt the playing-field.

 

But don’t forget, you can buy all of my books – both in print and digitally – here! (Sorry, couldn’t resist the chance to cheapen this article with a plug!)

Economic Festive Cheer

It’s Christmas Eve, and as we stagger toward the end of another year, I thought I’d point out this upbeat assessment about where we’re heading in the UK job-wise.

The Institue of Economic Affairs has published this interesting piece putting to rest many of myths about the free market system. Maybe not one to take to bed after too much mulled wine, but it does point out that, as I often say on this blog, things ARE getting better.

A couple of points from the piece’s author, Christopher Snowdon:

    • Wages – Over a century’s worth of growth has led to a steady rise in wages across the board. Despite perennial claims that the poor get poorer under capitalism, government figures show that in the UK average real wages have doubled for full-time workers and come close to doubling for part-time workers since 1975. The percentage of full-time workers earning above the national minimum wage has also increased, 98% earning at least £6.19 per hour in 2013 compared to only 55% earning more than this amount in 1975 in real terms.
    • Income – Between 1977 and 2011/12, the incomes of the poorest 20% of individuals rose by 93% in real terms. The recent recession saw the incomes of the richest fifth of households hit hardest, their disposable income falling by over 5% in real terms between 2007/08 and 2012/13. According to ONS figures, during the same period average incomes of the poorest fifth rose by over 3% in real terms – the provision of state benefits cushioning declining pay.
    • Inequality – Rising income inequality and relative poverty are often mentioned by critics of capitalism. Neither offer a meaningful measurement of whether or not the poor are better off. Having peaked in 1990, income inequality in Britain has been declining ever since. Despite real disposable incomes of the poorest fifth of households rising by 50% between 1975 and 2005, the number living below the relative poverty threshold increased from 13% to 15%. Reductions in inequality and relative poverty typically coincide with periods of general impoverishment which harm the poor.
    • Social mobility – Social mobility in Britain has not ground to a halt, mobility remaining broadly constant in relative and absolute terms for at least 100 years. The majority of those that are born poor move swiftly up the income ladder, almost all becoming wealthier than their parents. Intelligence and ability play an important role in determining individual progression.
    • Working hours – Average working hours for British employees continue to fall. According to OECD figures, over half of UK workers are working less than 40 hours a week and fewer than 12% work more than 50 hours a week. Only those on high incomes have experienced an increase in their working week.
    • Economic growth – Sceptics of further economic growth should bear in mind the benefits to be had from ongoing prosperity. Between 1965 and 2000, average incomes worldwide have doubled – contributing to improved living standards and a substantial reduction in poverty. Aside from job creation and the boost to wages, further economic growth is vital in order to afford increasingly burdensome welfare spending.
In 2000, the average person in full-time employment was clocking in 37.7 hours a week. By 2011, that was 36.4 hours, with fewer than 12 percent of us working more than 50 a week. Back in 1992, it was 38.1 hours. It’s progress. Slow, but getting there.

Yes, things are getting better in the workforce. It doesn’t always feel like it, but I think we’re heading in the right direction.

Happy holidays.

The Future of Android and iOS

iOS vs AndroidRight now, Android has an 80% share in the world smartphone market, compared to iOS’ 15%. However, if only one of those were to still be around (in some form) in ten years, I’d bet on iOS. Here’s why:

There’s no doubt that Android is winning the market share war at the moment. But Apple has never really been a company that cares about market share, and that seems to have served them pretty well over the years.

But the real war to win – for sustainability if nothing else – is the financial one, and here, Android is failing in a way that might eventually seal its doom.

The great thing about the free market: around half the Fortune 500 companies that were in that top list when I was born around 30 years ago are not there now. Either they vanished into oblivion, or no longer find their names there because they’re owned by other larger companies. Also about half of the Fortune 500 companies today didn’t even exist thirty years ago. As Bob Dylan once told us in The Time’s They Are A-Changin’, “The first one now will be later the last.”

It’s with the context of this fluid market dynamic that I look at Android today.

The market share of Google’s mobile OS, is very similar to the share the Symbian platform held in the mid noughties, before a certain Californian fruit company unveiled the iPhone and changed everything.

Symbian, not unlike Android, could be used on all sorts of phones. It became the most widely-licensed mobile platform in history. Nokia (the ones really pushing it), Motorola, Ericsson, Samsung and most of the rest all used it. It went from the number one smartphone OS in 2005 to as dead as Dillinger in 2009.

Android, as even the name suggests, was the child of Symbian. Developed by Google, it was originally designed to be the “new, improved” successor to the world’s most popular OS. But once iOS (or iPhone OS was it was initially known) was shown to the world, Google went back to the drawing-board, making Android a touch-screen experience only.

Desperate to find an OS that could rival Apple’s offering, dozens of manufacturers dived into Android, including Korean giant Samsung. In 2009, 80% of Samsung’s smartphones were using Windows Mobile. 20% Symbian. They announced around a third of their phones in 2010 would be Android. With Google’s open-source OS, they could add a “makeover” (in their case, Touchwiz) and differentiate themselves in the market. In short, as we’ve seen, Samsung’s plan was to be another Apple. They wanted their OS, apps, and phones to replicate that “feel” that Apple have, without the R&D. Just cram in some extra features, and take market share away from the new upstart, iOS.

Here lies the problem. Because Android is free to add to any smartphone, (even apple could build Android phones if they were so inclined, you don’t even need to ask Google’s permission), lots of cheap phones are made with a version of it running. Many of these cheap smartphones – particularly the ones in the Far East – aren’t even used as smartphones. People just make calls, and send-and-recieve texts.

The only really premium smartphone company running Android on their phones is still Samsung, and it’s only there that Google makes any serious money. Google gets paid when Android users utilise their search and other Google products. This revenue, like most of Google’s, is from primarily advertising. Their other revenue source is from their 30% commission from the Google Play store, just like Apple.

The best estimate is that 80% of Google’s search/app revenue comes from Samsung smartphone users, and 90% of all Google Play sales come from them too. They may have many companies using their OS, but most of those users don’t even use them as smartphones, just like with Android’s father Symbian. Despite Android coming out later than iOS, it actually represents the continuation of the Symbian (and Java Mobile) status quo. iOS was – and in many respects remains – the plucky little upstart. It just so happens to be the upstart that also makes most of the money. Even Google make more money from iOS users than they do those using Android phones.

And it gets worse for Google. This year, Samsung are releasing smartphones that will have the Tizen operating system. Another open system, not massively different to Android, but one that Samsung pretty much controls. The version they’ve got running in beta is so similar to their Touchwiz Android phones that most users won’t tell them apart. Many say that Samsung is tired of having to wait for Google to update their OS – and Samsung even has to add Knox, their own security integration, because they can’t rely on Google to provide anything secure enough.

The transition for Samsung from Android to Tizen will be slow – and potentially unsuccessful – but once it fully moves over (if customers are happy with it), then Google is left with a bunch of also-ran companies like Motorola, who don’t make them any money at all. I can see Google choosing to walk away from it then, especially if they continue to make money from their potentially game-changing Chromebooks. After all, why waste money in the sphere that’s dying, when you can spend more creative energy in the bit of the business that’s really working well?

Then Samsung can duke it out in the premium market with Apple. But they still have a big fight on their hands. In 2013, 150 million iPhones were sold, while Samsung only sold 100 million – that’s from their entire lineup of premium Galaxy S and Note phones. Plus, with Tizen slowly coming into the market during the great transition, Apple can use that time to point out their other great strength relative to the rest of the market: a lack of fragmentation, which is good for end-users and developers.

In the same time that it took iOS 7 to go from a 0% to 90% install base for devices currently in use, Android’s KitKat has gone from 0% to just under 5%. So iOS developers can build using the latest and greatest APIs, knowing full-well that the vast majority of their customers can make use of them. Plus – or maybe in part because of this – iOS users are more likely to spend money on (or in) apps. The latest figures suggest that even though iOS is 15% of the smartphone market, it’s 74% of revenue, with Android at 20% and “other” at 6%.

No one really knows what the future holds for the smartphone market. Maybe Tizen will be a failure and Samsung will stick with Android. Maybe the whole world will suddenly decide not to bother with iOS any more. Maybe Microsoft was something game-changing up its sleeve.

But, judging by the state of the market today, I’d bet on the Californian fruit company.

In the Land of the Free, Disruption is King

Robot ArmThere’s a very human fear of technology ruining our lives.

“It’ll replace us,” we say, when a new-fangled bit of technology comes along. But is there any real truth behind that anxiety?

It’s our instinctive reaction. But that doesn’t make it the right one.

When I was young, I can still recall a (serious) news report, commenting – with furrowed brows – how, shock horror, some children as young as 8 “had access to the internet.”

That was it. It wasn’t that they were doing something wrong, or even that they were unsupervised. It was the fear that they were using new technology. I posted a great video here a while ago featuring the founder of Wired Magazine, who says that in the early/mid 90s, mainstream newspapers were still writing articles about the web as “The Internet – Threat or Menace?” and, though that precise type of hysteria about the web has died down, general fear-peddling about new advances hasn’t ceased at all.

The truth this, technological change is part of an observed phenomenon I’ve written about before: “creative destruction”.

In my great-grandparents day, there used to be someone who’d ride down the road on a horse and cart, selling blocks of ice, so the local housewives throughout the city could preserve meat and other things for longer. Eventually a bunch of clever American and Japanese people came up with an electrical device called a refrigerator, which has become a feature of pretty much every house everywhere.

Now the ice-block salesman’s job has gone. Indeed, so has the job of the people who made the ice. But seriously, are we worse off because of it? Furthermore, are they? Did those people who found themselves out of work lose out long term?

In the short term it must have been hard, but surely with their expertise in the freezing business, they were well-placed to sell and support those buying the new appliances. And today: well, those people are long gone, and their descendants are just like the rest of us – with better jobs, better (inflation-adjusted) wages, more buying power, higher standards of living, and even more leisure time than their great-grandparents could have dreamed of.

The smart and thoughtful Roger Bootle has raised some interesting points related to robots and artificial intelligence in the workforce. His article in the telegraph, though too negative for my money, is nonetheless well worth a read.

We can’t begin to imagine the hardship of living in an early hunter-gatherer society. Where the life expectancy is around 25, things were, to put it mildly, pretty hard. The agricultural revolution must have rendered thousands of hunter-gatherer “jobs” redundant. But everything got better. The industrial revolution must have rendered millions of agricultural jobs redundant. But everything got better.

This technological revolution is, well, yes, rendering many jobs redundant. But the wisest among us shouldn’t worry too much about it. I’ve been made redundant. It’s an awful feeling. But for those of us who do find themselves out of work, the stats show that it virtually always leads to better things. And the world gets more efficient, and the standard of living of the people globally continues to rise.

In the land of the free, disruption is king. Long may it last.

The NHS: Britain’s State Religion

Stethoscope“The NHS is Britain’s national religion” stated then Prime Ministerial hopeful David Cameron before the last election. The phrase was meant to show he understood how preciously we hold the NHS to our collective heart, and that he wasn’t going to “tamper” with it too much.

He’s right that we hold it dear, and he’s right that we in the UK treat it as a religion, but he’s way off if he thinks this is a good thing.

A religion is a belief that operates on faith – without any evidence. Indeed, often the absence of evidence is a requirement. Even if there is evidence to the contrary, it merely serves to boost the congregations faith and proclaim their beliefs in a louder more vocal way.

The overwhelming evidence is that Britain, with it’s nation health service, has one of the worst healthcare outcomes in the western world. Everyone (almost) can provide some anecdotal story about how their Aunty Mabel received great treatment and that the nurses were very kind, but it’s just that – an anecdote. The fact is, even lucky old Aunty Mabel would have better treatment if she’d have been treated in Singapore, or Germany, or many other countries.

I have big problems with the healthcare system in the US. But those problems are the SAME as the ones facing Britain. The narrative in the UK is you either have our post-war NHS system, or you have an “evil” “private” system like the US. But what about the other systems, many of which, unlike both the UK and US models, are fairly free-market solutions?

The thing is though, pretty much all the ills with the American system are to do with the fact that it’s not a free-market system when compared to say, the cellphone market, or the grocery business. If the grocery industry was run like the US healthcare system, millions and millions of Americans would go to bed hungry every night. And more than a million every year would starve to death. But luckily, the comparatively free market grocery “system” in America means that the problems with diet over there are down to over-consumption (type 2 diabetes, heart decease and obesity), not starvation. I appreciate there are hungry people in the US, but I think we’re all smart enough to understand the problem in context. Tens of millions in the US will not go hungry tonight.

The US government contributes about 75 cents of every dollar spent on healthcare. There are anti free-market rules about not buying coverage in a different state to the one you live in, and if you do buy insurance, (or what they laughably call insurance but is really a system to pay for everything in advance, not just to insure yourself against unforeseen problems) you’re forced to pay for coverage for things irrelevant to you. But all that is for another post.

Basically healthcare the the UK and the US is faced with the same problem: the distance between the customer and the seller. If the majority of us had to directly buy your own health services and goods, the prices would fall and the quality would rise, at levels we can’t imagine now. That’s what happens in every other comparatively “free market” capital-intensive, zero marginal cost business/service. The problem in both systems, is the state stands in between consumer and service.

But you can’t make that argument in the UK. Because our Aunty Mabel said those nurses were so nice to her. Even when she went in for a chesty cough and contracted the norovirus on the ward. They were lovely. And the only alternative is the evil private American system where people die on the street because they can’t afford healthcare, right?

Amen.

The Secret to World Peace

Summed up better by libertarian magicians Penn & Teller than almost anyone else:

Yup. That.