Thursday, December 16, 2010
It’s the season of sharing
In the late sixties, when sharing was taken to the extreme, the hippies in Amsterdam introduced the “white bicycles” program. Around 2,000 bikes, painted white to distinguish them from regular ones, were declared to be collectively owned and available for use on short trips. Obviously there was too much temptation to transfer ownership from the collective to the individual and the program became a failure. But in the last couple of years the same concept has been successfully resurrected in many European cities. This time they are applying technology, such as creditcard-activated locks, to prevent unauthorized use. Zipcar in the US and Greenwheels in the Netherlands have taken the white bike concept to cars. These companies see themselves as car sharing rather than rental companies. They operate on a membership basis and offer easy access to cars, which are parked at designated spots in major cities. Members can reserve them online or pick up any available one. A card is used to access the vehicle, which is equipped with GPS for tracking.
In the same sixties a new computing concept was introduced: Time Sharing. So called “service bureaus” provided businesses with computing power to run their applications. This approach brought down the very high processing cost of dedicated computers. However with the introduction of the micro chip, the emergence of mini-computers and PC’s and the gradual decline of cost, time sharing faded into oblivion. Computer power got distributed and everyone rushed out to buy and build their own software. Now this has come full circle.
Most businesses still own their Client Relationship Management software. It took a major investment in time and resources to get Siebel or equivalent running, typically millions of dollars and project durations of well over a year. Many of these projects failed to deliver the expected results. In the mean time, thousands of companies have found out that it makes more sense to share the soft and hardware required to run a CRM application and pay for usage only. Salesforce has recycled the time sharing concept. While doing this, Salesforce has become one of the fastest growing SW companies with a market capitalization of over $18 billion. Similarly Google is signing up tons of new enterprise users for its web-based Google Apps, replacing Microsoft’s clunky alternatives.
The Cloud is the big story: business applications will move to the Web and computing power is obtained from providers on a pay-as-you-go basis. Sounds familiar? Computer power will go the way of power: it will be provided by utilities. You will end up buying computer services instead of servers. Today, around half of the total cost of ownership of business applications is spent on IT infrastructure, such as servers and networks. The average server runs at around 15% capacity and has energy consumption at half of its optimum. With ever-increasing network speeds and capacity, computing power is moving back to the data center. Actually to giant datacenters, each housing thousands of servers. In the last couple of years companies like Microsoft, Google, HP and Amazon have been on a construction spree. By aggregating computer power demand across companies, industries and geographies they increase server utilization to around 80%. By bundling purchasing power, the cost per server is driven down. Power utilities have been around for over 110 years. In 1907 utilities produced 40% of the power. By 1930 this was 90%. On Internet time, we can expect this happen much faster with computer power.
Sharing has been around forever. Many people still rent instead of buy a home, lease instead of buy a car and we used to rent videos from Blockbuster (before it went bust). What is new is the fact that the Web enables sharing at a very low threshold, in a personalized way and with an amazing ease of use. According to The Economist: “access often matters more than ownership…technology will make sharing more and more efficient.”
Happy holidays!
Thursday, December 17, 2009
Who's in control?
Given the state of technology we should relook at privacy and ownership of personal data. Let’s start with laying down the first principle: you own your personal data and you may grant others (like Facebook, the Government, credit card companies, your doctor) the right to use this data for purposes that you agree with. The second principle is that, if you opt for it, you should be notified when someone accesses your data. Today 75% of the population in the USA has broadband Internet (85% in the Netherlands) and by 2013 mobile phone penetration is expected to be 100%. This doesn’t seem to be much of a barrier to implementing low cost notification by SMS or email.
Your health record falls under your personal data. You own your record and are responsible for maintaining it. It is your life. That doesn’t mean that you have to type in whatever you have just learned from your doctor at your last visit. You should delegate this to trusted sources, like your general practitioner, home town pharmacy or community hospital. I have tried to follow the discussion around the electronic patient dossier in the Netherlands and it was obvious why a centralized, government driven approach will not work. In the Netherlands the government still plays a paternalistic role (they call it “father state”) with cradle to grave support of its citizens. Times they are a-changing. Many people just don’t trust the government to create a working central system and guarantee the privacy of their data. Fortunately there are viable alternatives.
While establishing a standard for electronic patient records may not be the hardest part – there are many working examples in the market like the Continuity of Care Document- the security of these records is slightly more complicated. We should look at that treasure trove of free options: the open source world. There is a well established standard for user identification and authentication, OpenID, which is supported by companies like Google, IBM, Paypal and Yahoo. If it’s good enough for the largest online companies in the world, it’s probably good enough for this purpose. We can also leverage standards that help ensure that data is stored in encrypted format at locations that are certified to be genuine. A “security provisioning service” can facilitate the enrollment of users and the assignment of “read and write” rights of the records. A directory service can contain a list of approved medical staff.
One of the weak links remains the fact that currently most the information is password protected. This can be strengthened by One Time Passwords sent to a registered mobile, or generated by a dedicated device. A robust system may require the option of biometric authentication to uniquely identify the user. The government could provide authentication services through public devices. The latter can be done by setting up identification and authentication centers (for instance in post offices) and possibly leveraging the major banks ATM networks (they own most of teh big banks anyway). One might even consider iris-recognition for highly secure transactions like reporting identity theft.
In bustling, chaotic India the government has just embarked on a huge program to provide every of their 1.2 odd billion (nobody knows exactly how many) citizens with a unique identification number. They use fingerprints as the authentication mechanism. In the Netherlands fingerprinting is now mandatory to obtain a passport. If India can deal with this problem on the scale of billions then the Netherlands should be able to handle 16 million users.
A decentralized approach using data and security standards will be much easier to implement than any centralized approach that requires huge databases and complex integration. In the banking world, decentralized solutions based on standards have worked well for the last twenty years or so. You can withdraw money at virtually any ATM around the world. This is not because there is one centralized ATM network with a huge database, but because there are established standards for data exchange, security and settlements between financial institutions that hold the data of their clients. Once you enter your card and PIN, the ATM will send encrypted identification and authentication information over the network to your bank, who returns an approval (if you have sufficient balance), which prompts the ATM to dispense money and your account getting debited.
Similarly the general practitioner, pharmacy or emergency room should be able to access relevant parts of your record over the Internet, if you have authorized them to do so. They can add observations, treatment information and prescriptions to your record to maintain a holistic view. If you would like to support the progress of science you can allow certain qualified institutions to use your medical data, stripped from your personal information, for analytical purposes. They will need to notify you when they do this for which purpose.
A combination of private and public initiatives is the way forward. Once standards are agreed, there could be multiple systems that a patient can choose from to store her record. For instance, the patient can delegate this to her community hospital, because she trusts them to look out for her interests. She can authorize her local pharmacy to access relevant parts of her record and update this when she picks up her prescription. This can be totally browser-based without the need for extensive system integration or an overbearing government. The user manages her demographic data and authorized organizations can be notified of any updates, like change of employer and insurance or change of address.
This approach puts the onus of ownership and control on the user, limits the role of the government to support of standardization and certification and leaves the implementation to private organizations that have the biggest stake in making this a success.
Wednesday, December 9, 2009
The Chasm
Large software projects remain unpredictable. The root causes of projects that are running way over budget and over time can be typically found on the fault-line of business and technology. If CIOs continue to behave like true cost center managers and keep holding on to a technology view of the world, IT will remain neutral at best and a value drain at worst. The technology providers that can provide relief are few and far between, as they tend to suffer from the “hammer syndrome”, i.e. they see every problem as a technology problem that can be killed by throwing more abbreviations at it. SOA, BI and BPM are all helpful tools, but only if they are used in the proper business context. “The business hasn’t given us clear requirements; they keep changing their minds all the time”. This type of statements indicates a chasm between business and IT. In 2008 the Hackett Group published a report which showed that companies where the IT discipline is weaved into the fabric of the organization have 40% higher margins than their peers who don’t. According to Faisal Hoque, of the BTM Corporation, the convergence of business and IT is a key driver for growth and profitability. This implies that every IT person understands the business value of the application she is working on and every executive has a grasp of the importance of technology for the company direction. Key technology decisions are business decisions. IT is managed using financial models that continuously measure value added to the organization. Applications are grouped in portfolios and directed on value, cost and risk.
In the last couple of years we have seen some dramatic changes in the IT world. A new generation of applications built for the Web and utilizing open source and collaborative development models, are allowing companies to implement flexible applications much faster and cheaper. Applications like Salesorce.com are shared among multiple companies (so called multi-tenant systems) and offer a web-based platform to integrate with existing systems. Applications that “live in the Cloud”, as this is euphemistically referred to, negate the need to manage infrastructure. In general it doesn’t make sense for most companies to own and manage their own technology infrastructure. They lack the expertise and scale to do this effectively. This is better left to specialists like HP and IBM or new players like Amazon.
Web technologies allow organizations to interact and exchange knowledge effectively. Business analysts, architects and project managers will remain close to the business, but there is no need to have developers, testers and configuration managers within the walls of your organization. Using collaborative development tools and methodologies it is possible to get the right talent at the right moment, independent of location and time-zone. When the project is well managed and scope and specifications are under control it doesn’t matter if the code is being cut in Bangor, Maine or Bangalore, India.
To sum it up, there is a plethora of wonderful new models and tools to make IT more efficient. In the end it’s only going to be effective if the barrier between business and IT disappears and IT becomes a true enabler of business value.
Thursday, October 22, 2009
Facebook knows more about you than you do
So you have added hundreds of people to your Facebook account, from your closest friends to vague acquaintances or even people you have never met. You interact with them on topics that interest you, you post, tweet and twitter to make sure everyone knows you exist or just for the fun of it. You play some games or do quizzes; the ones that compare you to animals or famous people. Innocent stuff, right? Maybe.
Facebook knows your employer, education, sexual orientation, friends and your interests. You use FB on your iPhone as well and thanks to the GPS in your phone everyone is aware of your whereabouts. And these quizzes you take? They are neatly packaged psychological tests that try to glean information about you, thereby taking profiling to the next level. Using behavioral analytics, all this data is being compared with three hundred million people (a user-base the size of the entire population of the US) to find patterns. Knowing who is like you, they now can fairly accurately predict what is next. In essence they create a model of you and use this to target advertising. Moreover, they can aggregate information over all these users and apply “social sensors” to see if tastes or moods are changing among certain groups. This is powerful knowledge…People who are more cynical than me can probably come up with some horror scenarios.
According to ACLU, California’s Civil Liberties organization, most people don’t know that Facebook's default privacy settings allow full access to a user's information. It is actually worse.. every time one of a user's friends takes a quiz, the quiz has access to that user's profile information. Of course that has not gone unnoticed and some users sued FB, alleging that the social networking site violates several state laws aimed at protecting consumers' privacy. According to the WSJ, the complaint accuses FB of failing to compensate its users for harvesting their personal data and for violating laws that protect consumers from having information they upload to the site shared with third parties, such as advertisers. Hopefully Facebook will strengthen the privacy of personal data, but it seems that the floodgates have opened.
Interestingly, when I talk to my kids they are not that concerned. They have grown up with Facebook and actually use it as a tool to position themselves, to basically promote the “brand me”. They have become incredibly clever in manipulating the way they are being perceived from their Facebook presence. It of course begs the question what is the reality content of what they post? As Daniel Hollinger writes in an excellent article in today’s WSJ: “it's getting harder to know what's real and unreal in a world that always seems to be slipping slightly out of focus.” Probably, the word “reality” itself has lost its meaning a while back when the Dutch Endemol group launched the first reality show, aptly named “Big Brother”.
Tuesday, September 22, 2009
Netflix found a cheap fix for innovation
Netflix started its business as a DVD rental company a couple of years back. They cleverly combine a net-based ordering system with a well thought-out physical distribution approach for the flicks on DVD. You select your movies from their website and put them in a queue. Every time you send a DVD back, by dropping it in a pre-addressed, pre-paid envelop in your mailbox, you get a new one. No late fees. Great for people like me who want to bring DVDs on a trip or just forget about them. Last year they introduced movies online to watch either on your PC or TV, via a small WiFi connected box. Guess what? They actually have old Dutch favorites like Soldier of Orange and countless other non-mainstream movies.
When you log on the first screen that comes up is “Movies you’ll love”. I have rated around 400 movies so far (nowhere near the guy who rated 5,000 movies in one day). The ratings vary from 1 to 5 and I generously bestowed Ben Hur, Citizen Kane and The Godfather with the maximum number of stars. My ratings are continually compared with people who have similar tastes and by applying some complex algorithm they predict which movies I might be interested in. They show both the overall rating and the one they believe I will give the movies I am considering for watching. Interestingly these two are rarely the same and my ratings are consistently close to their predictions.
A while back Netflix invited anyone with a beautiful mind to participate in a contest with $ 1 million prize money to come up with an algorithm that would improve the accuracy of their current system by more than 10%. In order to do so they made millions of data records available and tested the algorithms submitted against the actual ratings submitted by customers. According to Business Week, the winning team, which includes scientists from AT&T (T) Research, Yahoo's (YHOO) Israel lab, and computer scientists from Austria and Canada, blended more than 700 different statistical models into their formula. The next contest will be for algorithms that predict the popularity of new movies, based on your rental history, demographics and other profile attributes.
Netflix has stared into the future and seen what is happening in the media world. As content proliferates and the lines between user and producer blur, it will get harder and harder to separate the wheat from the chaff. While the studios will continue to generate blockbusters with big stars and maybe even come up with an original plot, instead of a comic book rewrite, we can expect more and more interesting low budget movies for smaller audiences. Movie business is even worse than the fashion industry and the few hits have to make up for multiple misses. But Netflix has something the studios don’t: a thorough understanding of their customers’ tastes and interests down to the individual level. Shortly, they will have a way to predict whether a movie will make it or flop. And they pay only $500K to get there, while the big studios keep shooting in the dark.
Netflix has understood that they don’t need to hire armies of PhD’s in statistical analysis to get what they need. They just “crowd source” their innovation. According to the New York Times, thousands of teams from 186 countries made submissions. The winning group is a merger of different smaller teams, who initially competed against each other. A Survivor-like situation emerged with different individuals and groups trying to form alliances to beat the others.
Google, Amazon and Netflix run highly profitable, multi-billion businesses based on a simple principle: attract as many users as possible and have them interact on your site, gain a detailed understanding of their needs and interests and analyze this against huge quantities of data on their peers to create the best value propositions. There are some lessons here.
Wednesday, September 16, 2009
Nothing endures but change
It is a cliché to state that globalization has been a tremendous force of change in the last decades. Never before have economies, markets and supply chains in different parts of the world been so entwined. This has led to unprecedented growth of the world economy. It allowed China to lift around 500M people out of poverty since the country opened up in the late seventies. It turned India from a socialist, autarkic backwater into an economic power and helped another 122(!) countries to grow more than 4% in 2007.
But there is a flip side: the resulting interdependence has brought complexity and instability. The butterfly effect applies: “small variations of the initial condition of a dynamical system may produce large variations in the long term behavior of the system”.
When the financial markets collapsed the repercussions were felt around the globe and as a consequence most western countries will experience a shrinking economy in 2009. The causes are manifold. There were the financial managers, whose short term oriented decisions were guided by multi-million bonuses. Then the compulsive consumers on a spending spree, high on cheap credit, supported by the huge trade deficits with China. Lax regulation, opacity and complexity of markets also played a role. All was well until the bubble burst and the walls came crumbling down. For a brief period Europe was basking in schadenfreude and then quickly governments had to step in to rescue their local banks. Iceland’s financial products -too good to be true- turned out to be that way. Within a couple of weeks this pristine country’s stock exchange lost 90% of its value.
I guess by now everyone realizes that financial markets operate 24 hours a day, around the globe. They have become more dynamic and complex and evolve faster than most people can grasp. Nassim Taleb, who writes about what to do with a world we don’t understand in the Black Swan (published before the crash): “Globalization creates interlocking fragility, while reducing volatility and giving the appearance of stability. In other words it creates devastating Black Swans. We have never lived before under the threat of a global collapse.” Some of the world’s leading financial gurus, including Alan Greenspan and Warren Buffett, have openly admitted that they failed to see what was happening. Buffett states in his letter to shareholders: “I made some errors of omission, sucking my thumb when new facts came in that should have caused me to re-examine my thinking and promptly take action.” On the theoretical front, Nobel Prize winner Paul Krugman puts it as follows: “much of the past 30 years of macroeconomics was spectacularly useless at best, and positively harmful at worst.”
Other industries may not show the same dynamics as the Financial Services, but acceleration of change is happening in IT, healthcare, automotive and media.
The IT industry has always been prone to change. What happened to Wordstar, Visicalc, Wang and DEC, just to name a few of the hundreds of brands that disappeared? Given the steady onslaught of open source and cloud services, where will the companies go that sell expensive software with outrageous maintenance contracts and the need for armies of specialists to install and keep it running? Successful product companies like Apple have focused on design and online services. They don’t own any factories. Google has built their business model around getting huge numbers of users and mining their data on the largest imaginable scale. Those providing software services can no longer depend on their proprietary knowledge, as there is currently more and better technology information in the public domain than in any of these companies. Those without a substantial workforce in emerging economies like India, China, Philippines or Argentina will soon find themselves out of work.
Oil prices have been fluctuating wildly. Last summer in a matter of weeks the price at the pump doubled. At the same time the discussion on global warming moved to the foreground. Gas guzzlers like the Hummer suddenly lost their coolness. Although years in the making, the decline of the Detroit Big Three became clear to all. Ford managed to fight its way back to profitability but Chrysler and GM had to be bailed out by the government. They didn’t act quickly and resolutely enough. Western car markets have shrunk by almost 20%. Meanwhile the car markets in India and China were alive and well. Tata launched a revolutionary $2,500 car that competes not only with low end cars but also motor bikes. Meanwhile countries like Australia and Israel are working on an electric car infrastructure. This is an industry in transformation.
The media industry is struggling. The sales of CDs is in terminal decline. Though paid-for downloads are increasing steadily, more and more kids are listening to personalized Internet radio and are no longer interested in owning songs. In the last year hundreds of newspapers went bankrupt, losing the battle with their online competition. Open source models are appearing to compete with traditional publishing and Google is putting millions of books online. Youtube is replacing television as the most important entertainment medium. Time to rethink business models in this industry.
Most companies react to change by reorganizations or mergers and acquisitions. It is harder to have the companies’ business models and processes adapt to new customer needs, competition or regulation. One of the reasons is that the key processes are supported by software applications and a technology infrastructure, suffering from change resistance. Business models also have the tendency to assume a linear world, rather than one in which sudden events and disruptions become more rule than exception.
The fashion industry always had to deal with fickle consumer tastes that could change on a dime. For most apparel companies it is hit or miss. In contrast, the Spanish company Zara has built their business model around customer insight and agility. According to the Harvard Business Review: “Zara has developed a superresponsive supply chain. The company can design, produce, and deliver a new garment and put it on display in its stores worldwide in a mere 15 days. Such a pace is unheard-of in the fashion business, where designers typically spend months planning for the next season.” Zara operates 1,500 stores in 71 countries. They aim to be as close as possible to the customer. Every day they collect sales data from all of their stores. These data are analyzed and related to inventories and other operational data. Slow moving inventory in one store can be moved to a fast moving store. They use a team-based approach in which designers and product managers work closely together to continually evolve their clothing lines, based on the information they receive about sales as well as input from store managers. Most products have very short life cycles. This gives their clothes a level of exclusivity and forces the consumer to buy today, because it may be gone tomorrow. Accurate forecasting is not required: they adjust as they go along. The company has a vertically integrated supply chain, keeping half of the work in-house, and a network of partners and subcontractors. Its supply chain is built around speed of operation. The CEO of Zara: “you need to have five fingers touching the factory and five touching the customer.”
Like Zara, who captures customer information in every store every day, companies that have a strong engagement with their customers tend to do better. They react faster and have more influence on customer needs. Many consumer goods and apparel companies who sell their products through distributors have to make assumptions about their client needs, as they lack a direct channel and timely data. But current technology allows these companies to get in touch with their clients directly, without the need to invest in a large retail network. They should set up highly interactive portals to promote information exchange between their stakeholders. Clients in different markets are prompted to share experiences. Forums on product customization or service improvement allow insight in customer needs. Events, both online and offline, can be organized to rally the fan base. All this interaction may be analyzed to glean information on current and future customer needs. While this will not generate the next breakthrough product, it gives input on product improvement and generally strengthens the ties between supplier and consumer. It will give the company an early warning system for changing behavior.
Successful companies constantly monitor the environment and assess the potential impact of changes on the company’s business. One of the largest Investment Banks has created online dependency maps of the companies they invested in, similar to the ones you can find at News Dots. It visualizes the most recent topics in the news as a giant network, highlighting “hot areas”, that may require action. So when GM got in the news about their issues it immediately showed which companies had substantial subcontracting relationships. New technologies that search large quantities of data and find patterns and associations, can highlight early warning for impending change. Obviously dealing the right way with that change is a matter of leadership and organizational agility. More on that topic in my next blog.
Wednesday, August 26, 2009
Open Source Ingrained in Ingres
He comments: “Most concepts that are fine in theory don’t work in practice. With open source it is the other way around. It goes against all organizational principles, but turns out to be highly effective”. He and his team have developed a network of thousands of users, developers and contributors. They created a lively community committed to keep the Ingres products at the forefront of technology. The business model has evolved from a traditional licensing model, which aims to lock in clients and have them charge for annual maintenance fees, to one of implementation, optimization and support. Ingres technology is freely available as a download and the company makes money selling a support subscription to users who are running mission critical workloads.
Roger is particularly excited about “Open Innovation”. They started working with the Amsterdam-based on Centrum Wiskunde & Informatica (Centre for Mathematics and Computer Science) on the next generation of database servers. Their starting point was that the architecture of database software was based on the 20 year old technology and that the explosion of stored data with its 50-100% annual growth required fresh thinking. CWI and Ingres invited a community of scientists to work on a project dubbed “Vectorwise Computing”. In essence they redesigned the software to take advantage of the very large number of transistors packed on today’s Intel chips and to resolve the fact that memory has become a major bottleneck. The global team, of which many members have never met face-to-face, is led by two of Ingres top engineers and two of CWI scientists as well as leading researchers from around the world. Ingres ensures that the end result is a commercially viable product, available to the development community as open source.
For their internal technology needs, Roger decided to move away from in-house developed, managed and operated systems. He felt that the company needed a flexible application base that would allow them to rapidly integrate the acquisitions they had planned and be able to scale with the volume of business. His direction was to use “Software as a Service”, where viable. The obvious advantages of this approach are short implementation cycles, flexible cost with low upfront investment and minimal operational management. The potential downside is lock-in and dependency on the provider. He is increasingly looking for open source software that runs in the Cloud, to get the benefits of the service and avoid the risks of vendor dependency. Currently his core business applications are obtained as a service from Salesforce.com and Intact for ERP.
Since Roger has actively lived the tremendous innovation and opportunities spawned by the open standards and open source movement in the Internet space, he is a staunch believer in applying the same concepts to one of the biggest issues facing us: global warming. He is working with a number of organizations to accelerate the speed of innovation in green technology.
Open Source is becoming a phenomenon that extends well beyond software. Its collaboration model with large groups of people around the globe contributing on the Internet to common solutions, is making inroads in education, healthcare and now green technology. The organizations that know how to tap into these opportunities by leveraging web tools and engaging key stakeholders will set themselves apart.