In this week’s NY Times it was mentioned that badly needed legislation is considered to tighten the security of personal data. Clearly we need safeguards for the protection of personal data. However, it is time to take it a step further.
Given the state of technology we should relook at privacy and ownership of personal data. Let’s start with laying down the first principle: you own your personal data and you may grant others (like Facebook, the Government, credit card companies, your doctor) the right to use this data for purposes that you agree with. The second principle is that, if you opt for it, you should be notified when someone accesses your data. Today 75% of the population in the USA has broadband Internet (85% in the Netherlands) and by 2013 mobile phone penetration is expected to be 100%. This doesn’t seem to be much of a barrier to implementing low cost notification by SMS or email.
Your health record falls under your personal data. You own your record and are responsible for maintaining it. It is your life. That doesn’t mean that you have to type in whatever you have just learned from your doctor at your last visit. You should delegate this to trusted sources, like your general practitioner, home town pharmacy or community hospital. I have tried to follow the discussion around the electronic patient dossier in the Netherlands and it was obvious why a centralized, government driven approach will not work. In the Netherlands the government still plays a paternalistic role (they call it “father state”) with cradle to grave support of its citizens. Times they are a-changing. Many people just don’t trust the government to create a working central system and guarantee the privacy of their data. Fortunately there are viable alternatives.
While establishing a standard for electronic patient records may not be the hardest part – there are many working examples in the market like the Continuity of Care Document- the security of these records is slightly more complicated. We should look at that treasure trove of free options: the open source world. There is a well established standard for user identification and authentication, OpenID, which is supported by companies like Google, IBM, Paypal and Yahoo. If it’s good enough for the largest online companies in the world, it’s probably good enough for this purpose. We can also leverage standards that help ensure that data is stored in encrypted format at locations that are certified to be genuine. A “security provisioning service” can facilitate the enrollment of users and the assignment of “read and write” rights of the records. A directory service can contain a list of approved medical staff.
One of the weak links remains the fact that currently most the information is password protected. This can be strengthened by One Time Passwords sent to a registered mobile, or generated by a dedicated device. A robust system may require the option of biometric authentication to uniquely identify the user. The government could provide authentication services through public devices. The latter can be done by setting up identification and authentication centers (for instance in post offices) and possibly leveraging the major banks ATM networks (they own most of teh big banks anyway). One might even consider iris-recognition for highly secure transactions like reporting identity theft.
In bustling, chaotic India the government has just embarked on a huge program to provide every of their 1.2 odd billion (nobody knows exactly how many) citizens with a unique identification number. They use fingerprints as the authentication mechanism. In the Netherlands fingerprinting is now mandatory to obtain a passport. If India can deal with this problem on the scale of billions then the Netherlands should be able to handle 16 million users.
A decentralized approach using data and security standards will be much easier to implement than any centralized approach that requires huge databases and complex integration. In the banking world, decentralized solutions based on standards have worked well for the last twenty years or so. You can withdraw money at virtually any ATM around the world. This is not because there is one centralized ATM network with a huge database, but because there are established standards for data exchange, security and settlements between financial institutions that hold the data of their clients. Once you enter your card and PIN, the ATM will send encrypted identification and authentication information over the network to your bank, who returns an approval (if you have sufficient balance), which prompts the ATM to dispense money and your account getting debited.
Similarly the general practitioner, pharmacy or emergency room should be able to access relevant parts of your record over the Internet, if you have authorized them to do so. They can add observations, treatment information and prescriptions to your record to maintain a holistic view. If you would like to support the progress of science you can allow certain qualified institutions to use your medical data, stripped from your personal information, for analytical purposes. They will need to notify you when they do this for which purpose.
A combination of private and public initiatives is the way forward. Once standards are agreed, there could be multiple systems that a patient can choose from to store her record. For instance, the patient can delegate this to her community hospital, because she trusts them to look out for her interests. She can authorize her local pharmacy to access relevant parts of her record and update this when she picks up her prescription. This can be totally browser-based without the need for extensive system integration or an overbearing government. The user manages her demographic data and authorized organizations can be notified of any updates, like change of employer and insurance or change of address.
This approach puts the onus of ownership and control on the user, limits the role of the government to support of standardization and certification and leaves the implementation to private organizations that have the biggest stake in making this a success.
Thursday, December 17, 2009
Wednesday, December 9, 2009
The Chasm
Some IT organizations think they have the wherewithal to do everything in-house. They manage their projects, hire some consultants and run their own datacenters. Around 80% of the IT budget is allocated to keeping existing systems running, which has little or no impact on company growth. Managing legacy applications is labor intensive, as it typically involves a Babylon of languages, applications and systems. The fact that this hodgepodge functions at all can be called a miracle. If a business process changes or a new product is introduced to the market, the IT department takes months, sometimes years, to bring about the necessary changes. There are too many dependencies and complexity is too high to have the application-base adapt rapidly. The transition to ERP packages such as SAP and Oracle did not the resolve the flexibility issue. Implementing SAP is seen by many as “pouring super expensive concrete in the foundation of the corporation”. Rich in functionality, high in cost, but agility is not a characteristic. Most organizations are better served by simplification and modernization of their existing systems, rather than the addition of new features (this also applies to Microsoft Office).
Large software projects remain unpredictable. The root causes of projects that are running way over budget and over time can be typically found on the fault-line of business and technology. If CIOs continue to behave like true cost center managers and keep holding on to a technology view of the world, IT will remain neutral at best and a value drain at worst. The technology providers that can provide relief are few and far between, as they tend to suffer from the “hammer syndrome”, i.e. they see every problem as a technology problem that can be killed by throwing more abbreviations at it. SOA, BI and BPM are all helpful tools, but only if they are used in the proper business context. “The business hasn’t given us clear requirements; they keep changing their minds all the time”. This type of statements indicates a chasm between business and IT. In 2008 the Hackett Group published a report which showed that companies where the IT discipline is weaved into the fabric of the organization have 40% higher margins than their peers who don’t. According to Faisal Hoque, of the BTM Corporation, the convergence of business and IT is a key driver for growth and profitability. This implies that every IT person understands the business value of the application she is working on and every executive has a grasp of the importance of technology for the company direction. Key technology decisions are business decisions. IT is managed using financial models that continuously measure value added to the organization. Applications are grouped in portfolios and directed on value, cost and risk.
In the last couple of years we have seen some dramatic changes in the IT world. A new generation of applications built for the Web and utilizing open source and collaborative development models, are allowing companies to implement flexible applications much faster and cheaper. Applications like Salesorce.com are shared among multiple companies (so called multi-tenant systems) and offer a web-based platform to integrate with existing systems. Applications that “live in the Cloud”, as this is euphemistically referred to, negate the need to manage infrastructure. In general it doesn’t make sense for most companies to own and manage their own technology infrastructure. They lack the expertise and scale to do this effectively. This is better left to specialists like HP and IBM or new players like Amazon.
Web technologies allow organizations to interact and exchange knowledge effectively. Business analysts, architects and project managers will remain close to the business, but there is no need to have developers, testers and configuration managers within the walls of your organization. Using collaborative development tools and methodologies it is possible to get the right talent at the right moment, independent of location and time-zone. When the project is well managed and scope and specifications are under control it doesn’t matter if the code is being cut in Bangor, Maine or Bangalore, India.
To sum it up, there is a plethora of wonderful new models and tools to make IT more efficient. In the end it’s only going to be effective if the barrier between business and IT disappears and IT becomes a true enabler of business value.
Large software projects remain unpredictable. The root causes of projects that are running way over budget and over time can be typically found on the fault-line of business and technology. If CIOs continue to behave like true cost center managers and keep holding on to a technology view of the world, IT will remain neutral at best and a value drain at worst. The technology providers that can provide relief are few and far between, as they tend to suffer from the “hammer syndrome”, i.e. they see every problem as a technology problem that can be killed by throwing more abbreviations at it. SOA, BI and BPM are all helpful tools, but only if they are used in the proper business context. “The business hasn’t given us clear requirements; they keep changing their minds all the time”. This type of statements indicates a chasm between business and IT. In 2008 the Hackett Group published a report which showed that companies where the IT discipline is weaved into the fabric of the organization have 40% higher margins than their peers who don’t. According to Faisal Hoque, of the BTM Corporation, the convergence of business and IT is a key driver for growth and profitability. This implies that every IT person understands the business value of the application she is working on and every executive has a grasp of the importance of technology for the company direction. Key technology decisions are business decisions. IT is managed using financial models that continuously measure value added to the organization. Applications are grouped in portfolios and directed on value, cost and risk.
In the last couple of years we have seen some dramatic changes in the IT world. A new generation of applications built for the Web and utilizing open source and collaborative development models, are allowing companies to implement flexible applications much faster and cheaper. Applications like Salesorce.com are shared among multiple companies (so called multi-tenant systems) and offer a web-based platform to integrate with existing systems. Applications that “live in the Cloud”, as this is euphemistically referred to, negate the need to manage infrastructure. In general it doesn’t make sense for most companies to own and manage their own technology infrastructure. They lack the expertise and scale to do this effectively. This is better left to specialists like HP and IBM or new players like Amazon.
Web technologies allow organizations to interact and exchange knowledge effectively. Business analysts, architects and project managers will remain close to the business, but there is no need to have developers, testers and configuration managers within the walls of your organization. Using collaborative development tools and methodologies it is possible to get the right talent at the right moment, independent of location and time-zone. When the project is well managed and scope and specifications are under control it doesn’t matter if the code is being cut in Bangor, Maine or Bangalore, India.
To sum it up, there is a plethora of wonderful new models and tools to make IT more efficient. In the end it’s only going to be effective if the barrier between business and IT disappears and IT becomes a true enabler of business value.
Subscribe to:
Posts (Atom)