Monthly Archives: August 2010

The Top 50 Cloud Bloggers

I’m happy to learn that I’ve made Cloud Computing Journal’s list of the Top 50 Bloggers in Cloud Computing.

What, Me Worry?

According to Yahoo news Infrastructure Services Users Worry Less About Security. This article references a Yankee Group study that found although security remains a top barrier slowing the adoption of cloud services in the enterprise, most companies that have adopted Infrastructure-as-a-Service (IaaS) worry less about security once they begin using the technology.

Once they’ve made the leap into the cloud, the article suggests, users conclude that the security issues aren’t as significant as they had been led to believe. I find myself in partial agreement with this; the industry has created a level of hysteria around cloud security that isn’t necessarily productive. Taking pot shots at the security model in the cloud is pretty easy, and so many do—regardless of whether their aim is true (and for many, their aim is not).

Nevertheless, my interpretation of these results is that they are uncovering less a phenomenon of confidence genuinely earned and more a case of misplaced trust. The article makes an interesting observation about the source of this trust:

Twenty-nine percent of the companies in the survey viewed system integrators as their most trusted suppliers of cloud computing. But among early adopters of IaaS, 33 percent said they turn to telecom companies first.

Do you remember, back around the turn of the century, when large-scale PKI was first emerging? The prevailing wisdom was that state-sponsored PKI should be administered by the post offices because this organization above all was perceived as trustworthy (as well as being centralized and a national responsibility). Hong Kong, for instance, adopted this model. But in general, the postal-run PKI model didn’t take hold, and today few federal post services are in the business of administering national identity. Trust doesn’t transfer well, and trust with letters and packages doesn’t easily extend to trust with identity.

Investing generalized trust in the telcos reminds me of the early PKI experience. The market is immature, and because of this so too are our impressions. Truthfully, I think that the telcos will be good cloud providers—not because I have an inherent trust in them (I actively dislike my cell provider on most days), but because the telcos I’ve spoken to that have engaged in cloud initiatives are actually executing extremely well. Nevertheless, I don’t think I should trust them to secure my applications. This is ultimately my responsibility as a cloud customer, and because of I can’t reasonably trust any provider entirely, I must assume a highly defensive stance in how I secure my cloud-resident applications.

I hope my provider is good at security; but I need to assume he is not, and prepare myself accordingly.


My New Book, Cloud Computing: Principles, Systems and Applications, is Now Available

I’m happy to announce that I have a paper published in a new cloud computing textbook published by Springer. The book is called Cloud Computing: Principles, Systems and Applications. The paper I wrote is Technologies for Enforcement and Distribution of Policy in Cloud Architectures. If you click on the link you should be able to preview the abstract and the first few pages online.

The editors of the book are Dr. Nick Antonopoulos, who is Professor and Head of the School of Computing at the University of Derby, UK and Dr. Lee Gillam, who is a Lecturer in the Department of Computing at the University of Surrey, UK. I participated on the review committee for the text, and Drs. Antonopoulos and Gillam have pulled together an excellent compilation of work. Although this book is intended as an academic work of primary interest to researchers and students, the content is nevertheless very timely and relevant for IT professionals such as architects or CTOs.

Lately much of my writing has been for a commercial audience, so it was nice to return to a more academic style for this chapter. I’ve carefully avoided book commitments for the last few years, but the opportunity to publish in a Springer book, a publisher I’ve always considered synonymous with serious scientific media, was just too good to pass up. Every book project proves to be more work than the author first imagines, and this was no exception (something to which my family will attest). But I’m very happy with the results, and I hope that this text proves its value to the community.

Why Health Care Needs SOA

A recent article in DarkReading offers a powerful argument as to why the health care sector desperately needs to consider Service Oriented Architecture (SOA). In her piece Healthcare Suffers More Data Breaches Than Financial Services So Far This Year, Erika Chickowski cites a report indicating that security breeches in health care appear to be on the rise this year, with that sector reporting over three times more security incidents than the financial services industry.

I worked for many years in a busy research hospital, and frankly this statistic doesn’t surprise me; health care has all of the elements the lead to the perfect storm of IT risk. If there is one sector that could surely benefit from adopting SOA as a pretext to re-evaluate security as a whole, it is health care.

Hospitals and the health care eco-system that surround these are burdened with some of the most heavily siloed IT I have ever seen. There are a number of reasons why this is so, not the least of which is politics that often appear inspired by the House of Borgia. But the greatest contributing factor is the proliferation of single-purpose, closed and proprietary systems. Even the simplest portable x-ray machine has a tremendously sophisticated computer system inside of it. The Positron Emission Tomography (PET) systems that I worked on included racks of what at the time were state-of-the-art vector processors used to reconstruct massive raw data sets into understandable images. Most hospitals have been collecting systems like this for years, and are left with a curiosity cabinet of samples representing different brands and extant examples of nearly every technological fad since the 1970s.

I’m actually sympathetic to the vendors here because their products have to serve two competing interests. The vendors need to package the entire system into a cohesive whole with minimal ins and outs to ensure it can reasonably pass the rigorous compliance necessary for new equipment. The more open a system is, the harder it is to control the potential variables, which is a truism also in the security industry.  Even something as simple as OS patching needs to go through extensive validation because the stakes are so high. The best way to manage this is to close up the system as much as reasonably possible.

In the early days of medical electronics, the diagnostic systems were very much standalone and this strategy was perfectly sound. Today, however, there is a need to share and consolidate data to potentially improve diagnosis. This means opening systems up—at least to allow access to the data, which when approached from the perspective of traditional, standalone systems, usually means a pretty rudimentary export. While medical informatics has benefited some from standardization efforts, the medical systems still generally reduce to islands of data connected by awkward bridges—and it is out of this reality that security issues arise.

Chickowski’s article echos this, stating:

To prevent these kinds of glaring oversights, organizations need to find a better way to track data as it flows between the database and other systems.

SOA makes sense in health care because it allows for effective compartmentalizing of services—be these MRI scanners, lab results, or admission records—that are governed in a manner consistent with an overall security architecture. Good SOA puts security and governance upfront. It provides a consistent framework that avoids the patchwork solutions that too easily mask significant security holes.

A number of forward-looking health providers have adopted a SOA strategy with very positive results. Layer 7 Technologies partnered with the Universitry of Chicago Medical Center (UCMC) to build a SOA architecture for securely sharing clinical patient data with their research community. One of the great challenges in any medical research is to gather sample populations that are statistically significant. Hospitals collect enormous volumes of clinical data each day, but often these data cannot be shared with research groups because of issues in compliance, control of collection, patient consent, etc. UCMC uses Layer 7’s SecureSpan Gateways as part of its secure SOA fabric to isolate patient data into zones of trust. SecureSpan enforces a boundary between clinical and research zones. In situations where protocols allow clinical data to be shared with researchers, SecureSpan authorizes its use. SecureSpan even scrubs personally identifiable information from clinical data—effectively anonymizing the data set—so that it can be ethically incorporated into research protocols.

The UCMS use case is a great example of how SOA can be a protector of information, promoting the valuable use of data while ensuring that only the right people have access to the right views of that information.

To learn more about this use case, take a look at the detailed description available on the Layer 7 Technologies web site.