Tag Archives: security

What We Should Learn From the Apple SSL Bug

Two years ago a paper appeared with the provocative title “The Most Dangerous Code in the World.” Its subject? SSL, the foundation for secure e-commerce. The world’s most dangerous software, it turns out, is a technology we all use on a more or less daily basis.

The problem the paper described wasn’t an issue with the SSL protocol, which is a solid and mature technology, but with the client libraries developers use to start a session. SSL is easy to use but you must be careful to set it up properly. The authors found that many developers aren’t so careful, leaving the protocol open to exploit. Most of these mistakes are elementary, such as not fully validating server certificates and trust chains.

Another dramatic example of the pitfalls of SSL emerged this last weekend as Apple issued a warning about an issue discovered in its own SSL libraries on iOS. The problem seems to come from a spurious goto fail statement that crept into the source code, likely the result of a bad copy/paste. Ironically, fail is exactly what this extra code did. Clients using the library failed to completely validate server certificates, leaving them vulnerable to exploit.

The problem should have been caught in QA; obviously, it wasn’t. The lesson to take away from here is not that Apple is bad—they responded quickly and efficiently the way they should—but that even the best of the best sometimes make mistakes. Security is just hard.

So if security is too hard, and people will always make mistakes, how should we protect ourselves? The answer is to simplify. Complexity is the enemy of good security because complexity masks problems. We need to build our security architectures on basic principles that promote peer-reviewed validation of configuration as well as continuous audit of operation.

Despite this very public failure, it is safe to rely on SSL as a security solution, but only if you configure it correctly. SSL is a mature technology, and it is unusual for problems to appear in libraries. But this weekend’s event does highlight the uncomfortable line of trust we necessarily draw with third party code. Obviously, we need to invest our trust carefully. But we also must recognize that bugs happen, and the real test is about how effectively we respond when exploits appear and patches become available. Simple architectures work to our favour when the zero-day clock starts ticking.

On Monday at the RSA Conference, CA Technologies announced the general availability of our new SDK for securing mobile transactions. We designed this SDK with one goal: to make API security simpler for mobile developers. We do this by automating the process of authentication, and setting up secure connections with API servers. If developers are freed up from tedious security programming, they are less likely to do something wrong—however simple the configuration may appear. In this way, developers can focus on building great apps, instead of worrying about security minutia.

In addition to offering secure authentication and communications, the SDK also provides secure single sign on (SSO) across mobile apps. Say the word SSO and most people instinctively picture one browser authenticating across many web servers. This common use case defined the term. But SSO can also be applied to the client apps on a mobile device. Apps are very independent in iOS and Android, and sharing information between them, such as an authentication context, is challenging. Our SDK does this automatically, and securely, providing a VPN-like experience for apps without the very negative user experience of mobile VPNs.

Let me assure you that this is not yet another opaque, proprietary security solution. Peel back the layers of this onion and you will find a standards-based OAuth+OpenID Connect implementation. We built this solution on top of the SecureSpan Gateway’s underlying PKI system and we leveraged this to provide increased levels of trust.

If you see me in the halls of the RSA Conference, don’t hesitate to stop me and ask for a demo. Or drop by the CA Technologies booth where we can show you this exciting new technology in action.

Advertisement

RSA Conference 2014 Preview And A Special CA Technologies/Layer 7 Event

Despite all our advances in communications—from social networking, to blogs, to actual functional video meetings—the trade conference is still a necessity. Maybe not as much for the content, which makes the rounds pretty fast regardless of whether you attend the show or not, but for the serendipitous meetings and social networking (in the pre-Facebook sense).

I find something comforting in the rhythm and structure a handful of annual conferences bring to my life. The best ones stay rooted in one location, occurring at the same time, year after year. They are as much defined by time and place as topic.

If it’s February, it must be San Francisco and the RSA conference. I’ve attended for years, and despite the draw from the simultaneous Mobile World Congress in Barcelona, RSA is a show I won’t skip. But I do wish MWC would bump itself a week in either direction so I could do both.

As everyone knows, this year the press made much ado of a few high profile boycotts of the conference and the two alt-cons, Security B-sides and TrustyCon, that sprung up in response. But I think it’s important to separate RSA the company from RSA the conference. The latter remains the most important security event of the year.

Every year, one theme rises above the rest. I’m not referring to the “official” theme, but the trends that appear spontaneously in the valley. The theme this year should be security analytics. The venture community put this idea on an aggressive regime of funding injections. We should expect an entertaining gallery of result good and bad. But either way, we will learn something, and it would be a poor move to bet against this sector’s future.

I’m also expecting 2014 to bring some real SDN traction. Traditional security infrastructure is low hanging fruit vendors too often miss. RSA is where SDNs for security will finally get a long awaited debut.

MWC may be the premier event for mobile, but most mobile security companies cover both, and CA is no exception. At RSA we’re showcasing our new Mobile Access Gateway (MAG). This features SDKs for iOS, Android, and JavaScript that make enterprise authentication simple for mobile developers.  As a bonus, this SDK offers cross app SSO. It means users sign on just once, from any authorized app. You should definitely come by the CA Technologies booth and have a look. And if you do see me at the show, be sure to ask me about the integrated PKI—surely one of the coolest, unsung features underneath the SDK hood.

CA and Layer 7 will host an afternoon event Monday Feb 24 at the nearby Marriott Marquis, and you are invited. You may recall we’ve held a few of these before, but this year, we have a very special guest. The event features Forrester Analyst Eve Maler, who will talk about Zero Trust and APIs. It is a great way to kick off the RSA 2014, and we’ll even give you a nice lunch. Who could refuse that?

To join us, sign up here.

New eBook: Five Simple Strategies For Securing Your APIs

Recently I wrote about the excitement I feel working within CA. This company is full of talented people, and when you draw on their capabilities, amazing stuff happens. Here in R&D we have some innovative solutions underway that are a tangible result of CA and Layer 7 working well together. I can’t reveal these yet, but you can see the same 1+1=3 equation at work in other groups throughout the organization.

Here is a good example. It’s an eBook we’ve assembled to help managers and developers build more secure APIs. The material started with a presentation I first delivered at a recent RSA show. We updated this with best practices developed by real customers facing real challenges. The content is solid, but what I love is the final product. It’s accessible, easy to digest, and the layout is fantastic. Half the battle is delivering the message so that it’s clear, approachable and actionable. This is just what we delivered. And best of all, it’s free.

The last year has been a difficult one in security. The Snowden affair made people talk about security; this, at least, is good and the dialog continues today. But if 2013 was a year of difficult revelation, 2014 is going to be about back-to-basics security.

APIs offer tremendous business value to enterprise computing. But they also represent a potential threat. You can manage this risk with a solid foundation and good basic practices, but you need to know where to start. This is the theme of our new eBook. It offers simple guidelines, not tied to a particular technology. You should apply these whenever you deploy APIs.

I hope you find this eBook useful. As always, I’d love to hear your feedback.

Download: Five Simple Strategies For Securing Your APIs.

Cisco and the Internet of Everything

John Chambers, CEO of Cisco, just published a good blog entry about the potential for change caused by universal connectivity, not just of our mobile gadgets, but of pretty much everything. Much has been made of late of the so-called “Internet of Things (IoT)”, to which Cisco is upping the scope and going so far as to make a bold estimate that 99.4% of objects still remain unconnected. This of course is great fodder for late night talk show hosts. I’ll leave this softball to them, and focus instead on some of the more interesting points in Chamber’s post and the accompanying white paper.

It strikes me that there might be more to Cisco’s Internet of Everything (#I0E) neologism than just a vendor’s attempt to brand what still may be a technology maverick. Internet of Everything sounds so much better than the common alternative when you append Economy on the end, and this is how it first appears in Chamber’s post. And that’s actually important, because adding economy in the same breath is an acknowledgement that this isn’t just marketing opportunism as much as a recognition that like mobility, the IoE is a potentially great catalyst for independent innovation. In fact, Cisco’s paper really isn’t about technology at all, but instead an analysis of market potential represented in each emerging sector, from smart factories to college education.

It is exactly this potential for innovation—a new economy—that is exciting. The combination of Mobile+APIs was so explosive precisely because it combined a technology with enormous creative potential (APIs) with a irresistible business impetus (access to information outside the enterprise network). The geeks love enabling tools, and APIs are nothing if not enabling; mobile just gives them something to build.

I0E of course is the ultimate business driver—and leveraging APIs as the enabler, it equals opportunity of staggering proportion. Like mobile before it—and indeed, social web integration before this—IoE will come about precisely because the foundation of APIs already exists.

It is here where I disagree with some IoT pundits who advocate specialized protocols to optimize performance. No thank you; it isn’t 1990 and opaque binary protocols no longer work for us except when streaming of large data sets (I’m looking at you, video).

Security in the IoE will be a huge issue, and on this topic Cisco has this to say:

IoE security will be addressed through network-powered technology: devices connecting to the network will take advantage of the inherent security that the network provides (rather than trying to ensure security at the device level).

I agree with this, because security coding is still just too hard and too easy to implement wrong. One of the key lessons of mobile development is that we need to make it easy for developers to enable secure communications automatically. Take security out of the hands of developers, put it in the hands of dedicated security professionals, and trust me, the developers will thank you.

As IoE extends to increasingly resource-constrained devices, the simpler we can make secure development, the better. Let application developers focus on creating great apps, and a new economy will follow.

(ISC)2 Webinar – Identity is the New Perimeter: Identity and BYOD

Join me and Tyson Whitten from CA Technologies as we deliver a webinar about security in the BYOD world. The title of our talk is Identity and BYOD, and we are honored to be presenting as part of the International Information Systems Security Certification Consortium (ISC)² security series.

This webinar will take place on Oct 25, 2012 at 1pm ET/10am PT. We will delve deeply into the issues created by the Bring Your Own Device (BYOD) movement in the enterprise, and discuss what you can do to manage the associated risk.

You can sign up on the (ISC)² website.

Why I Still Like OAuth

That sound of a door slamming last week was Eran Hammer storming out of the OAuth standardization process, declaring once and for all that the technology was dead, and that he would no longer be a part of it. Tantrums and controversy make great social media copy, so it didn’t take long before everyone seemed to be talking about this one. In some quarters, you’d hardly know the London Olympics had begun.

So what are we to really make of all this? Is OAuth dead, or at least on the road to Hell as Eran now famously put it? Certainly my inbox is full of emails from people asking me if they should stop building their security architecture around such a tainted specification.

I think Tim Bray, who has vast experience with the relative ups and downs of technology standardization, offered the best answer in his own blog:

It’s done. Stick a fork in it. Ship the RFCs.

Which is to say sometimes you just have to declare a reasonable victory and deal with the consequences later. OAuth isn’t perfect, nor is it easy; but it’s needed, and it’s needed now, so let’s all forget the personality politics and just get it done. And hopefully right across the street from me here in Vancouver, where the IETF is holding it’s meetings all this week, this is what will happen.

In the end, OAuth is something we all need, and this is why this specification remains important. The genius of OAuth is that it empowers people to perform delegated authorization on their own, without the involvement of a cabal of security admins. And this is something that is really quite profound.

In the past we’ve been shackled by the centralization of control around identity and entitlements (a fancy term which really just describes the set of actions your identity is allowed, such as writing to a particular file system). This has led to a status quo in nearly every organization that is maintained first because it is hard to do otherwise, but also because this equals power, which is something that is rarely surrender without a fight.

The problem is that centralized identity admin can never effectively scale, at least from an administrative perspective. With OAuth, we can finally scale authentication and authorization by leveraging the user population itself, and this is the one thing that stands a chance to shatter the monopoly on central Identity and Access Management (IAM). OAuth undermined the castle, and the real noise we are hearing isn’t infighting on the spec but the enterprise walls falling down.

Here is the important insight of OAuth 2.0: delegated authorization also solves that basic security sessioning problem of all apps running over stateless protocols like HTTP. Think about this for a minute. The basic web architecture provides for complete authentication on every transaction. This is dumb, so we have come up with all sorts of security context tracking mechanisms, using cookies, proprietary tokens, etc. The problem with many of these is that they don’t constrain entitlements at all; a cookie is as good as a password, because really it just linearly maps back to an original act of authentication.

OAuth formalizes this process but adds in the idea of constraint with informed user consent. And this, ladies and gentlemen, is why OAuth matters. In OAuth you exchange a password (or other primary security token) for a time-bound access token with a limited set of capabilities to which you have explicitly agreed. In other words, the token expires fast and is good for one thing only. So you can pass it off to something else (like Twitter) and reduce your risk profile, or—and this is the key insight of OAuth 2.0—you can just use it yourself as a better security session tracker.

The problem with OAuth 2.0 is it’s surprisingly hard to get to this simple idea from the explosion of protocol in OAuth 1.0a. Both specs too quickly reduce to an exercise in swim lane diagram detail which ironically run counter to the current movement around simple and accessible that drives the modern web. And therein lies the rub. OAuth is more a victim of poor marketing than bad specsman-ship. I have yet to see a good, simple explanation of why, followed by how. (I don’t think OAuth 1.0 was well served by the valet key analogy, which distracts from too many important insights.) As it stands today, OAuth 2.0 makes Kerberos specs seem like grade school primer material.

It doesn’t have to be this way. OAuth is actually deceptively simple; it is the mechanics that remain potentially complex (particularly those of the classic 1.0a, three-legged scenario). But the same can be said of SSL/TLS, which we all use daily with few problems. What OAuth needs are a set of dead simple (but nonetheless solid) libraries on the client side, and equally simple and scalable support on the server. This is a tractable problem and it is coming. It also needs much better interpretation so that people can understand it fast.

Personally, I agree in part with Eran Hammer’s wish buried in the conclusion of his blog entry:

I’m hoping someone will take 2.0 and produce a 10 page profile that’s useful for the vast majority of web providers, ignoring the enterprise.

OAuth absolutely does need simple profiling for interop. But don’t ignore the enterprise. The enterprise really needs the profile too, because the enterprise badly needs OAuth.

Upcoming RSA Conference Talk: Hacking’s Gilded Age—How APIs Will Increase Risk and Chaos

I’m going to be speaking about API security at next week’s 2012 RSA conference. I gave this talk the provocative title Hacking’s Gilded Age—How APIs Will Increase Risk and Chaos. It’s scheduled for Friday, March 2, 2012 at 10:10am in room 302.

Here’s the long form of the abstract, which gives a little more detail of what I’m going to cover in the talk than the short abstract that’s online:

This session will explore why APIs (which are largely RESTful services) are fundamentally different than conventional web sites, despite the fact that they share common elements such as the HTTP protocol. Web sites abstract back end applications behind a veneer of HTML that should—if it well designed—constrain capability and thus limit an organization’s security exposure. APIs in contrast are a more explicit interface leading directly into applications. These often self-document their intent, and thus provide a hacker with important clues that may reveal potential attack vectors—from penetration to denial-of-service. Because of this, APIs require a much more sophisticated model for access control, confidentiality around parameters, integrity of transactions, attack detection, throttling, and auditing.

But aside from the technological differences, there are cultural differences in the web development community that considerably increase the risk profile of using APIs. Many API developers have a background in web site development, and fail to understand why APIs demand a more rigorous security model that the web sites they were trained on. In a misguided attempt to promote agility, convenience is often chosen over precaution and rigor. The astonishingly rapid rise of RESTful services over SOAP, OAuth over SAML, API keys over certificates, and SSL (or nothing) over WS-Security is a testament to fast and informal prevailing over complex and standardized.

Nevertheless, it is certainly possible to build secure APIs, and this session will demonstrate specifically how you can spearhead a secure and scalable API strategy. For every bad practice, we will offer an alternative pattern that is simple-but-secure. We will explicitly show how the API community is dangerously extending some web paradigms, such as avoiding general use of SSL or not protecting security tokens, into the API world where the cost of failure is far greater. And finally, we will prescribe a series of directives that will steer developers away from the risky behaviors that are the norm on the conventional web.

I hope you can attend. And if you do, please come up after the talk and say hello.

See you next week in San Francisco.

Security in the Clouds: The IPT Swiss IT Challenge

Probably the best part of my job as CTO of Layer 7 Technologies is having the opportunity to spend time with our customers. They challenge my assumptions, push me for commitments, and take me to task for any issues; but they also flatter the whole Layer 7 team for the many things we do right as a company. And for every good idea I think I have, I probably get two or three great ones out of each and every meeting with the people who use SecureSpan to solve real problems on a daily basis.

All of that is good, but I’ve learned that if you add skiing into the mix, it becomes even better. Layer 7 is fortunate to have an excellent partnership with IPT, a very successful IT services company out of Zug, Switzerland. Each year they hold a customer meeting up in Gstaad, which I think surely gives them an unfair advantage over their competitors in countries less naturally blessed. I finally managed to draw the long straw in our company, was able to join my colleagues from IPT at their annual event earlier this January.

Growing up in Vancouver, with Whistler practically looming in my backyard, I learned to ski early and ski well. Or so I thought, until I had to try and keep up to a crew of Swiss who surely were born with skis on their feet. But being challenged is always good, and I can say the same for what I learned from my Swiss friends about technology and its impact on the local market.

The Swiss IT market is much more diverse than people from outside of it may think. Yes, there are the famous banks; but it is also an interesting microcosm of the greater European market—albeit run with a natural attention to detail and extraordinary efficiency. It’s the different local challenges which shape technology needs and lead to different emphasis.

SOA and Web services are very mature and indeed are pushed to their limits, but the API market is still in its very early stages. The informal, wild west character of RESTful services doesn’t seem to resonate in the corridors of power in Zurich. Cloud appears in patches, but it is hampered by very real privacy concerns, and this of course represents a great opportunity. Secure private clouds are made for this place.

I always find Switzerland very compelling and difficult to leave. Perhaps it’s the miniscule drop of Swiss ancestry I can claim. But more likely it’s just that I think that the Swiss have got this life thing all worked out.

Looking forward to going back.

The Future Is A Story About Mobile Computing

Earlier today CNET published an interview with Marc Andreessen, in which the Netscape founder and influential VC outlines his personal vision for where tech is heading in the near future. His new tagline, from a piece he wrote for the New York Times, is “software is eating the world”, a blunt reference to how software increasingly appears out of nowhere to utterly consume a traditional practice or business model—be this in commerce, the social realm, or just about everywhere.

Andreessen asserts that this affect will only accelerate in the future because of the explosion we are experiencing in mobile computing:

Most of the people in the world still don’t have a personal computer, whereas in three to five years, most people in the world will have a smartphone…. If you’ve got a smartphone, then I can build a business in any domain or category and serve you as a customer no matter where you are in the world in just gigantic numbers–in terms of billions of people.

This new scale of mobile is something we’re only beginning to see, but it is becoming clear that the change this brings about is going to be profound. Mobile computing is very interesting to Layer 7; watch our for some interesting new developments coming out of our labs early in the new year.

I discovered a similar indicator of mobile interest using Google’s Insights for search. Pete Soderling and Chris Comerford, from Stratus Security Technologies, gave an excellent talk back in 2010 at the RSA show about REST security. They illustrated how the zeitgeist around distributed computer communications was changing over time by comparing search volume for “SOAP Security” (blue line) and “REST Security” (red line):

Try this out for yourself here.

What struck me about this was not that REST came up so fast—you’d have to be living under a rock to have missed that one—but that the two approaches have been tracking roughly equivalent over the last year. This mirrors our own experience at Layer 7, where we support both SOAP and REST security equally. We see similar patterns of interest coming from our customers.

What is even more interesting is what happens when you add “Mobile Security” (yellow line) to the mix:

Try it here.

The future indeed, will be written from a hand held device.

Clouds Down Under

When I was young I was fascinated with the idea that the Coriolis effect—the concept in physics which explains why hurricanes rotate in opposing direction in the southern and northern hemispheres—could similarly be applied to common phenomenon like water disappearing down a bathtub drain. On my first trip to Cape Town many years ago I couldn’t wait to try this out, only to realize in my hotel bathroom that I had never actually got around to checking what direction water drains in the northern hemisphere before I left. So much for the considered rigor of science.

It turns out of course that the Coriolis effect, when applied on such a small scale, becomes negligible in the presence of more important factors such as the shape of your toilet bowl. And so, yet another one of popular culture’s most cherished myths is busted, and civilization advances ever so slightly.

Something that definitely does not run opposite south of the equator turns out to be cloud computing, though to my surprise conferences down under take a turn in the positive direction. I’ve just returned from a trip to Australia where I attended the 2nd Annual Future of Cloud Computing in the Financial Services, held last week, held in both Melbourne and Sydney. What impressed me is that most of the speakers were far beyond the blah-blah-blah-cloud rhetoric we still seem to hear so much, and focused instead on their real, day-to-day experiences with using cloud in the enterprise. It was as refreshing as a spring day in Sydney.

Greg Booker, CIO of ANZ Wealth, opened the conference with a provocative question. He simply asked who in the audience was in the finance or legal departments. Not a hand came up in the room. Now bear in mind this wasn’t Microsoft BUILD—most of the audience consisted of senior management types drawn from the banking and insurance community. But obviously cloud is still not front of mind for some very critical stakeholders that we need to engage.

Booker went on to illustrate why cross-department engagement is so vital to making the cloud a success in the enterprise. ANZ uses a commercial cloud provider to serve up most of its virtual desktops. Periodically, users would complain that their displays would appear rendered in foreign languages. Upon investigation they discovered that although the provider had deployed storage in-country, some desktop processing took place on a node in Japan, making this kind of a grey-area in terms of compliance with export restrictions on customer data. To complicate matters further, the provider would not be able to make any changes until the next maintenance window—an event which happened to be weeks away. IT cannot meet this kind of challenge alone. As Randy Fennel, General Manager, Engineering and Sustainability at Westpac put it succinctly, “(cloud) is a team sport.”

I was also struck by a number of insightful comments made by the participants concerning security. Rather than being shutdown by the challenges, they adopted a very pragmatic approach and got things done. Fennel remarked that Westpac’s two most popular APIs happen to be balance inquiry, followed by their ATM locator service. You would be hard pressed to think of a pair of services with more radically different security demands; this underscores the need for highly configurable API security and governance before these services go into production. He added that security must be a built-in attribute, one that must evolve with a constantly changing threat landscape or be left behind. This thought was echoed by Scott Watters, CIO of Zurich Financial Services, who added that we need to put more thought into moving security into applications. On all of these points I would agree, with the addition that security should be close to apps and loosely coupled in a configurable policy layer so that over time, you can easily address evolving risks and ever changing business requirements.

The entire day was probably best summed up by Fennel, who observed that “you can’t outsource responsibility and accountability.” Truer words have not been said in any conference, north or south.