Open Source and Cloud Computing
1 August 2008
These are some comments I have on Tim O’Reilly’s insightful post about open source and cloud computing.
There are interesting thoughts in the post about clouds becoming monolithic and how control of data by a few privileged companies will drive the development of services which access and manipulate our information. The entry into the market of smaller organizations with new and better ideas becomes more difficult. This is all probably true. One of the main contributing factors to this happening is that we all let it happen. Most people are not technical and are primarily concerned with an application performing some function adequately for their needs. If it happens to be a service built and hosted by a monopoly, most people don’t care. At least not until they grow weary of the application, perceive their may be better alternatives and then want them. So the evolution of monopolies with monolithic systems arises from organizations pushing their services for profit (which is fine) and the majority of service users only being concerned with their own satisfaction. Open source and open apis and standards don’t solve this problem.
Open source does make it easier for those who are technically savvy to build new software systems and services. It doesn’t solve the issue of being able to easily publish services for wide usage. It doesn’t solve the problem of having access to network, server and storage resources which the services may need to use. If you have choices of services that represent these resources, you begin to solve the problem. If these services can be discovered dynamically rather than referenced as static locations it begins to provide an even better solution. The process becomes:
- I decide I want a type of service (maybe storage)
- I lookup what my choices might be
- I discover which ones are available
- and select one.
Standards don’t necessarily help out either. Many of the existing protocols are sufficient for communication and data transfer. Standard APIs satisfy groups of service providers that may share resources and software. But if everyone uses the same standard doesn’t it become monolithic and antiquated as it no longer provides the needs of and access to newly emerging technologies? Having multiple standards and options is usually a better alternative. I wish I could credit the original author of this quote that has been around for at least 15 years; “The great thing about standards is that there are so many to choose from.”
So the answer to keeping monolithic organizations from squeezing out small companies’ new ideas is not through the use of open source and standards (although open source is beneficial). The answer lies in creating a platform which executes on compute resources within the internet allowing among other things:
- a way to look up desired services
- identify if they have the desired capabilities
- discover where these services maybe available
- select the desired services for use
By services I mean, software that represents a set of capabilities implemented as
- a software component
- as a component in conjunction with other components or services
- or as a software component utilizing hardware resources such as CPU, Storage, network bandwidth
- the services are dynamically hosted where they may run most efficiently
The answer lies in allowing everyone the opportunity to create and publish services for use on a platform accessible by everyone. Think of it as a layer on top of the existing internet. It is a network overlay within which everyone has access to services. There would be a large collection to choose from with an always changing selection. This is analogous to selection of services we choose in our everyday lives for food, auto repair, home services, etc. The answer doesn’t lie in enforcing open source and standards, the answer lies in creating an open execution platform enabling all to create and provide services.