A rant from a Googler comparing Amazon’s focus on a platform vs Google’s focus on products. Fundamental.
Last night, I spent the evening with a bunch of PHP developers in DC. This informal gathering in the DC-PHP community is a regular occurrence known as the DC PHP Beverage Subgroup – Virginia Chapter. There is also a DC-chapter that meets once a month as well. These two informal gatherings are for the sole purpose of getting together, enjoying some food and cold beverages and generally just talking about anything and everything. It complements the official DC PHP meeting which is generally a technical presentation directly related to PHP.
So last night, we were yukking it up about how PHP has re-invoked the GOTOoperator, a programming mechanism that, we thought, died with the BASIC programming language of yore. Coding in BASIC was very procedural and not very rich in its abilities.
10 PRINT "Hello World!"
20 GOTO 10
One of our number suggested that PHP, since they regressed so badly with the GOTO operator inclusion, should also adopt line numbers in code as well. :) This conversation devolved into all the cliché buzzwords of our time and eventually, it was suggested that what we really need is a “BASIC Cloud Framework API”.
Putting aside BASIC, which is not really practical or desirable, the concept of a Cloud-based Framework API, whatever it actually is, is not all that undesirable. If you think about it, we already have a Cloud-based API for APIs (yes, I realize this is very meta) with the super-cool Gnip which we covered last year when they launched. Social services channel their data through Gnip and Gnip provides a single API layer for data access. And it’s built in the cloud.
The concept of single layer APIs is not a new one. Why can’t we have an API for cloud-services as well?
Think about this. Right now, anyone wanting to build an application has three options. They can build out a server cluster or farm that physically scales and, by proxy, ends up costing a lot as physical hardware costs a lot. A second option would involve a virtual cluster made up of virtual machines. You still need hardware, but each server souped up with up to 32G of RAM can theoretically host tons of virtual machines all acting as a physical server. An entirely virtual solution is hosting applications in “the cloud”.
Cloud computing is not without it’s challenges. I’ve challenged the reliance on it in the past, and I still do. However, with cloud services like Amazon’s EC2, S3 or Google’s App Engine, it becomes entirely possible to not only store data in the cloud, but also run and maintain entire services in the cloud.
The problem is, each of them require different things. Amazon has a suite of developer tools that are needed to build against their cloud offerings. Google App Engine only supports Python, Ruby or Java.
There should be a way to abstract this development to a single layer — or API, if you will — to take advantage of this.
Laugh it up, chuckles. A cloud-based framework API is not all that ridiculous of a concept. The world once thought the earth was flat as well.
Without a doubt, I am a data whore. I love raw data. I love APIs. I love finding interesting ways to mashup data. With the new found craze in government for openness, led in no small part from the Federal level and work endorsed by the Obama Administration to work pushed forward by Sunlight Labs, Craigslist founder Craig Newmark and others, I’d expect the openness to trickle down to state and local levels. And it is.
On one level, you have Washington, DC (where I live) who has been making impressive strides through OCTO (Office of the Chief Technology Officer) with the assistance of iStrategyLabs and the Apps for Democracy competition.
Washington, DC is in production of it’s Open 311 API, a RESTful data API that they are careful to note is in development. (We will be building a PHP library around this API shortly, so keep an eye for that announcement over at Emmense.com).
In using a REST API, DC is opening up the service sector of the DC City government for developers of all sorts to tap into and build applications around. All to meet the needs of city residents.
San Francisco, on the other hand, just announced that they are utilizing Twitter to allow residents to submit issues directly from their favorite web application. Simply by following @sf311 (and being refollowed), citizens are able to DM requests.
Personally, I am partial to DC’s approach but I applaud both cities for pushing the boundaries to bring city government closer to the people. Frankly, I’m a little concerned about San Francisco utilizing Twitter for this purpose, for the same reason that I am hesitant about any business making their business model about Twitter. Twitter has not proved, at least in my mind, that they have the business savvy to keep their service from going out of business. Likewise, they have not proved their technical ability to make a fail-less system. It’s a game of Russian roulette to base a business (or government service) around this application. San Francisco probably has failover plans and this is just another approach though, so arguably it’s not a significant risk.
However, the solution to the 311 problem becomes infinitely more scalable when utilizing a pure API and allowing the pure submission and retrieval of data. And the use of an API keeps responsibility in-house. Twitter is not paid for by taxpayer money, so there is no expectation of quality control. A government owned and maintained API, on the other hand, provides safeguards that make sense.
All that aside, it is clear that both DC and San Francisco recognize that the accessibility of governments to their citizens is an utmost important goal in 2009. They are taking laudable steps to break down the barriers and solve real problems with modern technologies. For that, I can find no fault.
Microsoft is clearly getting hipper with their offerings. The company that has been notoriously committed to offline products, like their Windows operating System and productivity suite, Microsoft Office, to the detriment of their online offerings seems to definitely be moving into the internet space more. They are, in fact, trying to own the online space now which is a significant internal company departure from the past.
Of course, they have jumped headfirst into the incubation industry by launching BizSpark, which seeks to provide promising young companies with technical resources, such as their server offerings, and human and business resources to help these investment companies, mostly web based startups, become viable.
Naturally, one of the odd players in the Microsoft ecosystem has been the Xbox 360 platform. It is a killer gaming platform (I am an avid Xbox Gamer) and their online gameplay over Xbox Live is second to none. It has always lacked any kind of cohesion for an online service though. Especially in 2008, where Facebook and Twitter rule the day and it is rare to find someone who is not on some kind of social networking platform.
So a few months ago, when word leaked out about a complete overhaul to the Xbox Live experience, there were many of us who were excited about a modernization with significant incorporation of social networking elements. With the launch the other day, some of that has been delivered.
The Xbox Experience, as it’s called, is a significantly streamlined dashboard making it extremely easy to access common items, such as the Xbox Marketplace. Incorporation of online video giant, also dabbling in the social networking space, Netflix makes the Experience worlds better. It is possible to watch Netflix “Instant Play” queue items directly via your Xbox Dashboard. Sweet, if the video quality was better. Putting this aside, the mashup is a great step in making the Xbox an entertainment hub.
However, significant issues remain. A “big bling” element to the new Xbox Experience, is the new avatars. Going through a wizard the first time I logged in, reminded me a bit of creating your Tiger Woods 2008 character. Though this is fine in creating a personalized environment, I find no purpose for an avatar except to snap a proverbial photo and making that photo your “avatar photo”. I would much rather designate an actual graphic or picture as my avatar, in much of the same way most social networks allow you to.
The storyline falls apart more when you login to manage your Xbox Live account from the web and discover they have not incorporated any further way of getting at your data. Microsoft would do well to develop robust APIs that would allow players to get an XML or JSON feed of achievements, gamerscores, last/currently played games as well as other social network elements.
Why not provide a much more efficient “friends” method that would allow players to have wish lists, friend challenges, friend groups, as well as a unique element I call “tip sharing”. Tip sharing would be a forum element where a friend could share intel about a game (say Fallout 3) and I could “download” that tip into my Xbox Live user account. When I reach the Farrugut West Metro station in Fallout 3 and my friend has discovered something, the game could feed me that intel from a friend.
Another social element would be the concept of a “lifeline” where, if I’m stuck during a game, I could get immediate assistance (in-game or otherwise) from my friends through screen sharing, instant message (kill Live! Messenger and use OpenAIM, please) or other “helper” element.
Let’s make it really social and make it possible for gamers to find other gamers in their area and schedule times together (if you have to, use a modified, online, lite version of Sharepoint or Exchange Server to make this happen).
Of course, a natural tie together, via OpenSocial, with other social networks, possible use of OAuth for data access and login, status messaging and comment, and other “social elements” would really flesh the Xbox Experience as useful in 2008.
What are your thoughts on the Xbox Experience?
Allow me to get nerdy.
It has been a long time since I got downright giddy about something developer-oriented. Lots of new APIs are coming out all the time and I usually take a once over look at them to determine if there is something cool there. A lot of time there are cool things and I promise myself to come back and explore the possibilities later. I rarely do.
However, with the announcement of GNIP today, I finally feel like my incessant mulling of API frameworks might be coming to an end.
Let me back up. A few weeks ago, I was fiddling with a bunch of APIs trying to create some mashup I was working on. I sent Keith a direct message pitching a “crazy idea”. An API for all APIs. One API to rule them all. His response, “A meta API?”
That made sense and made me laugh because I know how much he hates the word “meta”.
My idea quickly dissipated as I realized it was probably pretty futile to create an API for all these varied services that all had different data formats and types and my need for it wasn’t all that important at the time anyway.
The idea with GNIP, bringing this story full circle, is that it is a meta-API. It sits in front of “data producers” (Digg, Flickr, Disqus) and provides a standardized API for “data consumers” (Plaxo, MyBlogLog, even Lijit!) to exchange data.
Since this is still so very early, there are bound to be other data producers and consumers. Also notable is that the only data format is XML. XMPP and JSON are missing. That will likely change over time too.
Data Producers not yet involved that should be:
And a few Data Consumers that are also missing:
The other day, I was talking to the CTO of a company that is working to build a web technology solution for a problem that exists due to the arcane infrastructure and systems already in place in the niche target industry. He was mourning the fact that, after spending gobs of time wireframing and re-wireframing a solution, the parties who initially expressed interest in licensing the technology, had decided to walk away from the table for a variety of reasons.
The big conglomerate that had decided to walk had expressed concern over the fact that they already had systems for billing and other management aspects of their company and didn’t want to invest in something unknown and untried over their long-standing, yet antiquated, solution.
Over the course of an hour or so, and even since then, we looked over his wireframes determining what the company should look like in order to make some sales, if not all the sales he wanted. I realized that his product was designed in such a way that dependencies were created everywhere. If a customer wanted just this one portion that does employee management but not the other part that does billing, there was no productive way to do this so he could make a sale without making the big sale.
In the web world, we talk about mashups. Take a google map and mash it up with Twitter and you have Twittervision. Mashup Basecamp with FreshBooks and tie in Salesforce and you’ve got a complete back-office CRM-Billing system to build your company on top of.
The strength of mashups is the distributed nature of the work. I no longer have to store my own video files because YouTube will do it for me and give me a means to access it, thus eliminating the overhead and cost of doing business associated with that video. I no longer have to worry about the development time and money needed to distribute a widget containing my content to other websites because Clearspring does that work for me. The trick is in APIs which allow others to innovate on top of the technologies created by others.
My advice to this entrepreneur was to create APIs between his various modules and build out-of-the-box products that he could sell that utilize those APIs. In fact the APIs can be open and still be paid access, which provides another stream of revenue – especially when his clients have the money to pay top dollar for those APIs if they consider the alternative cost of throwing out entire chunks of their existing infrastructure and using an out of the box solution that may not meet their unique needs.