2010-07-16

Proving Legitimacy

If your firm does work for a government agency at any level, federal, state, or municipal, you are no doubt familiar with the SF-330 which replaced the old 254/255 Forms that were previously used. It's a lengthy company profile form that can take many man-hours to fill accurately with time spent data-mining your ERP and financial systems. The general purpose of this form is to provide a standardized way for agencies to look at a prospective vendor applying for a contract and determine whether the business is of appropriate size and experience (both company experience and projects staff personal experience) for performing the particular type of work for a contract.

It's a very onerous pre-screening before the particulars of your proposal are even considered. Once you've done it once, however, it is fairly painless to keep updated from year to year, provided you save your reports and SQL queries from the previous year.

The problem is this. Ninety percent of the entities we deal with who require a SF330 submittal, still require their own totally unique past project experience reporting. This can be a terrible burden on small businesses to bear, especially if they are trying to provide accurate and thorough information, which I suspect many can't. To provide the information they require, you have to have an very robust ERP system, a ton of demographic fields in your master and transactional records (projects, invoices, client types, business units, staff), a rigorous data quality control process for it all, and staff who are familiar enough with that system to data-jockey it for all it's worth.

Wasn't this the problem the SF-330 was designed to solve?

2010-02-03

Cloud Computing DejaVu

I tweeted a while back:

I remember when my Wyse-50 terminal hooked up to a 300-baud modem dialed into a SPARCserver 4/330 in another city was "cloud computing".

This got me to thinking about Cloud Computing from a 50,000-FT view. In it's current usage, the term "Cloud Computing" is over-used that it's approching ubiquity, relegating it to the irrelevance of a marketing tag-line. From a compupological perspective, however, cloud computing is the next evolutionary, step in the the old client-server paradigm.

The client-server paradigm is simply a model by which computing work is done by a user who directly interfaces with one device which then, in turn, directs a second device to share some of the labor. This allows the "indirect" device to assist many remote devices which are local to the users.

Examples:

  • User on a PC saves file to fileserver
  • Outlook retrieves message from Exchange
  • Internet Explorer is used to run Google Maps on Google's server.
  • Google App's javascript, delivered from Google's server runs locally in Internet Explorer which then directs Google's App server to send a calendar invitation (via AJAX)

See, cloud computing is just the next evolution of the client-server model that simply adds new levels of abstraction whereby a server can be a client of another server at the same or a lower level in the software, hardware, or networking stack. When you think about it that way, you realize that's what virtualization is too.

So, I urge everyone to think in terms of layers of abstraction and indirection, not clouds. Remember the OOP craze? It was/is really about providing wonderfully new level of abstraction to coding too.

Abstraction and indirection are powerful tools when the hardware and network are fast enough to support it. We are there now.