Dennis D. McDonald (ddmcd@ddmcd.com) consults from Alexandria Virginia. His services include writing & research, proposal development, and project management.

Who Ensures that "Web 2.0" Software Applications Work Together?

By Dennis D. McDonald

Recently fellow blogger Philippe Borremans, owner of Conversationblog, asked me to check out a program he had been using (Qumana, www.qumana.com) to manage his blog. It had stopped working and he asked me, a fellow Squarespace user, to check it out. (Philippe's blog was later selected for the LinkedinBloggers March 22, 2006 "Blog Boost.")

Qumana is an example of a blog management program that enables the user to manage blogs across multiple programs though a single user interface. As a user of only two blogging systems (Squarespace for All Kind Food and WordPress for The Podcast Roundtable) I have not been motivated to try this category of software. But I did want to try the Qumana product in response to Philippe's request.

I downloaded the Windows XP Beta program and ran the install program but it would not auto-configure to my Squarespace web site, All Kind Food. I guess that the recent Squarespace update is the reason for this and Qumana may not have gotten around yet to modifying its software to correspond to the current version of Squarespace.

For me this is no great loss. I rather like the Squarespace interface and how I can easily access it from my three home machines (Windows XP Pro, Apple OS X, and Windows 2000) and from any web browser while I'm  on the road.

So what, you ask? Well, this little experience pointed out one of the issues raised by the "flattening" of the software and networking infrastructure landscape that social networking and "web 2.0" software products are having. "Programs working with other programs" has always been an issue for corporate IT departments because of compatibility and stability issue.

This was certainly true in the old days with "screen scraping" programs where Windows programs snatched presented data on terminal-emulating PC's based on assumed-to-be-stable screen layouts. It is true today with the development of "mash-ups" where data from two separate programs can be combined -- almost overnight -- to create original system functionality. A slight change in one side of the equation can cause the whole set-up to crash or for data to be corrupted.

Now, with the increasing standardization of network based data and applications, some control over application development and/or procurement may be shifting away from IT departments to end users and managers in business units outside IT.

Some folks are excited about this evolution since it may portend more rapid development processes that directly respond to market and competitive pressures. This is good -- more rapid development of useful applications has to be a good thing, especially when they are developed using an "agile" process that directly involves both users and developers in the creation process. And there are people who point out that externally developed and maintained applications may actually hold significant quality and perfomance advantages over internally developed applications. (One of the interviewees in my Web 2.0 Management Survey pointed this last point out to me and it reminded me of some of the original justifcations of ERP systems.)

Sometimes when a "traditional" IT professional looks at the application development process when it's managed outside the "traditional" IT department, he/she has to ask questions like the following:

  • How reliable is the data feed this application is counting on?
  • How reliable are the individual software components?
  • What happens when sensitive data is being updated and the PC or network goes down?
  • Who will be responsible for maintaining the infrastructure that supports this application?
  • Who will be responsible for making sure that the different components that make up this application are updated on a regular basis?
  • Who will be responsible for making sure that the different components that make up this application continue to work correctly with each other?

Call me a paranoid curmudgeon if you like. In my defense, I know what it's like to be involved with projects that are responsible for correctly handling millions of records containing customer  financial data. The last thing a corporation's executive management wants to do is to sign off on financial reports where some of the software management components might have been unreliable or flaky (can you say "Sarbanes-Oxley?")

So the next time someone complains how "slow" and "resistant to change" IT departments are when considering bringing the virtues of Web 2.0 and mashups in-house, just remember there might be a reason for the questions.  That "slow moving IT executive" might actually be thinking strategically and in the best interests of the company, its employees, and its stockholders.

Would you like to comment on this article? I'd love to hear from you! Please use the comment function below or send an email to Dennis D. McDonald at ddmcd@yahoo.com.

 

On Corporate Resistance to Enterprise Web 2.0

Online Intellectual Property: What About "Non-Documents"?