Today, more than ever, enterprises are faced with great challenges: rising development costs, budget cutbacks, and increasing customer demands, to name a few. Decision makers must determine which projects to fund and are often confronted with an existing technology infrastructure that already includes a hodgepodge of technologies, such as object-oriented technologies, enterprise application integration (EAI), enterprise resource planning (ERP), and customer relationship management (CRM).
Against this backdrop, Web Services emerge promising many wonderful benefits, such as reduced integration costs and a low learning curve. How can a technology planner reconcile these promises with conflicting real-life experiences involving incomplete standards, lack of security features, and other key issues? How should these gaps be addressed? Do Web Services replace existing technologies or do they augment them? Has any firm adopted Web Services successfully? If so, what were the benefits and how was it done? What makes Web Serv-ices different from previous attempts of interoperability such as Common Object Request Broker Architecture (CORBA)? And why is it inevitable that Web Services or the like will be adopted on a grand scale?
The answer is that early adopters of Web Services have indeed used the technology successfully to achieve a variety of goals, including providing better customer service (Putnam Lovell Securities), building a digital marketplace with reduced development costs (Pantechnik International), and saving the adopting firm millions of dollars by building a sophisticated procurement platform (Talaris).
An enterprise that is seriously considering using Web Services needs to identify what gaps are relevant and how to address these gaps. Many of the gaps in the standards are rapidly being addressed by emerging technologies or third parties such as Web Services networks.