Printer Friendly

The keys to successful integration for outsourced service providers: a Pervasive software white paper.

Overview

Outsourced service providers succeed by effectively handling a large number of trading partner and customer data feeds. Besides ongoing business, they also must address competitive pressures to on-ramp more customers faster, which will only add to their data requirements. Companies that can quickly and efficiently manage the complexities of heterogeneity combined with high-volume data feeds will be able to take on even more clients, resulting in opportunities to grow revenue more rapidly. This is a difficult proposition, though, as there are many obstacles that often get in the way of seamless integration of trading partner and customer data.

Major concerns that must be addressed in building, maintaining, and leveraging these data feeds revolve around data demands, mapping needs, communication protocols, and reformatting for business partners. This white paper will discuss these concerns and provide insight into developing effective solutions for outsourced service providers, including companies engaged in business data management and processing, transaction and payment processing, billing services, human resources outsourcing, payroll or accounting services, and direct mail or marketing services.

Data Demands

Building and maintaining systems and processes to leverage customer and trading partner data feeds can be expensive and difficult. The data can be sent in a wide variety of formats using a number of delivery methods. Some customers might send XML via message busses. Others might send flat files via FTP or EDI documents through a value-added network. And despite the heterogeneity of data sources and data feeds, outsourced service providers must be able to provide high-speed "provisioning" of customers. Businesses capable of accepting this data regardless of format will have the competitive edge, and those that can do so at a reasonable cost will have an even better advantage. Some companies use custom coding to handle transformation of data coming to and from customer and trading partner data feeds. Custom code presents challenges for those firms which seek business growth. If the data coming from the customer changes, the code must change as well, making it more difficult to maintain over time. New customer on-ramp time may be slowed. And what if the programmer leaves the company? The remaining programmers must decipher the code in order to maintain it. Most companies would prefer to use programming resources for building mission-critical applications, especially considering the wide variety of good integration tools available. Packaged integration tools allow the outsourced service provider to focus on their core competencies, while providing more flexibility, more scalability, and updated support for new data sources and targets that would have to be coded otherwise.

Unstructured Data

According to a leading industry analyst, some 80 percent of a company's information is unstructured data. Unstructured data sources might include print reports generated from legacy applications or information stored between the tags of HTML data. With so much of the world's data trapped in these unstructured sources, it is likely that outsourced service providers are receiving unstructured data from customers. This type of data can be very difficult to get into a format that is usable by internal applications. Given this, integration tools that enable the mining of pertinent customer data from non-relational and semi-structured data sources will enable a service provider to leverage customer data out of virtually any type of application.

Data Mapping Needs

The ability to accept and leverage data requires tools that can reduce time expenditures, while ensuring easy, rapid quality control. Outsourced service providers must have the tools in place to be able to map multiple, disparate data formats from customers, while addressing complex hierarchical structures, multiple record types, and packed fields.

If there is an integration plan in place that includes using tools with graphical data mapping interfaces and a large number of data connectors, the problems posed by mapping complex customer data feeds can be greatly decreased. However, without a strong integration strategy, problems will persist--ranging from determining whether data has been received from a customer to how to go through a third-party, perhaps a hosted application, to access critical data. Lack of control over original trading partner and customer data will create greater unpredictability about what to expect from customer data feeds. Even more, changes to the data at the end-points can be crippling. Companies that deal with vast amounts of these sorts of data will want to make sure that they have a data integration strategy that allows them to easily modify existing routines to account for changes in incoming data.

Reusability is key. The ideal integration platform will enable a service provider (or their customer) to only make slight modifications to data conversion routines through an easy to understand graphical user interface (GUI). Custom code can prove highly problematic. All too often, customers send data with changes that prevent a custom-coded solution's ability to utilize the data.

Other Needs

The ability to accept and leverage data requires tools that can reduce time expenditures, while ensuring quality control. The right tools and approaches will allow outsourced service providers to receive multiple, disparate formats from customers, while addressing complex hierarchical structures, multiple record types, and packed fields.

Communication Protocols

An outsourced service provider must maintain accountability beyond a simple file transfer. There are a wide variety of communications protocols, and a company must be able to connect to customer data feeds using SOAP, Web Services, HTTP protocols, message queues, and Enterprise Service Busses. Outsourced service providers must take this variety of protocols into account and select an integration platform that provides access to all types of file protocols--especially as data flows through service oriented architecture environments.

High-Volume Data Feeds

As their businesses grow, outsourced service providers must deal with increasing volumes of data feeds from customers. The problem of realizing a coherent view of customer data is compounded when the variety and change of the data are taken into account. Having a good platform for integration of customer data can mitigate the problems presented by high-volume data feeds.

Reformatting For Others

For many outsourced service providers, an ideal integration platform will allow them to take information from customer data feeds regardless of format, integrate the pertinent customer information into their internal applications, and then reuse and reformat this data for submission to other business trading partners such as suppliers, payment processing firms, claims clearinghouses, or banks.

Data Flow Into and From a Data Hub

Once data is received, the first step requires data profiling, marked in this diagram by "PDP." Data profiling should be a required pre-process to determine what data looks like and whether it is usable in its current format--or, in other words, to determine if the data is clean or dirty. This step is very helpful at design time when personnel are analyzing and debugging customer data to determine how to efficiently migrate the pertinent information into the customer data hub. At run time, if data profiling can be used as an executable step in the data conversion process, an outsourced service provider can establish a minimum threshold of data quality that determines which data can be placed in the customer data hub. In this scenario, the profiling step serves as a validation gatekeeper that "watches" to ensure that no dirty data enters a customer's data hub.

The next step after profiling is converting or transforming this data into a format that is useful.

The objective is to take a massive variety of data and aggregate, cleanse, and transform it into the format needed inside the data hub. Usually, this data can be moved into internal business applications either directly or through an intermediate flat file. This data might even be put into a database for analytics.

The ideal integration solution is one that accounts for incoming customer data pipelines, but also allows reuse of the same tool set to build outbound data pipelines for business partners and suppliers. Even more, the solution should be one that is highly versatile and scalable so the benefits of the integration tool can be pushed out to the customer site, behind the firewall, allowing incoming data from the customer data feed to be always clean. An integration engine with a small footprint can enable an outsourced service provider's customers to produce the clean data themselves. In this scenario, the integration product could be sent to customers along with standard templates and validation rules that determine to which format data is mapped.

What to Look For in Data Conversion and Integration Tools

A Highly Scalable Platform

An outsourced service provider should be looking for a tool that is scalable at design time and at run time. If dealing with large customer data feeds, the service provider needs a fast integration engine that can process large amounts of data and can get customers on-ramped as soon as possible. The software industry in general--and specifically companies in the integration tools market--grossly underestimate the importance of design time performance to integration projects.

An Open Repository

Outsourced service providers should require an integration solution that has an open repository of conversion and integration metadata. While it is almost certain that any integration tool on the market is going to be able to function as a black box, one that offers open conversion metadata will allow an extra layer of flexibility to a company's total application stack.

Many Data Connectors

While scalability at design time is of great importance, outsourced service providers should also look for integration tools that offer scalability at run time and agility for future needs. To work most effectively in this environment, an integration tool will need to feature a large number of data connectors in order to account for the variety of data formats that a provider must be able to accept to provide better value to customers. For example, in the healthcare and insurance sectors, solutions that support HIPAA, HL7, ACORD, and EDI industry standards are important. In addition, solutions that offer an open standards-based Data Mediation Services (DMS) connector will further enable outsourced service providers to respond to changes in demands for future connectivity.

The tools must be able to easily leverage data from multiple infrastructures including databases, legacy files, multiple Internet protocols, enterprise service message busses, SOA, and even unstructured data formats such as mainframe print dumps and tagged HTML. In addition to having a host of connectors, the integration solution should be able to operate its run time functions on multiple platforms.

A Sophisticated Management Framework

A tool set that offers a sophisticated management platform for integration projects--both at design time and run time--is important to success. One great challenge faced by outsourced service providers is the change in incoming data to which they must be able to adapt. This consideration becomes important at design time, when the group tasked to create integration routines for customer data feeds is asked to perform impact analysis on changes to existing integration routines or searches across integration routines to see if similar work has been completed that can be reused. A run-time management framework is also needed for those with data integration routines running at multiple locations. Those responsible for overseeing the conversion of data from customer data feeds will want tools that enable them to manage, schedule, and conduct real-time monitoring of data conversion and integration routines--with actionable real-time reporting on the performance of these routines.

Data Profiling

Data conversion projects will be much more efficient if service providers are able to look at data prior to loading internal applications and determine whether the data meets the threshold of data quality required in a data hub. A data profiler can serve as a pre-conversion validation tool to ensure that the customer data being loading into systems is clean and useful. Dirty data can result in significant expense and lost work time if not detected and addressed.

Professional Services

Data conversion and integration projects that involve a variety of changing data types inherently have some degree of difficulty. While outsourced service providers should look for integration tools that allow them to do the majority of the conversion work without paying for services, they will want an integration partner that provides consultation services, whether it be product training, analysis of existing conversion routines, or complete outsourcing of integration routine creation. An integration partner with strong service offerings will provide the extra assistance that will guarantee that the outsourced service provider is able to become productive in a short timeframe.

Conclusion

Outsourced service providers must reduce the complexity, costs, and risks associated with their integration deployments if they expect to compete effectively. By selecting the right data conversion and integration solution, providers can on-ramp customers more quickly and increase the number of data feeds they handle.

Utilizing a versatile, configurative integration architecture for rapid implementation, superior scalability, low total cost of ownership, and high ROI better ensures success. By using a comprehensive set of easy-to-use visual design tools, a company can rapidly build and test integration processes--regardless of size and complexity--across hundreds of data formats and applications, within and outside of the enterprise. The complete line of Pervasive data infrastructure software for database management and integration products enable businesses to manage, integrate, analyze and secure mission critical data for the industry's best combination of performance, reliability and total cost of ownership. Pervasive has extensive experience in dealing with transformation of all types of data--message formats, such as XML and EDI; SOA; message busses; flat files; legacy data sources such as COBOL, ISAM, and VSAM; industry standards such as HIPAA and ACORD; and popular relational databases.

www.pervasive.com

RELATED ARTICLE: Pervasive at Metamorphosis 8.0

May 28-29, 2008, San Jose, California

Metamorphosis 8.0 to be held at The San Jose Marriot will be the meeting place for SaaS vendors, ISVs, data services providers and BPOs who will present their own integration challenges and strategies.

Sessions are focused on vendors' experience with leveraging agile integration, organized into three tracks.

* Saas Integration

* Embedded Integration for ISVs

* Integration in the Data Center

You will hear about real-world business and technical strategies and you'll come away with an unparalleled competitive advantage. Take the opportunity to learn directly from experienced executives; there will be ample time for networking, partner development and business development.

In addition, there are pre-event product training sessions that will provide your technical directors invaluable insight into fundamental features and best practices. For more information on training, please email janelle.nunn@pervasive.com
COPYRIGHT 2008 A.P. Publications Ltd.
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 2008, Gale Group. All rights reserved.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:SOFTWARE WORLD INTELLIGENCE
Publication:Software World
Date:Jan 1, 2008
Words:2383
Previous Article:Key management for enterprise data encryption.
Next Article:Infosecurity Europe 2008.
Topics:


Related Articles
TANGO 2000 WEB TOOLS CERTIFIED FOR NEON'S SHADOW DIRECT.
PERVASIVE.SQL 2000 DB ENGINE POWERS DACEASY VER. 10.
SAGE LAUNCHES ACCOUNTING SOLUTION FOR SMALL BUSINESSES.
SAGE LAUNCHES ACCOUNTING SOLUTION FOR SMALL BUSINESSES.
PERVASIVE SELECTS TELELOGIC FOR CCM SOLUTION.
Up and away with SOA: service-oriented architecture (SOA) is a current technology rage. Its goal is to make IT integration faster, more effective and...
Pervasive Data Integrator 9.0.
infocall ends successful participation at meftec 2008.

Terms of use | Privacy policy | Copyright © 2021 Farlex, Inc. | Feedback | For webmasters |