Immersed in technology.
THE POSTWAR MORTGAGE BANKING INDUSTRY HAS SEEN MAJOR CHANGES, not the least of which has been in the area of automation. As recently as a decade ago, in the innumerable "mom and pop" mortgage operations, the industry's automation could have been defined as a printing calculator. But the rise of dominant industry players, many of them part of large, sophisticated banking entities, brought a major surge in automation to the industry.
The major focus for technology in this industry to date has been on administrative, or "back-office," systems that support day-to-day transaction processing, such as application taking, loan processing and tracking, fee collection, servicing and the like. Only recently have we come to focus on systems that support tactical or strategic decision-making, planning and analysis or sales and marketing. Now we are entering an era where technology will be applied to transform business processes.
In this regard, we are no different from most other industries that also focused on automating the more routine, back-office functions, where there were clearly defined processes and a limited number of potential outcomes. Contrast these kinds of straightforward business activities with decision-oriented or managerial tasks, and it becomes clear that the latter present much more difficult automation challenges.
To meet this challenge, we need to understand key principles of information systems management.
Principle one: We are all information managers
A good deal of what I address in this article applies to information systems management in general, not just in mortgage banking. Indeed, I would go further and assert that most of what I focus on applies to management in general. For one of the key principles we must adopt is that information systems management is the responsibility of all managers, not just information technology (I/T) managers. Just as war is "too important to be left to the generals," technology is too important to be left to the technologists. For business in the 1990s, all managers have to find ways to effectively manage their assets: human, financial, material and technological.
This recognition is nowhere better expressed than in a survey of mortgage banking CEOs done by Hamilton, Carter, Smith and Company, Los Angeles, and published in June 1992. The CEOs were asked about their most important current concern, and the three most important issues for the future. Overwhelmingly, the most important current concern was the regulatory environment. The most frequently cited future concern was the use of technology or automation. In most cases, the CEOs saw technology as providing the opportunity to reduce costs or improve service.
The survey findings demonstrate an increasing consciousness about the role of executives in identifying and understanding how technology can be used to support their business. Just how this new understanding can be exercised is nicely defined in a study by two Florida State University professors, R. Zmud and V. Sambarmuthy. (Their study, called "Managing I/T for Success, The Empowering Business Partnership," was published in 1992 by the Financial Executives Research Foundation.)
In their study, based on in-depth profiles of 18 firms in a variety of industries, they identified six core competencies for enterprisewide information technology management and assigned responsibility for these competencies to three sets of business managers: senior executives, business-unit managers and information technology management.
Senior executives are responsible for creating a "high-tech" organizational culture, meaning one based on fostering innovation through risk-taking and experimentation, where work processes are restructured to leverage I/T opportunities and senior management-sponsored information technology initiatives. Business-unit managers are responsible for fostering technology-based strategic alliances with customers and suppliers. Information technology managers are responsible for building appropriate infrastructure strategies, leveraging external resources and ensuring that the applications portfolio is continually analyzed to determine those systems that need to be created, enhanced or redone. I/T managers are also responsible for such traditional information technology activities as planning, evaluation and control, skill development, methodology, security, standards, disaster recovery and operational soundness.
The three groups share responsibility for formulating and implementing technology strategies and for facilitating organizational alignment. This means all managers share a responsibility for educating themselves about relevant technologies, developing a method to assess the value of information technology activities, cultivating an image of credibility for I/T throughout the company, and developing formal and informal working relationships among I/T staff and line managers.
Principle two: People issues at the core
The second key principle we must adopt is that at the heart of all information systems management issues are people issues. For those of us in information systems, this comes down to managing a set of relationships that are depicted in Figure 2.
The most critical relationship is obviously with our business clients. But we must also develop and forge ties with vendors and keep abreast of trends in both the mortgage and technology industries. With our business clients, we must have an open and ongoing dialogue and shared activities that aid the successful translation of tactical needs and strategic initiatives into information systems.
With our vendors, we must clearly communicate our direction and ask them to be partners with us in making the very best and most cost-effective technology choices. Virtually no one in the mortgage lending business has successfully implemented a major information system in recent years without the active assistance of a vendor, regardless of whether we buy or build our solutions.
Identifying and understanding the implications of developments in the technology marketplace is also important. For this we typically rely on a personal network of contacts to separate the winners from the also-rans. And finally, organizations such as the Mortgage Bankers Association of America (MBA) provide I/T practitioners with the opportunity to share ideas, set standards and forge electronic links that lower costs and increase the efficiency of the industry as a whole.
Principle three: Technology to transform the business
Thus, we have to build relationships in a variety of sectors if we are to be successful. We also need to change our view of technology from something for record-keeping to something for record-setting, from "back office" to "front line." That is to say, we must move away from the easy and convenient path of building systems that track what the business is doing, to information systems that transform the way business is done.
Figure 3 represents this trend, with the vertical axis showing relative impact on the business of certain automation solutions and the horizontal axis representing time, roughly 1960 to the year 2000. The bottom curve shows the trend of technology having growing impact on those functions that directly interact with the marketplace, thus exerting ever-increasing impact on the business.
Looking at the graph in another way, we see the dashed curve sloping downward. In the past, things used to be cheap because we could have people do them. Now, people are no longer cheap. Thus, the more manual the process, the more expensive it is. These are the economics that mortgage industry CEOs are seeing today.
The most significant stage in this trend so far has been the implementation of transaction-processing systems. Transaction-processing systems--still the main focus of the industry's automation efforts--allowed us to do things faster. Why? Because they used the huge power of mainframe computers to take something that would have taken days and condensed it to a matter of minutes.
Transaction-processing systems did for America's service industries what the assembly line did in manufacturing. We could break large and complicated processes down into small bits called transactions. We could train people to handle a limited number of transactions rather than an entire process, in the same way that auto workers could be taught to screw on door handles rather than build an entire automobile. This meant we could employ relatively low-skilled workers and make them very productive. The key to profitability for American business throughout most of the 20th century was rooted in this equation.
Transaction-processing systems also provided us with the ability to compile operational histories, to create management reporting systems to see where we had been and project where we were going. We could begin to show trends and analyze the meaning of those trends. That is why transaction-processing systems lead to decision-support systems (DSS), whose users are not the workers but the analysts and managers who are responsible for planning the future. In its highest form, DSS becomes executive information systems that allow our very senior management to interact with the data collected at the lowest levels of the business.
Principle four: A commitment to change
Transformative technologies, or technologies whose implementation is either the cause or effect of a reengineering of applicable business processes, will be similarly significant. Transformative technologies bring with them fundamental changes in the way we conduct a particular business process. The corollary to this is transformative technologies can only be successfully implemented if the business agrees to fundamentally change its process of doing business.
This fourth key principle is truly critical. Yet it raises a number of questions, most importantly: What are the existing financial, cultural and procedural barriers to the implementation of transformative technologies? To answer that question, I will briefly discuss seven transformative technologies shown in Figure 4.
Even two years ago portable computer technology would have been prohibitively expensive for any but the largest firms. Now we are riding the technology price curve that is bringing laptops and notebooks into the mainstream. Providing a loan officer with a portable computer entails applying point-of-sale technology to the mortgage banking industry. This will bring essential application data to the processing and decision-making stage more quickly, more accurately (no more handwritten applications) and with less redundant data entry. Also, portables enable us to reduce the physical plant (office space, facilities) necessary to support a distributed work force.
Portable computing thus has the potential to make both loan officers and processors more productive. It also has the potential of changing their relationship with each other, as well as their view of their roles as employees of the company. Loan officers frequently have preferred having processors with whom they can interact. Will shifting more of the processing to the sales force change their view of the processor? Will processors see this as a positive change? One advantage of the change is that certain processors can be trained to make the easy underwriting decisions, leaving the more difficult cases for experienced underwriters. This has the effect of performing a triage on loans, with the "no-brainers" going through an accelerated approval process without incurring substantial risk.
The interaction between the loan officer, borrower and real estate agent will be affected as this technology becomes widespread. Will the agent see the computer as a step up in the mortgage company's service, and will this result in increased referrals? Will the consumer find the technology intriguing or irritating? In today's environment, and particularly in more upscale markets, a computer-literate buying public is likely to be attracted to the technology. In rural or less advantaged areas, the reaction to the introduction of this technology is less certain. Like anything else, it's a matter of knowing your market.
Standardization of application taking will probably improve because the system on the computer can automatically check for compliance and best practices. Yet those loan officers who resist standards may well view automation as infringing on their autonomy. If they are highly successful, they may resist "changing a winning formula." How can such resistance be handled?
It is also important to understand the factors contributing to this resistance. If loan officers are asked to purchase their own machines, is the financial commitment involved onerous? If so, efforts can be made to subsidize purchase. Do they fear their ability to master it? A personalized training program and on-going support announced as part of the rollout can address this.
Certainly the best approach is for senior management to clearly articulate the vision, as Zmud and Sambarmuthy state in their study. This vision should clarify the role of portable computing (or any new technology, for that matter). Otherwise, there will be a tendency to see the technology as a fad or something that does not have full-blown commitment at the top. Even if the sales force can't yet see portable computing as a benefit, at the very least they must see it as inevitable.
Electronic/workflow imaging has largely found its niche in the mortgage industry as an alternative archival medium that allows documents to be scanned and stored electronically. Increasingly, however, imaging is being applied to work flow management, allowing speedier, more efficient and more accurate retrieval and processing of documents.
I define work-flow management as a means of segmenting the work process into time- or event-based activities to be performed by a work group. Intelligent work-flow systems provide assistance to management by identifying work queues and available resources and matching these for optimum efficiencies. They provide assistance to staff through the use of prompts and reminders and generation of productivity statistics.
Imaging technology holds the promise of radically reducing our industry's dependence on paper, substituting instead the electronic form as the document of record. Just as banks no longer ship paper checks back and forth for clearance, relying instead on electronic transmission, we in mortgage banking, can capture the image of our innumerable documents at the point of entry and move the file through the various processing steps logically, rather than physically. This reduces the possibility of lost documents, dramatically reduces the physical storage needs in the actual work environment, improves management control over the process and provides better status reporting to employees and customers.
Working with the image of the document, rather than the document itself, means that multiple employees can access the information on the document at once. From an organizational perspective, this changes the work flow from sequential to parallel processing. Rather than the assembly-line model, imaging allows for a more team-based approach, with a group of openers, processors, underwriters, analysts, customer service representatives, closers and others interacting with the file at appropriate moments. In fact, imaging technology is almost a prerequisite for this type of organizational change, which the insurance industry and others have experimented with successfully.
Imaging works best when it utilizes a team-based approach rounded on work-flow technology. We have traditionally assigned work based on an individual's prior experience, personal interest or other subjective criteria. This new approach instead seeks to assemble well-qualified teams operating in accord with uniform standards. Reducing the reliance on any given individual makes it easier for management to balance work across teams, thereby reducing the need to "gear up" or "gear down" in response to volume changes.
It should be noted that imaging is best considered a "way station" technology in the move from paper to true electronic processing. Reading an on-screen image is better than chasing paper, but inquiring against a data base is still the most efficient mode of processing. Where imaging holds great promise is in capturing graphic information (e.g., property photos).
How can expert systems transform a company? For the purpose of this article, I define expert systems as software that incorporates facts and human experiences into a knowledge base that can be referenced in such a way as to arrive at judgments that are as accurate as those routinely achieved by human experts.
Expert systems offer the promise of greater consistency, because you can either establish rules or extract the decision logic from actual cases and use the system to enforce compliance to the rules or logic. Expert systems can also be excellent training tools because they embody the experience of multiple people.
The downside to expert systems is that they engender the fear of losing control of a process to a "soulless machine." Users might well ask: What if there's a flaw in the system, some judgment parameter we forgot? What if it's using invalid data for its decisions?
Expert systems almost inevitably engender fear about job security. Yet, does anyone know of an underwriter who's actually lost a job as a result of an expert system implementation? Interestingly, whenever I ask that question in any industry, no one is ever able to supply me with a name. Still, there is the pervasive fear that "my people will lose their jobs." "I will have fewer people," "I will have less skilled people" and so forth. Such fears need to be addressed by an implementation team.
Another organizational factor to consider is that, in most lenders' operations, underwriters are viewed as an essential check and balance on a sales function that would obviously want every loan application approved. Underwriters, who look at longer term risk, provide this aspect of quality control. Can an expert system perform this same function?
Expert systems can be liberating for more experienced underwriters, freeing them up from more mundane decisions to concentrate on more difficult cases. This will test their ingenuity and creativity, which can be a positive development. On the other hand, it increases the pressure and the impact of making an error. We may want to strive for more of a balance, leaving some easy cases for the senior underwriters to handle as well.
Similarly, existing underwriting productivity measurements would need adjusting. If an underwriter could handle eight loans per day in a mixed environment where the work consisted of part easy loans and part hard loans, what should the performance standard be when the percentage of difficult cases substantially increases? If we based our cost/benefit analysis for the project on the old volumes, the payoff could be a long time in coming.
To initiate a switch to expert systems, we must first understand the underwriting process and whether or not it is functioning correctly in our operation. Does underwriting, for example, occur at the right time, or should it be moved earlier in the process? Are we creating difficult underwriting cases because of sloppy documentation upfront, or by not doing it right the first time?
We must define both the current and the desired process. To do so, we can't look at underwriting in isolation, but as part of an entire process flow. Only then can we decide how underwriting should be done in the future. And it is around that vision of the future that we will implement our system. The decision process must be built into the expert system. The expert system must also be able to explain (justify) its decision, just as human experts do.
It is critically important that the underwriters buy in to the vision. Otherwise, there will be endless opposition to the implementation. This can take both direct and indirect forms, the latter primarily involving underwriters not making themselves available for analysis, design and training sessions, inappropriate selection of data to build the knowledge base and other delaying tactics.
Integrating expert systems into your operation is not something you do "to" someone; it's something you do "with" someone. The same is true for imaging and the use of laptops by field originators. As we said at the outset, the introduction of these transformative technologies has to be carefully managed by the appropriate operational managers in conjunction with I/T.
The move toward client-server architecture is a much-discussed trend in information technology today. Although definitions vary widely, at its core, client-server refers to the migration away from "dumb" terminals linked to mainframe or midrange computer platforms to instead intelligent devices (e.g., PCs). This is done so that a given application can be processed on more than one platform at a time. This allows each platform to do what it does best: the PC (client) providing a user-friendly interface and seamless access across applications; the host (server) handling data storage and retrieval, inter-machine connectivity and heavy-duty processing.
There is no longer any question that users much prefer PCs to terminals. PCs can run graphical user interface (GUI) programs, such as Windows, which substantially reduce time spent on the learning curve and promote higher productivity. GUIs are more intuitive and can better mimic the actual nature of work than the old, character-based screens. Moreover, GUIs can provide a greater consistency to applications, so that a user only has to learn how to access, delete, print, etc., one time and can then do so for all programs. Even the visual presentation of information is vastly improved, making computing easier to tolerate for those who did not grow up in the information age and more fun for those who have.
The difficulty with PCs is that despite great leaps in power and performance they still have processing limitations. Even linked together in local area networks (LANs), there are some applications that will simply be too big to run efficiently for hundreds of users. Some software vendors overcome this limitation by designing their products to run on multiple LAN servers. But for others, a single server is a better choice for performance reasons. Client-server computing allows one to overcome that hurdle by breaking up an application so that part of it runs on the intelligent work station and part on a more powerful host (including the large mainframe computer).
Client-server computing provides a mortgage company with a great deal of flexibility in the implementation of technology. Because there are literally thousands of PC-based packages available, an I/T department can create a customized suite of applications that can support different functions: sales, customer service, trading, portfolio management and so forth. We are no longer tied to a single manufacturer with their proprietary solutions.
There is ongoing debate about the price of client-server versus that of traditional technology. One camp argues that mainframes and even midrange machines are too expensive, both in terms of direct costs for the hardware and the software licenses required and the indirect costs of large teams of expensive specialists to keep the architecture functioning properly. The other camp maintains that once you figure in all costs, it's a wash. However, no one can deny that PCs and LANs allow for incremental growth, so that a company can budget for several $30,000 upgrades rather than one costing $3 million.
So, why isn't everyone moving to client-server architecture? For one thing, firms have a huge investment in existing ("legacy") systems. With time, these systems have been enhanced to provide for specific solutions to business needs. Reengineering the architecture of these legacy systems will not come cheaply, either for the PCs to replace the dumb terminals or the transformation of expertise within the I/T unit alone. Further, much of the technology is not yet "bullet proof." It does not have the same track record in terms of reliability and operational control as the mainframe. This is not surprising. The mainframe has been around for 30 years; PCs for 10. It will take a while to catch up.
Moving to client-server means putting more computing power directly in the hands of end users. In the mainframe world, I/T could control everything on the machine. With PCs, users may themselves experiment with packages or ways of using software that is unanticipated. This is both the promise and the danger of this new form of computing. Companies have to have a good program of user education and support in place before moving to this architecture. And business managers have to be prepared for more-sophisticated use of automation.
Integrated information retrieval
Mortgage companies, like most financial services firms, are masters at acquiring information. Thousands of pieces of data are acquired from each borrower. We can obtain detailed profiles of application trends, approval ratios, demographic information, relocation patterns, runoff percentages and the like. In any given year, a mortgage company obtains hundreds of pieces of data about employees: production volumes, productivity, compensation, training, quality rates and performance ratings. Making this information available to management in a way that facilitates meaningful decision-making is the objective of information-retrieval systems.
As shown in Figure 5, these systems vary from commonplace items like daily production or lock reports, to batch reporting systems that provide regulators with necessary compliance information, and on up through DSS that allow for cross-tabulation of data and the discerning of trends, and finally to executive information systems (EIS). This last type of system allows for more-sophisticated analysis: product performance, profitability analysis, the relationship between training obtained and performance achieved, regional and local variations and so forth.
As elegant as this concept of the data hierarchy appears, most companies have a great deal of difficulty with its implementation. For one thing, the data usually resides in many systems, often on different hardware/software platforms. Even if on the same platform, it may be in incompatible forms, needing "translation," if it is to be merged together in a meaningful fashion. Data definitions may be inconsistent: For example, APR may be calculated one way in system A, another way in system B. Merging them makes for "apples and oranges" comparisons. A common item like "property address" may have totally different formats in different systems--which is the right one?
Thus, one of the first steps toward the implementation of a good reporting system is to clean up the data and build interfaces that allow it to be stored in one reliable data base that can be accessed by multiple users. Today, that usually involves a relational data base management system (RDBMS), which stores data in tables that can be manipulated to provide customized views of the data. Thus, a sales manager can look at the data in one way; a customer service manager can look at the same data in a different way. The data only has to bestored once, thereby making for great efficiencies in processing.
Of course, to set up such a system requires specialized data base administrators. They will work closely with end-user computing support specialists to design the on-line queries that generate customized screens of information or hard-copy reports. Business managers working with these specialists will need to clearly define their information needs and get used to working in a quicker automated world, rather than the old paper-based environment.
Some business managers find this a great opportunity, others a real challenge. After all, scores of middle managers have made a good living in manipulating the data for presentation to the executives. DSS and especially EIS override this mode of operation, letting the data speak for itself. Some may find this disquieting, and seek to delay implementation. Other managers may think they "own" the data (e.g., sales production by loan officer) and may not wish to share it with other areas. As before, senior management is best advised to clearly communicate the benefits associated with enterprise-wide information reporting.
Electronic data interchange (EDI)
Mortgage banking is still largely a paper-based industry. Even though enormous amounts of data are collected electronically, when we exchange it with others outside the firm, we most likely do it in paper form. The most obvious example are loan files that are transferred when servicing portfolios are acquired or sold. However, mortgage insurance certificates, property appraisals, hazard insurance policies, credit reports and title insurance policies are all items that lend themselves to electronic transmission.
MBA's Technology, Loan Administration and Residential Loan Production Committees have spent many years examining these issues. After much hard work, a number of standards have been submitted to the American National Standards Institute (ANSI) for approval in 1994. Mortgage companies can now begin transforming their software to send and accept electronic transmissions from external sources.
On behalf of MBA, Gartner Group Consulting Services, Stamford, Connecticut, in October of last year completed a study on the cost/benefits of EDI. In that document, the consultants note that:
* Loan, borrower and property standards will shorten the funding cycle, on average, by 3.21 days, allowing the typical company to fund an additional 3,276 loans annually.
* Product data model standards will shorten the funding cycle, on average, by 2.92 days, which will result in an annual increase in productivity per loan processor of 9 percent.
* A shortening of the cycle also means a percentage of loans will be delivered a month earlier to investors. Lenders enjoy a favorable rate differential for this earlier pool settlement date. Taking into account this favorable differential and warehouse "spreads," and applying these numbers to five-year projections of industry loan volumes, Gartner Group estimated that over five years, on a net present value basis, the industry could achieve $567,810,408 in savings.
* Costs were analyzed on an individual company basis. The overall average response indicates that the "typical" lender will incur $206,681 to implement the proposed EDI standards.
The MBA Technology Committee is committed to raising awareness among industry participants about the standards, working with vendors to ensure their fast implementation and promoting their use throughout the industry.
Although it makes great sense from an efficiency standpoint, moving toward EDI does require specialized personnel to implement the standards correctly. Partnership is essential between sending and receiving institutions. Unfortunately, the great promise of EDI will not be realized until we reach a critical mass whereby a significant proportion of the major players are willing to transmit electronically.
The first few companies will struggle through implementation, making the way easier for everyone else. Assuming they choose their partners well, they should gain significant processing efficiencies that should show up on the bottom line. Then the smaller players will have to run to catch up. Thus, the choice for mortgage companies is not "whether" EDI but "when."
The final transformative technology is voice communications. There are a number of technologies available to increase customer service. The best known is voice mail, whereby people both within and outside the organization are able to communicate (albeit one-way) regardless of time of day or personal availability.
Many organizations, unfortunately, do not use voice mail to their advantage. A common mistake is to allow outdated or generic greetings. Greetings should be updated on a weekly basis (daily, for heavy customer contact jobs) indicating the person's whereabouts, expected response time and how to talk to an actual person, if desired. Employees should not let voice mail be a crutch, routing calls to an automated attendant when they could take the call. Care also should be taken to avoid the endless loop of bouncing from one voice mail message to the next.
Automated attendants are commonly used to connect callers from a main number to a specific function or work group without the need for a receptionist (e.g., press 1 for payment information and so forth.) Automated attendants require careful advance planning in defining work groups and call paths so that people with appropriate skills can respond. There is nothing more irritating than being endlessly transferred from person to person who might be able to help.
More sophisticated are call-management systems (CMS). CMS, utilized in conjunction with automatic call distribution, provides a wide range of management information, most particularly the ability to track call volumes by type of line (800 numbers, direct in dial, outbound) to quantify the phone traffic entering or leaving a facility, as well as abandoned calls, calls placed on hold, those going to voice mail and so forth.
CMS information is critical to effective staffing of any major service facility. However, they are not cheap and must be fully utilized. It makes little sense to only occasionally examine the statistics. Daily reviews are essential for the first 90 days of operation, tweaking and fine-tuning the number of lines and workstations. After that, weekly reviews should be sufficient in establishing patterns.
Voice-response units (VRUs) provide for the automated answering of calls through the use of programmed scripts. Callers use touch-tone digits that access a host computer and provide information to the caller. Callers usually have the option to transfer to a live customer service representative. Customer usage of VRUs will vary by application, however, effective VRU scripts obtain a call completion ratio of 20 percent to 30 percent.
Auto-dialers are special-purpose computers that can accept a feed of customer names and phone numbers from a sales or screening system and call that number at predetermined times (or continuously) until the phone is answered. They are smart enough to know when a real person is on the line versus an answering machine. As soon as the person answers, the call is transferred to an employee for handling. Auto-dialers are particularly effective in delinquency situations to avoid lost productivity. Again, the cost of such units (in excess of $100,000) has to be weighed against the hourly costs of employee time.
The key to effective use of voice technology is the establishment of a customer service strategy. What level of service can a firm effectively offer? Obviously, an ideal situation would be every call picked up on the first ring every time by just the right person. But that is an expensive proposition. Long-established industry names allow for a 5 percent blockage rate (callers receive a busy signal) and 5 percent call-abandon rate without materially affecting the customer's service impression. Beyond that, there is a risk that each unanswered or mishandled call means lost business.
Firms that do customer satisfaction surveys should routinely ask about quality of phone response. Customer service representatives should periodically (e.g., every 10th call) ask a few brief questions about the customer's satisfaction with the handling of incoming calls.
Voice technology is evolving to more clearly integrate with data, allowing us to insert relevant pieces of computerized information (e.g., loan balance) into a voice message. Voice synthesis (computer-generated speech) is also becoming increasingly natural; eventually it will permit human-machine dialogue that is nearly indistinguishable from the real thing. This will enable routine inquiries to be handled by computer, with only the more complex requiring skilled personnel. Another advancement in voice technology will allow the incorporation of voice messages with data transmissions, thus permitting a loan officer on a laptop to transmit a 1003 along with verbal commentary about particular customer circumstances.
It takes a real partnership between the technical staff and the business-unit manager to determine the appropriate voice technologies to use and how to configure them to support a customer service strategy. As new technologies emerge, it will be necessary to evaluate the value they provide in improving the image to the customer. It will take skilled technicians to marry voice, data and eventually video and image technology. Band width continues to decline in price, yet we demand ever more band width. Wireless technology has the capability to reduce our dependence on predefined networks, giving us more of a band width-on-demand capability. Arguably, the human voice is the single most important function in enhancing our ability to reach the customer.
Reducing costs/improving service
Now we can return to the CEOs' objectives in integrating technology more deeply into their operations. Their key objectives are to reduce cost and improve service. Although we can easily agree with the importance of those objectives, the challenge comes in clearly targeting the improvements to be made in either area and then linking them with the transformative technology in question.
For example, one way to improve costs is to make people more productive. Productivity is conventionally defined as "doing more with less," but does that mean less manpower, less-skilled manpower or taking less time? Does more processing mean a greater number of units, handling a greater number of customers, or handling the same numbers but at a higher level of quality and with greater satisfaction on the part of customer and/or employee?
Similarly, what does "improve service" really mean? Some firms, including my own, routinely measure customer satisfaction and compare it against targeted goals. Our customers might well view productivity in terms of timeliness, accuracy, availability or clarity. To improve service, must we increase costs? We almost certainly will have to increase our investment in new systems or processes; possibly in new people as well.
With time, implementing the changes the customer asks for may well reduce operating costs as well, because eliminating errors and reducing repeated work will have that effect. Thus, service improvement and cost reduction may simply be part of the same transformation; only the time frame shifts.
A current example is the use of fax technology. Fax provides for very timely response and presumably increased service. However, the investment in fax technology increases our costs, at least in the short run. On the other hand, improved service may increase volume and thus increase revenue. If fax can reduce the unit cost of handling each transaction, can the same number of people handle increased volume? Is this another way of saying we are "doing more with less?"
But what happens if the very convenience of fax means we begin to depend on this technology and use it even in instances where the mail would suffice? These incremental (though arguably unnecessary) costs have to be factored into the equation.
The bottom line is that virtually no firm has ever fully documented a return on its investment in fax. Most probably thought at the time that it sounded like a good idea and made a leap of faith that it would provide some benefit. Other firms hesitated, only going to fax when it became an industry norm. Then not having fax capabilities became a competitive disadvantage. This also may be the rationale that impels us to consider transformative technologies.
Businesses have rarely demanded, and information technology staff have rarely taken the initiative, to look in retrospect at actual costs and benefits derived from technology. So it's almost impossible for us to assign financial value, especially for transformative technology. This is because we cannot identify the true costs of the way we do business or of our existing automation solutions, much less project costs of running our business with the new automation solution. Thus, I believe the implementation of transformative technologies in any given company will occur in one of three ways:
* Things will either get so bad in a given area (cost increases or declining service) that senior management will be ready for any alternative, no matter how radical.
* Projections that the consequence of a "business as usual" approach will cause the firm to miss key business objectives (profits, market share, etc.) will prompt openness to new technologies.
* Senior management and I/T will share a vision of an alternative way of conducting business that requires significantly changed business processes and an embrace of the technologies that engender this change.
An I/T organization to support transformation
I stated earlier that our principles start with people and end with people. I have said that transformative technologies fundamentally change not only the way the processing is done, but the way the business is done. And that includes the business of information technology as well.
A basic question arises as to who in I/T is going to identify transformative technologies for any given company, talk to the vendors and select the best offerings, sell the concepts to senior management, assemble and manage the I/T and client implementation teams, survey users, run pilots (tote that barge, lift that bail). This implies a project team composed of representatives from multiple areas. In a typical corporate hierarchy, are there project team members readily available?
Neither the business functions nor traditional I/T is structured to allow for a pool of senior staff or managers available to respond to these challenges. Such individuals have to have a unique combination of technical skills, business experience and the ability to articulate and share a vision of the future.
Are we recruiting or developing visionaries, or are we recruiting the tried and true? Often visionaries are outsiders, people who bring a breadth of experience to ask the "dumb" questions: "Why do you do things this way?" "What if you did them that way instead?" If we did recruit such people, where would we put them? It's my view that we need a different organizational structure, one in which we have a dedicated group of technically skilled and organizationally savvy people who can partner with their peers in the business units to implement transformative technologies. There are three tiers of involvement that can be envisioned under this new structure:
* Vision groups composed of senior department management who meet periodically and who define the desired workflow model;
* Team members typically manage a senior staff who are assigned full time for the duration of the project and who work with the information technology unit to create system prototypes;
* Users of the processing engine who will actually interact with the system and who design screens and reports (inputs and outputs).
Because of the degree of integration of the various functions, this organization requires much greater emphasis on methodology, standards and lateral communication than has traditionally taken place in I/T. New roles and relationships have to be created both within the I/T unit and between I/T and its clients. Clearly there's a lot of effort involved with such change. But our expectation is that there will be a lot of payback as well.
Selecting a technology strategy
A good way to determine an appropriate technology strategy is by means of the matrix shown in Figure 6. I have added other potential goals to the CEO's goals of reducing costs (i.e., creating operating efficiencies) and improving service. The other potential goals are increasing market presence and forming strategic alliances. I have defined the key technologies that would support each strategy. Thus, if the goal is operating efficiency, a focus might be to "reengineer work processes."
This reengineering of work processes should result in streamlining, reducing redundant steps and overcoming organizational barriers. From a technology standpoint, groupware (e-mail, schedulers, project management tools, etc.) as well as document-management systems would be a useful choice. Similarly, if building strategic alliances were the goal, the focus might be "integrating external appraisers." A strategy would be the ability to review property descriptions on-line. The technology that supports this would be handheld devices with point-and-click software and multimedia that can incorporate characters, handwriting and pictures into a stored image.
Some of these technologies have been in place for some time, while others are just emerging. Not every firm would want to introduce more than a few in a given year; no firm would ever want to use every technology. The key is to have a defined set of business goals that provide appropriate metrics to measure achievement, along with a strategy that provides a point of focus so that the technology may be selected.
One way of determining how well your I/T function is positioned to support transformative technologies is to do a quick check of capabilities. As shown in Figure 7, there are 10 critical items to look for, ranked in relative order of importance.
Larger firms are obviously more likely to score well than small companies that must devote the majority of their I/T efforts to just keeping the systems up and running. Defining a systems architecture, for example, takes a high degree of sophistication. A large firm can have a staff of technical planners; a smaller firm may choose to use consultants or vendors for this task.
A firm with a heavy commitment to developing new systems in-house will surely need a systems life cycle methodology; a firm that buys most of its software off-the-shelf can probably get by with a process for implementing new releases supplied by the vendor.
Please note that there is no right or wrong answer here. The absence of some of these items does not mean your information technology department is doing a bad job. My point is simply that if a majority of them are lacking, it means your firm is not well positioned to take advantage of technological advances.
All business managers today are in the business of change. If I/T can't rethink how it does things, we will surely never be able to help others perform such a task. If you are not willing to change yourself, don't suggest it to anyone else. Transformation begins at home.
Leilani E. Allen, Ph.D., is a senior vice president with PNC Mortgage Corporation of America, Vernon Hills, Illinois. The author wishes to thank her colleagues at PNC Mortgage and members of the MBA Technology Committee for their review and advice. This is an updated and revised version of an article that appeared earlier in Real Estate Solutions.
|Printer friendly Cite/link Email Feedback|
|Title Annotation:||mortgage banking|
|Author:||Allen, Leilani E.|
|Date:||Nov 1, 1994|
|Previous Article:||Mortgages over the Internet.|
|Next Article:||Getting off the ground with EDI.|