Wall Street Rush Hour

Every weekday morning around 8:45, the floor of the New York Stock Exchange Inc. comes to life. Traders dart from broker booths to trading specialists, scribbling quotes on tiny pieces of pink paper. The specialists themselves are hard at work too; using specially designed keyboards to scroll through pages of data. This pace crescendos until the bell kicks off trading at 9:30 a.m., when it becomes downright frenetic. Scampers become sprints. Typing becomes flailing. For a few moments, the 3,300 people who live and die on the floor of the world's largest exchange are moving as quickly as humans possibly can.

In the middle of all this madness, Bill Bautz stands calmly, arms crossed in front of him. With a wry smile, he glances at a screaming crowd to his left, nods his head slightly from side to side and laughs. For Bautz, the NYSE's senior vice president and CTO, this madness is just a microcosm. Action by brokers on the floor of the exchange represents only 10 percent of the 1 billion shares traded on the NYSE every day. The rest, Bautz says, is handled electronically by the expansive processing systems he helped build. By year's end, these systems, some of the fastest in the world, will be capable of transmitting as many as 2,000 messages per second. Bautz jokes that the technology makes the rat race around him seem lethargic, inefficient and, well, just plain human. (Bautz has since retired from the NYSE but continues to play a consultative role.) "You think this is fast?" he asks. "The NYSE is one of the only exchanges in the world that still has a trading floor. In many ways, what you see here is really a dinosaur."

Bautz is right; during the past 10 years, financial services organisations have employed so much technology that floors like the one at the NYSE have become veritable throwbacks to the days of yore. Everything in the industry, it seems, is wired, from brokers to markets to new trading forums such as electronic communication networks (ECNs). Today, just months after surviving a Y2K remediation that cost the industry an estimated SU$5 billion, financial services technologists are gearing up for new challenges they say could transform the business even more.

Sometime next year, the industry will switch from trading equities in fractions to trading them in decimals, necessitating major code improvements and a significant increase in system capacity. On the more distant horizon, there'll be a move from batch to real-time transaction processing, facilitating straight-through transactions by linking brokers directly to banks and exchanges. Finally, experts say that as more organisations offer customer-oriented Web applications and as more customers transmit financial information online, technologists must find new ways to handle issues of security and access--and new models to promote IT solutions to the business world at large.

"It took years, but we've built this industry on a foundation of technology," says John Panchery, vice president and managing director of systems and technology for the Securities Industry Association (SIA), an organization that serves more than 740 brokers and securities companies. "Now comes the hard part. We've got to take the technology we already have and make it work. That's no easy job, and it's going to take time and money. All of that responsibility will fall to the CIO."


Financial services technologists agree that perhaps the most immediate issue will be the impending industrywide move from fractions to decimals, known among technophiles as decimalisation. Since the NYSE opened its doors in the late 1700s, U.S. equities have been traded in fractions, a vestige of the Spanish system by which coins were cut into eight pieces and traded as "bits." For more than 200 years, equities and options were priced in increments of eighths of a dollar, and therefore had as many as eight price points on any given trade. In 1997, markets began trading in sixteenths, offering as many as 16 price points. Sixteenths--traders frequently refer to them as "teenies"--are the denominations in which most equities trade today.

Change, however, is right around the corner. For years, experts have decried fractions as an anachronism, saying that because most modern currency is exchanged in dollars and cents, securities should be traded in the same denominations. Finally, earlier this year, the Securities and Exchange Commission agreed and mandated that markets gradually lower their trading increments from sixteenths to nickels to pennies, and convert all resulting measurements to decimals. Under the new rules, equities would be traded as cash--instead of, say, IBM stock selling for 112 and 3/16, it would sell for 112.19. These changes will take effect April 1, 2001, and all equities companies must comply.

Between now and then, of course, the responsibility for converting to decimals lies with the exchanges themselves. Before brokers can trade on equities priced in decimals, the exchanges offering these equities must convert prices to decimals. According to Bautz, this process is fairly straightforward. Most computers read fractions as decimals to begin with, so the issue is simply reprogramming systems to display decimals instead of fractions. The project is already complete at the NYSE, Bautz says. At the Nasdaq, the effort is taking a bit more time. Executive Vice President and CIO Gregor Bailar says the switch should be finished by next spring.

Exchanges aren't the only ones that need to prepare for these new prices. Brokers will have to fix their systems too, and Scott Abbey, chairman of the SIA's decimalisation committee, says this may require some fairly significant changes in system code. Because many broker systems are programmed to accept data only in fractions, broker technologists might need to reprogram these systems to understand and accept prices in decimals. What's more, because any trade can theoretically have an infinite number of decimal points, programmers will need to establish limits so that their systems read just the first two or three decimal points of every price. In some cases, Abbey says, these changes will be simple. In other cases, he says, they could mean millions of tiny code modifications, a process not unlike the one necessary to prepare for Y2K.

"This might sound obvious, but the more information you've got, the more changes you need to make," says Abbey, who doubles as executive vice president and CIO at Paine Webber Group. "For smaller or newer organisations, this conversion isn't that big of a deal. But for some of the older ones, those that have traded millions of equities on fractions for years, the effort can be quite an undertaking."


What will this process cost? Overall, North American securities companies will spend approximately US $900 million on the conversion between October 1999 and its completion date, expected mid-2001, according to a report published by Needham, Mass.-based TowerGroup. At Paine Webber, Abbey estimates it will probably cost a couple of million dollars, adding that so far the remediation effort has changed hundreds of thousands of lines of code. Generally, Abbey isn't too worried about this process--when asked to describe it, he says it doesn't require "rocket science." He does, however, admit that the issue of reformatting report screens to display prices in decimals could present a bit of a challenge, so in preparation, his programmers began designing new report screens in February. They are now nearly done with the job of creating them and making the other code changes needed to support decimal trading.

At Fidelity Investment Systems Co., a subsidiary of Fidelity Investments, CIO and President Don Haile is more concerned about whether his systems will be able to handle the trading volume that decimalisation is expected to bring about. More price points mean more prices at which brokers can trade, which in turn means a major uptake in trading volume. Industry technologists say the 1997 switch from eighths to sixteenths resulted in a 14 percent increase in overall volume, and a 1999 report by Palo Alto, Calif.-based consultancy SRI International suggests a switch from sixteenths to pennies could catapult daily volume another 139 percent. Haile says his estimates support this claim. Currently, Fidelity's systems have a capacity of several thousand messages per second, and the brokerage is anticipating it will need to support between six and seven times that number.

To prepare for this increased volume--and to keep up with current demands--Haile says Fidelity adds new Sun Microsystems and Stratus Computer CPUs to its mainframes almost monthly. At Charles Schwab & Co., Adam Richards, vice president of architecture and planning, does the same, buttressing his network with CPUs and servers as needed. Fidelity's goal is to maintain network capacity at between three and four times peak usage, which Richards says is "a moving target from day to day." (So far the broker's high-water mark for simultaneous secure log-ons is about 95,000.) Perhaps the most innovative approach to increasing capacity has been at E-Trade Group, one of the industry's new Web-based brokers. There, CTO Josh Levine has installed a load-balancing application from Sunnyvale, Calif.-based Resonate to distribute traffic and dispatch customers to one of several data centers in the United States. As volume increases, the device diverts traffic evenly to one center or another, managing capacity as it goes.


Although these decimalisation efforts take up a lot of their energy, technologists such as Levine and Richards have glimpsed an even bigger challenge lurking on the horizon: In order to stay competitive and keep pace with the increasingly Internet-driven market, the industry is making a move from batch to real-time transaction processing. Although the switch probably won't occur until 2002 or 2003, technologists at markets and brokerage houses are already planning ahead, merging processes and doing what they can to minimize current batch windows. Unlike decimalisation, this process requires a total restructuring of the systems that drive the industry, and experts predict it could cost as much as, if not more than, what it cost to prepare for Y2K. According to the SIA's Panchery, this is reason enough for technologists to start panicking.

"The year 2000 problem was a maintenance job; this is a rewrite," he says. "Companies need to analyze what processes still occur after the close, what processes don't [occur then] and what processes can happen in real-time. That exercise will take time, and that time will cost billions."

Today, though most financial transactions are effective as soon as traders submit them, it actually takes much longer for money to change hands. Once trades have been completed on a market or ECN, local processing systems send them to the Depository Trust & Clearing Corp. (DTCC), the company that oversees clearance, settlement and custody of almost every U.S. equity trade. At the DTCC, proprietary Assembler-based software filters transactions by equity. Next, an order fulfillment system debits the buyer and credits the seller, then automatically sends confirmations to both parties. Finally, when the process is over, the system filters the completed batch by broker and sends trades back to the companies from which they came. This process can take as long as two days.

Broker batches begin where the DTCC's leave off. These companies process transactions only after they've received confirmations from the DTCC. Then, with the help of mostly homegrown systems, they update accounting systems, balance the general ledger and, of course, reconcile customer accounts. The companies generate confirmations of their own and either send or e-mail the documents to customers at the end of the cycle. This part of the process can take as long as one day.

Sounds complicated, right? It is--which is why some financial services organisations are doing whatever they can to speed the process. The DTCC, for example, is looking into replacing its netting software with an Internet-based application that queries large exchanges for trades, then processes them right away.

Don Donahue, managing director for customer marketing and development at the DTCC, says he's also preparing his systems to support all standard message formats, including XML, Financial Information Exchange (FIX) and the International Organisation for Standardisation's 15022. By acquiring these new technologies and by advocating new standards in messaging and in message transfer, Donahue says the organisation will be able to accept credit and debit information more easily, facilitating the settlement and confirmation steps as well.

"It's essential that people migrate to newer technologies and messaging capabilities, and standardise on newer file transfer capabilities," he says. "It's the only way real-time processing will ever survive."

Brokers, heeding this advice, are preparing for this transformation as well. At Merrill Lynch & Co., Executive Vice President and CTO John McKinley recently launched an effort to break up the nightly batch workload and run processes such as account management and ledger readjustment concurrently. Although McKinley has launched this effort with the immediate goal of positioning Merrill Lynch for extended-hours trading and the increased market volumes that decimalisation will cause, it can only help in the move to real-time processing. And the results of this effort have been encouraging, says McKinley. In the past, Merrill Lynch's nightly batch run took up eight hours of a 12-hour window; today, the batch is finished in half the time.

McKinley is also trying to implement new technologies in hopes that they too will speed things along. In particular, he's opted for IBM's Parallel Sysplex and GeoPlex technologies, which use parallel processing to make more efficient use of its mainframes. He adds that he has also purchased a number of new hard disk products from EMC and IBM to provide more storage space and new servers from Sun to help finish the load more quickly.

"It doesn't really matter to us how we shorten our batch load, but we're assuming we've got to do it right and get it done pretty soon," he says. "Will we ever process transactions in real-time? I sure hope so. Do I think we'll get there overnight? Not by a long shot."


Technologists acknowledge that the brunt of the challenges associated with the switch to real-time processing is still a few years off. Between decimalisation and then, however, as brokers become more reliant on the Internet, experts say these companies will have to improve issues of security and accuracy. While some might argue that these issues fall under the umbrella of customer service, others demur, saying that in the Web space, these are all subjects unto themselves.

However technologists choose to approach these topics, many of the best solutions in these areas are off-the-shelf programs from vendors around the world. In the area of security, financial services companies are deciding against traditional technologies such as firewalls and encrypted passwords and opting instead for cutting-edge applications such as voice recognition and biometrics (see "Body Language," Page 208). At Fidelity, for instance, Haile says his IT teams are looking into incorporating voice recognition software with Powerstreet, the company's new online trading feature. As Haile explains it, instead of inputting a password when they log on, users would speak into their computer's microphone and obtain access that way. Although he hasn't selected a vendor, Haile says he's looked at several and expects to choose one by the end of the summer.

"The goal here is to maximize access and security at the same time," he notes. "With an application like this, you don't even need a password. You just speak your name, and you're in. We feel that's the kind of feature our customers will really support."

Other companies are going in the other direction, looking into Web-based applications that enable customers to perform a single sign-on. With software from companies like Billerica, Mass.-based BullSoft, brokers can eliminate multiple log-in screens and replace them with a single interface for a user name and password. These products also work internally; with BullSoft's AccessMaster, broker employees can log in to all of their corporate systems at once, avoiding redundant entry points and thereby saving time.

In the area of access, many brokers are transferring portions of their Web services from HTML to XML and moving them into the wireless space. Today, at least four major brokers offer customers the option of trading on their cellular phones, and experts predict it's only a matter of time before others do too. Tapping into products from companies such as Macalla Software and WorldStreet Corp., these brokers have developed special wireless trading environments where customers can buy or sell with the push of a button. Schwab's Richards says that in the not-so-distant future, all trades could be conducted this way.

Still, some brokers are looking inward for solutions. At Paine Webber, Abbey has engineered a Web-based service called the PaineWebber Edge, which provides an online statement feature that enables clients to review a year or more of their monthly brokerage statements and chart their progress over time. Last year at venerable Wall Street brokerage J.P. Morgan & Co., CIO Peter Miller and others were charged with finding ways to unlock the value of J.P. Morgan's considerable investment in IT. The result is LabMorgan, an e-finance research and development division officially launched in March, the sole purpose of which is to develop new and exciting ways to commercialise some of the transaction capabilities the company has perfected over time.

In just six months, LabMorgan has released a Web-based service for derivatives management and settlement, a messaging tool that facilitates real-time processing and a global trade-routing program that authenticates parties on both ends of a transaction. According to Miller, now a senior manager at LabMorgan, the new company has helped redefine the role of IT within the brokerage as a whole, giving technologists new leverage among business-side executives. "It's a lesson every technologist could stand to learn," he says.

How is your organisation preparing for these changes? E-mail us at Matt Villano is a freelance writer based in New York City.


Biometric technologies make the Web a safe place for trading Picture this: It's lunchtime on a Tuesday, the Nasdaq's going mad, and you're dying to see how your stocks are faring. You log on to your broker's website, but before you can access your portfolio, the site prompts you to place your thumb on a small device next to your mouse. Next, it asks you to peer into a camera atop your monitor. Finally, it asks you to state your name aloud, slowly and clearly.

Welcome to biometrics, the science of measuring physical characteristics such as the pattern of a voice, face or fingerprint. By digitally measuring selected features with those in a database, these applications prove a user is who he or she claims to be and not only the holder of a password or card. In the financial services industry, where security is always a priority, this kind of authentication is a valuable commodity. And it's one that John Panchery, vice president and managing director of systems and technology for the Securities Industry Association, doesn't take lightly.

"People need security they can trust," he says. "You have a PIN. So what? What's to say someone won't steal it or use it without your consent? Biometrics are not so easy to steal. It is much more difficult to copy someone's voice, fingerprint or face. For the financial services industry, this technology is just perfect."

Biometric companies generally specialise in one or more security technologies such as voice recognition, face recognition and fingerprinting. Woburn, Mass.-based Keyware Technologies is one such company. Keyware's VoiceGuardian is a C++ application that enhances security by verifying distinctive quirks in a user's voice. When a user wants to log on, he or she speaks a predetermined phrase or password. VoiceGuardian evaluates the phrase by extracting certain feature vectors--like pitch and tone--and comparing them with vectors in a pre-existing database file. If the vectors match, the program grants the user access; if they don't, access is denied.

The specifics of this software form the basis of Keyware's most lucrative product, the Layered Biometric Verification (LBV) server. This application, written in Java and Visual Basic, seamlessly integrates multiple levels of biometric verification into one solution. Clients can select any combination of biometric characteristics to be integrated with the LBV server. One client might opt solely for voice verification, while another might require voice, face shape and fingerprint verification. Like VoiceGuardian, this software extracts biometric features and compares them with database files, granting access only if the features match. Unlike VoiceGuardian, however, the LBV server provides up to three levels of biometric-based security.

Veronique Wittebolle, Keyware's executive vice president, says this tripartite combination is what makes her company's products perfect for the financial services marketplace. "[Financial services companies] need a high level of authentication, and we provide that. It is a match made in heaven."


The NYSE can always count on Securities Industry Automation to help with its IT projects If you think the New York Stock Exchange has handled all of its recent IT endeavors by itself, you'd better think again. Sure, a handful of NYSE technologists have worked on the project, but much of the work has been done by technologists from the Securities Industry Automation Corp. (SIAC).

Founded in 1972, SIAC was established as a joint technology subsidiary of the NYSE and the American Stock Exchange. Today, Joe Kubat, executive vice president for NYSE services, says SIAC separately handles all day-to-day responsibilities for the systems and networks supporting trading at the two exchanges.

"All circuits lead to SIAC," Kubat says. "Our computers are linked to the entire securities industry and our work supports every aspect of the trading process."

Take batch processing, for example. With its Continuous Net Settlement system, SIAC filters most of the day's individual trades and figures out exactly who owes what to whom. The result is overwhelmingly positive. The complexity of the actual transfer of stock and money is greatly reduced, according to Bill Bautz, NYSE's senior vice president and CTO.

When the NYSE revamped its trading floor back in 1995, SIAC provided design and implementation services. In 1997, when the exchange equipped brokers with handheld devices to make their jobs easier, SIAC partnered with GTE Corp. in carrying out the project. Even last year, when the market launched its 3DTF virtual trading floor, a 3-D map of systems capacity and performance, SIAC was there as the project manager.

Looking ahead, Kubat says his organisation will help the NYSE incorporate reliability and availability into some of its newest systems. First, the new Institutional XPress application that SIAC is developing for the NYSE now provides institutional customers with market data from the trading floor. Eventually the application will provide order routing and fulfillment capabilities as well. Next, with NYSeDirect+, which SIAC is now developing, the exchange will have its own electronic communication network, enabling customers to specify limit orders of 1,099 shares or less to receive automatic execution. Both systems are being phased in this year. For more information, visit -M. Villano