How to Do Financial Trading IT Right: Behind the Scenes at Liquidnet
- 02 July, 2013 16:58
When most people think of big financial trades in Manhattan, they think of Wall Street, where the suit-and-tie culture applies to all roles, including the programmers and operations folks.
But there's more to Manhattan finance than Wall Street. Travel north about three miles and you enter the Fashion District, the land where 7th Avenue becomes Fashion Avenue-where people believe that good design will lead to success, instead of the other way around.
Midtown is also the home of Liquidnet, a trading firm with a no-tie policy, $49 billion in average daily natural liquidity-and only 300 employees. To achieve those numbers, the company needs a lot of IT; nearly half its employees work in information technology.
This May, Liquidnet opened its doors to host the Workshop on Performance and Reliability (WOPR), which brought me into town. When the company offered to let me in a day early to talk about how it company does IT, I didn't say no. But I did bring my notebook.
Explaining Financial IT to a Mere Mortal
I get off the elevator on 15th floor and walk past reception and the lunch area with the frozen yogurt and slushy machine (more on that later). I'm ushered into an office to meet Stefan Kutko, who you might say is a manager of application development for Liquidnet.
"You might say?" I ask. "Tell me more about that." It seems as good a place as any to start.
Stefan Kutko heads capital markets at Liquidnet. That means he deals with a lot of code.
Kutko explains that Liquidnet integrates large institutional investors, allowing them to trade large blocks of shares without disrupting the market.
"The average trade on the New York Stock Exchange is 250 shares. Our average is more like 42,000. You simply can't sell that kind of volume on the street in one trade, so you have to break it up," he says.
When that happens, though, people take notice and try to profit. "Now they know daily supply will be very high," Kutko says. "We don't want to take advantage of the market-we're the good guys-so we pair up buyers and sellers. You could think of it as a wholesale marketplace. To technology people, that also looks a lot like a database of buyers and sellers."
Kutko's group creates Web applications to expand those trades into something he calls "capital markets," or off-exchange ventures where a company trades its own stock directly, like a buy-back or IPO.
A Technology Infrastructure With 'Fewer Silos of Knowledge'
Kutko explains that Liquidnet has a "no title" policy. People define themselves by their responsibilities: His is head of global markets technology. In my time at Liquidnet, no one says he or she "writes code." Instead, people talk about the business actions they enable.
Kutko started at Liquidnet five years ago as a programmer, working on internal trade support applications. He's currently working on a platform that allows late-stage private companies on the verge of going public to let employees and early founders sell shares to strategic investors. This reuses the Liquidnet model for block trading to provide an application that lets buyers purchase institution-sized blocks of private company shares.
Liquidnet's architecture makes this all possible-there's a clean break between the front-end, which is usually Web-based, and the back-end Web Services, Kutko says. Services are generally RESTful, passing JSON, and can therefore be repurposed for any GUI application.
Case Study: Rapid Application Development the Zappos Way
In addition, Liquidnet has standardized on Google Protocol Buffers messaging technology, which lets common services communicate in both synchronous and asynchronous patterns. If the customer wants a real-time operation, they can wait for the transaction to finish or put it in the messaging queue. The protocol supports both methods and works both for the trader application, a Windows application written in C#, and C++, along with node.js and Java Web-based technologies.
After chatting with Kutko, I meet Matt Moss, the head of middle office technology. His role involves building that database to connect buyers and sellers, enabling the messaging system that other application teams can hook onto, collecting operations reports and sending them securely to the Order Audit Trail System (OATS) established by the Financial Industry Regulatory Authority. Moss also manages the system test area and a copy of the aforementioned messaging system and database-not only to test his own changes but also as a platform for teams to test changes in a life-like trading environment.
It's not just large, on-off trades, either. The trades often don't quite line up. One company might want to purchase 45,000 shares of one stock, and the other firm wants only 40,000. Or no large institution wants to buy the stock any time soon. To solve that problem, the company has an algorithm that breaks large transactions into smaller ones, then sends them onto the stock exchange in an unpredictable pattern throughout the day so the trades minimize market impact.
Remote Software Pushes Help Make It All Work
As I talk to the software leaders, I keep thinking about that thick front-end client, written for Windows in C#. That means new versions need to get pushed to the desktop, in an environment that is generally highly regulated and (software) change-averse. How does that work?
It's time to meet Phil Pactor, head of Enterprise Technology Services, which covers the IT department from support to operations and server setup.
Phil Pactor sees the advantages in cloud computing as decreasing time to add capacity and new features.
Pactor agrees that traditional software pushes to clients are increasingly problematic. That's why his team is moving toward a remote model, where clients use remote desktop to access software running on a terminal server. Liquidnet controls and administers that single machine; if a patch is needed, Liquidnet can do it without ever touching the clients' machine.
The idea is fairly new, and some clients still want the desktop app, but that kind of thinking is improving the company's ability to push new versions of the software with less risk.
Another question on my mind concerns the spikes in demand that Liquidnet must see. The company is constantly opening new markets, with new customers, and the stock market isn't exactly known for being predictable. Is Liquidnet considering cloud technologies?
"When we start to [talk] about the cloud we need to be a little more specific because people think of different things," Pactor says. "The first thing that springs to mind for many people is the public cloud, the Amazons. That might be in our future, but right now, the main project is the private cloud. How can we scale as the amount of transactions and trading partners continues to grow over time? We want the flexibility to expand or contract horizontally without the delays of new hardware."
To do that, Pactor continues, Liquidnet needs to maintain some excess capacity-up to 2.5 times expected trading volume-while quickly growing using commodity hardware. "We try to build out that headroom. The time from a project being announced to the need is about one-quarter of what it was five years ago. Today we can slot in CPU and memory very quickly with generic hardware. It's days, not weeks. Provisioning servers still requires some human intervention, but we can get a lot done with scripting tools like Python."
Pactor doesn't rule out the idea of using public cloud services, but he points to the sheer amount of scrutiny around access control, intrusion detection, audits and physical security as reasons to look at the public but not plan for just yet. His main pushes for an internal cloud, meanwhile, are less to prepare for a public future and more to speed up the software teams while decreasing time to add extra capacity.
After talking to Pactor, I discuss leadership theory with Moss over lunch-he's for small teams and loose standards built by coalition-and for the day. I return to Liquidnet for WOPR but expect that my time talking to folks at the company is up.
Liquidnet CIO: Taking 'Journey' to 'Shared Service Architecture'
At lunch on Friday, I get a bit of a surprise; CIO Neal Goldstein invites me for a chat. These things are hard to refuse. Besides, it means another trip past the frozen yogurt machine.
Neal Goldstein leads information technology at Liquidnet.
We discuss the company's business model: How Liquidnet enables institutional investors such as mutual funds, pension funds and companies to trade large blocks of shares. This is the conversation I expect, about the business side, but Goldstein takes a left turn, talking about how his technology group makes it possible for 700 of the world's largest asset managers to trade in 42 markets in real-time.
Like many large IT organizations, Liquidnet grew through immediate solutions. Each of these one-off solutions is individually efficient, but they also slow the entire system down, to the point that only a few years ago the company had 18 different protocols and communications platforms. After analyzing the impact of having 18 different ways to do things, the IT department began a "journey" toward a "shared services architecture." That's the architecture I've been learning about all week.
Goldstein describes the "why" of the architecture. Each market Liquidnet operates in has its own symbols, its own unique methods of market access and structures. By isolating that and creating a single reference architecture, the company can make entering new markets a standard process, not a system conversion. "We just entered Thailand," he says, "and before that, the Philippines."
Liquidnet has catered lunches every Friday. Today's theme: Barbecue.
For every trading company, Goldstein says, the biggest data problem "is capturing every movement of every stock. That creates terabytes and petabytes of data, which need to be scored and crunched for trading strategies."
A year ago, Liquidnet combined its ticker tape data repositories. Now, nearly all systems run off the same reference, which itself is scrubbed with different reference checks, Goldstein notes. One example: "Did we get the roughly 800 million distinct data elements ingested, scrubbed and processed a day?"
To master those message transports, Liquidnet brought "thought leaders in messaging" into a special interest group that, in a six-month span, determined what products it needed for the next six years. (The old messaging transport system lasted a decade.) "In a matter of a year or two, all of our core backend trading systems were converted," Neal says.
This method of identifying leaders, giving them resource, and working out a collaborative standard is something I hear several times from Goldstein. Some groups meet at lunch, others after work at "Beer Friday;" the company provides the drinks, especially if employees want to kick around ideas for improvement. He mentions the work of Thomas Vaniotis and Tom Puzak, managers who led a self-appointed initiative to visualize log data using ideas pioneered at Etsy.
After lunch, I spend another hour at WOPR before heading to the airport. It was a refreshing week, and one well spent, but I can't help feel like I wasn't visiting the technology department of a financial services firm. Perhaps, after all, I was visiting a technology firm that happened to specialize in supporting financial services.
Matthew Heusser is a consultant and writer based in West Michigan. You can follow Matt on Twitter @mheusser, contact him by email or visit the website of his company, Excelon Development. Follow everything from CIO.com on Twitter @CIOonline, Facebook, Google + and LinkedIn.
Read more about finance in CIO's Finance Drilldown.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.
The enlightened CIO’s guide to running projects
The enlightened CIO’s guide to running projects
Why IT projects really fail
Queensland government to provide 200 services online by 2015
Call Centers Suffer From Big Data Overload
Securing Virtual Desktop Infrastructure
Today’s enterprises are rapidly adopting desktop virtualisation as a means to reduce operating costs, enable workplace flexibility, increase business agility and bolster their information security and compliance posture. Actually realising these benefits, however, depends upon ensuring the security and availability of the virtual desktop infrastructure. Find out how you can not only preserves the benefits promised by virtual desktops, but how it can maximise them. Click to download!
Efficient Data Management in Three Simple Steps
Gartner reports that Business Intelligence, Mobile Technologies and Cloud Computing rank 1-2-3 as the 2013 Global CIO Technology Priorities. These three trends, labelled the “Perfect Storm” of new technologies, are transforming every link in the IT value chain, promising to deliver more efficient, responsive and dynamic IT operations. But this also means massive shifts in the way IT applications and services are created, deployed and maintained. This whitepaper aims to help you begin the journey to efficient modern data management
Case Study: Worldwide Collaboration by Design
HOK is a global provider of architectural planning, design and delivery solutions, that operates out of 24 offices on four continents. Being a truly global organisation, HOK needs to empower its worldwide workforce in order to effectively leverage its highly skilled people, irrespective of where they may be located. In this case study, we look at the benefits the organisation saw from introducing collaboration and conferencing technologies. Click to download!