This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter's approach.
Unified communications (UC) requires many components, protocols, back-end processes and pieces of communication equipment to work together seamlessly. But users don't care about the complexities involved. If they experience poor voice quality, frozen screens or inconsistent performance, they won't use any solution, no matter how cutting-edge.
To ensure workforce acceptance organizations need an effective UC Assurance Program that offers a methodical, end-to-end approach to performance evaluation. User experience cannot be understood by making a few test calls or sporadically checking video streams. UC solutions must be evaluated under conditions that accurately reflect how -- and how often -- they will be used.
Detecting issues and accurately diagnosing problems in the pre-deployment phase is the single most important thing an organization can do to help keep projects on time and on budget. More important, this step is essential to assure that quality, interoperability and other technical issues get corrected before they impact users.
The complexities inherent in UC solutions dictate a methodical, phased approach to testing. The checklist of best practices includes:
• Phase 1: Network assessment
- Validate foundational elements (carrier to IP network).
- Establish baseline performance metrics.
• Phase 2: Real-time (synchronous) communications
- Validate voice, chat and video quality.
- Assess the performance of contact center applications (IVR, CTI, routing, etc.).
• Phase 3: Non real-time (asynchronous) systems
- Validate messaging, data sources, applications and more.
- Evaluate the effect that each element has the quality of real-time communications.
• Phase 4: End-to-end solution validation
- Load test all applications, services and user behaviors running concurrently.
When designing a UC Assurance Program, it is important that it accurately reflect expected usage patterns. Only by emulating -- and then analyzing -- the correct volume and mix of traffic, can an organization fully validate the proposed solution. For example, if a company expects 40% of employees to utilize the new "notify and respond" application by mobile phone, 15% by text and 60% via the Web, the test traffic needs to be configured accordingly.
When designing a test plan for a proposed UC solution, first create a map of all network elements and applications. Next, evaluate how users will interact with the applications to estimate traffic loads and required bandwidth. Then simulate the environment to clarify how traffic from one service impacts another. This will also provide an understanding of routing requirements. Lastly, it is important to assess the functionality of all user options within the application itself. Use this information to create the complete test plan.
Automating the test process is important. Organizations must be able to replicate the exact conditions that created a failure to verify that it has been fully corrected.
Measuring user experience
Because user experience is everything, it is essential that solutions be evaluated from that perspective. A common benchmark used to determine the quality of audio or video as perceived by the user is the mean opinion score (MOS). The following table demonstrates metrics that organizations can use to understand performance from the user viewpoint:
Results that are out of line with these metrics indicate subpar quality, system delays and performance issues. Many times, something as simple as an incorrect QoS setting, inadequate bandwidth provisioning or an improperly configured Session Border Controller leads to a complete system failure. Interoperability problems can also stem from differing codec settings among individual components. Routing issues are a constant source of trouble in complex UC deployments. Programming errors buried in the application coding itself can cause certain functions to behave erratically.
Isolating these issues can be impossible when test phases are skipped or incomplete. Rather, a complete pre-deployment UC Assurance Program will identify the cause of these issues for rapid troubleshooting.
Keep it right: Monitoring UC solutions
Once a solution has been fully tested and deployed into a production environment, continuous monitoring of the user experience is an essential piece of a UC Assurance Program. Software upgrades, bandwidth expansions and unrelated projects constantly change the environment. Ongoing monitoring is not just for detecting system failures. If properly designed, these programs can predict issues and give organizations a chance to correct them before they impact users.
There are two ways to evaluate network traffic. The first approach is to passively monitor it as it goes by to obtain metrics such as jitter, round trip time and packet loss, which enable MOS calculations for real-time voice and video communications. This is an excellent way to obtain important system health metrics, calculate MOS ratings for voice and video quality and ensure availability.
The second approach is to actively monitor specific services, applications or equipment by placing test calls or injecting test traffic into the network, then following the traffic's progress and analyzing the results. This method serves to assess voice quality for customer-agent systems, measure data response times, evaluate video packet loss rates, check routing paths or test system capacity. Organizations can specify a complete range of test paths to fully evaluate the end-to-end performance of even the most complex UC solutions.
Together, these approaches help organizations monitor their UC solutions and predict issues before they impact customers. And proactive correction of problems ensures consistent use across the business.
UC solutions promise greater loyalty, enhanced productivity, improved operational efficiency and tremendous cost savings. However, this technology will never have a chance to prove itself if users dismiss it because it does not work correctly.
Investing in testing and monitoring not only ensures a great user experience, but also a great project experience. Detecting issues in the pre-deployment phase helps keep UC projects on time and on budget. Well executed test plans provide empirical evidence as to the source of problems. This eliminates the cross departmental -- and cross vendor -- finger pointing that can plague such complex undertakings. UC Assurance gives stakeholders the information they need to make smarter decisions throughout the development process. Ongoing monitoring protects against service degradation as environments evolve.
A strong UC Assurance Program speeds time to acceptance from both an organizational and user viewpoint. These best-practice methods enable companies to generate maximum ROI on their UC investments and, ultimately, makes the effort worthwhile.
Empirix is a market leader of service quality assurance solutions for comprehensive customer experience management and end-to-end communication analysis of mobile broadband and IP-based communications systems. The company addresses the quality and interoperability challenges posed by the increased use of mobile broadband, LTE, converged voice and unified communications in both IT and Telecom Environments. For further information, please visit www.empirix.com.
Read more about lans and routers in Network World's LANs & Routers section.
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.