Looking into the crystal ball – a peek into what 2013 will bring
- 11 December, 2012 15:35
The anticipated slowdown in the resources sector is making many Australian businesses nervous about their spending. As a result, CIOs are being asked to focus on pragmatic things like reducing costs, helping to improve existing business processes, and finding the right people to run their current infrastructure. By contrast, in the US, cost cutting has already run its course and priorities of CIOs has shifted back to increasing the top line competitiveness and agility of the businesses they drive.
The biggest overall challenge Australian IT faces over the next year will be carefully choosing the right combination of technology that allows them to cut overall costs today while setting themselves up to rapidly support new business initiatives as conditions improve and change in the future.
To support those plans, I think we’ll see initiatives geared towards increased sharing, more use of automation and greater competition between vendors for the shrinking IT dollar.
Something that is certain for the coming year is that we will see the use of virtualization begin to rise as costs of virtualization software drop due to increased competition. We have already seen many IT organisations adopt a multi-hypervisor strategy to encourage competition between the key vendors in this space.
Another cost saving trend we will see is an increase in the rate at which business critical applications are migrated onto virtualized infrastructure, especially ERP systems built on Oracle and SAP. In many cases the migration from existing proprietary Unix based hardware/software stacks to higher performing and more cost efficient Intel based hardware won’t be done by simply replacing one technology silo with another, but will be done with an eye to the future, onto pre-validated private cloud infrastructure stacks. The combination of the short payback time with potential game changing results in the long-term means that we will continue to see this trend become more popular throughout 2013.
Another area of growth for cloud-based infrastructure will be driven by a massive demand for workforce mobility and collaboration software sometimes collectively referred to as “systems of engagement” – for example, virtual desktops, Microsoft Exchange, SharePoint, and enterprise-class Dropbox alternatives such as Citrix Sharefile. The infrastructure to support these applications needs to be able to scale rapidly with rock solid reliability as they become core platforms for the long-term success of the business, and cloud-based infrastructures with their high degrees of standardisation, automation and elasticity often makes them the ideal platforms for these applications. For many, the big question of 2013 will be: “Is that platform going to be on a public or private cloud?”
This is an important question given the reputation cloud has for reducing costs, and while for security, probity, and compliance reasons it is likely that the core ERP systems will remain firmly “on-premise”, increasingly many IT shops will look to put many of the new “systems of engagement” into a public cloud. With the majority of the technical issue of public cloud having been resolved in 2012, as IT learns how to manage the governance issues, the decision criteria of private vs public cloud will be much like the decision around staffing requirements of permanent vs contracted employees. In the end, the decision around a capex-based private cloud infrastructure vs opex based public cloud becomes one of finding the right balance at the time, and building an infrastructure that can easily move that balance in either direction as business conditions change.
Business analytics and big data
It’s almost impossible today to talk about the future, without also talking about big data, the new buzzword within the IT industry. When we mention big data, we are talking huge amounts of data that need to be stored correctly. For example, on a global scale, the following occurs every minute:
- 48 hours of video is uploaded to YouTube
- 204,166,667 emails are sent
- More than 2,000,000 enquires are sent to Google
- 684,478 pieces of content are loaded onto Facebook
- Over 100,000 tweets are sent.
See infographic from Domo here.
Most IT departments simply don’t operate at that kind of scale, but even so they are clearly impacted by what for them is their very own big data challenges caused by the masses of unstructured data that users and machines are generating on a daily basis. The good news is that organisations, thanks to the open source community, have easy access to the software that lets them mine this kind of unstructured data to understand themselves, their infrastructure, and their customers better.
Today, however, outside of the scientific community and a small number of very forward thinking organisations, big data is still in its very early days. Having said that, one thing I think we will see in 2013 is that all of the interest in big data along with new products from Microsoft, SAP and others will be a renewed interest in “small data” analytics and uptake of this technology may be on the verge of booming. In the meantime, some companies are beginning to analyse other machine generated big data sources such as CCTV video to change building layouts resulting in improved workplace productivity and customer satisfaction. If nothing else, the results that big data/Hadoop can achieve, shows how IT can cost effectively use scale-out technology to tackle workloads that were previously reserved only for “Big Iron”.
Agile data infrastructures
Throughout 2013, it will be very hard to put an exact number on how much data organisations will need to process. But one inescapable trend is that data storage will continue to grow exponentially and that despite steady declines in the costs of disk drives, data storage, and its associated management, costs will increasingly consume a large proportion of IT budgets, and will continue to do so going forward. Rather than deal with this in the piecemeal tactical manner which has led to the current unsustainable growth in the cost of storage, beginning in 2013 IT will create agile data infrastructures that not only lowers the overall cost of owning and managing data storage in the short term, but also creates a platform for future innovation and ongoing costs savings without sacrificing reliability. Many are doing so based on criteria Gartner identified, those criteria being:
- Storage efficiency
- Intelligent caching
- Unified architecture
- Integrated data protection
- Continuous operations
- Secure multi-tenancy
- Service automation and analytics.
It’s important to note that not all of the new trends and technological introductions mentioned here will suit each organisation. Each company has different needs. CIOs and IT decision makers need to do their homework into what technology or solution will be right for their unique infrastructure. Be sure not to implement a new solution for the sake of keeping up with the crowd, it won’t work in your favour. On the other hand, Australian IT has the opportunity to learn from previous mistakes. Reducing costs has to be balanced with building for the future, but the good news is that there are now some great technologies that will help you do both.
John Martin is Principal Technologist at NetApp Australia and New Zealand
Join the CIO Australia group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.
The enlightened CIO’s guide to running projects
Why IT projects really fail
Queensland government to provide 200 services online by 2015
Call Centers Suffer From Big Data Overload
CIO 100: Carsales wins top gong for innovation
APAC Digital Performance
With some of the highest levels of social media penetration, mobile device ownership, and Internet connectivity in the world, Asian markets are ripe for more innovative and adept interactive engagement. In this study, we look at how marketers in the region express high hopes for digital, but hare held back with limited budgets and a region-wide lack of talent and training. Click for more
Complexity Ate My Budget
It’s high time we tamed the monster we created! Against a backdrop of sustained and uncontrollable data growth, most of today’s operational problems revolve around backup and recovery. Understanding the hidden costs and implications for data protection strategies is critical, but the complexity of the nebulous and amorphous cloud can make everything hazy. This white paper breaks it down to different dimensions of virtualisation and how to deliver the productivity and flexibility it promises.
Is your data centre growing too complex for your backup?
Backing up data today is growing more complex - and in an era of virtualisation, big data and cloud deployments, it can be difficult to maintain control over your data, resulting in loss and downtime. This hour-long webcast features expert commentary on navigating the complexity of backing up a heavily virtualised infrastructure; simplifying your backup software and hardware ecosystem; reducing the cost of backing up your organisation’s data, and modernising your backup infrastructure with integration. The presentations will conclude with an interactive Q&A session.