48
Cloud Storage Solutions Business Critical IT Systems Trust in the Cloud Investing in the Future Issue 7 2015 Launch Partners Next Generation Cloud Disaster Recovery-as-a-Service CLOUD COMPUTING WORLD

ClloudComputing_052015

Embed Size (px)

DESCRIPTION

cc

Citation preview

  • Cloud Storage Solutions

    Business Critical IT Systems

    Trust in the Cloud

    Investing in the Future

    Issue 7 2015

    Launch Partners

    Next Generation Cloud

    Disaster Recovery-as-a-Service

    CLOUD COMPUTINGWORLD

  • DATA CENTRESUMMIT 2015

    N O R T H

    DATA CENTRESUMMIT 2015

    N O R T H

    Data Centre Summit North is the first in a series of new one-day conference focussed events, with the first set to take place at Manchesters Old Trafford Conference Centre on the 30th of September 2015. DCS will bring all of the industries thought leaders together in one place, with the industries leading vendors and end users. The focus of the event will be placed on education, networking and debate, It will provide an open forum for delegates to learn from the best in the business.

    The event will also feature an exhibit hall where the industries leading companies will all show their newest products and services, together with a networking lounge so that you can make connections with like minded business professionals.

    To enquire about exhibiting call Peter Herbert on 07899 981123 or Ian Titchener on 01353 771460

    30th of September

    2015Manchesters Old Trafford Conference

    Centre

    Platinum Headline Sponsor

    Event Sponsor

    Schneider ElectricWhite paper

    www.datacentreworld.com

    CLOUD COMPUTING 1

    CONTENTS3 CCW NewsAll the key news from the world of cloud.4 Privacy and ComplianceMoving the legal industry to a private cloud8 Hadoop in the CloudBig clouds, big data and big analytics 12 Bring Your Own Cloud When the cloud casts its shadow 16 Data Centre Design Big data implications for financial services 20 Next Generation Cloud SolidFire Array22 Business Continuity Cloud continuity for distributed applications and services24 A Good DR Plan Investing in disaster recovery26 Trust in the Cloud CipherCloud Data Security Report32 Linux Containers Hot spares mean low cost instant disaster recovery36 Investing in the Future Business critical38 Cloud Storage Solutions Protecting sensitive data in the cloud42 Business Continuity Plan B Disaster Recovery Survey

    e-space north business centre 181 Wisbech Rd, Littleport, Ely, Cambridgeshire CB6 1RA

    Tel: +44 (0)1353 771460 [email protected] www.cloudcomputingworld.co.uk

    LGN Media LTD Publisher & Managing Director: Ian Titchener Editor: Nick WellsProduction Manager: Rachel Titchener Design: Andy BeavisFinancial Controller: Samantha White The views expressed in the articles and technical papers are those of the authors and are not endorsed by the publishers. The author and publisher, and its officers and employees, do not accept any liability for any errors that may have occurred, or for any reliance on their contents.

    All trademarks and brandnames are respected within our publication. However, the publishers accept no responsibility for any inadvertent misuse that may occur.

    This publication is protected by copyright 2015 and accordingly must not be reproduced in any medium. All rights reserved.

    Cloud Computing World stories, news, know-how? Please submit to [email protected]

    CLOUD COMPUTINGWORLD

  • DATA CENTRESUMMIT 2015

    N O R T H

    DATA CENTRESUMMIT 2015

    N O R T H

    Data Centre Summit North is the first in a series of new one-day conference focussed events, with the first set to take place at Manchesters Old Trafford Conference Centre on the 30th of September 2015. DCS will bring all of the industries thought leaders together in one place, with the industries leading vendors and end users. The focus of the event will be placed on education, networking and debate, It will provide an open forum for delegates to learn from the best in the business.

    The event will also feature an exhibit hall where the industries leading companies will all show their newest products and services, together with a networking lounge so that you can make connections with like minded business professionals.

    To enquire about exhibiting call Peter Herbert on 07899 981123 or Ian Titchener on 01353 771460

    30th of September

    2015Manchesters Old Trafford Conference

    Centre

    Platinum Headline Sponsor

    Event Sponsor

    Schneider ElectricWhite paper

    www.datacentreworld.com

    CLOUD COMPUTING 1

    CONTENTS3 CCW NewsAll the key news from the world of cloud.4 Privacy and ComplianceMoving the legal industry to a private cloud8 Hadoop in the CloudBig clouds, big data and big analytics 12 Bring Your Own Cloud When the cloud casts its shadow 16 Data Centre Design Big data implications for financial services 20 Next Generation Cloud SolidFire Array22 Business Continuity Cloud continuity for distributed applications and services24 A Good DR Plan Investing in disaster recovery26 Trust in the Cloud CipherCloud Data Security Report32 Linux Containers Hot spares mean low cost instant disaster recovery36 Investing in the Future Business critical38 Cloud Storage Solutions Protecting sensitive data in the cloud42 Business Continuity Plan B Disaster Recovery Survey

    e-space north business centre 181 Wisbech Rd, Littleport, Ely, Cambridgeshire CB6 1RA

    Tel: +44 (0)1353 771460 [email protected] www.cloudcomputingworld.co.uk

    LGN Media LTD Publisher & Managing Director: Ian Titchener Editor: Nick WellsProduction Manager: Rachel Titchener Design: Andy BeavisFinancial Controller: Samantha White The views expressed in the articles and technical papers are those of the authors and are not endorsed by the publishers. The author and publisher, and its officers and employees, do not accept any liability for any errors that may have occurred, or for any reliance on their contents.

    All trademarks and brandnames are respected within our publication. However, the publishers accept no responsibility for any inadvertent misuse that may occur.

    This publication is protected by copyright 2015 and accordingly must not be reproduced in any medium. All rights reserved.

    Cloud Computing World stories, news, know-how? Please submit to [email protected]

    CLOUD COMPUTINGWORLD

  • 2 CLOUD COMPUTING

    FOREWORD

    Testing, testing...Hello everyone,

    Disasters come in multiple forms, but the effect they have on your organisation can be mitigated against through strong testing. In short, you prepare for the worst-case scenario. Companies have never been this dependent on data or as vulnerable to its inaccessibility. So, what is your strategy for a natural or man-made disaster? In the world of cloud computing, where day-to-day conversations are littered with jargon, theres often confusion over the difference between Disaster Recovery-as-a-Service (DRaaS) and backup. Many organisations end up using the wrong terminology, but the two are actually quite different. For anyone requiring business continuity, backup should be complimentary to disaster recovery plans. Thanks to the rise of cloud-based disaster recovery, a hybrid strategy has become the recommended method of data protection and an affordable option for small to mid-sized businesses. Solely investing in disk is not enough. The cloud also means that the cost of on-premise solutions has also began to look like a waste of IT resources within bigger organisations. Investing in the right plan will enable you to focus instead on core business tasks. Testing can be expensive, but invaluable in assessing both the solution and the people who execute it. While an audit will help, the only way to be sure that your organisation is DR ready is by testing, testing and more testing. Enjoy the issue.

    Best Regards, Nick Wells, Editor, Cloud Computing World

    CLOUD COMPUTING 3

    REGULARS

    Cubic Interactive Ltd has launched Gekko, a smart email management and collaboration tool for the architecture, engineering and construction industry. Gekko has been designed to help users easily find, filter, track, categorise and prioritise emails providing a smart solution to email overload and reducing the amount of time users and project teams spend on sharing, managing and filing emails. Using industry standard IMAP technology, Gekko integrates with all popular email programmes including Microsoft Outlook, Gmail, Apple mail, Kerio. With first-hand experience working within the industry, our team has a unique insight and understanding in how projects are managed and the problems the AEC industry faces with time, administration and compliance, Daniel LoGiudice, Director of Cubic Interactive. Weve created Gekko to help teams manage their projects, improve team collaboration and maintain key client relationships.

    www.cubic-interactive.com

    ANSYS customers can now deploy consistent enterprise-specific simulation workflows and data to more engineers, regardless of geographic location or business unit with the launch of ANSYS 16.1 and ANSYS Enterprise Cloud. The new solution, running on Amazon Web Services (AWS), simplifies and accelerates the transition to cloud-based simulation by providing a reference architecture for end-to-end simulation that can be deployed within days minimizing risk while boosting productivity. Customers who adopt the ANSYS Enterprise Cloud can scale their simulation capacity including infrastructure and software assets on demand, in response to changing business requirements, optimizing efficiency and cost while responding to the growing demand for wider use of the technology. HGST sees the use of cloud computing as an important paradigm shift, providing increased business agility and the capacity when and where we need it, said Steve Phillpott, Chief Information Officer of HGST, Inc. We are impressed that the ANSYS solution delivers the full end-to-end simulation process in the cloud, allowing us to maintain models,

    simulate and analyze results directly in our virtual private cloud (VPC) environment. Keeping everything in HGSTs VPC mitigates compliance, connectivity, performance and latency issues that are unique challenges for complex modeling and simulation workflows such as ours. The ANSYS Enterprise Cloud has been carefully architected to remove previous barriers to adoption of cloud computing for engineering simulation. Delivered in a single-tenant environment that secures customer data, the solution supports 3-D interactive graphics workloads and auto-scaling high-performance computing (HPC), so results dont need to move between end users and the cloud data centre.

    www.ansys.com

    Cloud robotics, which involves the integration of cloud computing technology in robots, has been gaining prominence globally. End users have begun to recognise the benefits of this novel concept, which uses the Internet to augment a robots capabilities, mainly by off-loading computation and providing services on demand. With this concept set to make future robots more productive and efficient, the diverse requirements of end users will be met with no compromise made to the quality of services. New analysis from Frost & Sullivan, Innovations in Cloud Robotics, finds that cloud robotics will lead to the development of smart robots that have higher computing efficiency and consume less power. These attributes will drive down the cost of manufacturing as there is less hardware and also result in lower emissions. As cloud robotics moves beyond its nascent stage, numerous applications of these technologies will come to the fore, noted Technical Insights Research Analyst Debarun Guha Thakurta. For the moment, healthcare, transportation, consumer robotics and manufacturing are areas that can benefit from the use of shared resources and the elimination of the need to manage or update robotics software. The major challenge for market participants, however, is the high dependence of cloud robotics on active Internet connectivity for processing any function. In areas of limited or no connectivity, robots powered by the

    cloud are unable to function effectively and respond promptly in critical situations. The convergence of cloud robotics with big data, context-aware computing and high-speed ubiquitous wireless networks, along with the use of advanced wireless sensors, could solve connectivity issues that slow response times, said Technical Insights Research Analyst, Mousumi Dasgupta. Operations that require the execution of tasks in real-time will also need service-oriented robots with on-board processing capabilities.

    ww2.frost.com

    With consumers and businesses doing more work than ever in the cloud, it is vital that users have intuitive and powerful ways to access and collaborate on documents, photos, files and videos, whether on a PC, Mac or mobile device. To help make the cloud a social experience, FileChat has announced that it has launched its platform which delivers innovative features that allow users to chat, like and vote on any of their files to harness the power of cloud based storage like never before. Proven cloud storage solutions such as Google Drive and Dropbox need an easy and common way to make collaboration and messaging more effective for businesses and consumers alike. said Alberto Escarlate, CEO of FileChat. The new FileChat application gives users complete control over their cloud and will usher in a new way of managing documents and files and discussing them in one single user-friendly platform. FileChat is currently free and will be adding premium services/levels in the coming months. FileChat has secured a $3 million round of funding from private investors to support the development and launch of the platform. By adding social elements directly to cloud storage services, FileChat improves productivity and efficiency and allows everyone to use their favorite cloud storage platforms for much more than simply storing files. FileChat also offers the first powerful search tool dedicated to discussions on documents saved on cloud storage, so users can quickly find what they are looking for.

    www.filechat.com

    CCW NEWSAll the key news in the world of cloud. Please dont forget to check out our Web site at www.cloudcomputingworld.co.uk for a regular weekly feed of relevant news for cloud professionals.

  • 2 CLOUD COMPUTING

    FOREWORD

    Testing, testing...Hello everyone,

    Disasters come in multiple forms, but the effect they have on your organisation can be mitigated against through strong testing. In short, you prepare for the worst-case scenario. Companies have never been this dependent on data or as vulnerable to its inaccessibility. So, what is your strategy for a natural or man-made disaster? In the world of cloud computing, where day-to-day conversations are littered with jargon, theres often confusion over the difference between Disaster Recovery-as-a-Service (DRaaS) and backup. Many organisations end up using the wrong terminology, but the two are actually quite different. For anyone requiring business continuity, backup should be complimentary to disaster recovery plans. Thanks to the rise of cloud-based disaster recovery, a hybrid strategy has become the recommended method of data protection and an affordable option for small to mid-sized businesses. Solely investing in disk is not enough. The cloud also means that the cost of on-premise solutions has also began to look like a waste of IT resources within bigger organisations. Investing in the right plan will enable you to focus instead on core business tasks. Testing can be expensive, but invaluable in assessing both the solution and the people who execute it. While an audit will help, the only way to be sure that your organisation is DR ready is by testing, testing and more testing. Enjoy the issue.

    Best Regards, Nick Wells, Editor, Cloud Computing World

    CLOUD COMPUTING 3

    REGULARS

    Cubic Interactive Ltd has launched Gekko, a smart email management and collaboration tool for the architecture, engineering and construction industry. Gekko has been designed to help users easily find, filter, track, categorise and prioritise emails providing a smart solution to email overload and reducing the amount of time users and project teams spend on sharing, managing and filing emails. Using industry standard IMAP technology, Gekko integrates with all popular email programmes including Microsoft Outlook, Gmail, Apple mail, Kerio. With first-hand experience working within the industry, our team has a unique insight and understanding in how projects are managed and the problems the AEC industry faces with time, administration and compliance, Daniel LoGiudice, Director of Cubic Interactive. Weve created Gekko to help teams manage their projects, improve team collaboration and maintain key client relationships.

    www.cubic-interactive.com

    ANSYS customers can now deploy consistent enterprise-specific simulation workflows and data to more engineers, regardless of geographic location or business unit with the launch of ANSYS 16.1 and ANSYS Enterprise Cloud. The new solution, running on Amazon Web Services (AWS), simplifies and accelerates the transition to cloud-based simulation by providing a reference architecture for end-to-end simulation that can be deployed within days minimizing risk while boosting productivity. Customers who adopt the ANSYS Enterprise Cloud can scale their simulation capacity including infrastructure and software assets on demand, in response to changing business requirements, optimizing efficiency and cost while responding to the growing demand for wider use of the technology. HGST sees the use of cloud computing as an important paradigm shift, providing increased business agility and the capacity when and where we need it, said Steve Phillpott, Chief Information Officer of HGST, Inc. We are impressed that the ANSYS solution delivers the full end-to-end simulation process in the cloud, allowing us to maintain models,

    simulate and analyze results directly in our virtual private cloud (VPC) environment. Keeping everything in HGSTs VPC mitigates compliance, connectivity, performance and latency issues that are unique challenges for complex modeling and simulation workflows such as ours. The ANSYS Enterprise Cloud has been carefully architected to remove previous barriers to adoption of cloud computing for engineering simulation. Delivered in a single-tenant environment that secures customer data, the solution supports 3-D interactive graphics workloads and auto-scaling high-performance computing (HPC), so results dont need to move between end users and the cloud data centre.

    www.ansys.com

    Cloud robotics, which involves the integration of cloud computing technology in robots, has been gaining prominence globally. End users have begun to recognise the benefits of this novel concept, which uses the Internet to augment a robots capabilities, mainly by off-loading computation and providing services on demand. With this concept set to make future robots more productive and efficient, the diverse requirements of end users will be met with no compromise made to the quality of services. New analysis from Frost & Sullivan, Innovations in Cloud Robotics, finds that cloud robotics will lead to the development of smart robots that have higher computing efficiency and consume less power. These attributes will drive down the cost of manufacturing as there is less hardware and also result in lower emissions. As cloud robotics moves beyond its nascent stage, numerous applications of these technologies will come to the fore, noted Technical Insights Research Analyst Debarun Guha Thakurta. For the moment, healthcare, transportation, consumer robotics and manufacturing are areas that can benefit from the use of shared resources and the elimination of the need to manage or update robotics software. The major challenge for market participants, however, is the high dependence of cloud robotics on active Internet connectivity for processing any function. In areas of limited or no connectivity, robots powered by the

    cloud are unable to function effectively and respond promptly in critical situations. The convergence of cloud robotics with big data, context-aware computing and high-speed ubiquitous wireless networks, along with the use of advanced wireless sensors, could solve connectivity issues that slow response times, said Technical Insights Research Analyst, Mousumi Dasgupta. Operations that require the execution of tasks in real-time will also need service-oriented robots with on-board processing capabilities.

    ww2.frost.com

    With consumers and businesses doing more work than ever in the cloud, it is vital that users have intuitive and powerful ways to access and collaborate on documents, photos, files and videos, whether on a PC, Mac or mobile device. To help make the cloud a social experience, FileChat has announced that it has launched its platform which delivers innovative features that allow users to chat, like and vote on any of their files to harness the power of cloud based storage like never before. Proven cloud storage solutions such as Google Drive and Dropbox need an easy and common way to make collaboration and messaging more effective for businesses and consumers alike. said Alberto Escarlate, CEO of FileChat. The new FileChat application gives users complete control over their cloud and will usher in a new way of managing documents and files and discussing them in one single user-friendly platform. FileChat is currently free and will be adding premium services/levels in the coming months. FileChat has secured a $3 million round of funding from private investors to support the development and launch of the platform. By adding social elements directly to cloud storage services, FileChat improves productivity and efficiency and allows everyone to use their favorite cloud storage platforms for much more than simply storing files. FileChat also offers the first powerful search tool dedicated to discussions on documents saved on cloud storage, so users can quickly find what they are looking for.

    www.filechat.com

    CCW NEWSAll the key news in the world of cloud. Please dont forget to check out our Web site at www.cloudcomputingworld.co.uk for a regular weekly feed of relevant news for cloud professionals.

  • 4 CLOUD COMPUTING

    COMPLIANCE Privacy and Compliance

    IntroductionThe transition to the cloud is having an impacted on every part of our personal and professional lives. We once outsourced each of our utility services to large private enterprise and created professions dedicated to specialist areas of knowledge. Now, the same is happening to the data and computing systems that we work with and in no industry are the implications of this so profound as in the legal profession. In the legal sector, we have hundreds of years of experience and precedent surrounding the careful control of documents. From verifying authenticity using stamps, wax, seals, watermarks and other mechanisms to strict version control. However, none of this works in the digital world. Digital documents are frustratingly easy to copy, modify and forge. When you add this to the connected nature of the cloud and toss in the ability to access these systems over the Internet, then you have a recipe for legal disaster.

    Internet RegulationMany people mistakenly attribute cloud to the Internet, or to large-scale computing infrastructure. In truth, the cloud is neither of these things; its simply some (clever) marketing surrounding the same concept as any public utility. The Internet is by its very definition an inter-network. A network created as a result of several other networks connecting together. The Internet is not a thing in and of itself, or any one specific network. Instead, it is a result of the decision by the worlds network providers to allow

    LEGAL INTHE CLOUD

    Steven Harrison deliberates moving the legal industry to a private cloudBy Steven Harrison, Lead Technologist & Head of Products, Exponential-e

    15930 Cloud World Forum_Layout 1 06/03/2015 16:45 Page 1

  • 4 CLOUD COMPUTING

    COMPLIANCE Privacy and Compliance

    IntroductionThe transition to the cloud is having an impacted on every part of our personal and professional lives. We once outsourced each of our utility services to large private enterprise and created professions dedicated to specialist areas of knowledge. Now, the same is happening to the data and computing systems that we work with and in no industry are the implications of this so profound as in the legal profession. In the legal sector, we have hundreds of years of experience and precedent surrounding the careful control of documents. From verifying authenticity using stamps, wax, seals, watermarks and other mechanisms to strict version control. However, none of this works in the digital world. Digital documents are frustratingly easy to copy, modify and forge. When you add this to the connected nature of the cloud and toss in the ability to access these systems over the Internet, then you have a recipe for legal disaster.

    Internet RegulationMany people mistakenly attribute cloud to the Internet, or to large-scale computing infrastructure. In truth, the cloud is neither of these things; its simply some (clever) marketing surrounding the same concept as any public utility. The Internet is by its very definition an inter-network. A network created as a result of several other networks connecting together. The Internet is not a thing in and of itself, or any one specific network. Instead, it is a result of the decision by the worlds network providers to allow

    LEGAL INTHE CLOUD

    Steven Harrison deliberates moving the legal industry to a private cloudBy Steven Harrison, Lead Technologist & Head of Products, Exponential-e

    15930 Cloud World Forum_Layout 1 06/03/2015 16:45 Page 1

  • 6 CLOUD COMPUTING

    COMPLIANCEthe free movement of data. This is why issues such as net neutrality and supposed Internet Regulation are so important, but also often so futile. The Internet does not exist, so how would anyone purport to regulate it? What were really talking about is regulating the behaviours of individuals and companies who have voluntarily agreed to create a global communications system. This means that the cloud is not synonymous with the Internet. In fact, the notion that you can access a computing service thats not provided locally and must traverse several independent public networks via the informal agreement that is the Internet becomes questionable.

    DataIn addition, when considering a move to the cloud, the biggest differentiator from other utility industries is the nature of what data is. Whereas electricity, gas and water are all simple commodities, data has complexities that we are only just beginning to understand. There is a raging debate in the courts of the world as to the questions of data ownership, be that its subject (the person who collected it) or its object (the person it regards). I do not think anyone today can predict the date we will achieve a global consensus on the legal handling of data and so we must look at the world as a collection of parts and consider carefully how it applies to each situation.

    CloudFor the legal industry we see that the challenges surrounding cloud and privacy are many and complex. Legal organisations need to use private cloud networks that operate as an extension of their existing environment to ensure that confidential information remains behind the firewall and is not being transported over the public Internet. Its important to consider:

    1. When using a cloud provider, where does the data physically reside when at rest, when in use, when being backed up and when being restored after a disaster?

    2. When using software based in the cloud what is the access security mechanism? How many independent factors are used? Can this access security be compromised? Can communications be monitored or suffer man-in-the-middle attacks? What would be the potential impact of a denial of service attack if the data became unavailable at a crucial juncture of a legal proceeding?

    3. How is data integrity managed? Can data be modified without trace? Can we rely on the time-date stamps showing the last modified date/time/user? Will the court system we operate in accept these dates as evidence?

    4. If access is global, are there legal issues about people from outside of the jurisdiction accessing the data? Can legal borders be enforced in the online world?

    5. How do you deal with the risk of the commercial provider themselves falling under legal, financial or regulatory issues that would impact the service they provide?

    6. Today there is no one solution for every application in legal; the answer therefore is to ask the right questions of your service provider. A service provider suited to the legal industry will understand the challenges and will be able to provide targeted and specific answers to each firms challenges.

    The second part of the answer is to avoid taking small steps and instead look for transformative moves. The danger is that by making small steps, the new privacy issues and complications will outweigh the benefits. Consider moving servers to the cloud in isolation - where the servers are suddenly remote from the users and their PCs, accessed over the Internet or VPN in separate administrative domains and with separate security controls, policies and mechanisms. To match the security of legacy LAN infrastructures, go one step further and virtualise the entire workplace - servers, applications, desktops and communications so that the environment can be secured and controlled with a single policy as a single environment and managed to an even tighter standard. Again, this is where having a service provider partner with the right experience is critical. Big moves tend to carry more risk, but done properly it should be possible to stage and test the new environment without disrupting current operations and then simply schedule a switch-over at a suitable time.

    ConclusionMoving the legal industry to a private cloud and out-sourced service provider model is without question the correct course of action. Done properly, the law practice benefits from the knowledge, skill and experience of experts in the ICT field who also have a solid understanding of the legal industry. If we really push the bounds of what technology is capable of we can achieve even better security and privacy than is provided by legacy LAN environments.

  • 6 CLOUD COMPUTING

    COMPLIANCEthe free movement of data. This is why issues such as net neutrality and supposed Internet Regulation are so important, but also often so futile. The Internet does not exist, so how would anyone purport to regulate it? What were really talking about is regulating the behaviours of individuals and companies who have voluntarily agreed to create a global communications system. This means that the cloud is not synonymous with the Internet. In fact, the notion that you can access a computing service thats not provided locally and must traverse several independent public networks via the informal agreement that is the Internet becomes questionable.

    DataIn addition, when considering a move to the cloud, the biggest differentiator from other utility industries is the nature of what data is. Whereas electricity, gas and water are all simple commodities, data has complexities that we are only just beginning to understand. There is a raging debate in the courts of the world as to the questions of data ownership, be that its subject (the person who collected it) or its object (the person it regards). I do not think anyone today can predict the date we will achieve a global consensus on the legal handling of data and so we must look at the world as a collection of parts and consider carefully how it applies to each situation.

    CloudFor the legal industry we see that the challenges surrounding cloud and privacy are many and complex. Legal organisations need to use private cloud networks that operate as an extension of their existing environment to ensure that confidential information remains behind the firewall and is not being transported over the public Internet. Its important to consider:

    1. When using a cloud provider, where does the data physically reside when at rest, when in use, when being backed up and when being restored after a disaster?

    2. When using software based in the cloud what is the access security mechanism? How many independent factors are used? Can this access security be compromised? Can communications be monitored or suffer man-in-the-middle attacks? What would be the potential impact of a denial of service attack if the data became unavailable at a crucial juncture of a legal proceeding?

    3. How is data integrity managed? Can data be modified without trace? Can we rely on the time-date stamps showing the last modified date/time/user? Will the court system we operate in accept these dates as evidence?

    4. If access is global, are there legal issues about people from outside of the jurisdiction accessing the data? Can legal borders be enforced in the online world?

    5. How do you deal with the risk of the commercial provider themselves falling under legal, financial or regulatory issues that would impact the service they provide?

    6. Today there is no one solution for every application in legal; the answer therefore is to ask the right questions of your service provider. A service provider suited to the legal industry will understand the challenges and will be able to provide targeted and specific answers to each firms challenges.

    The second part of the answer is to avoid taking small steps and instead look for transformative moves. The danger is that by making small steps, the new privacy issues and complications will outweigh the benefits. Consider moving servers to the cloud in isolation - where the servers are suddenly remote from the users and their PCs, accessed over the Internet or VPN in separate administrative domains and with separate security controls, policies and mechanisms. To match the security of legacy LAN infrastructures, go one step further and virtualise the entire workplace - servers, applications, desktops and communications so that the environment can be secured and controlled with a single policy as a single environment and managed to an even tighter standard. Again, this is where having a service provider partner with the right experience is critical. Big moves tend to carry more risk, but done properly it should be possible to stage and test the new environment without disrupting current operations and then simply schedule a switch-over at a suitable time.

    ConclusionMoving the legal industry to a private cloud and out-sourced service provider model is without question the correct course of action. Done properly, the law practice benefits from the knowledge, skill and experience of experts in the ICT field who also have a solid understanding of the legal industry. If we really push the bounds of what technology is capable of we can achieve even better security and privacy than is provided by legacy LAN environments.

  • 8 CLOUD COMPUTING

    BUSINESS INTELLIGENCE Hadoop in the Cloud

    Makes Big Data Live up to Its Greatest PotentialBy: Nitin Bandugula, MapR Technologies

    Infoburst: Companies that provide enterprise-grade Hadoop distributions have recognized the need for cloud enablement.

    BIG ANALYTICSBIG CLOUDS, BIG DATA AND

    IntroductionAt the current rate of data growth, the challenge for enterprises to effectively scale and extract significant value from data in a timely manner is paramount. Organizations cannot continue to expand their data centres using only traditional means to keep pace. A cloud option provides a flexible and a cost-effective solution for organizations to store and process data.

    This is not to say that everything will move to the cloud, but a cloud option either for additional processing or as a disaster recovery component are realistic options. Companies that provide enterprise-grade Hadoop distributions have recognized the need for cloud enablement, and have included features such as mirroring across the WAN. Amazon and Google, for example, offer Hadoop in the cloud as service options.

    Essentially, Apache Hadoop provides a way to capture, organize, store, search, share, analyze and visualize disparate data sources (structured, semi-structured and unstructured) across a large cluster of commodity computers, and is designed to scale up from dozens to thousands of servers, each offering local computation and storage.

    The Hadoop Advantage The open source nature of Hadoop creates an ecosystem that facilitates constant advancements in its capabilities, performance, reliability and ease of use. These enhancements can be made by any individual or organization a true global community of contributors and are then either

    CLOUD COMPUTING 9

    BUSINESS INTELLIGENCE

    importantly, Drill is the worlds first and only distributed SQL engine that generates as-it-happens schema providing structure only when its needed to interpret something.

    Apache Drill makes it possible for you to explore data without having to ask IT counterparts to define schemas or create new ETL processes. As you query the data, Apache Drills engine discovers the source schemas and automatically adjusts query plans. Since you can query self-describing data and process complex data types as you go, you can extract every last ounce of insight from big data. Data from sources such as Hadoop, HBase, and MongoDB can be queried using ANSI SQL semantics to glean new insights at the speed-of-thought.

    By analyzing multiple data sources such as blog posts, sensors, clickstreams, customer interaction records, videos, transaction data, and more you can gain invaluable, actionable insight. Apache Drill makes it easier for you to bring SQL queries to such data sets much more rapidly than any other SQL engine on Hadoop. The faster you can derive value from big data, the greater the potential for you to reach more customers and address their particular needs.

    ConclusionThe data analytics paradigm is changing, and this presents you with a real opportunity to take full advantage of some new and powerful capabilities without sacrificing any existing ones. The combination of Hadoop in the cloud, along with powerful BI tools, can give you a significant competitive edge by taking full advantage of the insights provided by this paradigm shift.

    The combination of Hadoop in the cloud, along with powerful BI tools, can give you a

    significant competitive edge.

    contributed to the basic Apache library or made available in a separate (often free) commercial distribution. In effect, Hadoop is a complete system or stack for data analysis. Apply this to the cloud, and you have a cost efficient tool for analyzing data and quickly gathering insight.

    Extracting valuable information from data can be complex, especially given the growth in unstructured and semi-structured data. The sources of unstructured data, such as log files, social media, videos, etc., are growing in both their size and importance. Traditional analytic techniques require considerable pre-processing of unstructured and semi-structured data before being able to produce results, and achieving desirable insights can be a daunting process.

    The ability of Hadoop to employ simple algorithms and obtain meaningful results when analyzing unstructured, semi-structured and structured data in its raw form is unprecedented and currently unparalleled.

    Hadoop for BI Hadoop today offers capabilities that make BI on big data intuitive and easy to adopt, even in a cloud environment. There are tools now available within the Hadoop stack (such as Apache Drill) that are ushering in an era where SQL specialists no longer have to wait for weeks/months before they can access new datasets; a world where IT does not have to be a bottleneck in preparing and maintaining schemas for the BI user; a world where data scientists are free to follow the information trail wherever it may lead them. Such capabilities become all the more relevant when one is operating in a short-lived, ephemeral cluster environment in the cloud.

    The old way of doing analytics required data or schema preparation in advance. Datasets are predefined and schemas are fixed. When an analysis uncovers something interesting, digging deeper often requires a manual process of adding more dimensions to the schema, requiring help from IT. But there are new options available for the end user to access data instantly, and multiple ways to analyze it. These new ways of agile data exploration enhance the traditional schema-first approaches. Iteratively gaining deeper and deeper insights become much easier with an expanded set of tools for the end user.

    Apache Drill Apache Drill is a query engine for Hadoop that supports data-intensive distributed applications for interactive analysis of large-scale datasets. Drill automatically understands the structure of the data, giving analysts instant self-service access to data exploration. This approach reduces the burden on IT and increases the productivity and agility of analysts and developers. Apache Drill, which is an open-source version of Googles Dremel system, includes advanced capabilities such as an extensible architecture, overall data agility, support for full SQL, and the ability to efficiently handle modern data structures and nested data (such as JSON and Parquet). Most

  • 8 CLOUD COMPUTING

    BUSINESS INTELLIGENCE Hadoop in the Cloud

    Makes Big Data Live up to Its Greatest PotentialBy: Nitin Bandugula, MapR Technologies

    Infoburst: Companies that provide enterprise-grade Hadoop distributions have recognized the need for cloud enablement.

    BIG ANALYTICSBIG CLOUDS, BIG DATA AND

    IntroductionAt the current rate of data growth, the challenge for enterprises to effectively scale and extract significant value from data in a timely manner is paramount. Organizations cannot continue to expand their data centres using only traditional means to keep pace. A cloud option provides a flexible and a cost-effective solution for organizations to store and process data.

    This is not to say that everything will move to the cloud, but a cloud option either for additional processing or as a disaster recovery component are realistic options. Companies that provide enterprise-grade Hadoop distributions have recognized the need for cloud enablement, and have included features such as mirroring across the WAN. Amazon and Google, for example, offer Hadoop in the cloud as service options.

    Essentially, Apache Hadoop provides a way to capture, organize, store, search, share, analyze and visualize disparate data sources (structured, semi-structured and unstructured) across a large cluster of commodity computers, and is designed to scale up from dozens to thousands of servers, each offering local computation and storage.

    The Hadoop Advantage The open source nature of Hadoop creates an ecosystem that facilitates constant advancements in its capabilities, performance, reliability and ease of use. These enhancements can be made by any individual or organization a true global community of contributors and are then either

    CLOUD COMPUTING 9

    BUSINESS INTELLIGENCE

    importantly, Drill is the worlds first and only distributed SQL engine that generates as-it-happens schema providing structure only when its needed to interpret something.

    Apache Drill makes it possible for you to explore data without having to ask IT counterparts to define schemas or create new ETL processes. As you query the data, Apache Drills engine discovers the source schemas and automatically adjusts query plans. Since you can query self-describing data and process complex data types as you go, you can extract every last ounce of insight from big data. Data from sources such as Hadoop, HBase, and MongoDB can be queried using ANSI SQL semantics to glean new insights at the speed-of-thought.

    By analyzing multiple data sources such as blog posts, sensors, clickstreams, customer interaction records, videos, transaction data, and more you can gain invaluable, actionable insight. Apache Drill makes it easier for you to bring SQL queries to such data sets much more rapidly than any other SQL engine on Hadoop. The faster you can derive value from big data, the greater the potential for you to reach more customers and address their particular needs.

    ConclusionThe data analytics paradigm is changing, and this presents you with a real opportunity to take full advantage of some new and powerful capabilities without sacrificing any existing ones. The combination of Hadoop in the cloud, along with powerful BI tools, can give you a significant competitive edge by taking full advantage of the insights provided by this paradigm shift.

    The combination of Hadoop in the cloud, along with powerful BI tools, can give you a

    significant competitive edge.

    contributed to the basic Apache library or made available in a separate (often free) commercial distribution. In effect, Hadoop is a complete system or stack for data analysis. Apply this to the cloud, and you have a cost efficient tool for analyzing data and quickly gathering insight.

    Extracting valuable information from data can be complex, especially given the growth in unstructured and semi-structured data. The sources of unstructured data, such as log files, social media, videos, etc., are growing in both their size and importance. Traditional analytic techniques require considerable pre-processing of unstructured and semi-structured data before being able to produce results, and achieving desirable insights can be a daunting process.

    The ability of Hadoop to employ simple algorithms and obtain meaningful results when analyzing unstructured, semi-structured and structured data in its raw form is unprecedented and currently unparalleled.

    Hadoop for BI Hadoop today offers capabilities that make BI on big data intuitive and easy to adopt, even in a cloud environment. There are tools now available within the Hadoop stack (such as Apache Drill) that are ushering in an era where SQL specialists no longer have to wait for weeks/months before they can access new datasets; a world where IT does not have to be a bottleneck in preparing and maintaining schemas for the BI user; a world where data scientists are free to follow the information trail wherever it may lead them. Such capabilities become all the more relevant when one is operating in a short-lived, ephemeral cluster environment in the cloud.

    The old way of doing analytics required data or schema preparation in advance. Datasets are predefined and schemas are fixed. When an analysis uncovers something interesting, digging deeper often requires a manual process of adding more dimensions to the schema, requiring help from IT. But there are new options available for the end user to access data instantly, and multiple ways to analyze it. These new ways of agile data exploration enhance the traditional schema-first approaches. Iteratively gaining deeper and deeper insights become much easier with an expanded set of tools for the end user.

    Apache Drill Apache Drill is a query engine for Hadoop that supports data-intensive distributed applications for interactive analysis of large-scale datasets. Drill automatically understands the structure of the data, giving analysts instant self-service access to data exploration. This approach reduces the burden on IT and increases the productivity and agility of analysts and developers. Apache Drill, which is an open-source version of Googles Dremel system, includes advanced capabilities such as an extensible architecture, overall data agility, support for full SQL, and the ability to efficiently handle modern data structures and nested data (such as JSON and Parquet). Most

  • 10 CLOUD COMPUTING

    OPINION Bring Your Own Cloud

    Infoburst: BYOC is already a major trend but, as with any shadow IT, it does raise concerns

    WHEN THE CLOUD CASTS ITS SHADOW...

    OPINION

    CASTS ITS SHADOW...

    IntroductionThe proliferation of compact espresso, filter and other gourmet hot drinks machines though todays workplaces has not only provided relief for office coffee aficionados, it has also freed up one acronym for rather more critical usage: everyone should now know that BYOC stands purely for Bring Your Own Cloud.

    In practice, BYOC means allowing employees to work with their own choice of cloud application or cloud storage services for example Google Apps or Dropbox. For a smaller sized business, the use of public cloud services can offer better value than rolling out an internal shared service especially when free personal cloud offerings are all that is needed. The public service is also more likely to be maintained as state-of-the-art in terms of reliability and an easy user interface. The biggest worry might be security, but centralized company storage can provide richer rewards for hackers than a personal cloud account, resulting in more publicized security breaches in company networks and data centres. So, a well-managed BYOC policy could even provide better security than company-owned cloud storage and applications.

    BYOC is already a major trend but, as with any shadow IT, it does raise concerns unless policies are in place and workers keep management well informed about their use of public cloud services. In particular, the use of public cloud storage can mean that business owners have less idea where their companys information is stored, who can access it and how it is being used.

    The problem is compounded when a large population of employees are all making their own independent choices, keeping up with the latest services and features, and making it very difficult for management to keep pace with such a fast changing range and variety of usages. Do the benefits of BYOC justify the difficulty in policing it?

    Benefits of BYOC Think of the convenience in sharing and updating large documents via Dropbox, and it is obvious that use of BYOC can increase productivity, reduce costs and make it faster and easier to do business. It offers outstanding benefits for developers,

    architects, content creators, designers and anyone needing to share and co-create large files. Add to that the convenience and consistency of access for mobile workers and BYOC becomes a no-brainer.

    On a small scale, sharing data stored in the cloud is generally cheaper than using a customized in-house data centre, and it is automatically backed up to a secure location, with disaster protection, without impacting the users. It offers many advantages over data being stored on personal laptops or USB drives that can get lost or stolen.

    At a more subtle level, a culture where people are given greater freedom to find their own tools and solutions is a culture that fosters innovation, creativity, and independent thinking. The more employees feel free to make their own choices and define their own ways of working, the more they feel committed to the work and eager to justify their choices. New ideas, new techniques and new solutions stimulate competition and drive success.

    In short, the extra freedom and autonomy offered by BYOx in general and BYOC in particular can be a major boost for productivity and enterprise. So how can these benefits be realised without too much risk and loss of control?

    Security and Policy Enforcement The most immediate concerns for the organisation are likely to be accountability and security: if company data is being stored outside the company, where is it? And is it safe? Recent legislation on data protection and citizens rights means that it is not enough just to say that data is safe, it can be important to know where and how it is being stored. If your company stores personal details from European citizens in the USA, where they may be accessible to US security, you are not compliant with EU data protection laws. Even if the details are stored in Europe, but happen to pass through the cloud via the US, they could be at risk. So, there is a need for co-operation on a global scale to create common standards that define levels of data sensitivity, how they may be stored and transmitted, and how they are manage and accessed.

    Once in cloud storage, the data may well have

    James Walker explains why we need a collaborative approach to ensure open, standardized cloud services By: James Walker, President OpenCloud Connect

    CLOUD COMPUTING 11

  • 10 CLOUD COMPUTING

    OPINION Bring Your Own Cloud

    Infoburst: BYOC is already a major trend but, as with any shadow IT, it does raise concerns

    WHEN THE CLOUD CASTS ITS SHADOW...

    OPINION

    CASTS ITS SHADOW...

    IntroductionThe proliferation of compact espresso, filter and other gourmet hot drinks machines though todays workplaces has not only provided relief for office coffee aficionados, it has also freed up one acronym for rather more critical usage: everyone should now know that BYOC stands purely for Bring Your Own Cloud.

    In practice, BYOC means allowing employees to work with their own choice of cloud application or cloud storage services for example Google Apps or Dropbox. For a smaller sized business, the use of public cloud services can offer better value than rolling out an internal shared service especially when free personal cloud offerings are all that is needed. The public service is also more likely to be maintained as state-of-the-art in terms of reliability and an easy user interface. The biggest worry might be security, but centralized company storage can provide richer rewards for hackers than a personal cloud account, resulting in more publicized security breaches in company networks and data centres. So, a well-managed BYOC policy could even provide better security than company-owned cloud storage and applications.

    BYOC is already a major trend but, as with any shadow IT, it does raise concerns unless policies are in place and workers keep management well informed about their use of public cloud services. In particular, the use of public cloud storage can mean that business owners have less idea where their companys information is stored, who can access it and how it is being used.

    The problem is compounded when a large population of employees are all making their own independent choices, keeping up with the latest services and features, and making it very difficult for management to keep pace with such a fast changing range and variety of usages. Do the benefits of BYOC justify the difficulty in policing it?

    Benefits of BYOC Think of the convenience in sharing and updating large documents via Dropbox, and it is obvious that use of BYOC can increase productivity, reduce costs and make it faster and easier to do business. It offers outstanding benefits for developers,

    architects, content creators, designers and anyone needing to share and co-create large files. Add to that the convenience and consistency of access for mobile workers and BYOC becomes a no-brainer.

    On a small scale, sharing data stored in the cloud is generally cheaper than using a customized in-house data centre, and it is automatically backed up to a secure location, with disaster protection, without impacting the users. It offers many advantages over data being stored on personal laptops or USB drives that can get lost or stolen.

    At a more subtle level, a culture where people are given greater freedom to find their own tools and solutions is a culture that fosters innovation, creativity, and independent thinking. The more employees feel free to make their own choices and define their own ways of working, the more they feel committed to the work and eager to justify their choices. New ideas, new techniques and new solutions stimulate competition and drive success.

    In short, the extra freedom and autonomy offered by BYOx in general and BYOC in particular can be a major boost for productivity and enterprise. So how can these benefits be realised without too much risk and loss of control?

    Security and Policy Enforcement The most immediate concerns for the organisation are likely to be accountability and security: if company data is being stored outside the company, where is it? And is it safe? Recent legislation on data protection and citizens rights means that it is not enough just to say that data is safe, it can be important to know where and how it is being stored. If your company stores personal details from European citizens in the USA, where they may be accessible to US security, you are not compliant with EU data protection laws. Even if the details are stored in Europe, but happen to pass through the cloud via the US, they could be at risk. So, there is a need for co-operation on a global scale to create common standards that define levels of data sensitivity, how they may be stored and transmitted, and how they are manage and accessed.

    Once in cloud storage, the data may well have

    James Walker explains why we need a collaborative approach to ensure open, standardized cloud services By: James Walker, President OpenCloud Connect

    CLOUD COMPUTING 11

  • 12 CLOUD COMPUTING

    OPINION

    better protection than anything the company could afford to give it, so the question also arises whether its passage to and from storage is secure and that depends on the level of encryption and what the cloud provider is offering. If different types of data need different levels of encryption, has the cloud provider allowed for this?

    The company might need a higher level of encryption than that used by the cloud providers security system, or want to use a preferred third party supplier. At the opposite pole the encryption mechanism may be adding latency or inconvenience and the user would prefer lighter encryption. BYOE (Bring Your Own Encryption) is a solution already being touted in such cases.

    The problem with allowing users to choose their own encryption is that the providers security platform has to be able to support the chosen encryption system. The provider might offer a choice from a range of encryption offerings that have been tested for compatibility with the cloud offering, but that still requires the user to trust anothers choice of encryption algorithms: a full homomorphic offering might be vital for one operation, but a waste of money and effort for a whole lot of other processes.

    So, there is a need for a global standard around integrating cloud security platforms that any encryption offering can be registered for support by that platform. The customer could then choose a cloud offering both for the quality

    of its services and for its globally certified XYZ standard security platform. Then the customer can go shopping for a XYZ certified encryption system that matches their own specific security criteria and take responsibility for that choice and how well it complies with legal obligations.

    Order Out of Chaos Compliance and security are very important considerations in a BYOC environment. If the various cloud stakeholders can get together to create and agree global standards to cover these issues, it would solve a lot of management headaches and boost the adoption of BYOC and its many benefits. But this is really just the tip of the iceberg, because there are so many more aspects of cloud service provision that must be brought in line to reduce the risk of service fragmentation.

    Bring Your Own Cloud provides a great business environment, and it is up to us all to

    make sure it develops into a level playing field.

    Infoburst: Compliance and security are very important considerations in a BYOC environment.

  • 12 CLOUD COMPUTING

    OPINION

    better protection than anything the company could afford to give it, so the question also arises whether its passage to and from storage is secure and that depends on the level of encryption and what the cloud provider is offering. If different types of data need different levels of encryption, has the cloud provider allowed for this?

    The company might need a higher level of encryption than that used by the cloud providers security system, or want to use a preferred third party supplier. At the opposite pole the encryption mechanism may be adding latency or inconvenience and the user would prefer lighter encryption. BYOE (Bring Your Own Encryption) is a solution already being touted in such cases.

    The problem with allowing users to choose their own encryption is that the providers security platform has to be able to support the chosen encryption system. The provider might offer a choice from a range of encryption offerings that have been tested for compatibility with the cloud offering, but that still requires the user to trust anothers choice of encryption algorithms: a full homomorphic offering might be vital for one operation, but a waste of money and effort for a whole lot of other processes.

    So, there is a need for a global standard around integrating cloud security platforms that any encryption offering can be registered for support by that platform. The customer could then choose a cloud offering both for the quality

    of its services and for its globally certified XYZ standard security platform. Then the customer can go shopping for a XYZ certified encryption system that matches their own specific security criteria and take responsibility for that choice and how well it complies with legal obligations.

    Order Out of Chaos Compliance and security are very important considerations in a BYOC environment. If the various cloud stakeholders can get together to create and agree global standards to cover these issues, it would solve a lot of management headaches and boost the adoption of BYOC and its many benefits. But this is really just the tip of the iceberg, because there are so many more aspects of cloud service provision that must be brought in line to reduce the risk of service fragmentation.

    Bring Your Own Cloud provides a great business environment, and it is up to us all to

    make sure it develops into a level playing field.

    Infoburst: Compliance and security are very important considerations in a BYOC environment.

  • 14 CLOUD COMPUTING

    OPINION

    Two employees, or departments, may be using different cloud applications for the same purpose. The choice of application might have been made with full attention to the need for security, compliance and how well it meets the business need at what cost. But if those two users now need to co-operate on a project, will the services be compatible? During company mergers, and all the problems in smoothly integrating two different business cultures, if both businesses depend critically on outsourcing cloud services that are mutually incompatible, what happens?

    The fact is that we are seeing explosive growth in cloud service offerings, many of them so well designed and operated as to become essential business tools for an ever-growing number of organisations of every size and type. But there are few common standards that could make it easy to compare service offerings, let alone know whether they can be mixed and matched or will be compatible with each other in the future.

    A technological shift on the scale of cloud computing creates a gold rush situation where the prizes go to those who get in first and stake their claim in the new territory. The results can be impressive, but the long-term prospects can be messy. Consider what happened at the birth of the PC market in the 1980s, with a huge choice of operating systems and hardware that fast became obsolete. Even when the dust settled there were still problems among organisations using MS-DOS, MacOS and Unix.

    Compare that with the relatively smooth transition from legacy WAN technologies to todays Carrier Ethernet the difference being that the vendors put aside their differences and co-operated to form the MEF and then got the service providers and other WAN stakeholders involved in beating out a set of standards that gave CE universal appeal.

    ConclusionNow the number and diversity of stakeholders in the cloud environment is much greater, and the complexities of developing standards in such a fast moving market are formidable. But the same

    process is happening with OpenCloud Connect (OCC) and its mission to prevent the cloud from fragmenting into incompatible offerings and vendor lock-in by rival providers.

    The OCC has already defined the standard cloud reference architecture. One of the first benefits is the way it provided a common language and standard terminology, and this is already beginning to happen with the cloud. Even within a single organisation there will be cloud stakeholders that speak different business languages: they will need such standardisation to be able to talk together.

    Using this reference architecture, OCC cloud standards are being developed by the stakeholders themselves on an ongoing basis that can keep pace with the fast evolving cloud market. There is much to do, and we need to make sure that everyone has a chance to contribute, but the important thing is that it is happening now. Bring Your Own Cloud provides a great business environment, and it is up to us all to make sure that it develops into a level playing field and one as richly satisfying as Bring Your Own Coffee.

    Infoburst: Do the benefits of BYOC justify the difficulty in policing it?

    DATACENTRESMature Data Centres know that protecting their customers data isnt just about being popular, living in the upmarket streets of London, wearing Tier III trainers or comparing the size of their PUE.

    A mature data centre understands that high quality, exceptional service, low cost & ultimate flexibility combined with levels of security unsurpassed elsewhere is more important than boasting about the size of your PUE or your tier III label.

    Dont let childish boasts cloud your decision - choose a data centre that offers maturity and puts your business needs first.

    are MATURING

    Contact MigSolv Today

    0845 251 2255 migsolv.com

    Mature_DC_Layout.indd 1 13/01/2015 16:36

  • 14 CLOUD COMPUTING

    OPINION

    Two employees, or departments, may be using different cloud applications for the same purpose. The choice of application might have been made with full attention to the need for security, compliance and how well it meets the business need at what cost. But if those two users now need to co-operate on a project, will the services be compatible? During company mergers, and all the problems in smoothly integrating two different business cultures, if both businesses depend critically on outsourcing cloud services that are mutually incompatible, what happens?

    The fact is that we are seeing explosive growth in cloud service offerings, many of them so well designed and operated as to become essential business tools for an ever-growing number of organisations of every size and type. But there are few common standards that could make it easy to compare service offerings, let alone know whether they can be mixed and matched or will be compatible with each other in the future.

    A technological shift on the scale of cloud computing creates a gold rush situation where the prizes go to those who get in first and stake their claim in the new territory. The results can be impressive, but the long-term prospects can be messy. Consider what happened at the birth of the PC market in the 1980s, with a huge choice of operating systems and hardware that fast became obsolete. Even when the dust settled there were still problems among organisations using MS-DOS, MacOS and Unix.

    Compare that with the relatively smooth transition from legacy WAN technologies to todays Carrier Ethernet the difference being that the vendors put aside their differences and co-operated to form the MEF and then got the service providers and other WAN stakeholders involved in beating out a set of standards that gave CE universal appeal.

    ConclusionNow the number and diversity of stakeholders in the cloud environment is much greater, and the complexities of developing standards in such a fast moving market are formidable. But the same

    process is happening with OpenCloud Connect (OCC) and its mission to prevent the cloud from fragmenting into incompatible offerings and vendor lock-in by rival providers.

    The OCC has already defined the standard cloud reference architecture. One of the first benefits is the way it provided a common language and standard terminology, and this is already beginning to happen with the cloud. Even within a single organisation there will be cloud stakeholders that speak different business languages: they will need such standardisation to be able to talk together.

    Using this reference architecture, OCC cloud standards are being developed by the stakeholders themselves on an ongoing basis that can keep pace with the fast evolving cloud market. There is much to do, and we need to make sure that everyone has a chance to contribute, but the important thing is that it is happening now. Bring Your Own Cloud provides a great business environment, and it is up to us all to make sure that it develops into a level playing field and one as richly satisfying as Bring Your Own Coffee.

    Infoburst: Do the benefits of BYOC justify the difficulty in policing it?

    DATACENTRESMature Data Centres know that protecting their customers data isnt just about being popular, living in the upmarket streets of London, wearing Tier III trainers or comparing the size of their PUE.

    A mature data centre understands that high quality, exceptional service, low cost & ultimate flexibility combined with levels of security unsurpassed elsewhere is more important than boasting about the size of your PUE or your tier III label.

    Dont let childish boasts cloud your decision - choose a data centre that offers maturity and puts your business needs first.

    are MATURING

    Contact MigSolv Today

    0845 251 2255 migsolv.com

    Mature_DC_Layout.indd 1 13/01/2015 16:36

  • 16 CLOUD COMPUTING

    OPINION Data Centre Design

    Infoburst: Big data is a technology with a long lifespan ahead of it.

    FINANCIAL SERVICESBIG DATA IMPLICATIONS FOR

    Tony Kenyon explains why businesses are focussing less on IT and more on their fundamental services and users.By Tony Kenyon, Director of Product Management, Office of the CTO, A10 Networks

    CLOUD COMPUTING 17

    OPINION

    Introduction Over the past decade weve seen huge changes in the way data centres are designed, with increasing emphasis on dynamically provisioned soft programmable infrastructures. Data centres are increasingly expected to transition to revenue generation, rather of being a cost centre, requiring a major shift in the ability to automate and better monetize services in order to enable businesses to respond rapidly to market changes. There are a number of inflexion points at play here, including cloud computing, server virtualization, network functions virtualization (NFV), Software Defined Networks (SDN), mobile technology, and more recently social networks and Internet of Things (IoT). The general trend is away from protocols to APIs, from transport to applications and services, from instrumentation to information. Businesses want to focus less on IT and more on their fundamental services and users.

    With the emergence of Internet-focused organisations and social networks, and new multi-tenant cloud computing models, organisations may now be supporting many thousands or even millions of online users. Some of the most innovative (such as Amazon, Google, Facebook, and eBay) have taken novel approaches to the issues of scale and efficiency, leading to initiatives that have revolutionized the way we think about data centre design, leading to new architectural models such as cloud computing, virtualization, and big data.

    Over this period weve also seen a remarkable shift in the way we consume and manage data, with some of the biggest challenges around how to analyse all of the data were interested in within a reasonable time, for business benefit or some useful insight. As the volume and velocity of data has grown dramatically weve seen an uneasy transition from classic relational databases, object databases, graph databases, through to in-memory techniques and 4GL; the aim being to find more agile and more optimal ways to handle large and potentially complex data sets, often where the schema is uncertain, and where the data may be too large to fit on a single node, or simply coming at you too fast to handle.

    Is Big Data Over Hyped?The traditional way of handling large datasets has been to normalize it, make it well structured, reduce it, and then typically move it somewhere central (or at least summarize an collate this through some form of data hierarchy). For data centres, especially those spread over a large

    enterprise, this has created a particular set of challenges, not least of which how to handle and protect traffic moving between locations, how to scale operations, how to store data long term, how to archive it, as well as meeting increasingly robust regulatory requirements. All this requires a clear understanding of what data is being held where, how that data is being moved between entities, and any potential confidentially breaches revealed by any analytics.

    Big data is a true paradigm shift, it exorcises many of the things we took for granted about large scale data management, how and where we process and store data, the need to structure that data, even the complexity of algorithms used to gain insight. With that said, its important not to get carried away with the hype, and recognise that big data cannot be taken in isolation when considering future IT planning. Where and how big data is deployed very much depends on the applications and motivations of the organisation. Big data does not solve all data management problems, nor should it be expected to deal with all use cases. For example, using Apache Hive to run real-time critical queries on structured data is probably not a great idea; for such applications existing relational techniques may be much faster.

    Nevertheless the importance of unstructured data and the ability to gain insight from that data means that big data is a technology with a long lifespan ahead of it. In 2014 many organisations started to take a serious look at big data, but equally struggled to get to grips with it in terms of delivering business benefits; however all the indications are that things are moving fast and of those organisations agile and brave enough to invest now, some are already seeing significant financial and operational benefits in the insights revealed.

    Early AdoptionOne of the things that should be obvious is that we are very much at the end of the beginning for big data, and that introduces real uncertainty, and opportunity. While relatively immature, Big data is providing solutions to well established business problems across banking and financial market organisations around the globe today. The financial services industry is leveraging big data for strategic and competitive gain, transforming processes, operations and identifying new business opportunities through the insight gained from analytics.

    For those organisations not actively deploying big data solutions, roughly three quarters appear to have established planning in place or pilot

  • 16 CLOUD COMPUTING

    OPINION Data Centre Design

    Infoburst: Big data is a technology with a long lifespan ahead of it.

    FINANCIAL SERVICESBIG DATA IMPLICATIONS FOR

    Tony Kenyon explains why businesses are focussing less on IT and more on their fundamental services and users.By Tony Kenyon, Director of Product Management, Office of the CTO, A10 Networks

    CLOUD COMPUTING 17

    OPINION

    Introduction Over the past decade weve seen huge changes in the way data centres are designed, with increasing emphasis on dynamically provisioned soft programmable infrastructures. Data centres are increasingly expected to transition to revenue generation, rather of being a cost centre, requiring a major shift in the ability to automate and better monetize services in order to enable businesses to respond rapidly to market changes. There are a number of inflexion points at play here, including cloud computing, server virtualization, network functions virtualization (NFV), Software Defined Networks (SDN), mobile technology, and more recently social networks and Internet of Things (IoT). The general trend is away from protocols to APIs, from transport to applications and services, from instrumentation to information. Businesses want to focus less on IT and more on their fundamental services and users.

    With the emergence of Internet-focused organisations and social networks, and new multi-tenant cloud computing models, organisations may now be supporting many thousands or even millions of online users. Some of the most innovative (such as Amazon, Google, Facebook, and eBay) have taken novel approaches to the issues of scale and efficiency, leading to initiatives that have revolutionized the way we think about data centre design, leading to new architectural models such as cloud computing, virtualization, and big data.

    Over this period weve also seen a remarkable shift in the way we consume and manage data, with some of the biggest challenges around how to analyse all of the data were interested in within a reasonable time, for business benefit or some useful insight. As the volume and velocity of data has grown dramatically weve seen an uneasy transition from classic relational databases, object databases, graph databases, through to in-memory techniques and 4GL; the aim being to find more agile and more optimal ways to handle large and potentially complex data sets, often where the schema is uncertain, and where the data may be too large to fit on a single node, or simply coming at you too fast to handle.

    Is Big Data Over Hyped?The traditional way of handling large datasets has been to normalize it, make it well structured, reduce it, and then typically move it somewhere central (or at least summarize an collate this through some form of data hierarchy). For data centres, especially those spread over a large

    enterprise, this has created a particular set of challenges, not least of which how to handle and protect traffic moving between locations, how to scale operations, how to store data long term, how to archive it, as well as meeting increasingly robust regulatory requirements. All this requires a clear understanding of what data is being held where, how that data is being moved between entities, and any potential confidentially breaches revealed by any analytics.

    Big data is a true paradigm shift, it exorcises many of the things we took for granted about large scale data management, how and where we process and store data, the need to structure that data, even the complexity of algorithms used to gain insight. With that said, its important not to get carried away with the hype, and recognise that big data cannot be taken in isolation when considering future IT planning. Where and how big data is deployed very much depends on the applications and motivations of the organisation. Big data does not solve all data management problems, nor should it be expected to deal with all use cases. For example, using Apache Hive to run real-time critical queries on structured data is probably not a great idea; for such applications existing relational techniques may be much faster.

    Nevertheless the importance of unstructured data and the ability to gain insight from that data means that big data is a technology with a long lifespan ahead of it. In 2014 many organisations started to take a serious look at big data, but equally struggled to get to grips with it in terms of delivering business benefits; however all the indications are that things are moving fast and of those organisations agile and brave enough to invest now, some are already seeing significant financial and operational benefits in the insights revealed.

    Early AdoptionOne of the things that should be obvious is that we are very much at the end of the beginning for big data, and that introduces real uncertainty, and opportunity. While relatively immature, Big data is providing solutions to well established business problems across banking and financial market organisations around the globe today. The financial services industry is leveraging big data for strategic and competitive gain, transforming processes, operations and identifying new business opportunities through the insight gained from analytics.

    For those organisations not actively deploying big data solutions, roughly three quarters appear to have established planning in place or pilot

  • 18 CLOUD COMPUTING

    OPINION

    schemes. Banks in particular are under severe pressure to move focus away from products to customers in order to retain customers and grow market share through new opportunities. Understanding customer behaviour as well as social and market trends is key. One of the main challenges facing new entrants is the lack of publically available use cases and reference architectures; those organisations that have successfully invested in big data to optimize their workflows may keep details closely guarded for the time being to maintain competitive advantage.

    Use CasesThink about a banking network with millions of customers, each with a different activity profile, each with a different set of normal or expected actions. This brings into focus a number of complex variables that need to be weighted, classified and correlated. Think about all the historical data banks have, and how that data might be used to model customer preferences. By analysing the huge dataset using big data techniques it would be possible to enable personalized event based marketing campaigns for new products and services, with a much higher probability of conversion than blindly emailing the whole customer base, especially when coupled with a coordinated messaging approach across email, mobile, branch and ATM interactions. A key trend we are seeing across many Internet based businesses is the desire to improve and personalise interaction with users.

    With cashless transactions becoming the norm, fraud is also a big issue. Bank needs to continuously monitor client behaviour for anything anomalous - for example in detecting credit card fraud. Anomaly detection in this context must accommodate many variables: time, geolocation, transaction amount, transaction frequency, items purchased, mapped against a template of what normal looks like for that customer. Bear in mind that whats normal for you in December may be very different from July. Spatio-temporal problems like this are non-trivial, and solving them requires highly efficient processing at scale. With data streaming in thick and fast and potentially large financial transactions at stake we ideally want to detect anomalies accurately, within a small time window. Accuracy here means not stopping valid transactions (false positives), and not allowing fraudulent transactions (false negatives).

    ChallengesPerhaps the most obvious change big data introduces is in the storage-compute model. Big data components such as Apache Hadoop enable distributed processing to be done, but in situ with the data, where each data node is also a compute node. This fundamentally changes how we view storage and raises a bunch of questions of about what to do with existing SAN and NAS, and how best to archive. As mentioned earlier, big data should not be used as a panacea for all data management functions, but it may have real benefits where organisations have a lot of legacy data (for example by using predictive analytics on live data correlated with historical data). That could require moving historical data, integrating with that data, or unarchiving that data from long term storage. Again this has implications for traffic management, security, data handling, and storage to be considered.

    Where financial institutions have invested significant time and resource into building out legacy applications big data clusters raise the issue of integration and visibility. How do existing applications access this unstructured data, should data be duplicated in the interim, should big data be stored on SANs at the sacrifice of performance; these remain open questions.

    We cant mention big data without also mentioning the Internet of Things (IoT). By 2020 various industry predications estimate the number

    Infoburst: The agility and cost structure of cloud computing is an enticing proposition.

    CLOUD COMPUTING 19

    OPINION

    of Internet connected devices to be between 50 and 75 billion. This is going to radically change how humans interact with technology, the visibility we have on the state of these things, and the insights gained from analytics on those things. What this means in practice is much more unstructured data, data that needs to be processed using big data techniques, and data that will ultimately be useful to many organisations. Armed with the tools to analyse and correlate IoT data will be attractive to many businesses to both optimize existing processes and seek new revenue opportunities. We should therefore expect to see much larger volumes of data being generated (through instrumentation, or external feeds), and then handled within enterprise data centres of enterprises. This needs to be considered and factored into future IT planning.

    While data management problems are exploding, financial institutions may be reluctant to move sensitive data off premise, however the agility and cost structure of cloud computing is an enticing proposition for deploying big data clustered compute nodes, particularly on tactical projects. Many organisations cannot afford to build and tear down data centres to handle their processing and storage scale demands, nor do they have the agility needed to deal with rapidly changing high volume unstructured datasets. Cloud computing is fast becoming a keystone

    in our thinking about the way we architect data centres, one can foresee institutions deploying hybrid cloud solutions at both a strategic and tactical level to handle big data tasks, perhaps anonymising data or covering regulatory concerns through service level and data confidentiality agreements.

    Big data is relatively new, its only a decade since Google published the seminal MapReduce white paper, and as with any new technology the primary concern is functionality. This introduces a number of security challenges, not only in the secure handling and storage of the data, but in understanding the nature of the data itself, and how it can be manipulated to create insight (and potentiality breach confidentiality policy).

    Seizing the OpportunityThe financial services industry has changed beyond all recognition over the past 20 years, and firms are now dealing with a highly diverse, increasingly mobile, and demanding customer base. Banking has moved rapidly online, with users communicating and transacting over a variety of methods, often around the clock. While structured customer data is growing in size and scope, unstructured data is fast emerging as the more significant source for gaining insight. From investment bankers through to front-office staff, ready access to detailed information on products, risk metrics, market trends, and customer behaviour are critical in optimizing decision making, whether that be human decision making or automated.

    SummaryFor many organisations, and especially those in financial services, the most effective strategy for big data adoption will be to identify core business requirements, and then leverage existing infrastructure as part of a phased migration, ideally taking a specific project as a proof of concept in order to build up the necessary data science skills and analyse deployment options. Organisations will need to factor in regulatory and security concerns, as well as storage and archival models, and the relative importance of cloud computing resources. Existing data sources and analytics can be used to support the business opportunity, and by extending with big data techniques these new uns