• Home
  • About
  •  

    Data Protection in the Cloud FAQ

    January 12th, 2016

    SNIA recently hosted a multi-vendor discussion on leveraging the cloud for data protection. If you missed the Webcast, “Moving Data Protection to the Cloud: Trends, Challenges and Strategies”, it’s now available on-demand. As promised during the live event, we’ve compiled answers to some of the most frequently asked questions on this timely topic. Answers from SNIA as well as our vendor panelists are included. If you have additional questions, please comment on this blog and we’ll get back to you as soon as possible

    Q. What is the significance of NIST FIPS 140-2 Certification?

    Acronis: FIPS 140-2 Certification is can be a requirement by certain entities to use cloud-based solutions. It is important to understand the customer you are going after and whether this will be a requirement. Many small businesses do not require FIPS but certain do.

    Asigra: Organizations that are looking to move to a cloud-based data protection solution should strongly consider solutions that have been validated by the National Institute of Standards and Technology (NIST), an agency of the U.S. Department of Commerce, as this certification represents that the solution has been tested and maintains the most current security requirement for cryptographic modules, or encryption. It is important to validate that the data is encrypted at rest and in flight for security and compliance purposes. NIST issues numbered certificates to solution providers as the validation that their solution was tested and approved.

    SolidFire: FIPS 140-2 has 4 levels of security, 1- 4 depending on what the application requires.  FIPS stands for Federal Information Processing Standard and is required by some non-military federal agencies for hardware/software to be allowed in their datacenter.  This standard describes the requirements for how sensitive but unclassified information is stored.  This standard is focused on how the cryptographic modules secure information for these systems.

    Q. How do you ensure you have real time data protection as well as protection from human error?  If the data is replicated, but the state of the data is incorrect (corrupt / deleted)… then the DR plan has not succeeded.

    SNIA: The best way to guard against human error or corruption is with regular point-in-time snapshots; some snapshots can be retained for a limited length of time while others are kept for as long as the data needs to be retained.  Snapshots can be done in the cloud as well as in local storage.

    Acronis: Each business needs to think through their retention plan to mitigate such cases. For example, they would run 7 daily backups, 4 weekly backups, 12 monthly backups and one yearly backup. In addition it is good to have a system that allows one to test the backup with a simulated recovery to guarantee that data has not been corrupted.

    Asigra: One way for organizations that are migrating to SaaS based applications like Google Apps, Microsoft Office 365 and Salesforce.com to protect their data created and stored in these applications is to consider a cloud-based data protection solution to back up the data from these applications to a third party cloud to meet the unique data protection requirements of your organization. You need to take the responsibility to protect your data born in the cloud much like you protect data created in traditional on premise applications and databases. The responsibility for data protection does not move to the SaaS application provider, it remains with you.

    For example user error is one of the top ways that data is lost in the cloud. With Microsoft Office 365 by default, deleted emails and mailboxes are unrecoverable after 30 days; if you cancel your subscription, Microsoft deletes all your data after 90 days; and Microsoft’s maximum liability is $5000 US or what a customer paid during the last 12 months on subscription fees – assuming you can prove it was Microsoft’s fault. All the more reason you need to have a data protection strategy in place for data born in the cloud.

    SolidFire: You need to have a technology that provides a real-time asynchronous replication technology achieving a low RPO that does not rely on snapshots.  Application consistent snapshots must be used concurrently with a real-time replication technology to achieve real time and point in time protection.  For the scenario of performing a successful failover, but then you have corrupted data.  With application consistent snapshots at the DR site you would be able to roll back instantly to a point in time when the data and app was in a known good state.

    Q. What’s the easiest and most effective way for companies to take advantage of cloud data protection solutions today? Where should we start?

    SNIA: The easiest way to ease into using cloud storage is to either (1) use the direct cloud interface of your backup software if it has one to set up an offsite backup, or (2) use a cloud storage gateway that allows public or private cloud storage to appear as another local NAS resource.

    Acronis: The easiest way is to use a solution that supports both cloud and on premises data protection. Then they can start by backing up certain workloads to the cloud and adding more over time. Today, we see that many workloads are protected with both a cloud and on premise copy.

    Asigra: Organizations should start with non-production, non-critical workloads to test the cloud-based data protection solution to ensure that it meets their needs before moving to critical workloads. Identifying and understanding their corporate requirements for a public, private and/or hybrid cloud architecture is important as well as identifying the workloads that will be moved to the cloud and the timing of this transition. Also, organizations may want to consult with a third party IT Solutions Provider who has the expertise and experience with cloud-based data protection solutions to explore how others are leveraging cloud-based solutions, as well as conduct a data classification exercise to understand which young data needs to be readily available versus older data that needs to be retained for longer periods of time for compliance purposes. It is important that organizations identify their required Recovery Time Objectives and Recovery Point Objectives when setting up their new solution to ensure that in the event of a disaster they are able to meet these requirements. Tip: Retain the services of a trusted IT Solution Provider and run a proof of concept or test drive the solution before moving to full production.

    SolidFire: Find a simple and automated solution that fits into your budget.  Work with your local value added reseller of data protection services.  The best thing to do is NOT wait.  Even if it’s something like carbonite… it’s better than nothing.  Don’t get caught off guard.  No one plans for a disaster.

    Q. Is it sensible to move to a pay-as-you-go service for data that may be retained for 7, 10, 30, or even 100 years?

    SNIA: Long term retention does demand low cost storage of course, and although the major public cloud storage vendors offer low pay-as-you-go costs, those costs can add up to significant amounts over a long period of time, especially if there is any regular need to access the data.  An organization can keep control over the costs of long term storage by setting up an in-house object storage system (“private cloud”) using “white box” hardware and appropriate software such a what is offered by Cloudian, Scality, or Caringo.  Another way to control the costs of long-term storage is via the use of tape.  Note that any of these methods — public cloud, private cloud, or tape — require an IT organization, or their service provider to regularly monitor the state of the storage and periodically refresh it; there is always potential over time for hardware to fail, or for the storage media to deteriorate resulting in what is called bit rot.

    Acronis: The cost of storage is dropping dramatically and will continue to do so. The best strategy is to go with a pay as you go model with the ability to adjust pricing (downward) at least once a year. Buying your own storage will lock you into pricing over too long of a period.

    SolidFire: The risk of moving to a pay-as-you-go service for that long is that you lock your self in for as long as you need to keep the data.  Make sure that contractually you can migrate or move the data from them, even if it’s for a fee.  The sensible part is that you can contract that portion of your IT needs out and focus on your business and advancing it…. Not worrying about completing backups on your own.

    Q. Is it possible to set up a backup so that one copy is with one cloud provider and another with a second cloud provider (replicated between them, not just doing the backup twice) in case one cloud provider goes out of business?

    SNIA: Standards like the SNIA’s CDMI (Cloud Data Management Interface) make replication between different cloud vendors pretty straightforward, since CDMI provides a data and metadata neutral way of transferring data; and provides both standard and extensible metadata to control policy too.

    Acronis: Yes this possible but this is not a good strategy to mitigate a provider going out of business. If that is a concern then pick a provider you trust and one where you control where the data is stored. Then you can easily switch provider if needed.

    SolidFire: Yes setting up a DR site and a tertiary site is very doable.  Many data protection software companies available do this for you with integrations at the cloud providers.  When looking at data protection technology make sure their policy engine is capable of being aware of multiple targets and moving data seamlessly between them.  If you’re worried about cloud service providers going out of business make sure you bet on the big ones with proven success and revenue flow.

     


    Q&A – The Impact of International Data Protection Laws on the Cloud

    December 15th, 2015

    The impact of international data protection legislation on the cloud is complicated and constantly changing. In our recent SNIA Cloud Storage Webcast on this topic we did our best to cover some of the recent global data privacy and data protection regulations being enacted. If you missed the Webcast, I encourage you to watch it on-demand at your convenience. We answered questions during the live event, but as promised we’re providing more complete answers in this blog. If you have additional questions, please comment here and we’ll reply as soon as we can.

    The law is complex, and neither SNIA, the authors nor the presenters of this presentation are lawyers. Nothing here or in the presentation should be construed as legal advice. For that you need the services of a qualified professional.

    Q. What are your thoughts on Safe Harbour being considered invalid, and the potential for a Safe Harbour 2

    A. Since 6 October 2015 when the European Court of Justice invalidated the European Commission’s Safe Harbour Decision, there’s been a lot written about Safe Harbour 2 in the press. But it was clear that a renegotiation was essential two years before that, when discussions for a replacement were started. Many think (and many hope!) that a new and valid agreement in terms of Europe’s Human Rights legislation will be settled between the US and Europe sometime in March 2016.

    Q. Are EU Model Clauses still available to use instead of BCRs (Binding Corporate Rules)?

    A. EU-US data transfers facilitated by the use of model clauses probably today fail to comply with EU law. But as there appears to be no substitute available, the advice appears to be – use them for now until the problem is fixed. Full guidance can be found on the EC website.

    Q. What does imbalance mean relative to consent?

    A. An example might help. You might be an employee and agree (the “consent”) to your data being used by your employer in ways that you might not have agreed to normally – perhaps because you feel you can’t refuse because you might lose your job or a promotion for example. That’s an imbalanced relationship, and the consent needs to be seen in that light, and the employer needs to demonstrate that there has been, and will be, no coercion to give consent.


    Upcoming Webcast: The Impact of International Data Protection Legislation on the Cloud

    November 13th, 2015

    Data Privacy vs. data protection has become a heated debate in businesses around the world as governments across the globe are proposing and enacting strong data privacy and data protection regulations. Join us on November 18th for our next Cloud Storage live Webcast “Data Privacy vs. Data Protection: The Impact of International Data Protection Legislation on the Cloud.

    Mandating frameworks that include noteworthy changes like defining a data breach to include data destruction, adding the right to be forgotten, mandating the practice of breach notifications, and many other new elements are literally changing the rules when it comes to data protection. The implications of this, and other proposed legislation, on how the cloud can be utilized for storing data are significant. Join this live Webcast to hear:

    • “Directives” vs. “regulation”
    • General data protection regulation summary
    • How personal data has been redefined
    • Substantial financial penalties for non-compliance
    • Impact on data protection in the cloud
    • How to prepare now for impending changes

    Our experts, Bob Plumridge, SNIA Europe Board Member; Eric Hibbard, Chair SNIA Security TWG, and I will all be available to answer your questions during the event. I encourage you to register today for this timely discussion. We hope to see you on November 18th!


    Moving Data Protection to the Cloud: Key Considerations

    October 13th, 2015

    Leveraging the cloud for data protection can be an advantageous and viable option for many organizations, but first you must understand the pros and cons of different approaches. Join us on Nov. 17th for our live Webcast, “Moving Data Protection to the Cloud: Trends, Challenges and Strategies” where we’ll discuss the experiences of others with advice on how to avoid the pitfalls, especially during the transition from strictly local resources to cloud resources. We’ve pulled together a unique panel of SNIA experts as well as perspectives from some leading vendor experts Acronis, Asigra and Solid Fire who’ll discuss and debate:

    • Critical cloud data protection challenges
    • How to use the cloud for data protection
    • Pros and cons of various cloud data protection strategies
    • Experiences of others to avoid common pitfalls
    • Cloud standards in use – and why you need them

    Register now for this live and interactive event. Our entire panel will be available to answer your questions. I hope you’ll join us!

     


    Outstanding Keynotes from Leading Storage Experts Make SDC Attendance a Must!

    September 18th, 2015

    Posted by Marty Foltyn

    Tomorrow is the last day to register online for next week’s Storage Developer Conference at the Hyatt Regency Santa Clara. What better incentive to click www.storagedeveloper.org and register than to read about the amazing keynote and featured speakers at this event – I think they’re the best since the event began in 1998! Preview sessions here, and click on the title to download the full description.SDC15_WebHeader3_999x188

    Bev Crair, Vice President and General Manager, Storage Group, Intel will present Innovator, Disruptor or Laggard, Where Will Your Storage Applications Live? Next Generation Storage and discuss the leadership role Intel is playing in driving the open source community for software defined storage, server based storage, and upcoming technologies that will shift how storage is architected.

    Jim Handy, General Director, Objective Analysis will report on The Long-Term Future of Solid State Storage, examining research of new solid state memory and storage types, and new means of integrating them into highly-optimized computing architectures. This will lead to a discussion of the way that these will impact the market for computing equipment.

    Jim Pinkerton, Partner Architect Lead, Microsoft will present Concepts on Moving From SAS connected JBOD to an Ethernet Connected JBOD . This talk examines the advantages of moving to an Ethernet connected JBOD, what infrastructure has to be in place, what performance requirements are needed to be competitive, and examines technical issues in deploying and managing such a product.

    Andy Rudoff, SNIA NVM Programming TWG, Intel will discuss Planning for the Next Decade of NVM Programming describing how emerging NVM technologies and related research are causing a change to the software development ecosystem. Andy will describe use cases for load/store accessible NVM, some transparent to applications, others non-transparent.

    Richard McDougall, Big Data and Storage Chief Scientist, VMware will present Software Defined Storage – What Does it Look Like in 3 Years? He will survey and contrast the popular software architectural approaches and investigate the changing hardware architectures upon which these systems are built.

    Laz Vekiarides, CTO and Co-founder, ClearSky Data will discuss Why the Storage You Have is Not the Storage Your Data Needs , sharing some of the questions every storage architect should ask.

    Donnie Berkholz, Research Director, 451 Research will present Emerging Trends in Software Development drawing on his experience and research to discuss emerging trends in how software across the stack is created and deployed, with a particular focus on relevance to storage development and usage.

    Gleb Budman, CEO, Backblaze will discuss Learnings from Nearly a Decade of Building Low-cost Cloud Storage. He will cover the design of the storage hardware, the cloud storage file system software, and the operations processes that currently store over 150 petabytes and 5 petabytes every month.

    You could wait and register onsite at the Hyatt, but why? If you need more reasons to attend, check out SNIA on Storage previous blog entries on File Systems, Cloud, Management, New Thinking, Disruptive Technologies, and Security sessions at SDC. See the full agenda and register now for SDC at http://www.storagedeveloper.org.


    Security is Strategic to Storage Developers – and a Prime Focus at SDC and SNIA Data Storage Security Summit

    September 16th, 2015

    Posted by Marty Foltyn

    Security is critical in the storage development process – and a prime focus of sessions at the SNIA Storage Developer Conference AND the co-located SNIA Data Storage Security Summit on Thursday September 24. Admission to the Summit is complimentary – register here at http://www.snia.org/dss-summit.DataStorageSecuritySummitlogo200x199[1]

    The Summit agenda is packed with luminaries in the field of storage security, including keynotes from Eric Hibbard (SNIA Security Technical Work Group and Hitachi), Robert Thibadeau (Bright Plaza), Tony Cox (SNIA Storage Security Industry Forum and OASIS KMIP Technical Committee), Suzanne Widup (Verizon), Justin Corlett (Cryptsoft), and Steven Teppler (TimeCertain); and afternoon breakouts from Radia Perlman (EMC); Liz Townsend (Townsend Security); Bob Guimarin (Fornetix); and David Siles (Data Gravity). Roundtables will discuss current issues and future trends in storage security. Don’t miss this exciting event!

    SDC’s “Security” sessions highlight security issues and strategies for mobile, cloud, user identity, attack prevention, key management, and encryption. Preview sessions here, and click on the title to find more details.SDC15_WebHeader3_999x188

    Geoff Gentry, Regional Director, Independent Security Evaluators Hackers, will present Attack Anatomy and Security Trends, offering practical experience from implementing the OASIS Key Management Interoperability Protocol (KMIP) and from deploying and interoperability testing multiple vendor implementations of KMIP .

    David Slik, Technical Director, Object Storage, NetApp will discuss Mobile and Secure: Cloud Encrypted Objects Using CDMI, introducing the Cloud Encrypted Object Extension to the CDMI standard, which permits encrypted objects to be stored, retrieved, and transferred between clouds.

    Dean Hildebrand, IBM Master Inventor and Manager | Cloud Storage Software and Sasikanth Eda, Software Engineer, IBM will present OpenStack Swift On File: User Identity For Cross Protocol Access Demystified. This session will detail the various issues and nuances associated with having common ID management across Swift object access and file access ,and present an approach to solve them without changes in core Swift code by leveraging powerful SWIFT middleware framework.

    Tim Hudson, CTO and Technical Director, Cryptsoft will discuss Multi-Vendor Key Management with KMIP, offering practical experience from implementing the OASIS Key Management Interoperability Protocol (KMIP) and from deploying and interoperability testing multiple vendor implementations of KMIP .

    Nathaniel McCallum, Senior Software Engineer, Red Hat will present Network Bound Encryption for Data-at-Rest Protection, describing Petera, an open source project which implements a new technique for binding encryption keys to a network.

    Finally, check out SNIA on Storage previous blog entries on File Systems, Cloud, Management, New Thinking, and Disruptive Technologies. See the agenda and register now for SDC at http://www.storagedeveloper.org.


    SNIA’s Events Strategy Today and Tomorrow

    December 5th, 2013

    David Dale, SNIA Chairman

    Last month Computerworld/IDG and the SNIA posted a notice to the SNW website stating that they have decided to conclude the production of SNW.  The contract was expiring and both parties declined to renew.  The IT industry has changed significantly in the 15 years since SNW was first launched, and both parties felt that their individual interests would be best served by pursuing separate events strategies.

    For the SNIA, events are a strategically important vehicle for fulfilling its mission of developing standards, maintaining an active ecosystem of storage industry experts, and providing vendor-neutral educational materials to enable IT professions to deal with and derive value from constant technology change.  To address the first two categories, SNIA has a strong track record of producing Technical Symposia throughout the year, and the successful Storage Developer Conference in September.

    To address the third category, IT professionals, SNIA has announced a new event, to be held in Santa Clara, CA, from April 22-24 – the Data Storage Innovation Conference. This event is targeted at IT decision-makers, technology implementers, and those expected to influence, implement and support data storage innovation as actual production solutions.  See the press release and call for presentations for more information.  We are excited to embark on developing this contemporary new event into an industry mainstay in the coming years.

    Outside of the USA, events are also critically important vehicles for the autonomous SNIA Regional Affiliates to fulfill their mission.  The audience there is typically more biased towards business executives and IT managers, and over the years their events have evolved to incorporate adjacent technology areas, new developments and regional requirements.

    As an example of this evolution, SNIA Europe’s events partner, Angel Business Communications, recently announced that its very successful October event, SNW Europe/Datacenter Technologies/Virtualization World, will be simply known as Powering the Cloud starting in 2014, in order to unite the conference program and to be more clearly relevant to today’s IT industry. See the press release for more details.

    Other Regional Affiliates have followed a similar path with events such as Implementing Information Infrastructure Summit and Information Infrastructure Conference – both tailored to meet regional needs.

    The bottom line on this is that the SNIA is absolutely committed to a global events strategy to enable it to carry out its mission.  We are excited about the evolution of our various events to meet the changing needs of the market and continue to deliver unique vendor-neutral content. IT professionals, partners, vendors and their customers around the globe can continue to rely on SNIA events to inform them about new technologies and developments and help them navigate the rapidly changing world of IT.


    Trends in Data Protection

    July 29th, 2011

    Data protection hasn’t changed much in a long time.  Sure, there are slews of product announcements and incessant declarations of the “next big thing”, but really, how much have market shares really changed over the past decade?  You’ve got to wonder if new technology is fundamentally improving how data is protected or is simply turning the crank to the next model year.  Are customers locked into the incremental changes proffered by traditional backup vendors or is there a better way?

    Not going to take it anymore

    The major force driving change in the industry has little to do with technology.  People have started to challenge the notion that they, not the computing system, should be responsible for ensuring the integrity of their data.  If they want a prior version of their data, why can’t the system simply provide it?   In essence, customers want to rely on a computing system that just works.  The Howard Beale anchorman in the movie Network personifies the anxiety that burdens customers explicitly managing backups, recoveries, and disaster recovery.  Now don’t get me wrong; it is critical to minimize risk and manage expectations.   But the focus should be on delivering data protection solutions that can simply be ignored.

    Are you just happy to see me?

    The personal computer user is prone to ask “how hard can it be to copy data?”  Ignoring the fact that many such users lose data on a regular basis because they have failed to protect their data at all, the IT professional is well aware of the intricacies of application consistency, the constraints of backup windows, the demands of service levels and scale, and the efficiencies demanded by affordability.    You can be sure that application users that have recovered lost or corrupted data are relieved.  Mae West, posing as a backup administrator, might have said “Is that a LUN in your pocket or are you just happy to see me?”

    In the beginning

    Knowing where the industry has been is a good step in knowing where the industry is going.  When the mainframe was young, application developers carried paper tape or punch cards.  Magnetic tape was used to store application data as well as a media to copy it to. Over time, as magnetic disk became affordable for primary data, the economics of magnetic tape remained compelling as a backup media.  Data protection was incorporated into the operating system through backup/recovery facilities, as well as through 3rd party products.

    As microprocessors led computing mainstream, non-mainframe computing systems gained prominence and tape became relegated to secondary storage.  Native, open source, and commercial backup and recovery utilities stored backup and archive copies on tape media and leveraged its portability to implement disaster recovery plans.  Data compression increased the effective capacity of tape media and complemented its power consumption efficiency.

    All quiet on the western front

    Backup to tape became the dominant methodology for protecting application data due to its affordability and portability.  Tape was used as the backup media for application and server utilities, storage system tools, and backup applications.

    B2T

    Backup Server copies data from primary disk storage to tape media

    Customers like the certainty of knowing where their backup copies are and physical tapes are comforting in this respect.  However, the sequential access nature of the media and indirect visibility into what’s on each tape led to difficulties satisfying recovery time objectives.  Like the soldier who fights battles that seem to have little overall significance, the backup administrator slogs through a routine, hoping the company’s valuable data is really protected.

    B2D phase 1

    Backup Server copies data to a Virtual Tape Library

    Uncomfortable with problematic recovery from tape, customers have been evolving their practices to a backup to disk model.  Backup to disk and then to tape was one model designed to offset the higher cost of disk media but can increase the uncertainty of what’s on tape.  Another was to use virtual tape libraries to gain the direct access benefits of disk while minimizing changes in their current tape-based backup practices.  Both of these techniques helped improve recovery time but still required the backup administrator to acquire, use, and maintain a separate backup server to copy the data to the backup media.

    Snap out of it!

    Space-efficient snapshots offered an alternative data protection solution for some file servers. Rather than use separate media to store copies of data, the primary storage system itself would be used to maintain multiple versions of the data by only saving changes to the data.  As long as the storage system was intact, restoration of prior versions was rapid and easy.  Versions could also be replicated between two storage systems to protect the data should one of the file servers become inaccessible.

    snapshot

    Point in Time copies on disk storage are replicated to other disks

    This procedure works, is fast, and is space efficient for data on these file servers but has challenges in terms of management and scale.  Snapshot based approaches manage versions of snapshots; they lack the ability to manage data protection at the file level.  This limitation arises because the customer’s data protection policies may not match the storage system policies.  Snapshot based approaches are also constrained by the scope of each storage system so scaling to protect all the data in a company (e.g., laptops) in a uniform and centralized (labor-efficient) manner is problematic at best.

    CDP

    Writes are captured and replicated for protection

    Continuous Data Protection (both “near CDP” solutions which take frequent snapshots and “true CDP” solutions which continuously capture writes) is also being used to eliminate the backup window thereby ensuring large volumes of data can be protected.  However, the expense and maturity of CDP needs to be balanced with the value of “keeping everything”.

     

     

    An offer he can’t refuse

    Data deduplication fundamentally changed the affordability of using disk as a backup media.  The effective cost of storing data declined because duplicate data need only be stored once. Coupled with the ability to rapidly access individual objects, the advantages of backing up data to deduplicated storage are overwhelmingly compelling.  Originally, the choice of whether to deduplicate data at the source or target was a decision point but more recent offerings offer both approaches so customers need not compromise on technology.  However, simply using deduplicated storage as a backup target does not remove the complexity of configuring and supporting a data protection solution that spans independent software and hardware products.  Is it really necessary that additional backup servers be installed to support business growth?  Is it too much to ask for a turnkey solution that can address the needs of a large enterprise?

    The stuff that dreams are made of

     

    PBBA

    Transformation from a Backup Appliance to a Recovery Platform

    Protection storage offers an end-to-end solution, integrating full-function data protection capabilities with deduplicated storage.  The simplicity and efficiency of application-centric data protection combined with the scale and performance of capacity-optimized storage systems stands to fundamentally alter the traditional backup market.  Changed data is copied directly between the source and the target, without intervening backup servers.  Cloud storage may also be used as a cost-effective target.  Leveraging integrated software and hardware for what each does best allows vendors to offer innovations to customers in a manner that lowers their total cost of ownership.  Innovations like automatic configuration, dynamic optimization, and using preferred management interfaces (e.g., virtualization consoles, pod managers) build on the proven practices of the past to integrate data protection into the customer’s information infrastructure.

    No one wants to be locked into products because they are too painful to switch out; it’s time that products are “sticky” because they offer compelling solutions.  IDC projects that the worldwide purpose-built backup appliance (PBBA) market will grow 16.6% from $1.7 billion in 2010 to $3.6 billion by 2015.  The industry is rapidly adopting PBBAs to overcome the data protection challenges associated with data growth.  Looking forward, storage systems will be expected to incorporate a recovery platform, supporting security and compliance obligations, and data protection solutions will become information brokers for what is stored on disk.