Category: big data
Take the 2017 Archive Requirements Survey!
by Samuel A. Fineberg, Co-chair, SNIA LTR TWG
Ten years ago, a SNIA Task Force undertook a 100 Year Archive Requirements Survey with a goal to determine requirements for long-term digital retention in the data center. The Task Force hypothesized that the practitioner survey respondents would have experiences with terabyte archive systems that would be adequate to define business and operating system requirements for petabyte-sized information repositories in the data center. Read More
Cast Your Vote on November 8 for the Magic and Mystery of In-Memory Apps!
It’s an easy “Yes” vote for this great webcast from the SNIA Solid State Storage Initiative on the Magic and Mystery of In-Memory Apps! Join us on Election Day – November 8 – at 1:00 pm ET/10:00 am PT to learn about today’s market and the disruptions that happen when combining big-data (Petabytes) with in-memory/real-time requirements. You’ll understand the interactions with Hadoop/Spark, Tachyon, SAP HANA, NoSQL, and the related infrastructure of DRAM, NAND, 3DXpoint, NV-DIMMs, and high-speed networking and learn what happens to infrastructure design and operations when “tiered-memory” replaces “tiered storage”.
Presenter Shaun Walsh of G2M Communications is an expert in memory technology – and a great speaker! He’ll share with you what you need to know about evaluating, planning, and implementing in-memory computing applications, and give you the framework to evaluation and plan for your adoption of in-memory computing.
Register at: https://www.brighttalk.com/webcast/663/230103