Read what tech experts and influencers have to say about getting the most value from your storage architecture so it can scale and adapt to evolving workloads. Credit: Shutterstock/Pingingz Artificial intelligence (AI) technologies and hybrid/multi-cloud trends are putting pressure on organizations to optimize their storage strategy to ensure data availability — while enabling scalability and efficiency. For example, generative AI (genAI) applications have further accelerated data creation, which in turn increases the need for efficient, available storage that is also cost-effective. Optimizing all that data, whether in the cloud or enterprise data centers, is largely dependent on tiered data storage, which uses a mix of hard disk drives (HDDs), solid state drives (SSDs), and the ever-persistent archival tape storage. “Different applications and data have varying requirements around access frequency, speed, and cost-effectiveness,” says Brad Warbiany, director, planning and strategy at Western Digital. “As AI datasets, checkpoints, and results grow in size and volume, high-capacity HDDs are the only cost-efficient bulk storage solution for cold and warm data with an essential role alongside cloud-optimized technologies for modern AI and data-centric workloads.” IT and business decision-makers, as well as technologists and influencers from our 娇色导航Experts Network, echoed this strategy when we asked: How can organizations address the biggest challenges in scaling storage infrastructures while balancing cost efficiency, sustainability, and long-term total cost of ownership (TCO)? The agility angle “As data volume, driven by AI, continues to increase, organizations must leverage data life cycle policies and auto-tiering to optimize storage capacity and control costs, ensuring data is dynamically moved to lower-cost tiers as it becomes less active,” says Hasmukh Ranjan (), senior vice president and 娇色导航at AMD. Other experts agree that with AI rapidly evolving, organizations need flexibility and adaptability to meet future needs. “Implementing agile, high-performance storage platforms is crucial for handling the dynamic and ever-expanding nature of AI workloads,” says Chris Selland (), independent consultant, analyst and lecturer on entrepreneurship and innovation at Northeastern University D’Amore-McKim School of Business. Selland points out that incorporating tools such as tiered storage can optimize costs by aligning storage resources with evolving data requirements. Automated data life cycle policies can help ensure that “data is stored on the most appropriate storage tier based on its age, access requirements, and business value.” While data center SSD can provide key advantages, such as low latency, it’s not sufficient to justify a higher TCO for many applications and comes with a potential acquisition cost that can be six times greater than HDD. Even during periods of significant SSD price drops, the TCO advantage of HDD has blunted any major shift in data center market share. Meeting goals for business value According to experts, many organizations are striving to balance their storage needs with sustainability goals. “There needs to be a balance within companies to increase their storage demands that AI drives with that of staying energy efficient so as not to grow their organization’s carbon footprint,” says Scott Schober (), president and CEO at Berkeley Varitronics Systems. “Balancing performance with sustainability requires a collaborative multi-generational team that can devote attention to your storage infrastructure,” says Will Kelly (), a writer focused on AI and the cloud, “while also extending their focus to controlling data sprawl and optimizing your cloud storage tiers while cultivating an architecture that can scale and adapt as your AI workloads evolve.” Then there’s the issue of assigning storage based on its value to the business, says Arsalan Khan (), speaker, advisor, and blogger on business and digital transformations: “One of the biggest challenges is striking the right balance between collecting data for strategic, high-value use cases versus just accumulating data without a clear purpose. When scaling storage infrastructure, it’s critical to align these considerations with cost efficiency, sustainability, and long-term TCO.” That reinforces the need to assign storage tiers based on the value of the data. Savvy administrators will prioritize TCO and HDDs for lower-performance cool/warm workloads — which make up the bulk of the data center environment — while strategically deploying SSDs for workloads that benefit from a performance advantage. The rapid deployment of genAI technology can exacerbate the challenges for those with a storage infrastructure that can’t keep up, say experts: “GenAI is extending the business value of cleaned data, including real-time transactional data, unstructured data used for training AI models, and long-term archived data required for compliance,” says Isaac Sacolick (), president of and author of. “IT teams manage many data types in data warehouses, data lakes, cloud file systems, and SaaS — with different performance and compliance requirements. The challenge for CIOs is defining and managing an agile storage infrastructure that scales easily, enables moving data depending on business need, meets security requirements, and has low-cost options to fulfill compliance requirements.” Kumar Srivastava (), CTO at Turing Labs, adds: “Rapid growth in data from R&D formulations demands agile, scalable storage solutions that support AI-driven analysis with data spanning multiple formats, structure, and quality. Ensuring low latency for data access while integrating modern tools with legacy systems is critical.” Also, as with just about anything involving IT, enterprises are contending with the IT skills gap, which affects storage management. “Inexperience in allocating dynamic resources for complex AI models results in poor orchestration, a costly problem,” says Peter Nichol (), data and analytics leader for North America at Nestlé Health Science. “This creates idle resources and encourages overprovisioned clusters, leading to waste. Cost leakage occurs more frequently than you might think.” Consider the architecture The intersection of AI and storage strategies necessitates a well-thought-out approach to storage architecture. It is critical to align appropriate storage types with the business outcomes that organizations are seeking from AI. HDDs provide a significant and persistent TCO advantage, making them a preferred option to fulfill a dominant share of tiered storage architectures and ensure a cost-effective approach to achieve the business outcomes that organizations are seeking from AI. Learn even more about efficient scaling of the data center by reading “The Long-Term Case for HDD Storage.” SUBSCRIBE TO OUR NEWSLETTER From our editors straight to your inbox Get started by entering your email address below. Please enter a valid email address Subscribe