top of page

7 OCTOBER | LONDON 2024

SEPTEMBER 12TH - 14TH
The O2, LONDON

The CogX blog

Thought leadership on the most pressing issues of our time

Guest contributor: AI & DeepTech

Hitachi Vantara



Hitachi Vantara's CTO for Artificial Intelligence, Jason Hardy, is a leading voice in shaping the future of AI and large data processing. In our latest Q&A, he dives deep into the cutting-edge advancements and strategic plans propelling Hitachi Vantara to the forefront of this space. Leveraging his vast experience in data-driven solutions and digital transformation, Jason offers insightful perspectives on the challenges and possibilities that lie ahead.



Solving the AI Data Challenge: How Hitachi Vantara's Technology is Enabling the Next Generation of AI


“Solving the [AI] data challenge is at the core of who Hitachi Vantara is.”


Data is essential in the age of AI — after all, it is the lifeblood of complex large language models (LLMs). However, as anyone working in the field knows, AI is demanding and insatiably hungry for information.


Unleashing AI's true power requires seamless data integration for AI workloads. Hitachi Vantara, a leader in AI data infrastructure, is tackling this challenge for enterprises.


To peel back the curtain on what's coming next, we spoke with Jason, the CTO for Artificial Intelligence at Hitachi Vantara. From their involvement in the awe-inspiring Las Vegas Sphere project to the future of data processing and its impact across healthcare, finance, sustainability and beyond, Jason reveals how they are solving the AI data challenge for various industries.



1. What was Hitachi Vantara’s role in the Sphere project? What did you hope to achieve?


Hitachi Vantara was inspired by Sphere’s vision for a new style of immersive entertainment at a scale and quality unlike anything else. Our goal was to help make sure that the technology behind it was ready to meet the challenge. With the world’s highest resolution LED screen inside the venue and the largest LED screen on the exterior, Sphere leaned on Hitachi Vantara to deliver solutions to stream video content on an unprecedented scope.


Hitachi Vantara worked in tandem with Sphere’s engineering teams and long-time integrator, 7thSense, to evolve the high-speed storage capabilities provided by Hitachi Content Software for File (HCSF) to predictably and reliably maintain the extremely high-performance requirements necessary. For Sphere’s original immersive film, Darren Aronofsky’s Postcard From Earth, the system handles over 400 gigabytes a second of throughput at sub 5 milliseconds of latency and a 12-bit color display at a 444 subsampling. The HCSF system consists of 27 nodes, with 4PB of flash storage for playback within Sphere and streamed in real-time to 7thSense media servers, each streaming 4K video at 60 frames per second – a world-first in terms of technology capability at this scale.


This mission critical requirement demanded nothing but success, and I am proud to say that our collaboration more than delivered. Achieving a smooth video playback experience at this scale demanded coordination between all aspects of the pipeline. HCSF, being integral to this experience, ensured that 7thSense’s software was able to deliver to the screen in a consistent manner.




2. How did Hitachi Vantara handle and process the enormous amount of data required for this project? What technologies were crucial in achieving this?


Hitachi Content Software for File is the same technology that other Hitachi Vantara customers utilize for massive AI and Generative AI requirements – including foundational model training – as well as massive data lake house requirements, high speed data analytics, genome research, and other mission critical workloads that operate at a massive scale. We determined that HCSF would be capable of managing the scale and raw performance requirements of this project because of its history of success in other critical environments. The challenge that Sphere brought included the density requirements for a system of this scale, and the latent-sensitive environment we were working in. Being able to serve the video streaming requirements to Sphere took some very creative engineering to ensure the delivery was consistent and the performance and latency were always within the tolerances.



3. Beyond Sphere, where do you see this technology being applied, and how do you anticipate these technologies evolving? Will generative AI play a role in this progress?


Other media and entertainment market customers will very much benefit from the improvements we have been able to incorporate into our technology. However, the reach goes beyond the creative world. We are already seeing a large amount of interest and adoption of this core technology into AI and GenAI markets, including some customers who are building foundational models.

 

This technology is already benefiting customers who are attempting to re-engineer their data lake or data lake house platforms for analytics, as well as those customers who are doing research in the healthcare and life science space. Hitachi Vantara focuses on multiple disciplines, including financial services and insurance, healthcare and life sciences, and manufacturing, among others. This technology benefits each of these markets in foundational ways, especially as customers are trying to innovate.

 

Every customer who is thinking about what data means to them and how they can best exploit it for the next wave of innovation will require a solution like HCSF. Innovation requires the ability to make split second decisions, and this is what AI and GenAI helps address. Since data is the fuel for AI, the ability for large data sets to be consolidated and integrated into AI and GenAI workloads is critical to success. Without the proper data foundation, customers looking to adopt AI and GenAI into their ecosystems will fail until they have solved this data challenge. Solving the data challenge is at the core of who Hitachi Vantara is. It’s not just about providing the fastest solution. It’s also about understanding the entirety of the ecosystem that wraps around the outcome, ensuring that investments can be justified, and improvements can be realized.



4. Hitachi Vantara is focused on creating better technological experiences responsibly. How do you ensure environmental sustainability in your technology?


Sustainability is at the core of everything Hitachi Vantara – and our parent organization, Hitachi Ltd.– do. Sustainability is not just for large-scale customers, either, but also for our widespread customers who operate at small scale. We ensure that everything we do tracks back to answering the question “Have we done this the right way?”

 

For example, we are all aware that GPUs consume a tremendous amount of power. It is how they can calculate such substantial amounts of information and power AI and GenAI outcomes that are changing the way we all operate. It is our responsibility, as a creator of technology, to ensure that what we engineer can provide as much improvement to the GPU while being as sustainable as possible. We cannot directly improve on the GPU’s power requirements, but what we can do is support the ecosystem around the GPU, ensuring that everything we touch is as efficient as possible. For example, HCSF can help a GPU run 20x faster. This mindset goes beyond the goal of being fast and efficient, it requires using technologies to improve on the economics that help scale these solutions. 

 

It’s also about the other technologies we create, like our unified primary storage platform Virtual Storage Platform One, which is responsible for supporting mission critical workloads, like core banking systems, database workloads, virtualization, and a lot of the other core-systems that make up IT organizations. By taking a holistic approach across our entire portfolio, we ensure that we are constantly putting our best foot forward and are always working towards improvement. This is a core piece of our DNA as a company.



5. How do you see the intersection of large-scale data processing and high-resolution technology shaping the future of digital immersive experiences?


Not everyone can create an end-to-end, large-scale media experience. That takes a tremendous amount of effort, skill, and tenacity. However, technology is already starting to show that immersive digital experiences are more easily available through new forms of consumption, combined with the imagination of creators. Creating content that continues to push the envelope will be the challenge. This is where Generative AI will start to make a monumental difference in what is available: not just from a content perspective, but by allowing more people to have access to these abilities. What was once reserved for the largest post-production or animation houses is now making its way to independent and small-scale creators. Combined with virtual reality, augmented reality, the Metaverse, and other new ways to consume content beyond live and in-person venues, the very definition of digital immersion will evolve to be more accessible and more inclusive.

 

Nothing will beat seeing something live, especially at a venue like Sphere in Las Vegas. The roar of the crowd, the jaw dropping immersion from the screen, and the mind-numbing audio all blend to create an experience that cannot be duplicated. However, technology is evolving to bring this type of experience closer to our homes. As new artists and creators start to dream up what’s next, these tools to help the creation and generation of content will evolve to become part of the everyday creative pipeline.



 

Author:


As Chief Technology Officer for Artificial Intelligence, Jason Hardy is responsible for the creation and curation of Hitachi Vantara’s AI strategy and portfolio. He is defining the future and strategic direction of Hitachi iQ, the company’s AI Platform, and cultivating a level of trust and credibility across the market by fostering strong working relationships with customers and partners, and leading public facing events. Jason represents the company externally by communicating the company's vision and value proposition for AI and by collaborating with key partners to develop comprehensive go-to-market strategies.


Prior to his current role, Jason served as the CTO for Hitachi Vantara’s Data Intelligence portfolio. He brings more than 20 years of experience tackling complex data-driven problems for the world’s largest organizations, addressing critical areas such as digital transformation, object storage technology, artificial intelligence, and high-performance data analytics.


Jason's deep expertise extends beyond technology. Drawing on his experience consulting with hundreds of global clients on data strategy, he incorporates a strong customer-centric perspective into his work. He frequently speaks at industry events, sharing valuable insights and best practices on leveraging emerging technologies to achieve optimal business outcomes.



 

Did you enjoy this post? Then you’ll love our weekly briefings on AI & Deeptech. Check out some previous editions here, or just cut straight to the chase and subscribe to our newsletters exploring AI, net zero, investing, cinema, and deeptech.

 

Previous
Next
bottom of page