back to top
More
    HomeNewsNVIDIA Expands DRIVE Hyperion Ecosystem With 11 Global Partners for Level 4...

    NVIDIA Expands DRIVE Hyperion Ecosystem With 11 Global Partners for Level 4 Autonomous Vehicles

    Published on

    Anthropic Acquires Vercept: Claude Now Operates Software Like a Human

    Anthropic’s acquisition of Vercept is not a talent grab or a defensive move. It is a direct investment in making Claude the most capable computer-using AI agent available. The bottleneck has always

    NVIDIA announced at CES 2025 that its DRIVE Hyperion autonomous vehicle platform is expanding to include 11 new global partners, including Bosch, Magna, Sony, and ZF Group. The platform now features dual DRIVE AGX Thor systems-on-chip built on Blackwell architecture, delivering over 2,000 FP4 teraflops of real-time compute for Level 4 autonomy. Mercedes-Benz, JLR, and Volvo Cars have adopted the platform, which passed safety certifications from TÜV SÜD and TÜV Rheinland.

    What’s New

    NVIDIA’s DRIVE Hyperion ecosystem adds tier 1 suppliers, automotive integrators, and sensor partners: Aeva, AUMOVIO, Astemo, Arbe, Bosch, Hesai, Magna, Omnivision, Quanta, Sony, and ZF Group. These companies are building DRIVE Hyperion-based electronic control units or qualifying sensor suites on the production-ready architecture.

    The platform uses two NVIDIA DRIVE AGX Thor SoCs that deliver roughly 1,000 INT8 trillion operations per second for real-time sensor fusion across 360-degree views. This compute power enables transformer-based perception, vision language action models, and generative AI workloads to process complex driving scenarios instantly.

    NVIDIA also released Alpamayo, a family of AI models optimized for the DRIVE Hyperion platform. Alpamayo 1 is a 10 billion-parameter chain-of-thought vision language action model that allows autonomous vehicles to reason through rare edge cases like traffic light outages.

    The latest DRIVE Hyperion iteration will be available in the first half of 2025 for both passenger and commercial vehicles, including long-haul freight.

    Why It Matters

    Level 4 autonomy means vehicles can handle all driving tasks without human intervention in specific conditions. NVIDIA’s unified platform reduces development time and testing costs by ensuring hardware compatibility across the ecosystem.

    The dual Thor SoC configuration marks a significant compute leap for production autonomous systems. By comparison, earlier platforms required custom integration for each sensor and compute configuration, delaying time-to-market by months or years.

    Commercial trucking adoption signals that full self-driving is moving beyond robotaxis into freight logistics, where driver shortages and operational costs make autonomy economically viable.

    Safety and Certification

    DRIVE Hyperion passed assessments from TÜV SÜD and TÜV Rheinland, making it the first platform to receive comprehensive third-party automotive safety and cybersecurity validation. The platform uses NVIDIA Halos, a safety framework spanning data center to vehicle.

    Halos provides tools for independent inspection, system validation, and certification aligned with global automotive standards. Combined with large-scale simulation and AI data factory workflows, Halos enables continuous testing across millions of virtual and real-world scenarios.

    NVIDIA CEO Jensen Huang stated the platform delivers “unmatched functional safety and AI” for autonomous machines that rely on physical AI world foundation models.

    Platform Specifications

    Component Specification
    Compute Dual DRIVE AGX Thor SoCs
    Architecture NVIDIA Blackwell
    Performance 2,000+ FP4 teraflops / 1,000 INT8 TOPS
    Autonomy Level Level 4-ready
    OS NVIDIA DriveOS
    Availability H1 2025

    What’s Next

    Automotive partners will begin integrating DRIVE Hyperion-based systems into production vehicles throughout 2025. The modular design allows automakers to customize software and service layers while using a common compute and sensor foundation.

    NVIDIA is releasing an open dataset with over 1,700 hours of driving data and AlpaSim, an open-source simulation framework on GitHub for validating autonomous systems. These tools will help developers train and test Alpamayo-based applications using real and synthetic data generated through NVIDIA Cosmos world models.

    Aurora and Continental plan to deliver driverless trucks at scale by 2027 using NVIDIA’s platform. Additional partnerships with Uber, Wayve, and Waabi were announced at CES, though deployment timelines remain unconfirmed.

    Featured Snippet Boxes

    What is NVIDIA DRIVE Hyperion platform?

    DRIVE Hyperion is an end-to-end autonomous vehicle platform combining DRIVE AGX system-on-chip, DriveOS operating system, sensor suite reference architecture, and Level 2+ to Level 4 driving stack. It provides modular, production-ready components for building self-driving passenger and commercial vehicles.

    Which automakers are using DRIVE Hyperion?

    Mercedes-Benz, Jaguar Land Rover (JLR), and Volvo Cars have adopted DRIVE Hyperion. Toyota is integrating DRIVE AGX and DriveOS into future models, while Aurora and Continental are using it for autonomous freight trucks.

    How much computing power does DRIVE Hyperion have?

    The platform delivers over 2,000 FP4 teraflops, equivalent to approximately 1,000 INT8 trillion operations per second. This is powered by dual NVIDIA DRIVE AGX Thor SoCs built on Blackwell architecture.

    What is Level 4 autonomy?

    Level 4 autonomy enables vehicles to handle all driving tasks without human intervention in defined conditions or geographic areas. Unlike Level 2/3 systems requiring driver supervision, Level 4 vehicles operate independently within their operational design domain.

    Mohammad Kashif
    Mohammad Kashif
    Senior Technology Analyst and Writer at AdwaitX, specializing in the convergence of Mobile Silicon, Generative AI, and Consumer Hardware. Moving beyond spec sheets, his reviews rigorously test "real-world" metrics analyzing sustained battery efficiency, camera sensor behavior, and long-term software support lifecycles. Kashif’s data-driven approach helps enthusiasts and professionals distinguish between genuine innovation and marketing hype, ensuring they invest in devices that offer lasting value.

    Latest articles

    Anthropic Acquires Vercept: Claude Now Operates Software Like a Human

    Anthropic’s acquisition of Vercept is not a talent grab or a defensive move. It is a direct investment in making Claude the most capable computer-using AI agent available. The bottleneck has always

    Samsung Galaxy Buds4 Pro Officially Lauched: Everything You Need to Know Before March 11

    Samsung launched the Galaxy Buds4 series at Galaxy Unpacked 2026 in San Francisco, and the lineup arrives with more hardware changes than any previous Buds generation. The Buds4 Pro moves to a dual-

    Perplexity Computer Is the General-Purpose AI Worker That Handles Entire Projects, Not Just Prompts

    Perplexity has quietly redefined what AI software can do. Perplexity Computer is not a chatbot upgrade or a search feature. It is a fully autonomous, multi-agent platform designed to carry entire projects

    Samsung ProScaler: The AI Display Technology That Makes Every Screen Sharper

    Most smartphones display video at whatever resolution the source provides. Samsung ProScaler refuses that limitation. Introduced with the Galaxy S25 series at Unpacked 2025,

    More like this

    Anthropic Acquires Vercept: Claude Now Operates Software Like a Human

    Anthropic’s acquisition of Vercept is not a talent grab or a defensive move. It is a direct investment in making Claude the most capable computer-using AI agent available. The bottleneck has always

    Samsung Galaxy Buds4 Pro Officially Lauched: Everything You Need to Know Before March 11

    Samsung launched the Galaxy Buds4 series at Galaxy Unpacked 2026 in San Francisco, and the lineup arrives with more hardware changes than any previous Buds generation. The Buds4 Pro moves to a dual-

    Perplexity Computer Is the General-Purpose AI Worker That Handles Entire Projects, Not Just Prompts

    Perplexity has quietly redefined what AI software can do. Perplexity Computer is not a chatbot upgrade or a search feature. It is a fully autonomous, multi-agent platform designed to carry entire projects
    Skip to main content