Astronauts face several risks during spaceflight, including exposure to radiation.
(Image credit: Keegan Barber)
As data centers push for higher data throughput and reduced latency, many are transitioning from hard disk drives (HDDs) in their servers to solid-state drives. This shift aims to boost performance, increase efficiency and cut operating costs. But not all SSDs are built the same, making it key to select the right type for enterprise and other environments.
SSD classes are distinguished by the two main components: the flash storage controller and the non-volatile NAND flash memory used to store data.
In today’s market, SSD and NAND flash memory consumption are split into three main groups:
Choosing the right SSD for enterprise use involves more than simply replacing HDDs. SSDs come in various form factors (such as 2.5") and interfaces including Serial ATA (SATA), Serial Attached SCSI (SAS) and the newer NVMe PCIe which directly connects storage to the server’s CPU.
Despite their ease of deployment, not every SSD is suitable in the long term for enterprise workloads, and the cost of making the wrong – or cheapest – choice can lead to premature wear, inconsistent write performance and increased latency.
To guide the selection process, let’s explore the three key characteristic that separate enterprise-grade SSDs from their client-class counterparts: performance, reliability and endurance.
1. PerformanceEnterprise SSDs are designed to provide sustained high-speed read and write operations for both sequential and random data requests from the CPU through multi-channel architecture and parallel access from the SSD’s controller to the NAND flash chips.
In environments handling complex workloads like real-time data analysis, CAD collaborations or global banking transactions, the storage devices must deliver low latency and simultaneous multi-client data access without any degradation in response time. User productivity is a direct result of low latency.
Client applications involve only single users or application access with a higher tolerable delta between the minimum and maximum response time (or latency) on any user or system actions.
SSDs used in complex storage arrays, such as Network Attached Storage, Direct Attached Storage or Storage Area Network can also be negatively impacted by mismatched performance, causing problems with the storage array latency, the ability to sustain performance and, of course, quality of service as perceived by users.
Unlike client SSDs, enterprise SSDs use multi-channel architectures and parallel NAND access to maintain peak and steady-state performance, ensuring consistent quality of service (QoS) even during traffic surges.
2. ReliabilityNAND flash memory, though fast, comes with inherent limitations such as finite life expectancy (as NAND flash cells wear during repeated writes) and natural error rates. Enterprise SSDs combat this with advanced Error Correction Code (ECC) mechanisms to manage bit errors and maintain data integrity.
The SSD controller’s ability to correct bit errors can be interpreted by the Uncorrectable Bit Error Ratio (UBER), “a metric for data corruption rate equal to the number of data errors per bit read after applying any specified error-correction method”, as defined by the industry standards association, JEDEC. Enterprise class SSDs differ from client-class SSDs in terms of their ability to support heavier write workloads, more extreme environmental conditions and to recover from a higher Bit Error Ratio than a client SSD.
To further enhance reliability, enterprise SSDs often integrate end-to-end data protection. This ensures data accuracy as it moves between the host and NAND storage, using parity data and redundancy checks to recover corrupted data blocks. In these SSDs, periodic checkpoint creation, cyclic redundancy check (CRC) and ECC error correction are also implemented in an end-to-end internal protection scheme to ensure the integrity of data from the host through the flash and back to the host.
SSDs can incorporate physical circuitry for power loss detection that manages power storage capacitors on the SSDs. This allows the capacitor to complete pending writes during sudden outages, adding another layer of security. Power loss protection (PLP) circuitry is usually required for applications where data loss is not recoverable.
There are environments in which the use of software-defined storage or server clustering can cut the need for hardware-based power fail support as any data is replicated onto a separate and independent storage device on a different server or servers. Web-scale data centers often dispense with power fail support using software defined storage to RAID servers to store redundant copies of the same data.
3. EnduranceEndurance reflects how long an SSD can reliably handle data writes. NAND Flash cells degrade with each program or erase (P/E) cycle until they are unable to store data accurately. When this happens, the degraded block is removed from the user addressable storage pool and the logical block address (or LBA) is moved to a new physical address on the NAND Flash storage array.
A new storage block is used to replace the bad one on the SSD. This also means that the Bit Error Ratio rises, resulting in a set of management techniques being implemented on the enterprise SSD controller to manage the cell capability to reliably store data over the expected life of the SSD.
Enterprise SSDs are built for continuous 24/7 use, unlike client SSDs that typically operate on an 8-hour cycle, but in both cases their endurance needs to be understood. To measure this, manufacturers usually use the JEDEC committee endurance measurement metric of ‘terabytes written’ (TBW), which estimates the amount of data an SSD can handle before the NAND flash becomes unreliable.
The write amplification factor (WAF) – the ratio of actual NAND writes compared to data received from the host – also impacts endurance. Higher WAF can accelerate wear, so enterprise SSD controllers use sophisticated algorithms to manage data distribution and extend lifespan.
When considering other measures of component reliability, the ‘mean time between failure’ (MTBF) is an important model. Enterprise SSD components are assessed on longevity and their ability to manage the voltages across all NAND flash memory over their lifespan. All enterprise SSDs should be rated at least at two million hours MTBF.
SMART monitoring and reporting on enterprise-class SSDs allows the device to be assessed for life expectancy prior to failure based on the current write amplification factor (WAF) and wear level. Pre-failure predictive warnings for failure events, including loss of power, bit errors occurring from the physical interface or uneven wear distribution, are often also supported.
Client-class SSDs may only feature the minimum SMART output for monitoring the SSD during standard use or post failure.
Some SSDs also allow for an increased reserve capacity of NAND flash memory to be allocated as an over-provisioned (OP) spare capacity. Not visible to the user and operating system access, this is a temporary write buffer for higher sustained performance and as a replacement of defective flash memory cells and enhances the reliability and endurance of the SSD.
In summaryUnderstanding the differences between enterprise and client SSDs — from NAND endurance to performance optimization — is essential when upgrading data center storage. Enterprise SSDs offer robust solutions tailored for high-intensity workloads, ensuring reliability and minimizing downtime.
By carefully selecting SSDs suited to specific applications, organizations can future-proof their storage infrastructure and maintain seamless operations.
We've listed the best external hard drives.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
Doctors who mail abortion medication pills across state lines have been on alert ever since Louisiana, which bans abortion, indicted a New York doctor for mailing the pills to a woman there.
(Image credit: Elissa Nadworny)
The F-35 was meant as a one-size-fits-all fighter that could be used across NATO. But strained U.S.-Europe relations are giving some member countries second thoughts about the U.S.-built plane.
(Image credit: Thomas Lohnes)
Sudiksha Konanki remains missing after she disappeared during a spring break trip with friends.
(Image credit: Francesco Spotorno)
NPR readers of different belief systems share the poignant rituals that make them feel close to their spirituality. For some, it's poetry and gardening, for others, it's meditation and community.
Israel launched deadly strikes in Gaza to pressure Hamas to agree to a new ceasefire. Hamas isn't budging, and more than half of recently freed hostages oppose the renewed war. Why is Israel doing it?
(Image credit: Bashar Taleb)
Leadership teams across the public and private sectors face some tough decisions that could ultimately define their technology strategies for years to come. Rising cloud computing costs and the accelerated uptake of artificial intelligence (AI) now rank highly on the agenda for many. At the same time, the ongoing reliance on legacy systems continues to impede further digital transformation.
With budgetary pressures mounting and the need to demonstrate the value of IT now more urgently than ever, leaders need to get a handle on the rapidly changing IT landscape or risk falling behind. As they plan for the future, here are my top five trends set to dominate the agenda in 2025.
1. A return to on-premises infrastructureOver the past decade, cloud computing has become foundational for IT infrastructure, promising flexibility, scalability, and cost efficiency. Today, many organizations are reconsidering their cloud-first strategies and looking to return to—at least partly—on-premises infrastructure.
This shift is primarily driven by rising overheads, which are exacerbated by unpredictable pricing models, data egress fees, and the increased costs associated with the shift to AI. As a result, enterprises are exploring hybrid models, such as repatriating some services, to reduce costs and gain more control over their infrastructure. Although this may seem surprising, the return to on-premises infrastructure is not about rejecting the cloud but optimizing IT investments for long-term sustainability.
2. AI—From buzzword to business process revolutionNo one can doubt the impact of AI on the digital world. For many, 2025 is the year AI moves from the ‘hype phase’ to a mainstay of business processes. Organizations are no longer simply experimenting with AI–they’re embedding the technology into their workflows to drive efficiency gains and competitive advantage.
AI-driven automation is key. Routine tasks that once required significant human effort are all being optimized with AI tools. It’s a similar story with AI’s predictive capabilities reshaping decision-making at the executive level. However, to truly integrate AI into business processes, organizations must address concerns around bias, data integrity, and regulation to establish AI as a reliable, ethical, and scalable tool for transformation.
3. The end of legacy systems as companies accelerate digital transformationReliance on legacy IT systems has long been a challenge for enterprises, but 2025 may mark a tipping point where organizations can no longer afford to delay modernization. Companies will need to replace outdated systems to reap the benefits of AI, automation, and advanced analytics.
Still, the replacement of legacy systems is being hastened thanks to ongoing regulatory pressures and ever-present security concerns. In today’s digital world, organizations need modern IT infrastructures to support real-time data access and seamless integrations.
4. Dismantling IT silos for collaboration and better resultsIt’s a similar story with IT silos, which for years have stifled innovation and prevented organizations from fully realizing the benefits of emerging technologies. With AI and automation transforming industries, organizations want to foster stronger collaboration between IT, security, operations, and business teams. Without this, the roll-out of AI risks stumbling before it can succeed.
As a result, cross-functional collaboration is becoming increasingly important as organizations embrace methodologies such as DevOps, SecOps, and DataOps. Platforms that unify data access and provide real-time insights across departments will be essential in breaking down silos.
5. Leaders need to prove value amid rising costsWithout acknowledging chief information officers' (CIOs) growing budgetary challenges, no discussion about the year ahead would be complete. As each quarter passes, IT leaders are being asked to stretch their resources further by streamlining operations, cutting redundant tools, and seeking maximum ROI.
As a result, there’s a stronger push for IT teams to demonstrate how their spending directly supports business goals. In some cases, this drives a renewed focus on IT financial management (ITFM), where CIOs are expected to align technology investments directly with business outcomes.
Looking aheadLooking ahead, IT departments designed to show real impact—whether through cost savings, boosting revenue, reducing risks, or driving innovation—will secure continued investment. With economic uncertainty and rising costs, there’s a growing belief that IT should be regarded simply as an expense but as a key driver of business success.
The year ahead poses some interesting challenges for business leaders. AI is becoming a core part of operations; cloud costs are making companies rethink their strategies, and legacy systems are reaching their limit. At the same time, CIOs are under growing pressure to show the actual value of IT investments while bringing teams together to work more effectively. The balance between opportunities and threats will be key to success in the coming months.
We feature the best IT management tools.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
I’d be lying if I said I’d not been wishing for ages that Valve opened up SteamOS to other handheld PCs and machines beyond its excellent Steam Deck. And after some waiting, that looks to be finally happening with the recent SteamOS 3.7.0 Preview coming with the "beginnings of support for non-Steam Deck handhelds."
Now that’s not a vast amount to go off given it's only for a preview version of SteamOS. But with a model of the Lenovo Legion Go S being powered by SteamOS and set to arrive in May, we could see the advent of a new range of SteamOS-based handheld PCs.
There are already ways to run SteamOS on non-Steam Deck machines, but support for them is unofficial and they lack the slick handheld integration of Steam in the same vein as the Steam Deck. The best handheld PCs tend to run Windows 11 with a form of handheld interface on top of the operating system and then tap into Steam’s Big Picture Mode to enable a console-like handheld gaming experience.
Having native SteamOS support would surely make all this slicker for new handheld PCs and those that could be retrofitted or dual-booted with Valve’s Linux-based operating system. In our hands-on time with the Lenovo Legion Go S, we were certainly sold on the idea of the SteamOS version.
Steamy dreamsWhile I still want Microsoft to work on refining Windows 11 to work more smoothly on handheld PCs, especially if it does indeed make a form of Xbox handheld, I feel SteamOS is more up to the task of supporting handheld PC gaming, especially given it’s had some three years to mature.
What excites me further, is that by dropping Windows 11 reliance, handheld PCs from the likes of Asus, Lenovo and MSI could tap into an operating system that requires less overhead resources and can thus unlock more power out of the chips on handheld PCs.
The Steam Deck’s AMD Zen 2 and RDNA 2-based APU isn’t as powerful as other chips in other handhelds, such as the Asus ROG Ally X that sports a more powerful AMD Ryzen Z1 Extreme processor and RDNA 3-based graphics. But the Steam Deck arguably offers the slickest and smoothest gaming experience when one takes into account software and hardware in tandem.
Letting SteamOS take care of the operating system and interface could be the best of both worlds, with, say, a next-generation ROG Ally sporting a powerful APU with silicon horsepower that can be readily accessed thanks to a smaller need for overhead compute resources.
All this could open up a new avenue for handheld PCs that could finally challenge the Steam Deck, at least in my eyes.
The only caveat would be access to third-party game services. SteamOS and the Steam Deck were been built around users tapping into the wide array of games supported on the Steam Store, rather than enable easy access to Xbox Game Pass or the Epic Games Launcher.
However, there are already workarounds to get the likes of Xbox Cloud Gaming running on the Steam Deck; Valve doesn't appear to discourage this. So I’d not be surprised to see hardware makers work to build on SteamOS to integrate other launchers in a neat, easy-to-use fashion.
It's hard to say when we could see more SteamOS-based handhelds. But given we’ve just got the latest SteamOS preview, it shouldn’t be long before a full version is let out into the wild; that could come in April.
If that happens we could see a bunch of handheld PCs that eschew Windows 11 for SteamOS this side of 2025; I’m crossing my fingers.
You might also likeThe antitrust lawsuit filed by the Professional Tennis Players' Association says the organizations that run the sport hold "complete control over the players' pay and working conditions."
(Image credit: Kelly Barnes)
Volvo is using a new AI technique called 'Gaussian splatting' to train its vehicles and accelerate its goal of zero collisions on the roads – and it's all thanks to its recently expanded partnership with Nvidia.
Last month we reported that the upcoming Volvo ES90 will be the most powerful car it has ever created in terms of core computing capacity, due to it packing a dual Nvidia AGX Orin configuration.
Now, the company has revealed how this sort of supercomputing is also helping it to more quickly train its Advanced Driver Assistance Systems.
Volvo claims that it can now synthesize incident data collected by the advanced sensors in its latest vehicles, such as emergency braking, sharp steering or manual intervention.
This then allows the company to reconstruct and explore them in new ways to better understand how incidents can be avoided.
The novel method is dubbed Gaussian splatting and it allows the company’s software to produce realistic, high-fidelity 3D scenes and subjects from real-world visuals.
(Image credit: Volvo)Once these scenes have been created, Volvo’s engineers can manipulate them to generate a number outcomes. The video clip examples the Swedish marque provides are freakishly realistic.
It's akin to a human learning how to skateboard by playing Tony Hawk's Pro Skater for hours and hours on end.
“We can select one of the rare edge cases and explode it into thousands of new variations of the scenario to train and validate our models against," Alwin Bakkenes, Head of Global Software Engineering at Volvo Cars, explains.
Bakkenes says this has the potential to unlock a scale that Volvo has never had before and even to catch edge cases before they happen in the real world.
Now the computers are training the computers (Image credit: Volvo)Gaussian splatting is a relatively new 3D rendering technique that doesn’t rely on neural networks, unlike more complex methods such as Neural Radiance Field (NeRFS).
This allows for incredibly complex 3D scenes to be created in real time. The technique is currently being explored in multiple industries, from gaming to interactive app development.
Volvo’s use of advanced Lidar, sensor and high-definition camera technology, as first showcased in the EX90, collects reams of data that can then be reproduced in a manipulatable 3D model, which allows its engineers to then train the vehicle’s AI to perform better in the real world.
There was some disappointment when the EX90 launched, seeing as its Lidar technology would remain offline for consumer use, effectively banished to merely collecting data until Volvo’s compute power was at a level where the company was happy to introduce ADAS systems that rely on the sensor suite and software stack.
Thankfully, its recently announced partnership with Nvidia will help the Swedish marque, which is synonymous with road safety, to realize its vision of zero collisions and driver assistance systems that actually help, rather than simply nag.
What’s more, the company has also stated that early EX90 models will be updated with the dual Nvidia AGX Orin System on a Chip set-up, so they too can make the most of the latest developments in autonomous driving and ADAS systems.
You might also likeThe United Nations has long been in the spotlight over allegations of child rape and other sexual abuses by its peacekeepers, especially by those based in Congo and the Central African Republic.
(Image credit: Salvatore Di Nolfi)
The decision sparked angry protests from bullfighting supporters and matadors, some of whom tried to breach a police barricade at the local Congress.
(Image credit: Ginnette Riquelme)
Many of us have used file converters before, as when you need an MP3 but your recording is an M4A, free online file converters are a first port of call.
However the FBI has now warned some of these “free tools” are increasingly infecting victim’s devices with malware.
It says criminals are using the enticing offer of an easy and swift file transfer, like a .doc to a .pdf file, or combining files, like multiple .jpegs into one .pdf - useful when you need to upload something or send it in a particular format.
Risk of ransomwareAThe conversion tools will convert your files, but in the meantime, will infect the converted file with hidden malware, which it then hands over to the victim.
The malware can then exfiltrate personal information like names, social security numbers, banking information, crypto currency, and more - leaving the user at risk of identity theft or fraud.
s if that isn’t bad enough, some of these attacks also infect the victim’s device with ransomware, taking control of the computer.
Viruses and malware infections can have disruptive consequences for users, but there are key tools that can help.
“The best way to thwart these fraudsters is to educate people so they don’t fall victim to these fraudsters in the first place,” said FBI Denver Special Agent in Charge Mark Michalek.
“If you or someone you know has been affected by this scheme, we encourage you to make a report and take actions to protect your assets. Every day, we are working to hold these scammers accountable and provide victims with the resources they need.”
If you think you may have accidentally downloaded malware or a virus, we have advice on how to remove malware from your device, but the best defense is being careful and never downloading anything from an untrusted source.
You might also likeI don’t think it is an understatement to say that Nvidia wants to be the company that powers the artificial intelligence universe we may one day live in permanently.
The opening keynote at Nvidia GTC 2025, colloquially referred to as the Woodstock of AI, saw CEO Jensen Huang extol the virtues of an AI ecosystem powered by hardware - and software - from the second most valuable company in the world.
We were re-introduced to the concept of AI factories, which are essentially gigantic data centers and where, Nvidia’s world, tokens generated are equivalent to revenue: the presentation mentioned a price, $1 trillion.
The cost of building data centers, like mega-farms, increases exponentially every generation as they get more complex, and require more power to house even more compute.
A prototype of the Newton robot in Nvidia's Isaac lab (Image credit: Nvidia)The lofty Stargate project has a budget of $500 billion and Apple has already committed to spending $500 billion over the next few years, a large chunk of which will be on data centres.
Depending on what analysts you talk to, the global data centre capital expenditure is expected to reach $1 trillion either in 2027 (PWC) or 2029 (Dell’Oro).
So that gives you a measure of Nvidia’s extremely ambitious targets for itself and for humanity.
A big piece of that puzzle that Nvidia introduced at the keynote is its Photonics switch systems which will allow those AI factories to “scale to millions of GPUs”, seemingly within the same physical perimeter.
A Nvidia GB300 NVL72 server rack (Image credit: Nvidia) Big, huge numbersThe biggest number of them all in this presentation was when Rev Lebaredian Nvidia’s VP, Omniverse and Simulation Technology, presented a slide about how Physical AI is transforming $50 trillion industries.
These span across two billion cameras, 10 million factories, two billion vehicles and, Lebareian added ‘Future Billion humanoid robots’.
Yes, it’s not just about robots, but humanoid robots, millions of them. Robots, in general, have been around for decades in many shapes and forms.
Nvidia, however, envisions a world where billions of robots are built on what it calls, its “Three Computers”: Pre and post-training (done via DGX), simulation and synthetic data generation (done via Omniverse) and Runtime (done using AGX).
The world of the future may be home to “multiple types of humanoid robots” that will be able to post-train - while on the job so to speak - with real or synthetic data, allowing them to learn new skills or enhance their knowledgebase on the fly, literally at (or near) the speed of light.
And the first of them may well be based on its new GR00T N1 humanoid robot foundation model, developed in partnership with Google Deepmind and … Disney Research.
(Ed: One can not see the tie-in with GR00T from Guardians of the Galaxy and Wall-E, a potentially prescient movie about the impact of robotics and AI on humanity).
Where does this leave us? With a clear roadmap to a world where AI becomes an indispensable part of what it means to be human, perhaps the most important utility of them all. Never, in the history of humanity, will so many depend on so few.
Oh and the other acronym (AGI) was not mentioned during the presentation, not once.
You might also likeIf you’ve ever wanted to see Pedro Pascal dancing in a red and yellow dreamland, well, today is your lucky day. That’s because Apple has just released a new ad featuring the actor and directed by Spike Jonze, and the aim is to promote the company’s AirPods 4 with Active Noise Cancellation.
The spot begins with Pascal leaving a café after seemingly getting his heart broken. As he walks down a snowy street, he puts in his AirPods and begins listening to El Conticinio by Spanish musician Guitarricadelafuente. As he turns on the active noise cancellation mode, his surroundings change to an icy landscape filled with dancers moving in time to the music.
He’s then snapped back to reality when a passerby asks him for directions, whereupon Pascal activates the AirPods’ transparency mode so that he can hear the person while still listening to his music.
He then spies a happier version of himself across the street. This incarnation of Pedro Pascal re-enables active noise cancellation on his AirPods and suddenly finds himself in a bright red and yellow world. As Perfect by Sam i & Tropkillaz begins playing, he joyfully dances through the flowery streets, seemingly restored to happiness.
By the end, he returns to the real world and sees his sadder self. The two exchange comforting glances with each other before the original Pascal walks away down the street.
Highlighting key featuresThe purpose of the ad is to highlight the active noise cancellation feature in Apple’s AirPods 4. This can cut out background noise, allowing you to focus on whatever it is you are listening to. These also feature a transparency mode that allows some external sounds to be heard so you can conduct a conversation and be more aware of your surroundings.
AirPods 4 with Active Noise Cancellation also come with an Adaptive Audio feature that can automatically change between active noise cancellation and transparency mode without you having to do anything. It doesn’t look like that’s in use in the commercial, though, as Pascal reaches up to manually change the audio mode on his AirPods throughout the short film.
The ad might feel familiar to fans of Apple’s original iPod commercials, which featured silhouetted figures dancing while listening to music on the device.
It’s also not the first time Jonze has directed an Apple ad – in 2018, he worked with the company on a short film titled Welcome Home that was made to promote the HomePod.
You might also likeThe decision by U.S. District Judge Ana C. Reyes blocks the Department of Defense from carrying through with a policy directive designed to remove transgender service members from the military.
(Image credit: Chip Somodevilla)
Radio Free Europe/Radio Liberty, a government-backed overseas broadcaster, sued the Trump administration in an attempt to get it to release funds appropriated by Congress.
(Image credit: MICHAL CIZEK/AFP via Getty Images)
Better and more accurate weather forecasts could soon be a more common occurrence for all of us thanks to a new upgrade from Nvidia.
At its Nvidia GTC 2025 event, the company unveiled the next step along in its plan of using a digital twin of planet Earth to help forecasters build more accurate forecasting models.
The NVIDIA Omniverse Blueprint for Earth-2 weather analytics model provide a significant update on the original iteration, announced at GTC 2024, providing more accurate and powerful tools to make forecasts even better.
Better forecasts, fasterNvidia says severe weather-related events have caused a $2 trillion impact on the global economy over the last decade, and better forecasts could be one way of mitigating this.
It says the NVIDIA Omniverse Blueprint for Earth-2 offers a number of useful tools, including Nvidia GPU acceleration libraries, a physics-AI framework, development tools and microservices to help speed up the process of going from prototyping to production with weather forecast models.
In effect, Nvidia says this should all help developers build solutions that deliver warnings and updated forecasts in seconds, rather than the minutes or hours needed with traditional CPU-driven modeling.
“We’re seeing more extreme weather events and natural disasters than ever, threatening lives and property,” said Nvidia CEO Jensen Huang.
“The NVIDIA Omiverse Blueprint for Earth-2 will help industries around the world prepare for — and mitigate — climate change and weather-related disasters.”
The platform has already signed up several major climate tech companies including AI company G42, JBA Risk Management, Spire and others.
You might also like