There are over one billion smart meters already deployed across the world - 38 million in the UK alone. These are embedded IoT devices, designed to have an ultra-small footprint and fitted with lightweight software that is constantly communicating with energy providers and the national grid.
The cybersecurity of embedded devices like these is weighed against their size and performance requirements - while we want smart meters to be secure, we also want them to operate smoothly, use up very little energy and space, and send the right data at the right time. Additional layers of encryption would increase the size of this data, potentially impacting the performance and cost of smart meter infrastructure.
But this encryption is something the energy sector, and the technology supply chain as a whole, now has to rethink. Last year, the US National Institute of Standards and Technology revealed its final standards for post-quantum cryptography (PQC) - a new form of encryption designed to protect data from potential quantum computer attacks.
Now, agencies across the US and Europe, including the NSA and NCSC, are recommending that the governments and businesses coordinate their migration to PQC so that every device is quantum-safe by 2035.
This the biggest cybersecurity transition in a generation, and a real challenge for the world’s 1B+ smart meters.
Why do smart meters need to be upgraded?There are three main reasons for upgrading smart meters to PQC - risk, compliance, and market forces.
Risk is the word at the heart of every conversation around cryptography and cybersecurity. Every new iteration of an encryption algorithm or cybersecurity application is designed to stay one step ahead of the attackers and mitigate the chance of a breach. PQC is designed to mitigate the risk of an attack from future quantum computers, which experts anticipate will easily be able to crack current encryption standards.
When this cryptographically-relevant quantum computer emerges, critical national infrastructure (like the energy grid) will be a prime target for disruption. Therefore, energy networks need to act now to protect themselves and their data from this future risk. As vulnerable endpoints in the energy network - with the technical capability to cut off power supplies to households - smart meters need to be secured to ensure that infrastructure is protected from attack.
Providers should also have one eye on whether their smart meters comply with new regulations. Government guidelines are all recommending that hardware and software align with NIST’s PQC standards by 2035 at the latest - much sooner if your customer is the government itself. Simply put, the transition must take place, and is in fact already underway.
Finally, market forces will soon compel decision-makers still on the fence to upgrade smart meters to PQC. As migration deadlines approach, energy suppliers and hardware manufacturers who can promise PQC-enabled devices will be preferred for government and corporate contracts over those that have delayed their transition.
The challenge of upgrading smart metersThere are two parts to the smart meter PQC challenge - upgrading the millions of “brownfield” devices that are already deployed, and ensuring that the millions of “greenfield” devices currently on the production line are prepared for the upcoming PQC deadlines.
In most cases, already-deployed devices will require an over-the-air firmware update to become PQC-secure. This could be a major challenge for older memory-constrained models, and replacing this legacy hardware is likely to be the most costly part of the transition.
Where these upgrades are possible, there are physical challenges as well. Smart meters are small, embedded devices with minimal amounts of RAM and computing capacity. They are also limited on bandwidth - transmitting small amounts of data with every network communication they make. PQC implementations will need to work within these constraints, but some may run into issues.
For example, post quantum encryption keys are larger than RSA/ECC keys, meaning that a quantum-safe message is larger than those currently being sent by a smart meter.
Many smart meters rely on fixed-function hardware cryptography that is unchangeable, and cannot be upgraded in the field - this means that, on these devices, it’s not possible to update secure boot processes and maintain cryptographic agility (the ability to rapidly adapt the cryptography on a device).
Manufacturers don’t need to worry about over-the-air upgrades for “greenfield” smart meters that are still being designed, as they have a chance to protect devices before production. They will still face issues with memory and CPU, and will need to ensure that PQC is factored into their design process to ensure devices are compliant beyond 2035.
The next steps for smart metersThe first and most important step for the energy sector is to plan ahead thoroughly. 2035 is sooner than it seems - especially for large-scale digital transformation change projects - and this is a process that many companies will be hoping to finalize well ahead of that deadline.
The goal of the transition is to maintain the highest standards of security without compromising performance and without racking up unsustainable costs. Inevitably, the oldest models of smart meters that cannot receive over-the-air updates will need to be replaced - the ten-year transition timeline means that this can be factored into annual budgeting for hardware upgrades in the field, rather than through an impractical all-at-once rollout.
For all other devices - deployed and in production - manufacturers and energy providers need to identify where the most critical data on their device is transmitted from and focus on securing this as a priority. For smart meters, this means communication modules and the process by which they could trigger an energy shutoff, as these are the vectors that attackers will target first.
To navigate the challenges of migrating embedded and memory-constrained systems to PQC, smart meters will need low-footprint implementations of PQC, which are designed to apply NIST’s standards without exerting excessive demand on CPU and RAM. It’s worth bringing on PQC expertise to ensure that the right implementation for the right device is found - as robust as the PQC algorithms published by NIST are, they are also only as secure as the way they are implemented.
Manufacturers will need to factor PQC into their product roadmap. This sounds daunting, but as much as 80% of this transition will be handled in the supply chain - meaning that the vendors who supply the communication modules, HSMs, and microprocessors used in smart meters will themselves be responsible for upgrading vulnerable cryptography. The remaining 20% is the manufacturer’s responsibility - communications channels and metering software that needs to be upgraded in-house.
The key message for energy providers and device manufacturers is that this process needs to start as soon as possible. Smart meters are designed to have a long shelf life, and the risk of deploying devices in 2030 that are obsolete in 2035 is one that should be avoided.
We list the best antivirus software.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
The mostly symbolic move puts added diplomatic pressure on Israel as the war and humanitarian crisis in the Gaza Strip rage. France is now the biggest Western power to recognize Palestine.
(Image credit: Ludovic Marin)
In the southern state of Chiapas, which borders Guatemala, the New World screwworm fly's rapid spread appears to have caught most ranchers off guard, despite memories of previous outbreaks in the 1980s and 1990s.
(Image credit: Isabel Mateos)
A breakthrough on a ceasefire deal between Israel and Hamas following 21 months of war has eluded the Trump administration as humanitarian conditions worsen in Gaza.
(Image credit: Jehad Alshrafi)
The order aims to ban "pay-for-play" NIL deals, mandates scholarships for women's and Olympic sports and threatens to withhold funds from schools who don't comply. But its legality is in question.
(Image credit: Win McNamee)
Germany, France and the United Kingdom will hold talks with Iran in Istanbul Friday, just days after the three European nations warned they would reimpose stiff sanctions on Tehran.
(Image credit: Ronald Zak)
The Federal Communications Commission approved the sale of Paramount Global after the buyer made pledges to showcase a diversity of viewpoints and root out alleged bias in CBS' news coverage.
(Image credit: Dimitrios Kambouris/Getty Images)
The order guarantees voters, at least for now, the ability to sue to enforce rights guaranteed under the landmark 1965 law.
(Image credit: Drew Angerer)
Federal education policy has seen a lot of changes since President Trump's inauguration. For example, the Department of Education itself, which Trump has vowed to close.
But that hasn't stopped the Trump administration from also wielding the Department's power. Most recently, by withholding billions of dollars for K-12 schools.
The Trump administration has drastically changed the federal government's role in education. What does that mean for American classrooms?
For sponsor-free episodes of Consider This, sign up for Consider This+ via Apple Podcasts or at plus.npr.org.
Email us at considerthis@npr.org.
(Image credit: J. David Ake)
Patriot Memory has introduced the MD330 Storage Hub, a device that combines a unique set of capabilities in a single unit.
The company says it offers high-speed charging, local storage, and 4K video output through a single USB-C port - features which are rarely found together, especially in a device weighing just 21 grams.
This makes the MD330 less a refinement of existing accessories and more an attempt to redefine what a USB hub or docking station can be.
Designed for tight workflowsThe MD330 connects via USB-C and supports 4K display output, with both mirrored and extended screen modes, and also delivers up to 100W of power, which theoretically allows it to charge laptops or handheld consoles while connected to an external display.
Whether this works reliably under sustained use, particularly with power-hungry devices, is something Patriot has yet to fully demonstrate.
These capabilities are usually reserved for more bulky setups, so questions remain about thermal handling and power stability.
Beyond video and charging, the MD330 integrates flash storage, available in capacities from 128GB to 1TB, which turns the hub into a portable archive for media or project files.
However, transfer speeds are not specified, and there’s no mention of encryption or file system compatibility - and without those details, it’s difficult to determine whether the storage is suitable for anything beyond casual use.
The MD330 aims to simplify mobile setups by combining a power bank, USB hub, and dock into one device, but that doesn’t guarantee consistent performance.
Running display output and power delivery at the same time could strain the device, especially if file transfers are also in progress.
How it manages bandwidth and thermal load across these tasks will ultimately determine whether it’s genuinely useful or just an overpromised accessory.
At the time of writing, Patriot has not revealed the price of the MD330, a key detail that could heavily influence its appeal.
Via Techpowerup
You might also likePresident Trump visited the Federal Reserve to inspect an ongoing renovation and disagreed with Powell about the final cost of the project in an extraordinary moment.
(Image credit: Chip Somodevilla)
Google Photos is rolling out new generative AI features that can transform still images into short video clips, briefly bringing anyone in the photo to life and including natural-looking motion. The Photo to Video tool employs Google’s Veo 2 AI video model, the same model deployed on YouTube, Gemini, and other parts of Google’s ecosystem. The feature doesn’t turn your snapshots into full movie trailers; it just creates six-second clips.
Once you see the option to make your images into videos, you just pick the image picture you want to animate, then choose either “Subtle movements” or “I’m feeling lucky” from the buttons below. As you can imagine, the subtle movement choice has the people in the picture move around a little bit. The model is designed to guess what might have happened in that frozen second. The other choice could do anything, perhaps even throw confetti in the air.
The update is rolling out in the U.S. on Android and iOS right now, but there are other AI tools coming later this summer to Google Photos. Most notable is the Remix feature coming in the next few weeks. Remix takes your existing photos and restyles them into looking like comic book panels, anime stills, 3D renderings, or pencil sketch art. It's an ability that Gemini and its many rivals already offer, but now it will be built directly into your photo gallery and won't need you to write a full prompt for it.
All of this comes together in a new section of the app called the Create tab, which will serve as a hub for these tools and any other AI features Google may release in the months ahead. In the near term, it will include the Photo to Video and Remix features alongside the existing collage and highlight video creators. But as Veo gets smarter and Google’s confidence grows, the possibilities could expand into any number of AI enhancements such as extended video clips, voiceovers, or multi-image stories.
The packaging is what is crucial here. This is the first time that photo-to-video generation has been embedded into a mainstream app like Google Photos, which the company claims has more than a billion users.
AI-powered video tools like Sora and Veo have generated headlines for their jaw-dropping realism and deepfake potential. But Google Photos isn’t pitching this update as a creative revolution. It’s presenting it as a memory enhancement. That said, Google doesn't want to accidentally trick anyone about where the new images and videos come from. That's why every AI-generated video or remix will carry a visible label showing that the content was created with AI. They will also each include an invisible SynthID watermark identifying the AI behind its production, the same as the one used by all of Gemini’s image and video generators.
AI photo inspirationIt's unlikely Google will simply drop these new features and move on. After all, the company has already deployed Veo 3, the latest iteration of the text-to-video model, to Gemini and YouTube for higher-quality short videos complete with synced dialogue and background audio. Tools that animate stills today may very well narrate them tomorrow.
This is more of a play for those not constantly trying the latest AI toy, but who do like to share photos and look at pictures taken by others. It's easy to poke fun at the idea of making your selfie move, but that's the sort of feature that attracts a lot of users who want to see just how animated AI can make them.
You might also likeThe White House directive calls for prioritizing money for programs that require sobriety and treatment, and for cities that enforce homeless camping bans.
(Image credit: Mark Makela)
A new fabrication method could make photonic circuits cheaper and more practical by directly integrating quantum dot (QD) lasers onto silicon chips, a process that could influence how future smart home devices, fitness trackers, and even laptops are engineered.
The research team, led by Rosalyn Koscica at the University of California, achieved this by combining three key strategies.
They used a pocket laser configuration for direct integration, followed a two-step growth method involving metalorganic chemical vapor deposition and molecular beam epitaxy, and introduced a polymer gap-filling technique to reduce optical beam spread.
Closing the gap with careful engineeringThis development addresses longstanding challenges involving material incompatibilities and coupling inefficiencies that have historically limited the performance and scalability of integrated photonic systems.
The combined efforts minimized the initial interface gap and made it possible for lasers to function reliably on silicon photonic chiplets.
As the researchers note, “Photonic integrated circuit (PIC) applications call for on-chip light sources with a small device footprint to permit denser component integration.”
The new approach enables stable single-mode lasing at the O-band frequency, which is well-suited for data communications in data centers and cloud storage systems.
By integrating the lasers directly with ring resonators made of silicon or using distributed Bragg reflectors from silicon nitride, the team has also addressed issues related to alignment and optical feedback.
One of the more surprising findings from the research is how well the lasers perform under heat.
“Our integrated QD lasers demonstrated a high temperature lasing up to 105 °C and a life span of 6.2 years while operating at a temperature of 35 °C,” says Ms. Koscica.
These performance metrics suggest a level of thermal stability previously difficult to achieve with monolithically integrated designs.
This thermal resilience opens the door to more durable applications in real-world environments, where temperature fluctuations can limit the reliability of photonic components.
It may also reduce the need for active cooling, which has traditionally added cost and complexity to past designs.
Beyond performance, the integration method appears well suited to large-scale manufacturing.
Because the technique can be executed in standard semiconductor foundries and does not require major changes to the underlying chip architecture, it holds promise for broader adoption.
The researchers argue that the method is “cost-effective” and “can work for a range of photonic integrated chip designs without needing extensive or complex modifications.”
That said, the approach will likely face scrutiny regarding consistency across large wafers and compatibility with commercial photonic systems.
Also, success in controlled lab environments does not guarantee seamless deployment in mass manufacturing settings.
Still, the combination of a compact laser design, compatibility with conventional processes, and integration of O-band functionality makes this development notable.
From data centers to advanced sensors, this silicon-compatible laser integration could bring photonic circuits closer to mass-market viability.
Via IEEE
You might also like