Error message

  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Notice: Trying to access array offset on value of type int in element_children() (line 6591 of /home/cay45lq1/public_html/includes/common.inc).
  • Deprecated function: implode(): Passing glue string after array is deprecated. Swap the parameters in drupal_get_feeds() (line 394 of /home/cay45lq1/public_html/includes/common.inc).
  • Deprecated function: The each() function is deprecated. This message will be suppressed on further calls in menu_set_active_trail() (line 2405 of /home/cay45lq1/public_html/includes/menu.inc).

Technology

New forum topics

The Samsung Galaxy S25 Ultra seemingly scratches more easily than its predecessor

TechRadar News - Mon, 02/03/2025 - 04:14
  • A durability test has revealed that the Samsung Galaxy S25 Ultra's Corning Gorilla Armor 2 screen might scratch more easily than first thought
  • This could, however, mean that the phone is less likely to shatter when dropped
  • These results are at odds with Corning's claims

The move to Corning Gorilla Armor 2 protection is supposedly one of the biggest upgrades for the Samsung Galaxy S25 Ultra, with the new material sounding, on paper, like a big improvement on the original Gorilla Armor used by the Samsung Galaxy S24 Ultra. However, a new durability test suggests the reality might be a bit more complicated than that.

JerryRigEverything (via Phone Arena) has released a video testing the durability of the Galaxy S25 Ultra in various ways, and one interesting thing they found was that the phone's screen started getting scratched at level six of the Mohs scale of hardness (a system used to measure scratch resistance).

That’s in line with a lot of other high-end phones, but it’s a step down from the Galaxy S24 Ultra, which in the same test didn’t start getting scratches until level seven. So, in other words, the Galaxy S25 Ultra’s screen is seemingly easier to scratch than its predecessor’s.

A shatter-resistant compromise

So, what’s going on here? While JerryRigEverything doesn’t have a definitive answer, they posit that perhaps the glass was too brittle on the Samsung Galaxy S24 Ultra. After all, increasing scratch resistance will typically mean making the glass harder, which can also make it more likely to shatter when dropped.

So, Corning and Samsung might have decided that this is a better balance – sacrificing some scratch resistance for improved drop resistance.

That said, while Corning itself states that drop resistance has been improved compared to the original Gorilla Armor, it also says that “Gorilla Armor 2 maintained its exceptional scratch resistance.”

So, it’s possible that, for whatever reason, JerryRigEverything’s results will prove to be outliers. But even if the glass on the Galaxy S25 Ultra really is more prone to scratching, we’d argue it’s probably a fair trade if that means it’s less likely to smash.

What do you value more in a smartphone: strong scratch resistance or strong drop resistance? Let us know in the comments.

You might also like
Categories: Technology

The next PlayStation State of Play could come as early as Valentine's Day

TechRadar News - Mon, 02/03/2025 - 04:08
  • A fairly reliable leaker has hinted at the next State of Play date
  • The next PlayStation presentation could air around Valentine's Day
  • Updates for upcoming PS5 games like Ghost of Yotei seem likely

A PlayStation State of Play presentation could be happening this month per a reliable leaker, which would line up with previous years.

The rumor comes from NateTheHate on X / Twitter (via VGC), a fairly reliable source who has revealed accurate information about PlayStation and Nintendo products in the past. In a reply to another user asking about a specific date for the State of Play presentation, he responds: "What does your heart tell you?" That strongly suggests a live date of on or around February 14 (Valentine's Day).

Now, it's possible that Nate could just be guesstimating here. A State of Play for February 2025 is a fairly safe bet, given Sony has run these presentations towards the start of the last two years. We also know that Sony has at least two big PS5 games in the pipeline for this year - those being Death Stranding 2: On the Beach and Ghost of Yōtei, both sequels to a pair of critically acclaimed titles.

A State of Play this month would also present a good opportunity for Insomniac Games to give a more thorough update on its upcoming Marvel's Wolverine game. The developer did share a small message about the game last week, stating it has to remain "very stoic until it’s time to pop the claws down the road." Could that time be this month? We'll need to wait and see.

It'd also be reasonable to expect an update on Tekken 8 season 2 - a massive patch for the fighting game that doesn't yet have a release date. While we currently don't know which characters are going to be added for the game's second year, Bandai Namco has not so subtly teased the return of Anna Williams in the original season 2 trailer.

In any case, keep your eyes glued to PlayStation's social channels this month, as that State of Play announcement could happen any day now.

You might also like...
Categories: Technology

Savings Rates Over 4% APY Could Stick Around a While Longer, but Not Forever. Today's Best Savings Rates for Feb. 3, 2025

CNET News - Mon, 02/03/2025 - 04:00
This account can help you stick to your savings goals, even if rates fall.
Categories: Technology

A storm is coming: how HPC protects us against weather-related disasters

TechRadar News - Mon, 02/03/2025 - 03:55

To most people, words like algebra, algorithms, and computational mathematics may bring back memories of educational struggles. But behind these abstract concepts lies a powerful, life-saving tool: High-Performance Computing (HPC). HPC leverages advanced mathematics and enormous processing power to handle calculations that were once unimaginable, making it indispensable across a range of disciplines, including meteorology.

Without HPC, our ability to predict natural disasters would be vastly diminished. From issuing hurricane warnings that enable mass evacuations to forecasting floods before they reach populated areas, HPC allows meteorologists to transform data into actionable insights that save lives. In a world without HPC, many more lives would be lost to extreme weather.

Mathematics that saves lives

Natural disasters impact nearly every corner of the globe. In 2023, earthquakes in Turkey and Syria led to the highest death toll of any natural disaster that year. Economically, Hurricane Katrina remains one of the world’s most costly disasters, second only to the earthquake and tsunami that struck Japan in 2011 These catastrophic events underscore the importance of accurate and timely forecasting, a feat made possible by the power of HPC.

During my tenure at Red Oak Consulting and my time working within the Met Office, I have seen firsthand the transformative power of High-Performance Computing (HPC) in forecasting and predicting impending weather catastrophes with remarkable accuracy, which is only going to be more vital to society as we progress through the modern world.

Natural disasters on the rise

Climate change is transforming the world’s landscape, fueling extreme weather events that threaten to reshape entire regions. Rising global temperatures drive severe droughts, increase the intensity of storms, and intensify tropical cyclones. The oceans, warmed by climate change, provide ideal conditions for storms to form and strengthen, while rising sea levels and water-saturated air supercharge these events.

With rising sea levels blurring the boundaries between land and ocean, coastal areas that were once considered safe are now at risk of flooding. Climate change redraws the map of risk, leaving previously untouched regions vulnerable to natural disasters. As the World Wildlife Fund (WWF) notes, climate change is a key driver behind the increased frequency and severity of hurricanes, which last longer and reach higher intensities than ever before.

The devastating floods in Spain, which tragically claimed 200 lives, highlight the increasing risks climate change poses to weather patterns and water systems. According to the Intergovernmental Panel on Climate Change extreme weather events, including floods and droughts, have intensified due to human-induced warming. Rising global temperatures accelerate and destabilize the hydrological cycle, resulting in extreme variations in water availability.

The phenomenon known as DANA (Isolated Depression at High Levels) played a major role, as residual summer heat from the Mediterranean clashed with polar air, creating convective clouds and torrential rains. Warmer sea surfaces and a moisture-laden atmosphere, both driven by climate change, amplify such events, making deadly flash floods more frequent and severe.

Why is HPC crucial to meteorology?

For decades, weather forecasting has relied on complex mathematical equations and vast amounts of data. However, until recently, the computing power needed to process these elements quickly and accurately was limited. HPC has now become a crucial tool, allowing meteorologists to model and forecast extreme weather events with unprecedented precision and speed, giving communities precious time to prepare for what lies ahead.

HPC processes vast datasets from satellites, ocean buoys, radar, aircraft, and ground stations, integrating them to create models that simulate various weather scenarios. These advanced models can predict hurricanes, cyclones, heatwaves, and flash floods by running billions of calculations that reveal how atmospheric conditions may develop. HPC enables these models to be run at an incredibly high resolution, pinpointing likely developments across specific regions and timeframes.

For example, in the case of an approaching hurricane, HPC can forecast its expected path, intensity, wind speeds, and rainfall distribution, allowing meteorologists to issue accurate, targeted warnings well in advance of impact. Without HPC, such precise predictions would be impossible, and communities would be left with much less time to prepare for incoming disasters.

HPC also enables real-time data assimilation, which means it can integrate the latest data into ongoing forecasts, updating predictions minute by minute as conditions change. This capability is particularly vital for rapidly shifting events, like thunderstorms and cyclones, which can be unpredictable and fast-moving. HPC’s speed and scale make it possible to refine forecasts down to highly specific locations, helping authorities make informed decisions on evacuations, shelter locations, and resource deployment.

Future-proofing with climate modelling

While weather forecasting predicts short-term conditions, climate modelling aims to simulate long-term changes, giving us insight into how rising temperatures, greenhouse gases, and other factors will shape our planet’s climate over decades or centuries. HPC is essential for this purpose, allowing scientists to simulate the complex, interconnected systems driving Earth’s climate and to explore possible future scenarios.

At its core, climate modelling is about understanding the intricate dynamics between atmospheric circulation, ocean currents, and land-sea interactions. Models consider variables like greenhouse gas concentrations, solar radiation, cloud formation, and human activities, integrating them to project future climate conditions. Because Earth’s climate system has countless interdependent factors, climate models require immense computational power. HPC enables scientists to run these models and simulate climate interactions with unprecedented detail and accuracy.

HPC-powered climate models provide insights crucial for planning and policymaking. For instance, by running simulations based on different emissions levels, scientists can predict potential outcomes for global temperatures, sea levels, and weather patterns. This data helps inform government policies on climate resilience, infrastructure planning, and disaster preparedness, empowering decision-makers to build defenses against the impacts of climate change.

Furthermore, HPC enables ‘ensemble modelling’, where multiple simulations run in parallel with slight variations to account for uncertainties. This approach yields more reliable, probabilistic forecasts, offering a range of potential outcomes. Ensemble modelling is essential in climate science, as it provides a fuller picture of possible scenarios and equips policymakers with the information needed to make informed, adaptive decisions.

HPC and global aviation

A less visible yet equally vital area where HPC powers forecasting is in global aviation. The skies may seem vast and unpredictable, but behind every transatlantic flight and intercontinental journey lies a finely tuned network of meteorological support. Just two centers worldwide provide real-time aviation forecasts: the World Area Forecast Centers (WAFCs), operated by the UK Met Office and NOAA in the United States.

Tasked with delivering critical weather forecasts for safe flight planning, the WAFCs rely on HPC to generate up-to-the-minute insights that help pilots navigate potential hazards. The Civil Aviation Organization (ICAO) oversees these centers, which keep watch on everything from turbulence to icing conditions across global airspace. With HPC enabling rapid data processing, WAFCs can predict hazardous conditions well before a flight reaches them, allowing pilots to adjust routes and ensure passenger safety.

Imagine a transatlantic flight with hundreds of passengers on board, cruising at high altitude over the Atlantic. Without HPC, forecasters would struggle to track storm systems, turbulence, and potential icing hazards in real time. Thanks to HPC, WAFCs monitor vast stretches of atmosphere, predict weather events, and ensure that flight paths are optimized for safety and efficiency. This essential capability keeps global aviation moving smoothly and mitigates the risks posed by unforeseen weather conditions.

Preparing with precision

As the risk of natural disasters grows, high-performance computing stands as a frontline defense, transforming raw data into life-saving forecasts. HPC models simulate hurricanes, wildfires, and floods in astonishing detail, giving communities and emergency responders precise, real-time updates that guide preparations and minimize harm.

Beyond immediate crises, HPC powers long-term climate models that reveal how rising temperatures and sea levels will shape future risks. Cities use this information to strengthen defenses, plan resilient infrastructure, and adapt to an era of extreme weather. Enhanced by AI, HPC pushes the boundaries of early detection, identifying patterns that signal emerging threats and offering vital insights that help us act with precision.

So, the next time you check a weather forecast, remember the sophisticated systems working behind the scenes-powered by HPC, predicting natural disasters, and preparing us for whatever lies ahead.

We've compiled a list of the best cloud computing services.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

Here's Why You Should Get a Lymphatic Drainage Massage

CNET News - Mon, 02/03/2025 - 03:00
Wondering how to safely do lymphatic drainage massage at home and whether it actually works? Here are the benefits, risks and techniques, according to a licensed specialist.
Categories: Technology

ChatGPT vs. DeepSeek: which AI model Is more sustainable?

TechRadar News - Mon, 02/03/2025 - 02:46

By now, even casual observers of the tech world are well aware of ChatGPT, OpenAI’s dazzling contribution to artificial intelligence. Its ability to generate coherent, on-point responses has upended online research and sparked endless speculation about AI’s growing role in our everyday lives.

A recent rising challenger, China’s opensource AI-powered chatbot, DeepSeek, has drawn its own intrigue, promising to run more efficiently and be better suited to non-English users than its American competitor.

Yet in the rush to assess its functionality, adoption, and potential geopolitical sway, one pressing question seems to have been sidelined: how do the environmental credentials of these ChatGPT and DeepSeek compare?

Where It All Began: A Look at ChatGPT and DeepSeek’s Origins

ChatGPT

ChatGPT’s meteoric rise began in late 2022, with OpenAI and Microsoft forming a high-profile alliance to scale it via Azure’s cloud services. Every iteration of the GPT architecture, however, comes at a steep environmental price. Training such a colossal model requires immense computing power, and the subsequent energy use has raised uncomfortable questions about its carbon footprint.

DeepSeek

While DeepSeek hasn’t yet become a household name to the extent ChatGPT has, it’s earning a reputation as a leaner, more multilingual competitor. It uses techniques like pruning (removing unnecessary parts of the model to reduce size and improve efficiency), model distillation (training a smaller "student" model to imitate a larger "teacher" model), and algorithmic streamlining (optimizing each step of the computation process to minimize wasted resources and improve overall performance) – all intended to cut down on resources and associated costs.

The theory goes that an AI needing fewer GPUs should, in principle, consume less energy overall. Yet details on its total environmental impact remain conspicuously thin, leaving observers to wonder if DeepSeek’s operational gains could truly deliver on the sustainability front.

Energy and Carbon Emissions

The most glaring environmental toll for both models lies in the power needed to train them. Early estimates suggest that rolling out ChatGPT’s latest language model, GPT4, demanded colossal GPU capacity for weeks on end.

DeepSeek, meanwhile, claims to require fewer high-end chips, potentially reducing its total electricity draw.

Data Centers and Energy Sources

Powering ChatGPT on Microsoft’s Azure platform has its upsides and downsides. Microsoft is working to become carbon-negative by 2030, underpinned by investments in green energy and carbon capture. Yet many of its data centers remain tethered to non-renewable energy grids, and the manufacture of sophisticated AI chips is itself resource-intensive.

DeepSeek appears to rely on Alibaba Cloud, China’s most prominent cloud provider, which has set similar targets for carbon neutrality. But China’s national grid continues to rely heavily on coal, meaning the actual environmental impact might be more significant unless DeepSeek is sited in locations rich in renewable infrastructure. That said, DeepSeek’s focus on efficiency might still make it less carbon-intensive overall.

Water Usage and Cooling

Running giant clusters of GPUs produces heat – lots of it. Data centres typically use vast amounts of water for cooling, especially in regions with high temperatures. Microsoft has come under fire for consuming billions of liters of water, some of which goes towards cooling the hardware behind AI operations.

Information on DeepSeek’s water footprint is scant. If Alibaba Cloud’s newer facilities use advanced cooling methods – such as immersion cooling (submerging servers in a thermally conductive liquid to dissipate heat more efficiently) – DeepSeek might fare better in terms of water usage. But with so little public data on its processes, it’s difficult to measure how it stacks up against ChatGPT on this front.

The Hidden Cost of E-Waste

The relentless pace of AI hardware development means GPUs and other accelerators can quickly become obsolete. ChatGPT’s operations, involving cutting-edge equipment, likely generate a rising tide of e-waste, though precise figures are elusive.

In principle, DeepSeek’s more frugal approach implies fewer chips, which could mean slower turnover and less waste. Still, this remains an educated guess until there’s more visibility into how DeepSeek’s hardware ecosystem is managed.

Where Do They Stand?

At first glance, OpenAI’s partnership with Microsoft suggests ChatGPT might stand to benefit from a more environmentally conscious framework – provided that Microsoft’s grand sustainability promises translate into meaningful progress on the ground. DeepSeek, meanwhile, must grapple with a coal-reliant grid in China, yet its drive for efficiency could place it in a better position to curb overall energy consumption per operation.

That said, the U.S. is hardly a clean-energy haven either. While Microsoft has pledged to go carbon-negative by 2030, America remains one of the world’s largest consumers of fossil fuels, with coal still powering parts of its grid. Moreover, political shifts could slow progress: the resurgence of a "drill, baby, drill" mentality in Republican energy rhetoric suggests a renewed push for oil and gas, potentially undermining AI’s green ambitions.

Ultimately, AI is hurtling forward at breakneck speed, but the environmental ramifications lag far behind in public scrutiny. As these systems weave themselves ever deeper into our politics, economy, and daily interactions, the debate on their energy sources, water usage, and hardware footprints must become more transparent. If the world’s appetite for AI is unstoppable, then so too must be our commitment to holding its creators accountable for the planet’s long-term well-being. That responsibility extends not just to China and the U.S. and every nation where AI is trained, deployed, and powered.

We've created a comprehensive list of the best AI tools.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

Why ‘mission-critical’ mobile devices are key to business survival and security

TechRadar News - Mon, 02/03/2025 - 01:56

When we hear the word “mobile,” the automatic assumption is that we’re talking about iPhones and iPads. However, "mobile" is an umbrella term that extends far beyond just phones and tablets.

Mobility use cases are enabled by infrastructure that affords users the freedom to stay connected while on the move. This ranges from handheld portable electronic and smart wearables to point-of-sale (POS) systems and Apple Vision Pro headsets. This means that most of the critical technologies used by businesses to operate and connect with customers are increasingly provided through mobile solutions.

Many organizations have yet to reach a level of maturity in their mobile programs to reflect the critical role devices play. Such assets can no longer be viewed solely as ‘niche’ by businesses. They are, in fact, ‘mission-critical’ devices that must be treated as first-class assets when developing both security and resilience strategies.

What is meant by ‘mission-critical use’

It’s safe to say that one of the biggest revolutions over the last 30 years, alongside developments like the internet, has been the introduction of mobile devices into the workplace.

What originally started as a “nice to have,” with only certain individuals having access to smartphones and personal digital assistants (PDAs), has evolved to the point where mobiles are a necessity for any successful business.

As a result, we have seen new device form factors join the ‘mission critical’ category. These mobile devices are essential to the operation and running of an organization; if one of these devices fails, the entire business would likely grind to a halt.

Many of these devices are deployed in environments where they might be shared among multiple users or designated for specific functions rather than assigned to an individual.

Given the broad definition, there is a wide spectrum of devices that can count as ‘mission critical,’ each one serving a distinct need within a business. This includes tablets used in healthcare to monitor patient recovery or clinical therapy, as well as systems used in retail environments such as mobile credit card processors, as well as to process payments, manage inventory, or for time tracking on the sales floor.

Even an Apple Vision Pro headset can be considered ‘mission critical’ depending on the use case, with such devices being used in power stations to train technicians and optimize site operations.

Whilst these are three very different examples, each illustrates how vital mobile assets have become for organizations. If they were to fail, it could result in lost revenue for a small business, or, in more serious cases, put patients’ or workers’ lives at risk.

Challenges organizations face with managing ‘mission-critical use’ devices

Maintaining operational uptime on ‘mission-critical use’ devices is essential and this means making them both cyber resilient and operationally resilient.

Mobiles are now a common attack vector for cybercriminals, in part because they often exhibit the worst security standards. For example, 40% of mobile users are running a device with known vulnerabilities. Poor cybersecurity standards mean that the bar to exploit such an asset is extremely low, making it easier for cybercriminals to take them offline and halt the operations of victim organizations.

Businesses tend to focus all their efforts on meeting regulatory checkboxes for compliance, yet they often overlook specific security threats and vulnerabilities that might put the device at risk. It’s also frequently assumed that limited-access devices are safe by default, but this is rarely the case, particularly when work devices are used for personal reasons.

On the other hand, some businesses may have elements of strong cybersecurity but fail to implement practices strategically. For example, automated processes might update all devices with new patches at the same time. If the business doesn’t have backup systems in place, then it could face operational downtime while the update takes place.

Performance is king when it comes to mobile devices, and providing the best possible service to customers means frontline workers having the right tools performing reliably.

For example, you can’t have a mobile device used in a critical scenario run out of power because heavy cumbersome software is placing undo strain on the battery.

While these are two different issues, they stem from the same problem: businesses don’t fully understand how dependent they are on mobiles and lack awareness when examining the security of such devices.

While mobile devices used by workers on the frontline, such as POS systems, are obvious additions to the ‘mission critical’ category, the mobile devices of knowledge workers, are equally important. If an executive loses access to their smartphone, they can’t retrieve essential information or perform their job effectively. This is potentially as disastrous as a frontline system going down; however, it’s often overlooked by IT teams.

Organizations need to understand assess all the mobile devices that are used for work and recognize which ones are ‘mission critical.’ Only then can they start addressing the security challenges they face and make mobiles more resilient. Tackling the problem requires a structured and layered model.

Building resilience in ‘mission critical’ devices

The first stage in assessing an organization's ‘mission critical’ footprint involves a comprehensive asset inventory. This means understanding what assets are deployed, where they are, and what they’re accessing.

The inventory should include a mapping of where there are overlaps between devices and the applications that are also ‘mission critical.’ These are crucial aspects that organizations often overlook. Once a business has an understanding of their ‘mission critical’ assets, they can implement a backup plan for when they go down.

During this phase, it’s also crucial for organizations to know their suppliers, and to understand the control options available for the devices they’ve acquired. This will enable security teams to apply minimum security standards immediately as devices are unboxed, making good security hygiene just as important as application deployments.

Basic cyber hygiene practices, such as implementing Multi-Factor Authentication, enforcing rigorous patching processes, and requiring strong passwords, are essential for improving the security standards of ‘mission criticals’. The majority of breaches can be tied to failures in getting the basics right.

Following the ‘mission critical’ asset inventory, it is imperative to implement threat prevention, this includes Device Management to ensure that devices are monitored and security policies enforced. By implementing such capabilities, organizations can block malicious activity before it reaches the device, helping to maintain operational availability in devices.

Settings are equally important. Limiting non-essential notifications, restricting high-risk applications, and carefully managing access controls can enhance safety on ‘mission critical’ tools. The same approach applies to backup planning.

Finally, organizations should think about connectivity to workloads and backend applications that are operated off the device – for example, connection to a database or running an AI workload in the cloud. It’s important that all data in transit between devices and workloads is protected.

However, cybersecurity strategies and practices must also be aligned with uptime strategies. It’s a wasted effort improving cyber hygiene if a malfunctioning update causes downtime in all ‘mission-critical’ devices at the same time.

For ‘mission critical’ devices, broad updates or general alerts aren’t ideal. A tablet relied on by an airline pilot or surgeon should not receive disruptive updates during essential operations.

Organizations need to either establish a plan to schedule downtime aligned with business requirements or procure backup devices and implement a local protocol to enable immediate device swapping as needed.

Ultimately, mobile devices are now critical to the operations of every business. Therefore, IT teams need to treat them in the same way they would other critical assets. Businesses need to have a clear plan for how they manage ‘mission critical’ devices that ensures they are both secure and operationally resilient. For many, this means the time has come for mobile to assume a first class role in the enterprise, where its impact on business is understood and maintained.

We've listed the best Mobile Device Management solutions.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

8 Foods That Will Help You Build Muscle and Optimize Your Gains

CNET News - Mon, 02/03/2025 - 01:33
Muscle growth can be improved by more than just hitting the gym.
Categories: Technology

Stearns & Foster Lux Estate Mattress Review 2025: Experts Test Handcrafted Comfort, Available Online or In-Store

CNET News - Mon, 02/03/2025 - 00:46
If you're looking for a decent supportive hybrid mattress and aren't on a strict budget, the Stearns & Foster Lux Estate Mattress might be a great option for you.
Categories: Technology

Today's NYT Mini Crossword Answers for Monday, Feb. 3

CNET News - Sun, 02/02/2025 - 21:59
Here are the answers for The New York Times Mini Crossword for Feb. 3.
Categories: Technology

La Liga Soccer Livestream: How to Watch Barcelona vs. Alavés From Anywhere

CNET News - Sun, 02/02/2025 - 16:00
Hansi Flick's players are in big need of a win as they look to keep pace with leaders Real Madrid.
Categories: Technology

Severance season 2 episode 3 recap: The baby goats return!

TechRadar News - Sun, 02/02/2025 - 16:00

Severance season 2 has all of us on the edge of our seats, and they're once again teasing us with weekly releases. Episode 3 of one of the best Apple TV Plus shows is finally here, and it packed a lot into its hour-long runtime. Most notable were, of course, the goats, but we've also had our first look at Gwendoline Christie's new character and witnessed Mark making a shocking decision about severance. Elsewhere, Dylan's latest perk could prove to torment him.

If you need a reminder for previous episodes, check out our Severance episode 1 recap and Severance season 2 recap to bring you up to speed.

Cobel's job offer

(Image credit: Apple TV Plus)

The episode sees Cobel still on the road after abruptly driving away from Mark, who confronted her outside her home. When she reaches an unknown place called Salt's Neck, she decides to turn around and head back, but like most things Cobel does, we don't quite know why.

Cobel heads to Lumon, where she meets up with Helena after considering the job offer. She says if she returns, she wants to keep a close eye on MDR so she can watch Mark complete Cold Harbor, one of the biggest Severance mysteries. She's also not pleased about Milchick running the show and wants him gone, but Helena pushes back on this.

Not for the first time, Cobel abruptly leaves the meeting and drives off, so she's been constantly running away from her problems and fears recently. Something weird is going on, and I want to know what!

Meeting a new department

By this point, Mark is now fully back to work at Lumon. We see him timing how long it takes to get into Lumon to the point where he's severed, where he obviously switches into innie Mark, abruptly stopping the countdown.

He's on a mission, printing out flyers of a Ms. Casey drawing, and asks Helly and Irving to pass them around the Lumon departments. It's also important to note that Helly is still acting weird around Mark and not like Helly at all. Nevertheless, she helps Mark on his quest to find Ms. Casey.

They come across a small hallway. They crawl through it and find it leads to a large room full of goats on what seems to be artificial grass. We then meet a brand new team led by Gwendoline Christine, who is immediately suspicious of MDR and asks if they're going to kill her (not a normal question to ask at work, but okay...). She reveals that this department is called Mammalians Nurturable, and hey presto, we've got a new department – and it's linked to the goats, who have intrigued fans enough to inspire plenty of Severance goat theories.

Mark and Helly talk to the woman, asking her if she knows anything about the Wellness department or Ms. Casey. Mark reminds her that if they can make Ms. Casey disappear, any one of them could follow, including their goats, and it seems to do the trick. She eventually admits that Ms. Casey used to conduct wellness sessions in their “husbandry tanks.” She and her staff all seem to have liked her; they believe she retired but tell Mark they won't get in the way if he thinks otherwise. So that's something, at least.

Milchick's disturbing gift from The Board

(Image credit: Apple TV Plus)

Natalie arrives at Milchick's office to present him with a gift from The Board. But he is left visibly uncomfortable when he realizes they're a series of paintings recontextualizing Lumon’s history with Black versions of Kier and his subjects. With her earpiece connected to the mysterious Board, Natalie reels off some corporate-approved messaging, saying: “The Board austerely desires for you to feel connected to Lumon’s history. To that end, please accept from the Board these inclusively re-canonicalized paintings intended to help you see yourself in Kier, our founder.”

She also claims she received the same gift and was "moved" by it, but when the Board terminates the call, Milchick and she exchange an uncomfortable glance before she goes back to her forced smile. Natalie's facade has slipped for a brief moment, and I'm excited to see if she starts to rebel against Lumon, too. Milchick, understandably, packs up the paintings and puts them away.

Dylan’s innie and Gretchen

(Image credit: Apple TV Plus)

In a very emotional scene, Ms. Huang comes to get Dylan and leads him to the former security room, which has now been turned into the Outie Visitation Room. Turns out, Milchick was right when he teased these plans to Dylan and it has the potential to be one of the most twisted "perks" Lumon has come up with.

Dylan learns he's "earned" an 18-minute visit with his outie’s wife, Gretchen, where they discuss their three kids and get a bit closer. As Dylan’s session with his wife ends, she tells him she’s proud of him and loves him. He doesn’t know how to respond but feels great. Later, Dylan's outie watches the kids and when asked about the visit, Gretchen tells him it was "weird but good."

Mark's big decision and a surprise visit

(Image credit: Apple TV Plus)

Natalie visits Devon’s home to discuss Ricken's book, which became an unexpected hit after it was found on the severed floor. Lumon wants an "innie version" of the book, given the fact they liked it so much, but it's pretty suspicious that Lumon keeps rocking up to Devon and Ricken's home.

Devon is equally as suspicious of Devon as she is of Milchick and leaves to meet up with Mark. They try to burn an image into his retinas to take into Lumon, but as we've seen before, previous attempts to sneak messages in and out have not gone well.

Asai Reghabi shows up and stops him from going through with this bold idea. She tells him that his wife Gemma is alive and there’s only one way to get information in and out of Lumon: reintegration. Now, reintegration is a controversial choice, considering it killed Petey in Severance season 1, but since Mark is desperate, he decides to go through with the procedure to hopefully stitch his memories back together.

Asai and Mark begin the reintegration process. and this episode ends with the two sides of Mark's persona blending together, insinuating that the reintegration process may work this time around, but now I'm pretty fearful for Mark's mental and physical health.

You might also like
Categories: Technology

Today's NYT Connections Hints, Answers and Help for Feb. 3, #603

CNET News - Sun, 02/02/2025 - 15:00
Here are some hints — and the answers — for Connections No. 603 for Feb. 3.
Categories: Technology

Today's Wordle Hints, Answer and Help for Feb. 3, #1325

CNET News - Sun, 02/02/2025 - 15:00
Here are some hints and the answer for Wordle No. 1,325 for Feb. 3.
Categories: Technology

Today's NYT Strands Hints, Answers and Help for Feb. 3, #337

CNET News - Sun, 02/02/2025 - 15:00
Here are some hints -- and the answers -- for the Feb. 3 Strands puzzle, No. 337.
Categories: Technology

Today's NYT Connections: Sports Edition Hints and Answers for Feb. 3. #133

CNET News - Sun, 02/02/2025 - 15:00
Here are some hints — and the answers — for Connections: Sports Edition No. 133 for Feb. 3
Categories: Technology

The Young, Inexperienced Engineers Aiding Elon Musk's Government Takeover

WIRED Top Stories - Sun, 02/02/2025 - 13:02
Engineers between 19 and 24, most linked to Musk’s companies, are playing a key role as he seizes control of federal infrastructure.
Categories: Technology

An Nvidia GeForce RTX 5090 with 96GB of GDDR7 memory? No, this is almost certainly the RTX 6000 Blackwell

TechRadar News - Sun, 02/02/2025 - 12:34
  • A shipping manifest has detailed what looks like a professional workstation card
  • It could possibly be the successor to Nvidia's RTX 6000 Ada, the most expensive graphics card in the world
  • Based on the RTX5090, it is expected to have a whopping 96GB, twice that of it predecessor

The GeForce RTX 5090, the latest flagship graphics card for gamers and creatives in Nvidia's GeForce 50 series, was unveiled at CES 2025 and has just gone on sale - buts hortly before it did, rumors began to swirl of an RTX 5090 Ti model featuring a fully enabled GB202-200-A1 GPU and dual 12V-2×6 power connectors, theoretically allowing for up to 1,200 watts of power.

This speculation began following the appearance of a prototype image on the Chinese industry forum Chiphell - reporting on the image, ComputerBase said, “With 24,576 shaders, the GB202-200-A1 GPU is said to offer 192 active streaming multiprocessors, which were previously rumored to be the full expansion of the GB202 chip. The memory is said to continue to offer 32GB capacity, but with 32Gbps instead of 28Gbps, it will exceed the 2TB/s mark.”

Shortly after the engineering card surfaced online, ComputerBase alsospotted shipping documents on NBD Data listing a graphics card with 96GB of GDDR7 memory, marked as “for testing.” It is a reasonable assumption that this unidentified model is actually a professional workstation card, potentially – let’s say probably – the RTX 6000 Blackwell.

Useful for AI applications

(Image credit: NBD)

The GeForce RTX 5090 features 32GB of GDDR7, using sixteen 2GB modules connected through a 512-bit memory interface. 48GB would be possible if sixteen 3GB chips were used instead of 2GB chips.

If two of these 3GB chips were connected to each 32-bit controller, placing 16 chips on both the front and back of the graphics card in a "clamshell" configuration, the 96GB mentioned in the documents – which is twice as much as the RTX 6000 Ada, the most expensive graphics card in the world – would become a reality.

The shipping records indicate these GPUs use a 512-bit memory bus, reinforcing this theory. The internal PCB designation PG153, seen in the documents, aligns with known Nvidia Blackwell designs and has not yet appeared in any existing consumer graphics cards.

Nvidia is expected to introduce the RTX Blackwell series for workstations at its annual GPU Technology Conference (GTC 2025), so we should know more about them come March 2025. And yes, if you’re thinking 96GB of GDDR7 memory is overkill for gaming or creative purposes I’d agree with you. It is a good amount for AI tasks though, so we can expect to see Nvidia announce an AI version of the RTX 6000 Blackwell when it finally takes the wraps off its next-gen product.

You might also like
Categories: Technology

Forget the RTX 5090 – the RTX 5070 is the best gift Nvidia has given PC gamers in ages

TechRadar News - Sun, 02/02/2025 - 12:00

Yes, I know, we’re all very excited about how powerful Nvidia’s RTX 5090 is, and presumably many of us are also very upset that Nvidia apparently thought seven cards per retailer would be enough stock for launch day (note to Jensen Huang: that was a joke, please take your hitman off speed dial).

But even though it does look mighty impressive – and as per our RTX 5080 review, the middle child of the Blackwell generation is no slouch either – there’s a different GPU I’m really looking forward to this year, and it’s the lowest-spec desktop card Nvidia announced at CES 2025. That’s right, I’m talking about the RTX 5070.

Now, I’ve always had a soft spot for Nvidia’s xx70 GPUs; I rocked a GTX 970 back in the day, and I was a strong supporter of the RTX 3070 when it came out back in 2020. These cards typically find the right balance between performance and pricing; not too expensive, but still perfectly capable of delivering a solid gaming experience to the average consumer. And with the RTX 5070, I think we could be in for a treat – not least because Nvidia has seemingly done the unthinkable.

A pleasant surprise

See, at launch, the current-gen RTX 4070 cost $599 / £589 / AU$1,109. I thought that was a pretty fair price at the time – certainly better value for money than the higher-end Lovelace GPUs, and something we praised it for in our review. I was expecting to see the exact same price tag for the RTX 5070, but no: Nvidia has actually lowered the list price, bringing it down to $549 / £549 (around AU$880).

I’m really not exaggerating when I say that this is nuts. Obviously, we don’t have performance figures for the RTX 5070 yet, so there’s every possibility Nvidia does screw the pooch on this one, but let’s be honest: this card will likely sit somewhere between the RTX 4070 and RTX 4080, with $50 / £40 shaved off the price tag to boot. That’s great!

That’s not all, either; I’m just talking about my raw performance expectations here, but that’s without even factoring in Multi Frame Generation, which combined with DLSS 4 and Reflex 2 provides a serious performance boost for RTX 5000 GPUs without many of the drawbacks seen in previous iterations of Nvidia’s upscaling and frame-gen tech. Access to these tools – which can be retroactively improved by Nvidia – is a key winning factor for the 5070.

With that price tag, the RTX 5070 has genuine potential to be the new 1440p gaming king – or even a reasonably priced 4K card, once we see how well DLSS 4 and MFG actually perform on a more affordable Blackwell card. Personally, I have high expectations… don’t let me down, Nvidia.

You might also like...
Categories: Technology

Know Your Rights: 3 Rules for When Police Can Take Your Home Security Videos

CNET News - Sun, 02/02/2025 - 10:30
If you have a Ring doorbell or outside camera, your footage isn't always private. Here are cases when police can legally take your captured video.
Categories: Technology

Pages

Subscribe to The Vortex aggregator - Technology