Error message

  • Deprecated function: implode(): Passing glue string after array is deprecated. Swap the parameters in drupal_get_feeds() (line 394 of /home/cay45lq1/public_html/includes/common.inc).
  • Deprecated function: The each() function is deprecated. This message will be suppressed on further calls in menu_set_active_trail() (line 2405 of /home/cay45lq1/public_html/includes/menu.inc).

Feed aggregator

New forum topics

Amazon unveils Alexa Plus, its brand new AI-infused voice assistant

TechRadar News - Wed, 02/26/2025 - 10:16
  • Amazon has announced an AI-powered version of its Alexa voice assistant
  • Alexa Plus is more conversational and can carry out complex tasks for you
  • There's no word yet on pricing or availability (this is breaking story)...

Live from its HQ in New York City Amazon is currently hosting its Alexa event where we’ve all been expecting the launch of a brand new Alexa voice assistant. And lo and behold, it’s finally here – Amazon has revealed Alexa Plus, its new AI-infused voice assistant.

The announcement marks the biggest upgrade for the voice assistant since its launch in 2014. And from Vice President of Devices and Services Panos Panay’s demonstration at Amazon’s devices event, it looks rather impressive. Need a concierge? Sous-chef? Assistant? House manager? Alexa Plus seemingly has it all covered.

Live from New York City, Vice President of Devices and Services, Panos Panay, gives a first-look demonstration to Alexa Plus. (Image credit: Future)

One of the biggest improvements in Alexa Plus compared to the classic Alexa voice assistant is its impressive ability to hold conversations, which Panay seamlessly trialled live on stage at the event.

He asked “I'm a little bit nervous about it, but we're about to do live demos. What do you think can go wrong?”. Alexa Plus responded with “With so many eyes on you, it's natural to feel a bit nervous. As for what could go wrong, let's just say Murphy's Law is probably sharpening his pencil right now”. So it’s confirmed; Alexa Plus has a great sense of humor.

So what can Alexa Plus do? Powered by AI models from Anthropic and Amazon Nova, it looks impressively versatile. Some of the demos included smart home control, making restaurant reservations and connecting to your calendar to add events or send invites to friends. The AI assistant also has vision powers, which means it can scan documents and recall information later.

Naturally, there's lots for kids too. Ademo video showed Alexa Plus answering questions and creating stories, which it was able to do before – but this time it includes AI-generated images, too.

This is a breaking story, we'll update it with more information soon...

Categories: Technology

Best Internet Providers in Winter Haven, Florida

CNET News - Wed, 02/26/2025 - 10:09
The City of 100 Lakes is pretty limited regarding internet service providers, but that doesn't mean there's no quality to be found.
Categories: Technology

Trump's social media video garners pushback from Arabs and Muslims in U.S. and Gaza

NPR News Headlines - Wed, 02/26/2025 - 10:02

In a seemingly AI-generated video that the president posted on social media, images of destruction due to the war in Gaza are transformed into a glitzy resort called "TRUMP GAZA."

(Image credit: Screenshots via Instagram. Annotation by NPR)

Categories: News

A new Linux backdoor is hitting US universities and governments

TechRadar News - Wed, 02/26/2025 - 10:02
  • Unit 42 spots a new Linux malware
  • Auto-color can grant the attackers full access to compromised endpoints
  • Initial infection vector is unknown, but universities and governments hit

Universities and government offices in North America and Asia are being targeted by a brand new Linux backdoor called “Auto-color”, experts have claimed.

Cybersecurity researchers from Palo Alto Networks' Unit 42 revealed in early November 2024, it came across a backdoor which was relatively difficult to spot, and impossible to remove without specialized software.

The backdoor was capable of opening a reverse shell to give the attackers full remote access, running arbitrary commands on the target system, tampering with local files, acting as a proxy, or dynamically modifying its configuration. The malware also comes with a kill switch, which allows the threat actors to remove all evidence of compromise and thus make analysis and forensics more difficult.

Dangerous threat

Given its advanced obfuscation features, and an extensive list of dangerous capabilities, Auto-color was described as a very dangerous threat. However, Unit 42 could not attribute it to any known threat actor, nor did it want to discuss the victims in more detail. Therefore, we don’t know how many organizations were infected, nor what the end goal of the campaign is.

What’s also unknown is how the victims got infected in the first place. Unit 42 says the initial infection vector is unknown, but added it has to start with the victim executing a file on the target system. The file usually has a benign name, such as “door”, “log”, or “egg”.

Linux malware is becoming more sophisticated and widespread due to increased Linux adoption in cloud computing, enterprise servers, and IoT devices. Cybercriminals are shifting focus from traditional Windows targets to include Linux environments, exploiting misconfigurations, unpatched vulnerabilities, and weak security practices.

The rise of malware-as-a-service (MaaS) and automated attack tools also makes Linux-based threats more effective, as well.

Via BleepingComputer

You might also like
Categories: Technology

Nvidia retiring PhysX for its RTX 5000 GPUs has made some gamers furious - but I don't think it's a complete dealbreaker

TechRadar News - Wed, 02/26/2025 - 09:38
  • Nvidia's 32-bit PhysX support isn't present on RTX 5000 series GPUs
  • This will effect a number of older titles that utilize the physics API for enhanced visuals and particle effects
  • It adds to the multitude of issues the RTX 5000 series GPU users are currently facing

The RTX 5000 series launch has come with an abundance of issues and controversies that Nvidia is attempting to helm - one of them, is the removal of a big feature among Nvidia's Gameworks that has left many frustrated but it may not be as bad as it seems.

As highlighted by Tom's Hardware, Nvidia quietly removed 32-bit support for one of its proprietary technologies, PhysX, on RTX 5000 series GPUs - a feature that was used in plenty of older titles, including The Witcher 3: Wild Hunt, Metro: Exodus, and Borderlands 2, all of which took advantage of the API for enhanced in-game physics (such as ragdoll and cloth physics seen in the evgaonthetube video below) and visual effects in-game (particularly particle effects). Using the tool saved developers a lot of time in coding, allowing complex physics to be more easily implemented.

Well, owners of Team Green's new Blackwell GPUs no longer have this luxury - it's forced some gamers to slave a second, older GPU for dedicated PhysX support while using their next-gen cards (as shown in this report by XDA Developers), but isn't very power efficient as is generally a hassle, as it requires running two GPUs simultaneously.

It seems like a frustrating move for fans, but Nvidia's recent focus on RTX and AI is likely why PhysX is being left behind. It's also worth noting that modern games are effectively no longer using PhysX, which means only older titles (those more than five years old) will see worse performance on RTX 5000 series GPUs - although I have to say that it's really not that big of a problem, as you can simply turn PhysX off.

Don't panic, this isn't going to render older games unplayable... but you should probably preserve your older GPU

Now, before I get hunted by Reddit for saying this, I am in no way saying the omission of PhysX is something to be championed or praised - as a matter of fact, I think this is yet another reason why you should stay away from the RTX 5000 series GPUs until its problems are resolved.

The ability to play older games with a certain graphics setting enabled on a flagship GPU shouldn't even be a question - it's totally mind-boggling to see worse performance on a newly-released flagship GPU when PhysX is enabled in a game like Batman: Arkham City, and I'm hoping the complaints will encourage Nvidia to consider catering to older games.

It's also a very valid argument that potentially paying over $2,000 (in the case of the RTX 5090's inflated pricing for third-party cards) and losing out on a feature that enhances the visual quality of older titles is absurd. What I can say, is that the removal of 32-bit PhysX support on the new Blackwell GPUs isn't the height of Team Green's issues as of now and doesn't mean you won't get to play classic titles - you will however need to disable PhysX in games that support it to avoid significant frame drops.

Since most AAA titles today are moving away from the use of PhysX, I daresay it's omission (at least 32-bit support, as 64-bit is still functional) is somewhat reasonable - the focus on RTX and AI and ways to bring different technologies to enhance new games are clear, it just hurts gamers like myself who love playing classic games.

If you still own an older RTX GPU and you're a classic video game enthusiast, I'd advise you to preserve it - with more advanced technologies coming, the likelihood of Nvidia dropping support for other old features under the Nvidia Gameworks umbrella is high. I honestly believe the RTX 4000 series is still your best bet (while some cards are still available), because trust me, you don't want to deal with the litany of issues RTX 5000 series users are facing...

You may also like...
Categories: Technology

What is Digital Experience Monitoring and why is it critical for every organization?

TechRadar News - Wed, 02/26/2025 - 09:22

Have you recently experienced a customer service agent say, “Sorry, my computer is slow”? Have you been frustrated by a site that is not loading quickly? Have you had poor connectivity right when you need to have an important call? Well, you are not alone.

Why is it that customers and corporate users experience so much frustration? According to Gartner Research, 47% of technology users experience high “digital friction” on a daily basis. It is true that most human activity today is either digital or supported by digital processes. We depend on technology for our daily life.

However, while the software industry has been building and improving software for the past decades, and today there are very effective technologies for high availability, redundancy, and self-healing, there is another technology trend happening – the internet.

Yes, the internet is much more fragile than people realize. But it has been in the last few years that everything has become internet-based. Smartphones, SaaS, the cloud, and lately Covid have resulted in both users becoming increasingly distributed and internet-dependent, and applications becoming more modular, complex, service-oriented, hybrid, distributed, and internet-centric.

Let’s just say you want to check today’s news. A recent Web Page Test showed it takes 625 different calls to load the page. These include loading fronts from one site, ads from another, cookies, images from a CDN, tags, code snippets, APIs, etc. This does not include the back-end calls that happen to a content management system, database interactions, APIs to gather weather, and so on.

Each of these depends on a series of internet technologies and protocols to work: DNS, BGP, an ISP, etc. The collection of these technologies is called The Internet Stack. It’s a small miracle that all these 625 dependencies work in a few seconds – and it must be a few seconds, because users are increasingly impatient. Slow is the new down.

To help IT operations teams manage this complexity, get visibility into the internet stack, and deliver a great customer experience, a new category of tools has emerged - Internet Performance Monitoring or IPM. A subset of IPM is Digital Experience Monitoring, which has emerged as an essential tool for IT teams to ensure performance and maintain the reliability that users expect. Gartner introduced the first Magic Quadrant for DEM just a few months ago, signaling just how critical DEM has become. But what exactly is DEM, and why does it matter for your business?

DEM defined

According to Gartner, DEM tools assess the availability, performance and quality of the experience of applications, whether those users are customers, employees or even digital agents accessing APIs.

Combined with a broader observability strategy, DEM is becoming critical for an IT team’s toolkit, helping businesses maintain the user-centric perspective they need in today’s customer- and employee-driven digital environment.

DEM tools monitor actual and simulated interactions with critical applications, allowing IT teams to anticipate problems before an impact is even noticed. While traditional observability is like your car’s dashboard, revealing the details of the system’s inner workings like RPMs, oil temperature, and gas levels, DEM is like a GPS that focuses on the route to the destination, which is the ultimate goal. Both are important, but DEM provides unique visibility into how systems are performing from an end-user perspective — not just whether an application is up and running, but whether it’s meeting the expectations of those who rely on it every day.

Requirements for effective DEM

For DEM to be effective in modern IT Operations, it must meet a few critical requirements:

· RUM and synthetic testing: Synthetic tests provide proactive, controlled, ongoing tests that ensure a system is working and can measure very precisely changes in performance over time

· Global visibility: While some observability platforms offer monitoring from the cloud, these are insufficient for DEM as probably no users are accessing your applications from within a cloud data center which has very different connectivity and compute power. It is critical for a DEM to have thousands of vantage points around the world, in last mile, wireless, and backbone nodes, to effectively understand the experience of users in each relevant geographic area

· Visibility into the internet stack: For DEM to be effective, it must be able to provide intelligence on how data gets from one place to another, with a deep understanding of DNS, real-time BGP data, and flexibility to support multiple protocols such as SIP for telephony, MQTT for IoT applications, as well as modern internet technologies such as ECN and http/3. · Supporting SLAs and XLOs: A breached SLA can have significant impact and costs; certainly, something to avoid. With DEM, the opportunity is to focus on eXperience Level Objectives (XLOs), which have the potential to align IT with the business around the metrics that really matter and prove the value of IT investments.

The DEM advantage: Observability plus actionable insights

IT teams are expected to do more than just keep systems up and running; they must ensure those systems deliver value to end-users. This is where DEM’s focus on user experience truly stands out. Unlike observability platforms, which dive into the internal processes of applications, DEM offers visibility into how those applications impact actual user experiences.

This user-centered focus allows IT teams to spot trends in user behavior, benchmark performance, and identify areas for improvement — all while staying agile in a fast-paced environment. As DEM solutions continue to evolve, IT teams can expect an even more granular view of user experiences, from application-level performance to specific interactions with digital interfaces.

An unavoidable truth: Your digital experience is now synonymous with brand reputation. With DEM, organizations are able to create meaningful experiences for their users to connect with their brand. DEM provides IT teams with the tools they need to turn observability data into actionable improvements that keep users happy and businesses running smoothly.

DEM is no longer optional; it’s an essential component of success for any business connected to the digital world. By focusing on the experiences that truly matter to users, DEM empowers IT teams to monitor, measure and improve in ways that traditional monitoring tools simply can’t match. And with new tools and capabilities emerging, the future of DEM promises to offer IT teams even more powerful ways to ensure that digital experiences are always at their best.

We've featured the best IT infrastructure management service.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

Microsoft fixes concerning issue with its Entra ID authentication tool

TechRadar News - Wed, 02/26/2025 - 09:12
  • Microsoft invertedly introduced a bug to Entra ID
  • The bug prevented users from logging into different Azure services
  • It has now been resolved, but users aren't happy

Microsoft has fixed a problem in its Entra ID authenticator service which briefly prevented users from logging into different Azure cloud services. The problem stemmed from Seamless SSO and Microsoft Entra Connect Sync, which caused DNS authentication failures.

In its Azure Status web page, Microsoft explained that it recently made changes that caused DNS resolution failures for the autologon.microsoftazuread.sso.com domain. The failure prevented customers from accessing Azure services between 17:18 UTC and 18:35 UTC on February 25, 2025.

"As part of a cleanup effort to remove duplicate IPv6 CNAMEs, a change was introduced which removed a domain utilized in the authentication process for Microsoft Entra ID's seamless single sign-on feature. Once removed the domain could no longer be resolved and requests for authentication would fail," the status page apparently read.

DNS change

"These issues were caused by a recent DNS change, which has now been reverted, and the service is fully recovered. At this time, customers should no longer encounter DNS resolution failures."

The status update was later removed, but not before being picked up by BleepingComputer. It was apparently removed because the page is made to only track “widespread incidents”, and since the issue was resolved, the update was removed.

Still, Microsoft said it would share more details about the misstep in the future - however at press time, that is yet to happen.

Entra ID (formerly Azure AD) is Microsoft's cloud-based identity and access management service. It handles authentication and authorization for users accessing Microsoft services like Microsoft 365, Azure, and other integrated applications.

Seamless SSO and Entra Connect Sync are features that enhance how Entra ID manages authentication. Seamless SSO automatically signs in users when they are on a corporate network, using their on-premises credentials without requiring a password prompt. Entra Connect Sync ensures that user identities, group memberships, and credentials remain synchronized between an organization’s on-premises Active Directory and Entra ID, enabling hybrid identity management.

You might also like
Categories: Technology

In the age of AI, everybody could lose the right to anonymity

TechRadar News - Wed, 02/26/2025 - 09:04

Generative AI is reshaping industries and redefining how we harness technology, unlocking new opportunities at a scale never seen before.

However, this transformation comes with a list of challenges. Chief among them is the erosion of data privacy. Traditional methods of anonymizing data, once considered effective in unlocking valuable insights while preserving privacy, have quickly become vulnerable against AI’s growing capabilities.

As AI lowers the barriers to identifying individuals from supposedly anonymous datasets, organizations must adopt a paradigm shift toward encryption-based methods. Solutions like confidential computing offer a clear path forward, ensuring that data remains protected even as AI’s capabilities grow.

Without these advances, the promise of privacy in the digital age could become a thing of the past.

The illusion of anonymity

For decades, enterprises have relied on anonymization techniques such as removing HIPAA identifiers, tokenizing PII fields, or “adding noise” to data to protect sensitive information. These traditional methods, while well-intentioned, are fundamentally flawed.

Consider the famous case of the Netflix Prize dataset from 2006 as a prime example. Netflix released an anonymized set of movie ratings to encourage the development of better recommendation algorithms. Yet, that same year, researchers from the University of Texas at Austin re-identified users by cross-referencing the anonymized movie ratings with publicly available datasets.

Similarly, Latanya Sweeney’s seminal study in 2000 demonstrated that combining public records—like voter registration data—with seemingly innocuous details like ZIP codes, birth dates, and gender could deanonymize individuals with startling accuracy.

Today, fast developing AI tools make these vulnerabilities even more apparent. While Large Language Models (LLMs) like ChatGPT have introduced unprecedented efficiencies and possibilities across industries, the associated risks are twofold. With their ability to process vast datasets and cross-reference information faster and more accurately than ever, these tools are not only powerful but widely accessible, making privacy challenges even more pervasive.

Experiment: deanonymizing the PGP dataset

To illustrate the power of AI in deanonymization, consider an experiment my colleagues and I conducted involving a GPT model and the Personal Genome Project (PGP) dataset. Participants in the PGP voluntarily share their genomic and health data for research purposes, with their identities anonymized through demographic noise and ID assignments.

As a proof-of-concept, we explored whether AI could match publicly available biographical data of prominent individuals to anonymized profiles within the dataset (for instance, Steven Pinker, a well-known cognitive psychologist and public figure whose participation in PGP is well-documented). We found that by leveraging auxiliary information, AI could correctly identify Pinker’s profile with high confidence, demonstrating the increasing challenge of maintaining anonymity.

While our experiment adhered to ethical research principles and was designed to highlight privacy risks rather than compromise them, it underscores how easily AI can pierce the veil of anonymized datasets.

The growing threat across industries

The implications of such experiments extend far beyond individual privacy. The stakes are higher than ever in industries like healthcare, finance, and marketing, where enterprises handle vast amounts of sensitive data.

Sensitive datasets in these industries often include transactional histories, patient health records, or insurance information—data that is anonymized to protect privacy. Deanonymization methods, when applied to such datasets, can expose individuals and organizations to serious risks.

The Steven Pinker example is not merely an academic exercise. It highlights the ease with which modern AI tools like LLMs can lead to deanonymization. Details that once seemed trivial can now be weaponized to expose sensitive data, and the urgency to adopt more robust data protection measures across industries has grown exponentially.

What once required significant effort and expertise can now be done with automated systems. The potential for harm isn’t theoretical; it is a present and escalating risk.

The role of confidential computing and PETs

The rise of AI technologies, particularly LLMs like GPT, has blurred the lines between anonymized and identifiable data, raising serious concerns about presumed privacy and security. As deanonymization becomes easier, our perception of data privacy must evolve. Traditional privacy safeguards are no longer sufficient to protect against advanced threats.

To meet this challenge, organizations need an additional layer of security that enables the sharing and processing of sensitive data without compromising confidentiality. This is where encryption-based solutions like confidential computing and other privacy-enhancing technologies (PETs) become indispensable.

These technologies ensure that data remains encrypted not only at rest and in transit but also during processing—enabling organizations to unlock the full value of data without risk of exposure, even when data is actively being analyzed or shared across systems.

The dual benefit of privacy and utility makes PETs like confidential computing a cornerstone of modern data privacy strategies.

Safeguarding anonymity in an AI-driven world

In the new era of AI, the term “anonymous” is increasingly a misnomer. Traditional anonymization techniques are no longer sufficient to protect sensitive data against the capabilities of AI. However, this does not mean privacy is lost entirely—rather, the way we approach data protection must evolve.

Organizations need to take meaningful steps to protect their data and preserve the trust of those who depend on them. Encryption-based technologies like confidential computing offer a way to strengthen privacy safeguards and ensure anonymity remains possible in an increasingly AI-powered world.

We've featured the best online cybersecurity course.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here : https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

Foundation season 3 isn't out yet on Apple TV+, but a fourth season is reportedly on the way with a new showrunner

TechRadar News - Wed, 02/26/2025 - 09:02
  • Apple TV+'s Foundation has reportedly found a new showrunner
  • Fear the Walking Dead showrunner Ian Goldberg has reportedly been hired to replace David Goyer
  • The big-budget sci-fi series is also getting a fourth season, apparently

Apple has reportedly hired a new showrunner for its critically-acclaimed Foundation TV show – and secretly greenlit a fourth season.

Per Deadline, the tech giant has secured the services of Ian Goldberg, whose previous credits include helming seasons 4 through 8 of Fear The Walking Dead. A big-budget and oft-times narratively complex sci-fi series is a far cry from Goldberg's work on the dystopian horror spin-off of AMC's popular The Walking Dead TV adaptation. However, Goldberg has penned scripts for other sci-fi shows, including Terminator: The Sarah Connor Chronicles and Krypton, so the genre won't be alien to him.

Goldberg's rumored hiring comes exactly one year (i.e. February 26, 2024) after reports emerged that Foundation had lost its previous showrunner/executive producer David S. Goyer. The US filmmaker is said to have clashed with Apple TV+ executives over the budget for Foundation season 3, which is currently in its post-production phase. Goyer is believed to have stepped back from his duties following those apparently tense conversations, with Bill Bost being asked to oversee the rest of season 3's lengthy shooting schedule.

Goldberg's installation as the series' new showrunner isn't the only new, erm, news that Deadline reported on. The US outlet also claims that Foundation has been internally renewed for a fourth season and that a writers room has already been assembled by Goldberg to start work on its screenplays.

I reached out to Apple for comment on Goldberg's hiring and the series' season 4 renewal, but hadn't received a response by the time of publication. I'll update this article if I hear back.

Laying the groundwork for a brighter future

Foundation's third season has faced plenty of hardships throughout its development (Image credit: Apple TV Plus)

It's an understatement to say that one of the best Apple TV+ shows has run into development problems since its outstanding second season aired in 2023 (read my Foundation season 2 review to find out why I liked it so much).

Two months after Foundation's season 3 renewal in December 2023, reports started to emerge that its cast and crew faced an agonizing filming delay amid multiple development issues with its third season. Principal photography had already started before these problems arose, but the 2023 Hollywood strikes and aforementioned budget woes were blamed as the main culprits for the hold-up last February.

Filming restarted on Foundation season 3 on February 19, 2024, but its problems didn't end there. Goyer's departure notwithstanding, the loss of Mikael Persbrandt, who had been cast as The Mule, an incredibly important and dangerous antagonist in Isaac Asimov's Foundation book series, caused further issues. Game of Thrones' Pilou Asbaek was hired to play The Mule following a season 3 cast shake-up last March.

Thankfully, it doesn't appear that there have been any other significant issues that have hampered season 3's development. As long as its post-production phase continues to run as smoothly as possible, it's likely that Foundation's next chapter will debut on one of the world's best streaming services before 2025 ends.

For more details on its next season, read my dedicated Foundation season 3 guide or remind yourself what happened in its predecessor's last episode by way of my Foundation season 2 ending explained piece.

You might also like
Categories: Technology

Mint Mobile unlimited plan customers have just received an easy-to-miss but crucial upgrade

TechRadar News - Wed, 02/26/2025 - 08:49

This week, both new and existing Mint Mobile customers on the carrier's already great unlimited data plan have received an unexpected but welcome bonus.

The carrier has just announced that it's removing the strict 40GB data cap that was previously in place on the unlimited plan. This cap meant that customer speeds were dramatically reduced after you hit your cap - thankfully, that's not the case now.

Instead, the unlimited plan will now only be slowed when you're over 35GB of data usage and the local area is particularly busy. You won't get any deprioritized speeds unless you go over that cap - and your plan will go back to full speed if your local area isn't experiencing heavy traffic.

This makes the Mint Mobile unlimited data plan much more forgiving for high-data users since you won't get any permanent slow-downs on your data speeds until the next bill cycle. Mint hasn't hiked up the prices for its unlimited plan, so you're getting even more bang for the buck than you were previously. In fact, there is a deal on the unlimited plan that I've also attached below.

Today's best deals at Mint Mobile

Mint Mobile: unlimited plan for $30 $25/mo when you buy a year upfront
Mint Mobile's just launched a great deal for its excellent unlimited plan. Instead of paying the usual $360 upfront, you can now get a full year of unlimited data for just $300 right now. That brings the average monthly price down to just $25/mo - the same that you'd usually pay for the 20GB plan. Even better still, Mint has just removed the strict 40GB cap on its unlimited plan, which makes it an even better option regardless of this deal. View Deal

Google Pixel 9: was $799 now $399, plus one year of unlimited data for $180 at Mint Mobile
Mint Mobile's current promotion on the Google Pixel 9 is one of the best prepaid phone deals you'll find anywhere right now. For a limited time only, new customers can get a massive $400 discount on this excellent flagship as well as one full year of unlimited data for just $15/mo. Overall, this is a great plan and phone combo, although note that the Pixel 9 did briefly go down to just $299 over Black Friday. Still, this is an amazing deal and one that's hard to beat.View Deal

Google Pixel 9 Pro Fold: was $1799 now $1,449, plus one year of unlimited data for $180 at Mint Mobile
Another one of Mint's superb deals on Google Pixel phones, this time on the really, really high-end Pixel 9 Pro Fold. This particular foldable isn't for everyone with it's massive tablet-like display and high price tag but today's deal at Mint Mint will get you a nice $300 discount. You'll also be able to score a full year of unlimited data for just $15 per month, for an additional $180 saving. View Deal

You can see even more promotions over at our main Mint Mobile deals page. It's also worth checking out more cheap cell phone plans if you're thinking of making the switch to a more budget-friendly carrier.

Categories: Technology

My favorite laptop maker just unveiled its first desktop - and it's the cutest little PC I've ever seen

TechRadar News - Wed, 02/26/2025 - 08:47

It should come as no surprise to regular readers of TechRadar’s Computing section that I’m a big, big fan of Framework. It’s the laptop maker that does everything right: repairability, eco-friendly designs, great customization options, and a company ethos that puts employees first. I waxed lyrical about the Framework Laptop Chromebook Edition’s awesome design back in 2023, and now I’m getting excited all over again - because Framework is finally making a desktop PC.

The Framework Desktop, showcased in a blog post on the manufacturer’s website, does admittedly feel slightly counterintuitive to Framework’s mission statement. After all, desktop PCs are already more customizable, repairable, and upgradable than laptops, a set of benefits Framework was keen to bring to the laptop space with its main product line. The blog post addresses this, though, saying that the reason it’s finally decided to make a tower PC is because of the new AMD Ryzen AI Max processors - chips so good that Framework shifted its roadmap a year ago to incorporate them into a desktop system to “unlock every bit of its performance”.

AMD’s latest are some seriously meaty CPUs, so it makes perfect sense to see this happen. With up to 16 CPU cores at a 5.1GHz boost clock and newly powered-up Radeon 8060S integrated graphics plus an NPU for running local AI workloads, AMD isn’t messing around, potentially making the Framework Desktop a candidate for our list of the best workstation PCs. Framework claims that the top-spec Ryzen AI Max+ 395 configuration is capable of 1440p gaming in even “the heaviest titles”, something I’m keen to put to the test.

Good things come in small packages

Also… this is just the cutest little desktop system I’ve seen in my life. Seriously, look at it. It’s adorable. Fit to be one of the best mini PCs ever seen, frankly. The front panel is formed of 21 swappable colored tiles, and Framework has open-sourced the design so you can 3D-print your own too. You can choose between a solid black or glass side panel, pick an RGB fan, and even add an optional carry handle for those of us who still go to LAN parties.

(Image credit: Framework)

All that is great, and I adore how sleek and compact this thing is, but there’s one more design choice here that is far more important: Framework has included the hot-swappable ‘Expansion Cards’ used for customizing the ports on its laptops, meaning that you can choose exactly which two ports you want on your front I/O. That’s neat.

The top-spec Ryzen AI Max+ 395 configuration starts at $1,999 (£1,999 / about AU$3,160), which is a fairly high price of admission for a desktop PC, but as I noted in my review of Framework’s Chromebook, you’re getting a lot of computer for your cash and you’re making a socially and ecologically responsible purchase. For those who don’t need that peak performance, the 8-core Ryzen AI Max 385 configuration will start at $1,099 (£1,099 / about AU$1,740). Framework has also confirmed that there will be new models of its flagship 13-inch laptop, plus the new 2-in-1 touchscreen 12-inch model.

Framework describes its desktop PC as “the easiest PC you’ll ever build”, and even offers the mainboard - which is the motherboard, CPU, and RAM - as a standalone unit starting at $799 (£799 / about AU$1,265), so you can install it your own custom-build compact PC instead if you’d prefer. Personally, I can’t wait to get my hands on the whole PC.

You might also like...
Categories: Technology

Christianity declines among U.S. adults while "religiously unaffiliated" grows, study says

NPR News Headlines - Wed, 02/26/2025 - 08:28

The percentage of Christians in the U.S. has dropped dramatically, though that loss may have leveled off in recent years.

(Image credit: Charlie Riedel/AP)

Categories: News

Hundreds of GitHub repositories hijacked to trick users into downloading malware

TechRadar News - Wed, 02/26/2025 - 08:24
  • Kaspersky research finds "hundreds" of malicious GitHub commits
  • Commits pretend to be useful software but trick victims into downloading malware
  • At least one person lost 5 BTC because of the campaign

Cybersecurity researchers Kaspersky have iscovered a longstanding, widespread criminal campaign targeting software developers with information-stealing malware.

Kaspersky said it observed hundreds of fake GitHub repositories, some posing as tools and automation mechanisms, others as hacks and cracks, that were actually delivering different sorts of malware to their victims. They dubbed the campaign ‘GitVenom’. Apparently, someone has been very thorough, carefully setting up commits, writing accompanying documentation and readme files, all in order to avoid being flagged as malware.

However, beneath the fake documents lies malicious code built in Python, JavaScript, C, C++. and C#. Kaspersky saw Node.js stealer, AsyncRAT, Qasar backdoor, and a clipboard hijacker. The malware has been circulating across GitHub for at least two years, Kaspersky stressed, with targets and victims located all over the world, but some countries are targeted more than others: with Russia, Brazil, and Turkey hit especially hard.

Losing bitcoin

There is no telling how many victims fell for the ruse, but Kaspersky singled out one case in which someone lost 5 BTC to the scam, equivalent to just under half a million dollars.

GitHub is one of the most popular code repositories in the world, used every day by millions of software developers. It is an important platform that helps speed up and simplify software development, while at the same time improves security by allowing countless security experts to scrutinize the code.

However, the popularity also draws in the wrong crowd. GitHub is constantly being bombarded with malware, as hackers employ typosquatting, impersonation, and outright fraud, to try and trick people into downloading malware instead of legitimate code.

GitHub’s maintainers work hard to keep the platform clean, and were forced on multiple occasions to suspend new account creation and new commits submissions, due to an onslaught of malware.

Via BleepingComputer

You might also like
Categories: Technology

Nvidia RTX 5060 Ti GPU with 16GB rumored for March launch, followed by 8GB flavor in April – but where does that leave the vanilla RTX 5060?

TechRadar News - Wed, 02/26/2025 - 08:16
  • Nvidia’s RTX 5060 Ti is again rumored to have 16GB and 8GB versions
  • It’s now claimed the 16GB version will arrive first, later in March, and the 8GB model in the first half of April
  • There’ll be worries about that 8GB loadout, but we mustn’t forget the other VRAM specs that are pertinent here

Nvidia’s RTX 5060 Ti is seemingly the next Blackwell graphics card to go on sale – well, following the imminent RTX 5070 which arrives early in March – and it’ll be produced in 16GB and 8GB flavors.

Wccftech brings word – grab the saltshaker and use liberally – from unnamed sources that Nvidia reportedly intends to release the RTX 5060 Ti 16GB version first, in the latter half of March. The 8GB model will supposedly then arrive the following month, though it should be in the first half of April.

Core spec details are not provided here, notably, save for a guess that these graphics cards will be built on the GB206 chip as previously rumored (the Blackwell silicon which is a tier below GB205, the engine of the RTX 5070).

What we do learn, though, is that Nvidia is said to be sticking with a 128-bit memory bus, which is the same as the RTX 4060 Ti models.

If you think it’s disappointing that the VRAM (video RAM) capacity and bus specs are unchanged, well, don’t forget that the rumor mill believes Nvidia is using GDDR7 with its RTX 5060 models, the faster video memory present in the existing Blackwell line-up.

Because it’s a lot quicker – running at 28Gbps, rather than 18Gbps with the RTX 4060 Ti – the overall bandwidth will be considerably beefier with the RTX 5060 Ti, over 50% faster in theory. So, don’t worry about a lack of performance oomph on the VRAM front, in short.

Finally, we’re told that the RTX 5060 Ti will have a power usage of 180W, which is a step up from its predecessor, but only by 20W or so. It’ll still be a GPU that can be run with a relatively modest PSU, in other words, and apparently, the RTX 5060 Ti will also have boards that use a traditional 8-pin power connector to make for a relatively hassle-free upgrade. (This is despite some previous speculation indicating that Nvidia would mandate the newer 12V-2x6 connector for all Blackwell GPUs, although it will be present on some 5060 Ti cards, we’re told).

(Image credit: Future / John Loeffler) Analysis: Yet more VRAM controversy, potentially

This isn’t the first time we’ve heard rumors about the RTX 5060 Ti mirroring the RTX 4060 Ti with 16GB and 8GB variants. And just like the previous-gen Lovelace boards, this means the xx60 Ti graphics card can have more VRAM than the xx70 class model – 16GB versus 12GB – if you fork out for the more expensive RTX 5060 Ti.

That brings us to the topic of price, where Wccftech tells us that nothing concrete has been heard on the grapevine yet. The RTX 5060 Ti will obviously have to start at an MSRP of considerably less than the RTX 5070, which is $549 / £539, and Wccftech’s guess of $400 to $500 (the latter for top-end 16GB cards) seems about right.

I’m guessing Nvidia won’t have made that final decision yet, though, and is quite likely waiting to see how the RX 9070 models, or indeed RX 9060 – which are rumored to be getting an airing at AMD’s GPU event later this week – pan out for their MSRPs.

Pricing for the 8GB variant of the RTX 5060 Ti could be a tricky matter, as there were plenty of complaints about the RTX 4060 Ti being underpowered for video memory with that pool of VRAM. Now, we’re a whole generation on, so Nvidia sticking to the same loadout is going to go down like a lead balloon with quite a few PC gamers, I can guarantee that.

Of course, the overall increase in VRAM speed brought in by the introduction of GDDR7 (in theory) has to be considered, as I already observed – and the same goes for Nvidia’s new tricks to get more out of video memory with the Blackwell generation. But I’m guessing some gamers are going to be difficult to persuade away from the notion that simply on a raw capacity front, 8GB is just looking too thin for any real future-proofing here (even at 1080p gaming).

I’m getting ahead of myself here, though – we need to see what Nvidia reveals first, of course, even if the grapevine seems pretty consistent with its rumors here (which tends to be a sign they’re on the money, though it’s no guarantee by any means).

What’s also interesting here is that there’s no mention of the RTX 5060, and when that might arrive. If the 8GB flavor of the RTX 5060 Ti isn’t turning up until maybe mid-April, that may mean the vanilla RTX 5060 could be pushed back to May, perhaps? Those other rumors about delays for Nvidia’s RTX 5060 models appear to be backed up by this latest piece of speculation, at any rate.

You might also like
Categories: Technology

iPhones are replacing 'Trump' with 'racist' during dictation – but Apple is fixing the problem

TechRadar News - Wed, 02/26/2025 - 08:11
  • iOS is changing "Trump" to "racist" when transcribing
  • Apple says the bug is now being fixed
  • The official explanation is "phonetic overlap"

iPhone owners have noticed a peculiar bug in recent days: "Trump" autocorrects to "racist" when using speech-to-text dictation mode. According to Apple, it's a problem with "phonetic overlap", and a fix is already in the works.

After TikTok videos of the slip went viral, Apple provided a statement to The Guardian and others, blaming "phonetic overlap" between the two words: "We are aware of an issue with the speech recognition model that powers Dictation, and we are rolling out a fix as soon as possible," a spokesperson said.

While many people were able to recreate the blip, it didn't happen every time – and the text seemed to revert back to "Trump" after a short delay. The latest reports online suggest Apple's fix has already taken effect, so you might not see it happening any more.

In its explanation, Apple suggested its speech recognition engines were struggling to distinguish between words with "r" in them. Further testing suggested iOS didn't always get the word "racist" right either, though historically Apple's speech-to-text engines have been very reliable.

'Just not plausible'

Apple says the Trump-related bug is getting patched (Image credit: Getty Images)

Apple will be keen to draw a line under this as soon as possible and get the error corrected. It seems particularly unfortunate that a transcription bug like this would link two specific words sure to set off a wave of controversy and politically-charged debate.

Peter Bell, professor of speech technology at the University of Edinburgh, told the BBC that Apple's explanation was "just not plausible" given what we know about speech-to-text technology. "It probably points to somebody that's got access to the process," said Bell.

John Burkey, founder of Wonderrush.ai, gave a similar option to the New York Times: "This smells like a serious prank," he said. "The only question is: did someone slip this into the data or slip into the code?"

This also feeds into the wider conversation about AI and its reliability, as AI models are used to convert the spoken word into transcribed text – something that you can now do on any modern smartphone. Whether it's meeting notes or show subtitles, we need to be able to rely on the accuracy of this fast-spreading technology.

You might also like
Categories: Technology

The 9 Best Mirrorless Cameras (2025): Full-Frame, APS-C, and More

WIRED Top Stories - Wed, 02/26/2025 - 08:02
Want the image quality of a DSLR without the bulk? These WIRED picks do more with less.
Categories: Technology

Meet the 'wooly devil,' a new plant species discovered in Big Bend National Park

NPR News Headlines - Wed, 02/26/2025 - 08:00

The plant, formally known as Ovicula biradiata, is especially notable for being the simultaneous discovery of a new species and genus. It was found with help from the community science app iNaturalist.

(Image credit: D. Manley)

Categories: News

Sony unveils its first lens with a massive 800mm reach – and it could be a dream optic for wildlife photography

TechRadar News - Wed, 02/26/2025 - 08:00
  • Sony unveils FE 400-800mm F6.3-8 G OSS and FE 16mm F1.8 G lenses
  • The 400-800mm lens is Sony's first-ever with 800mm reach
  • The full-frame lenses cost $2,900 / £2,550 and $800 / £850 respectively (AU pricing to follow)

Sony has unveiled two lenses for its full-frame cameras, covering wide-angle and telephoto extremes between them. The headline-grabbing lens is the FE 400-800mm F6.3-8 G OSS, because it's Sony's first-ever telephoto lens to reach the 800mm focal length.

Costing $2,900 / £2,550 (AU pricing to follow) and available from early March 2025, the 400-800mm joins Sony's other telephoto zooms – a 100-400mm and a 200-600mm – as the one with the longest reach, making it particularly ideal for wildlife and action photography where you can't get close, such as birding and motorsports.

What's more, it works with Sony 1.4x and 2x teleconverters. The latter can extend the maximum reach up 1600mm, even if the reduced f/16 maximum aperture is impractical for most scenarios, besides sunny weather.

Alongside the 400-800mm lens, Sony is also introducing the FE 16mm F1.8 G, which costs $800 / £850 (again, AU pricing to follow) and should be available from early April 2025. The ultra-wide prime joins a crowded section of Sony's expansive lens line up, with plenty of alternatives to consider.

Where it hopes to stand out against the likes of the FE 14mm F1.8 GM, FE 20mm F1.8, FE 16-25mm F2.8, FE 16-35mm and F2.8 GM II is its competitive price, compact build, ultra-wide perspective and fast f/1.8 aperture. For users of either of those zooms who generally stick to the widest angle, then the new 16mm prime could make way more sense.

We are currently carrying out an in-depth review of the 400-800mm lens, coming really soon, so do look out for that. (Image credit: Chris Rowlands) The ultimate super-telephoto zoom?

Sony isn't the first to launch a super-telephoto zoom with maximum 800mm focal length. No, that accolade went to Canon with its RF 200-800mm F6.3-9 IS USM in 2023.

There's plenty going for the 400-800mm lens, mind. Sony says it is dust-resistant and moisture-resistant, plus its focusing and zoom are internal – the latter being a welcome surprise.

An internal zoom means the lens doesn't extend as you zoom in and out, with the center of gravity essentially unchanged. Most enthusiast zoom lenses extend as you zoom, including Canon's, and there's potential for dust and moisture to enter the lens through its extending barrel. Not so with Sony's 400-800mm.

This is an optically complex lens too, comprising 27 elements in 19 groups, 6 of which are ED elements, plus an 11-blade circular aperture for what should be pleasant bokeh. Sony says there's minimal flare, ghosting and chromatic aberration – our incoming in-depth review will reveal all.

The lens' minimum focus distance is 1.7m, delivering a maximum 0.23x magnification – this is no macro lens. It's equipped with two linear motors and supports autofocus tracking up to 120fps, meaning the autofocus performance of Sony's best cameras can realize its full potential with this lens in play.

We're also testing the 16mm F1.8 lens – here it is attached to the Sony A7C II. (Image credit: Chris Rowlands)

The lens is stabilized, which is an essential feature for a super-telephoto zoom, but unfortunately Sony couldn't tell us what the lens's OIS is rated up to. We should have that info by the time our in-depth review is published.

At 2,475g, Sony's 400-800mm lens is weightier than Canon's super-telephoto, plus it's slightly pricier while Canon's lens has a 4x optical zoom which can zoom out wider to 200mm.

For people with a Sony camera, however, the new FE 400-800mm offers the longest reach possible at a competitive price. It's not a pro-grade Sony lens, but I can see it being particularly popular with enthusiast wildlife and action photographers, for whom the maximum F6.3-8 aperture is acceptable.

You might also like
Categories: Technology

Nicole Kidman's new Prime Video thriller, Holland, looks quirky and unsettling in its first trailer

TechRadar News - Wed, 02/26/2025 - 07:55

Amazon Prime Video have released the first trailer for brand new thriller Holland and what could be one of the best new movies on Prime Video looks as darkly funny as it is deeply unsettling.

The film stars Nicole Kidman as “the meticulous Nancy Vandergroot, a teacher and homemaker whose picture-perfect life with her community pillar husband (Matthew Macfadyen) and son (Jude Hill) in tulip-filled Holland, Michigan, tumbles into a twisted tale. Nancy and her friendly colleague (Gael García Bernal) become suspicious of a secret, only to discover nothing in their lives is what it seems.”

The film has actually been in development for over a decade, with the screenplay topping the 2013 blacklist. Back then, Naomi Watts and Bryan Cranston were attached to star, with Errol Morris on directing duties. The film rights were acquired by Amazon as part of their purchase of MGM studios in 2022.

Judging by the trailer, which you can watch just below, the film looks to be a mash-up of Hitchcockian mystery and Lynchian suburban satire, with a hint of The Truman Show – all shot through an A24-style lens. And from Fresh director Mimi Cave, we’d expect nothing less, after that movie’s skewering of the horrors of modern dating.

The film sees Macfadyen continue to soar post-Succession following his villainous turn in Deadpool & Wolverine, in a role that may not be all it seems. While Fred Vandergroot is ostensibly the perfect family man, the trailer hints at a sinister side, and that, potentially, the husband and father may be the orchestrator behind the bizarre goings on that plague Kidman’s character. Meanwhile, Bernal’s teaching colleague Dave joins Nancy as she investigates the unfolding mystery.

Kidman has been carving out quite a niche for herself as the queen of the streaming thriller in recent years after turns in Nine Perfect Strangers, Big Little Lies and The Perfect Couple. While still very much in the ‘suburban wife with a secret’ mould of those characters, Vandergroot looks to be a refreshing change of pace, with a quirkier performance from the star than we’ve seen recently, as Nancy spirals out of control.

Holland isn’t the only project that sees Kidman teaming up with Amazon either after she fronted Lulu Wang’s drama series Expats. Prime’s upcoming crime adaptation Scarpetta will see Kidman star alongside Jamie Lee Curtis, while an adaptation of Andrew Bovell’s award-winning play Things I Know to Be True is still in production for the streamer, which ranks among the best streaming services.

Also starring Rachel Sennott, Lennon Parham, Isaac Krasner and Jeff Pope, Holland is set to premiere at South by Southwest Festival on March 9 before landing on Prime Video on March 27, 2025.

You might also like
Categories: Technology

Best Satellite Internet Providers for 2025

CNET News - Wed, 02/26/2025 - 07:30
Although it can't match the speeds of fiber or cable, satellite internet is an essential lifeline for rural communities.
Categories: Technology

Pages

Subscribe to The Vortex aggregator