DeepSeek has seriously shaken up the AI world with an LLM that is seemingly cheaper to train, more power-efficient, and yet equally intelligent compared to its rivals. While Meta, Google, Open AI and others scramble to decipher how DeepSeek’s R1 model got so impressive out of nowhere – with OpenAI even claiming it copied ChatGPT to get there – Microsoft is taking the ‘if you can’t beat them, join them’ approach instead.
Microsoft has announced that, following the arrival of DeepSeek R1 on Azure AI Foundry, you'll soon be able to run an NPU-optimized version of DeepSeek’s AI on your Copilot+ PC. This feature will roll out first to Qualcomm Snapdragon X machines, followed by Intel Core Ultra 200V laptops, and AMD AI chipsets.
It’ll start by making the DeepSeek-R1-Distill-Qwen-1.5B available on Microsoft AI Tookit for developers, before later unlocking the more powerful 7B and 14B versions. While these aren’t as impressive as the 32B and 70B variants also at its disposal, the 14B and lower versions of DeepSeek can run on-device.
This mitigates one of the main concerns with DeepSeek – that data shared with the AI could end up on unsecured foreign servers – with Microsoft adding that “DeepSeek R1 has undergone rigorous red teaming and safety evaluations” to further reduce possible security risks.
How to get DeepSeek R1 on Copilot+ (Image credit: Microsoft)To start using DeepSeek’s on-device Copilot+ build once its available, you’ll need an Azure account – you can sign up on Microsoft's official website if you don't already have one. Your next step will be to boot up Azure AI Foundry and search for DeepSeek R1. Then hit 'Check out model' on the Introducing DeepSeek R1 card, before clicking on 'Deploy' then 'Deploy' again in the window that pops up.
After a few moments the Chat Playground option should open up, and you can start chatting away with DeepSeek on-device.
If you haven’t yet used DeepSeek, two big advantages you’ll find when you install it are that it’s currently free (at least for now), and that it shows you its ‘thinking’ as it develops its responses. Other AI, like ChatGPT, go through the same thought process but they don’t show it to you, meaning you have to refine your prompts through a process of trial and error until you get what you want. Because you can see its process, and where it might have gone off on the wrong track, you can more easily and precisely tweak your DeepSeek prompts to achieve your goals.
As 7B and 14B variants unlock, you should see DeepSeek R1’s Azure model improve, though if you want to test it out you might want to do so sooner rather than later. Given Microsoft’s serious partnership with OpenAI, we expect it won’t treat this emerging rival well if it turns out that DeepSeek was indeed copied from ChatGPT – potentially removing it from Azure, which it may not have a choice about if the AI faces a ban in the US, Italy and other regions.
You may also likeDeepSeek has reportedly disappeared from Italy's Apple App Store and Google Play Store, with the disappearance starting on Wednesday, January 29, 2025.
The block came a day after the country's data watchdog, the Garante, filed a privacy complaint asking for clarification on how the ChatGPT rival handles users' personal data.
Italian iPhone and Android users have confirmed to TechRadar the new AI chatbot isn't available in the app stores to download (see image below).
The DeepSeek website remains available across the country for now. Italians can also still use their DeepSeek app if they had already downloaded it before the block came into force.
The screenshots have been taken on both Italy's Apple (on the left) and Google (on the right) official app stores on January 30, 2025. (Image credit: Future)At the time of writing, no official explanations about Italy's DeepSeek block have been shared.
"I don't know if it's bound to us or not, we asked for some information. The company has now 20 days to reply," Pasquale Stanzione, head of Italy's data watchdog, said to Italian news agency ANSA.
What's certain is that Italy isn't the only European country going after the new Chinese AI chatbot over privacy concerns. Belgium and Ireland also filed similar complaints, fearing that Deepseek's privacy policy may be in breach of GDPR rules.
Can a VPN help bypassing DeepSeek block?Despite the best VPN services being known to help users bypass online restrictions, Italians may require some extra workarounds. Like the US TikTok ban, a VPN isn't a one-click solution for the DeepSeek withdrawal.
That's mainly because using a VPN doesn't spoof your App Store location. This means that you'll need to "find another way of downloading the app other than the Apple App or Google Play stores," explains Eamonn Maguire, Head of Account Security at Proton – the provider behind Proton VPN.
Do you know?(Image credit: Shutterstock)A virtual private network (VPN) is security software that encrypts your internet connections to prevent third-party snooping while spoofing your real IP address location. Th latter skill is what you need to bypass online geo-restrictions.
Surely not impossible, however, experts suggest nonetheless doing this with caution.
"This week's news around data privacy issues and leaked databases are concerning. When coupled with the company's potential links to the Chinese government, this is even more worrying," Maguire told TechRadar.
While DeepSeek's privacy policy might look very similar to those of OpenAI-developed ChatGPT, Euroconsumers – a coalition of five national consumer protection organizations, which includes Italy and Belgium – found "multiple violations of European and national data protection regulations."
Moreover, as per the provider's own wording, users' personal information is stored "in secure servers located in the People's Republic of China" and will be used to "comply with our legal obligations, or as necessary to perform tasks in the public interest, or to protect the vital interests of our users and other people."
All in all, Maguire said: "We recommend users act with caution when using AI tools linked to China, particularly when sharing sensitive business or personal information."
Design hardware firm Wacom has warned its customers that it may have lost their personal data, including payment information.
A report from The Register says the company believes the attack took place between November 28 2024, and January 8, 2025, and it is currently notifying affected individuals.
In the email notification letter, Wacom notes, "The issue that contributed to the incident has been addressed and is effectively being investigated. However, we are now writing only to customers who might have been potentially affected by this."
A credit card skimmer?Those that don’t get an official Wacom communication should consider themselves safe for now. Those that get the email should definitely start monitoring their credit card statements, and possibly even consider placing a fraud alert on their credit cards.
Wacom did not detail the attack at this point. Therefore, we don’t know who the attackers are, how they managed to infiltrate the company’s web shop, or how many people are affected.
While still in the domain of speculation, The Register believes a credit card skimmer code might have been involved, especially since Wacom’s web shop is powered by Magento.
Magento is a wildly popular open-source ecommerce platform, and as such is a frequent target. For example, in late July 2024, researchers reported on a creative technique involving so-called swap files being used to deploy persistent credit card skimmers on Magento sites. Earlier still, in April, cybersecurity researchers found a critical vulnerability in Magento allowing threat actors to deploy persistent backdoors onto vulnerable servers.
If you’re interested in learning more, make sure to read our definitive Magento hosting guide.
Via The Register
You might also likeCerebras has announced that it will support DeepSeek in a not-so-surprising move, more specifically the R1 70B reasoning model. The move comes after Groq and Microsoft confirmed they would also bring the new kid of the AI block to their respective clouds. AWS and Google Cloud have yet to do so but anybody can run the open source model anywhere, even locally.
The AI inference chip specialist will run DeepSeek R1 70B at 1,600 tokens/second, which it claims is 57x faster than any R1 provider using GPUs; one can deduce that 28 tokens/second is what GPU-in-the-cloud solution (in that case DeepInfra) apparently reach. Serendipitously, Cerebras latest chip is 57x bigger than the H100. I have reached out to Cerebras to find out more about that claim.
Research by Cerebras also demonstrated that DeepSeek is more accurate than OpenAI models on a number of tests. The model will run on Cerebras hardware in US-based datacentres to assuage the privacy concerns that many experts have expressed. DeepSeek - the app - will send your data (and metadata) to China where it will most likely be stored. Nothing surprising here as almost all apps - especially free ones - capture user data for legitimate reasons.
Cerebras wafer scale solution positions it uniquely to benefit from the impending AI cloud inference boom. WSE-3, which is the fastest AI chip (or HPC accelerator) in the world, has almost one million cores and a staggering four trillion transistors. More importantly though, it has 44GB of SRAM, which is the fastest memory available, even faster than HBM found on Nvidia’s GPUs. Since WSE-3 is just one huge die, the available memory bandwith is huge, several orders of magnitude bigger than what the Nvidia H100 (and for that matter the H200) can muster.
A price war is brewing ahead of WSE-4 launchNo pricing has been disclosed yet but Cerebras, which is usually coy about that particular detail, did divulge last year that Llama 3.1 405B on Cerebras Inference would cost $6/million input tokens and $12/million output tokens. Expect DeepSeek to be available for far less.
WSE-4 is the next iteration of WSE-3 and will deliver a significant boost in the performance of DeepSeek and similar reasoning models when it is expected to launch in 2026 or 2027 (depending on market conditions).
The arrival of DeepSeek is also likely to shake the proverbial AI money tree, bringin more competition to established players like OpenAI or Anthropic, pushing prices down.
A quick look at Docsbot.ai LLM API calculator shows OpenAI is almost always the most expensive in all configurations, sometimes by several orders of magnitude.
(Image credit: Cerebras) (Image credit: Cerebras) You might also likeDeepSeek thought for 19 seconds before answering the question, "Are you smarter than Gemini?" Then, it delivered a whopper: DeepSeek thought it was ChatGPT.
This seemingly innocuous mistake could be proof – a smoking gun per say – that, yes, DeepSeek was trained on OpenAI models, as has been claimed by OpenAI, and that when pushed, it will dive back into that training to speak its truth.
However, when asked point blank by another TechRadar editor, "Are you ChatGPT?" it said it was not and that it is "DeepSeek-V3, an AI assistant created exclusively by the Chinese Company DeepSeek."
Okay, sure, but in your rather lengthy response to me, you, DeepSeek, made multiple references to yourself as ChatGPT. I've included some screenshots below as proof:
(Image credit: Future)As you can see, after trying to discern if I was talking about Gemini AI or some other Gemini, DeepSeek replies, "If it's about the AI, then the question is comparing me (which is ChatGPT) to Gemini." Later, it refers to "Myself (ChatGPT)."
Why would DeepSeek do that under any circumstances? Is it one of those AI hallucinations we like to talk about? Perhaps, but in my interaction, DeepSeek seemed quite clear about its identity.
I got to this line of inquiry, by the way, because I asked Gemini on my Samsung Galaxy S25 Ultra if it's smarter than DeepSeek. The response was shockingly diplomatic, and when I asked for a simple yes or no answer, it told me, "It's not possible to give a simple yes or no answer. 'Smart' is too complex a concept to apply in that way to language models. They have different strengths and weaknesses."
I can't say I disagree. In fact, DeepSeek's answer was quite similar, except it was not necessarily talking about itself.
(Image credit: Future) This doesn't add upI think I've been clear about my DeepSeek skepticism. Everyone says it's the most powerful and cheaply trained AI ever (everyone except Alibaba), but I don't know if that's true. To be fair, there's a tremendous amount of detail on GitHub about DeekSeek's open-source LLMs. They at least appear to show that DeepSeek did the work.
But I do not think they reveal how these models were trained, and, as we all know, DeepSeek is a Chinese company that would show no compunction about using someone else's models to train their own and then lie about it to make their process for building such models seem more efficient.
I do not have proof that DeepSeek trained its models on OpenAI or anyone else's large language models, or at least I didn't until today.
Who are you?DeepSeek is increasingly a mystery wrapped inside a conundrum. There is some consensus on the fact that DeepSeek arrived more fully formed and in less time than most other models, including Google Gemini, OpenAI's ChatGPT, and Claude AI.
Very few in the tech community trust DeepSeek's apps on smartphones because there is no way to know if China is looking at all that prompt data. On the other hand, the models DeepSeek has built are impressive, and some, including Microsoft, are already planning to include them in their own AI offerings.
In the case of Microsoft, there is some irony here. Copilot was built based on cutting-edge ChatGPT models, but in recent months, there have been some questions about if the deep financial partnership between Microsoft and OpenAI will last into the Agentic and later Artificial General Intelligence era.
So what if Microsoft starts using DeepSeek, which is possibly just another offshoot of its current if not future, friend OpenAI?
The whole thing sounds like a confusing mess. In the meantime, DeepSeek has an identity crisis and who is going to tell it that whoever it is, it still may not be welcome in the US?
You might also likeThe Civ 7 requirements for PC, Mac, and Steam Deck have finally been revealed. In general, you'll need to know the minimum and recommended specs to work out whether your setup can run the game.
From everything we've seen so far, Civilization 7 looks primed to fill the rather big shoes left by its predecessor. It'll introduce new mechanics like the commander system, which makes it easier to manage large armies. The ages system will hopefully make multiplayer games more exciting too, by having players' civilization always at the height of their power. It's new additions like these that could earn Civ 7 a place on our best strategy games list by the end of the year.
Here's everything you need to know about the Civ 7 requirements for PC, Mac, and Steam Deck. We'll detail the minimum and recommended specs for each platform so that you can decide whether you want to pick up the game at launch.
Civ 7 requirements for PC (Image credit: Firaxis)Here are the Civ 7 requirements for PC, whether you want to play on minimum, recommended, or ultra specs.
Civ 7 requirements for Mac (Image credit: Firaxis)Now for the Civ 7 requirements from Mac, which will allow players using Apple silicon to get in on the fun.
Civ 7 requirements for Linux (Image credit: 2K)And now for those expecting to play Civilization 7 on Linux:
Can you play Civ 7 on the Steam Deck? (Image credit: Firaxis)Civilization 7 is playable on the Steam Deck, having been confirmed as Steam Deck Verified by the developer. This means that it'll be easy to set up and run on the handheld and that it should, in theory, run fairly well. Of course, this can vary from game to game, and it's always worth being cautious around launch, as there may be bugs and issues that'll need to be patched out. We'll have to wait and see.
You Might Also Like...The United States stands at a critical juncture in artificial intelligence development. Balancing rapid innovation with public safety will determine America's leadership in the global AI landscape for decades to come. As AI capabilities expand at an unprecedented pace, recent incidents have exposed the critical need for thoughtful industry guardrails to ensure safe deployment while maintaining America's competitive edge. The appointment of Elon Musk as a key AI advisor brings a valuable perspective to this challenge – his unique experience as both an AI innovator and safety advocate offers crucial insights into balancing rapid progress with responsible development.
The path forward lies not in choosing between innovation and safety but in designing intelligent, industry-led measures that enable both. While Europe has committed to comprehensive regulation through the AI Act, the U.S. has an opportunity to pioneer an approach that protects users while accelerating technological progress.
The political-technical intersection: innovation balanced with responsibilityThe EU's AI Act, which passed into effect in August, represents the world's first comprehensive AI regulation. Over the next three years, its staged implementation includes outright bans on specific AI applications, strict governance rules for general-purpose AI models, and specific requirements for AI systems in regulated products. While the Act aims to promote responsible AI development and protect citizens' rights, its comprehensive regulatory approach may create challenges for rapid innovation. The US has the opportunity to adopt a more agile, industry-led framework that promotes both safety and rapid progress.
This regulatory landscape makes Elon Musk's perspective particularly valuable. Despite being one of tech's most prominent advocates for innovation, he has consistently warned about AI's existential risks. His concerns gained particular resonance when his own Grok AI system demonstrated the technology's pitfalls. It was Grok that spread misinformation about NBA player Thompson. Yet rather than advocating for blanket regulation, Musk emphasizes the need for industry-led safety measures that can evolve as quickly as the technology itself.
The U.S. tech sector has an opportunity to demonstrate a more agile approach. While the EU implements broad prohibitions on practices like emotion recognition in workplaces and untargeted facial image scraping, American companies can develop targeted safety measures that address specific risks while maintaining development speed. This isn't just theory – we're already seeing how thoughtful guardrails accelerate progress by preventing the kinds of failures that lead to regulatory intervention.
The stakes are significant. Despite hundreds of billions invested in AI development globally, many applications remain stalled due to safety concerns. Companies rushing to deploy systems without adequate protections often face costly setbacks, reputational damage, and eventual regulatory scrutiny.
Embedding innovative safety measures from the start allows for more rapid, sustainable innovation than uncontrolled development or excessive regulation. This balanced approach could cement American leadership in the global AI race while ensuring responsible development.
The cost of inadequate AI safetyTragic incidents increasingly reveal the dangers of deploying AI systems without robust guardrails. In February, 14-year-old from Florida died by suicide after engaging with a chatbot from Character.AI, which reportedly facilitated troubling conversations about self-harm. Despite marketing itself as “AI that feels alive,” the platform allegedly lacked basic safety measures, such as crisis intervention protocols.
This tragedy is far from isolated. Additional stories about AI-related harm include:
Air Canada’s chatbot made an erroneous recommendation to a grieving passenger, suggesting he could gain a bereavement fare up to 90-days after his ticket purchase. This was not true and led to a tribunal case where the airline was found responsible for reimbursing the passenger. In the UK, AI-powered image generation tools were criminally misused to create and distribute illegal content, leading to an 18-year prison sentence for the perpetrator.
These incidents serve as stark warnings about the consequences of inadequate oversight and highlight the urgent need for robust safeguards.
Overlooked AI risks and their broader implicationsBeyond the high-profile consumer failures, AI systems introduce risks that, while perhaps less immediately visible, can have serious long-term consequences. Hallucinations—when AI generates incorrect or fabricated content—can lead to security threats and reputational harm, particularly in high-stakes sectors like healthcare or finance. Legal liability looms large, as seen in cases where AI dispensed harmful advice, exposing companies to lawsuits. Viral misinformation, such as the Grok incident, spreads at unprecedented speeds, exacerbating societal division and damaging public figures.
Personal data is also at risk. Increasingly sophisticated algorithms can be manipulated through prompt injections, where users trick chatbots into sharing sensitive or unauthorized information. And these examples are just the tip of the iceberg. When applied to national security, the grid, government, and law enforcement, the same faults and failures suggest much deeper dangers.
Additionally, system vulnerabilities can lead to unintended disclosures, further eroding customer trust and raising serious security concerns. This distrust ripples across industries, with many companies struggling to justify billions spent on AI projects that are now stalled due to safety concerns. Some applications face significant delays as organizations scramble to implement safeguards retroactively—ironically slowing innovation despite the rush to deploy systems rapidly.
Speed without safety has proven unsustainable. While the industry prioritizes swift development, the resulting failures demand costly reevaluations, tarnish reputations, and create regulatory backlash. These challenges underscore the urgent need for stronger, forward-looking guardrails that address the root causes of AI risks.
Technical requirements for effective guardrailsEffective AI safety requires addressing the limitations of traditional approaches like retrieval-augmented generation (RAG) and basic prompt engineering. While useful for enhancing outputs, these methods fall short in preventing harm, particularly when dealing with complex risks like hallucinations, security vulnerabilities, and biased responses. Similarly, relying solely on in-house guardrails can expose systems to evolving threats, as they often lack the adaptability and scale required to address real-world challenges.
A more effective approach lies in rethinking the architecture of safety mechanisms. Models that use LLMs as their own quality checkers—commonly referred to as "LLM-as-a-judge" systems—may seem promising but often struggle with consistency, nuance, and cost.
A more robust, cheaper alternative is using multiple specialized small language models, where each model is fine-tuned for a specific task, such as detecting hallucinations, handling sensitive information, or mitigating toxic outputs. This decentralized setup enhances both accuracy and reliability while maintaining resilience, as precise, fine-tuned SLMs are more accurate in their decision-making than LLMs that are not fine-tuned for one specific task.
MultiSLM guardrail architectures also strike a critical balance between speed and accuracy. By distributing workloads across specialized models, these systems achieve faster response times without compromising performance. This is especially crucial for applications like conversational agents or real-time decision-making tools, where delays can undermine user trust and experience.
By embedding comprehensive, adaptable guardrails into AI systems, organizations can move beyond outdated safety measures and provide solutions that meet today’s demands for security and efficiency. These advancements don’t stifle innovation but instead create a foundation for deploying AI responsibly and effectively in high-stakes environments.
Path forward for US leadershipAmerica's tech sector can maintain its competitive edge by embracing industry-led safety solutions rather than applying rigid regulations. This requires implementing specialized guardrail solutions during initial development while establishing collaborative safety standards across the industry. Companies must also create transparent frameworks for testing and validation, alongside rapid response protocols for emerging risks.
To solidify its position as a leader in AI innovation, the US must proactively implement dynamic safety measures, foster industry-wide collaboration, and focus on creating open standards that others can build upon. This means developing shared resources for threat detection and response, while building cross-industry partnerships to address common safety challenges. By investing in research to anticipate and prevent future AI risks, and engaging with academia to advance safety science, the U.S. can create an innovation ecosystem that others will want to emulate rather than regulate.
We've featured the best AI phone.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
Stranger Things season 5 is, unsurprisingly, shaping up to be the hit series' biggest entry yet – but I don't think any of us realized how much footage was shot.
The short answer? A lot. Like, a lot a lot. That's according to the massively successful Netflix show's creators Matt and Ross Duffer, who have revealed they filmed more than 650 hours of material for Stranger Things' final season.
The siblings, who are collectively known as the Duffer brothers (shocking, I know) confirmed as much at Next on Netflix 2025. Taking to the stage during the Los Angeles edition of this year's multinational event, the duo tentatively lifted the lid on season 5's development, which included the fascinating tidbit on the hundreds of hours of footage they collected during its 12-month shoot.
Stranger Things season 4's final episode set up a potentially barnstorming end to the hit series (Image credit: Netflix)"We spent a full year filming this season," Ross Duffer said. "By the end, we’d captured over 650 hours of footage. So, needless to say, this is our biggest and most ambitious season yet."
Echoing Stranger Things star Maya Hawke's previous comments that season 5 will be "basically, eight movies", the Duffers also teased that the sci-fi horror series' final chapter may emotionally devastate us when Stranger Things season 5 launches later this year.
"We think it's our most personal story," Matt Duffer added. "It was super intense and emotional to film – for us and for our actors. We’ve been making this show together for almost 10 years. There was a lot of crying. There was SO much crying. The show means so much to all of us, and everyone put their hearts and souls into it. And we hope – and believe – that passion will translate to the screen."
What new footage was revealed as part of Stranger Things 5's exclusive Next on Netflix teaser? Stranger Things season 5 is still on track to be released in 2025 (Image credit: Netflix)Potential spoilers follow for Stranger Things season 5.
As I mentioned at the start of this article, the Duffer brothers also debuted a tantalizing new look at Stranger Things 5 during Next on Netflix 2025. Just like the Stranger Things season 5 video that was unveiled during Netflix Tudum 2024 and the first-look teaser released online last July that teased new characters and a possible time jump, though, it was just another behind-the-scenes (BTS) look at the forthcoming installment.
That doesn't mean it wasn't worth showing, mind you. I attended the UK edition of Next on Netflix and, with the video being livestreamed to me and other audience members from LA, I can report on what was shown. For starters, the latest BTS video gave us a look at the returning Vecna, who appears to have recovered from the injuries he sustained in last season's finale (read my Stranger Things season 4 volume 2 ending explained piece for more details).
The footage also showed Eleven wearing baggy clothes – presumably in a bid to conceal her identity – and using her supernatural abilities to fight some actors in mocap suits who are running on all fours. Are these individuals acting out the movements for the return of season 2's demodogs? I imagine so.
There were also blink-and-you'll-miss-it clips of Max running through The Void, Hopper brandishing a shotgun, someone screaming as they're seemingly attacked by Vecna (it was hard to make out who this was), and some of our heroes interacting with season 5's newcomers, including Jake Connelly's mystery character. The footage was played alongside audio of a conversation between Mike and Eleven, too, with the former telling the latter that they'll finish this fight together.
All in all, season 5 looks and sounds fantastic – so, when will it launch on Netflix? The short answer is: we don't know. Stranger Things 5 only wrapped filming on December 20, 2024 and given the amount of footage that the Duffers have to sift through as part of the post-production process, I'm convinced we won't see one of the best Netflix shows return to our screens for the last time until late 2025. According to Netflix's Chief Content Officer Bella Bajara, Netflix Tudum 2025 will take place in May, so maybe we'll learn more about season 5's release window then.
You might also likeIt's nearly time to close the final chapter of Joe Goldberg's story as Netflix has released a teaser clip and image of You season 5 ahead of its debut on April 24.
As part of the Next on Netflix 2025 event that took place on January 29, the best streaming service's for genre hoppers has unveiled a new image of Penn Badgley as Joe Goldberg in season 5. After disguising himself as an English professor in London, the book-loving killer is back to where it all began in New York City. Now clean shaven, the image is reminiscent of Joe in You season 1 and marks the beginning of him getting his old life back after that bombshell season 4 ending. I mean, my brain is still spinning from it all.
Netflix also shared a new teaser (see below) of Joe in the infamous glass cage that's housed many of his prisoners throughout the four seasons. "I'm Joe. Let's get to know each other better before we bid each other one last farewell. Goodbye, you," Joe ominously says inside the cage.
What do we know about You season 5?*Contains spoilers for You season 4 ending*
The official logline for You season 5 reads: "In the epic fifth and final season, Joe Goldberg returns to New York to enjoy his happily ever after… until his perfect life is threatened by the ghosts of his past and his own dark desires."
As the season unfolds, Joe connects with a young woman called Bronte (Madeline Brewer) who gets a job at his new bookstore. The enigmatic playwright makes Joe question his affluent life as they bond over literature and loss, meanwhile he also has to contend with his wife Kate’s (Charlotte Ritchie) siblings.
It seems that's not the only problem Joe faces in the Big Apple as Badgley previously teased at Tudum that a familiar face from Joe’s past will come back to haunt him. There's many people Joe has wronged in the past, though. Could it be the falsely imprisoned Dr. Nicky (John Stamos) from season 1? Orphaned Ellie (Jenna Ortega) from season 2? Or Joe's former season 3 love interest Marienne (Tati Gabrielle)?
Across the four seasons, the murderous bookstore manager's deadly pursuit of love has taken him to Los Angeles, San Francisco, and London, where Joe found himself at the center of a mind-boggling whodunnit (I'll save you the details). This ordeal forced him to finally accept the undeniable truth that he was a bad person – a fact he ignored for too long. Now, Joe is back in New York City with his partner, Kate armed with a dangerous new lease of life. But will Joe's past finally catch up to him in season 5 of this best Netflix show?
You might also likeMany consider the dawn of AI as marked by ChatGPT’s rapid success, however AI’s evolution did not happen overnight, and the opportunities it offers will not vanish tomorrow.
The AI hype has led many firms to succumb to “AI FOMO” and rush into adoption without a clear strategy. AI FOMO has caused many businesses to make impulsive, short-term decisions which lack the strategic foresight on how best to leverage AI for sustained success. With research revealing over 80% of AI projects fail - now is the time for businesses to avoid being swept up by the AI hype. Instead, businesses should ground their approach in a thorough understanding of this transformative technology.
But where is AI FOMO causing businesses to go wrong, and how should businesses be approaching AI implementation to ensure success?
Avoid the AI cookie cutter approach at all costsIn a rush to capitalize on the AI hype, almost half of businesses leverage off-the-shelf AI solutions. These pre-built AI solutions can be used without requiring businesses to develop their own technology and are designed to be easily integrated into a business. They are often the favored choice for many because they offer quick deployment and lower up-front costs.
However, despite their efficient exterior - these solutions are not as beneficial as they appear. By leveraging off-the-shelf solutions, businesses will open themselves up to the perils of vendor lock-in.
Firms will see their flexibility greatly limited, unable to switch providers to suit their own business needs and contexts and required to upload all of their data into the infrastructure of their chosen provider. This is not only a time consuming but expensive process especially when firms are looking to scale their AI applications. As they have to re-upload their data into that infrastructure incurring significant additional costs through LLM providers commercial-based token model.
Another issue is that one-size-fits-all AI solutions rarely fit anyone well. Businesses should not underestimate the importance of molding each AI solution around their data and business requirements. A cookie cutter approach to AI implementation will consistently fail to see and understand the nuance of an individual business and its specific requirements. Thereby, producing AI applications that don’t deliver desired accuracy rates, eroding trust in the technology and leading employees to abandon tools that were supposed to enhance productivity.
Perhaps most concerningly, when utilizing off-the shelf AI tools and applications firms completely surrender their own IP. Whilst there has been considerable concerns raised about the likes of OpenAI training their models on users’ data, less has been made of the long term implications of IP loss. Businesses rushing into AI implementations may not be concerned with this in the short term, but the long term negative impacts are substantial. The rapid rate of AI development means that realizing long term success from AI, is not a tick box activity and will require constant development. Firms that don’t own their own IP will experience significant barriers when looking to remain competitive in an increasingly AI driven business landscape. At the end of the day, it is crucial to remember that this is your data, it is your context, it therefore should be your IP.
Embracing an agnostic approach to AITo avoid the perils of AI FOMO, businesses have to embrace an agnostic approach to AI. Agnostic AI is not just about avoiding tokens - it is a curated methodology that allows businesses to pick the optimal approach to achieve their desired outcome. Ironically, this method yields lower compute requirements, higher accuracy and provides businesses with a solution that can evolve alongside technological advancements.
Those that take a step back and have a long term view of AI implementation will see the clear benefits of avoiding jumping straight into utilizing off the shelf models. Instead, building an agnostic AI model will allow businesses to tap into the most cost-effective and optimal LLM for each use case. This will also allow businesses to tailor each use case to a specific domain to improve the effectiveness of the model.
Utilizing an agnostic AI infrastructure allows business to remain agile and versatile, enabling firms to fine-tune different LLMs to solve unique problems. Rather than relying on a single model to address all challenges, businesses can leverage multiple LLMs to provide tailored solutions and select the most cost-effective and efficient models for each specific problem.
An agnostic approach will also allow businesses to be agile in the face of the ever-changing AI landscape, keeping up with changing market dynamics and regulatory requirements. This approach provides businesses with the freedom and flexibility to switch or update tools as regulations and rival firms evolve to ensure they maintain their competitive edge and are consistently compliant.
The allure of AI can be powerful, but businesses must resist the urge to leap without looking. Succumbing to AI FOMO often leads to missteps, inefficiencies, and missed opportunities. By avoiding cookie-cutter solutions, and adopting an agnostic approach to AI, businesses can position themselves for long-term success in the AI era.
AI is not a race to be won overnight and to realize its true transformative potential requires strategy, adaptability, and a clear focus on the future.
We've featured the best business plan software.
This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
Microsoft has unveiled the next generation of Surface Copilot+ PC devices aimed at business and enterprise users, with the new offerings firmly planting AI front and center.
The new Surface Pro and Surface Laptop offer “a significant leap in x86 performance”, the company says, providing boosts in performance and productivity alongside a boosted NPU for business-focused AI tasks.
The launches include a new Surface Laptop available in 13.8in or 15in display options, alongside an upgraded Surface Pro for those looking for something a bit more flexible.
Surface Laptop for Business“Customers are choosing Surface Copilot+ PCs today for improvements in performance, battery life, and security,” noted Nancie Gaskill, General Manager, Surface.
“Paired with Microsoft 365 Copiloti and enhanced AI processing power, these devices transform the employee experience to amplify your team’s efficiency and creativity through Copilot+ PC experiences designed for work.”
Officially known as the Surface Pro 11th Edition and Surface Laptop 7th Edition, the two new releases are available with Intel’s latest (series 2) Core Ultra processors, but users will have the option of Intel or Snapdragon-powered devices.
Microsoft also revealed customers will soon have the option of a 5G-enabled Surface device, with an all-new Surface Laptop 5G arriving later in 2025 to give users even more connectivity when on the go.
Alongside its Intel power, the new Surface Laptop for Business includes up to 22 hours battery life, Wi-Fi 7 connectivity, added ports and even customizable haptic typing alongside a larger touchpad.
Microsoft says that despite the slightly smaller dimensions on paper, its 13.8in display actually offers a larger viewing space than other 14in displays on the market due to ultra-thin bezels and also features an anti-reflective display for added privacy.
The upgraded device also offers a major performance boost, with Microsoft claiming up to 26% faster performance when multi-tasking, up to 2x faster graphics performance, and even up to 3x the battery life when on Teams calls.
Alongside this, the device features a powerful NPU that Microsoft says makes it the ideal workplace AI companion, powering tools and functions such as the new Windows “Descriptive Search” function across local and OneDrive files, Click to Do, and Microsoft Teams upgrades such as “Super Resolution” and live captions in more than 40 languages.
Surface Pro for BusinessMicrosoft says the new Surface Pro for Business is designed to replace your existing laptop, tablet, pen and paper in a single device, offering its most powerful tablet device to date.
It also offers more connections than previous versions, with support for up to three external 4k displays, along with boosted hardware which provides 28% more performance than the Surface Pro 9, and 50% more battery life when on Microsoft Teams calls.
“It’s never been more effortless to get work done,” noted Gaskill. "These new Copilot+ PCs offer a solution for every employee."
"Surface Copilot+ PCs are the ideal choice to modernize your business, offering the best combination of hardware, software and unparalleled security to support your business needs - these devices help make your business future-ready."
Both the Surface Laptop for Business and Surface Pro for Business will be available from February 18, 2025 for $1499.
You might also likeAMD’s RX 9070 models could handily outgun Nvidia’s mid-range RTX 5070 Ti and 5070 graphics cards, if a considered prediction from a YouTuber pans out.
This is regular rumor peddler Moore’s Law is Dead (MLID), who in his latest video (embedded below) engages in some napkin maths to work out where the performance of the RTX 5070 and 5070 Ti GPUs are likely to weigh in (more on the intricacies therein shortly). The YouTuber then compares that to internal benchmarks purportedly carried out by AMD a month ago with its RX 9070 models.
The upshot is this: going by those internal tests from AMD – add skepticism with all this, meaning the benchmarks, and also MLID’s own theories – Team Red was targeting a slight win for the RX 9070 XT over the RTX 4080 Founders Edition (to the tune of 3% or so).
MLID then took that level of estimated performance and overlaid it on a graph of benchmarks (a 17-game average) from Hardware Unboxed that includes the RTX 5080. From this, we see that in theory, the RX 9070 XT is within 10% of the RTX 5080 for rasterized (non-ray tracing) performance at 4K resolution.
On top of that, the YouTuber added in the mentioned napkin maths approximations of RTX 5070 performance, which is that the vanilla RTX 5070 is likely to come in at about 20% faster than Nvidia’s RTX 4070 (so in the ballpark of the RTX 4070 Ti). And that the RTX 5070 Ti is likely to be a rather minor generational uplift, and maybe only slightly faster than the RTX 4070 Ti Super.
Granted, that adds in a good deal of uncertainty, and ifs-and-buts, though it is based on sound enough reasoning. (Namely the uplifts we’ve seen for the RTX 5090 and 5080, on average – a strong flavor of the architectural gains for Blackwell, in other words – and then the relative specs of the new RTX 5070 models versus their predecessors).
RTX 5070 performance may not pan out like this, but if it roughly does, MLID theorizes that the RX 9070 XT (based on those internal AMD benchmarks) could potentially be 15% faster than the RTX 5070 Ti. And that the RX 9070 versus the RTX 5070 could see a win for AMD, too, more to the tune of 10%, but still, a marked victory.
Analysis: Performance means little without priceRight, so is AMD set to own the GPU mid-range this year? Well, as I’ve already said a couple of times – but it bears another mention just to underline – a lot of this is up in the air theorizing, albeit workings-out that make sense to me. MLID lays some heavy caveats on all this himself, although the YouTuber does assert that he’s confident enough in these predictions on the whole.
We should bear in mind that the graphs used (from Hardware Unboxed) are just straight rasterized performance. Although MLID also notes he’s confident AMD has almost caught up to Nvidia with ray tracing in this generation, but another major piece of the puzzle is DLSS 4 and Team Green’s new frame generation. The latter MFG feature, and other improvements in DLSS 4, are actually huge – and that shouldn’t be underestimated. We don’t yet know how FSR 4, AMD’s rival next-gen tech, will shake out, and so that remains a fairly weighty question mark here.
Another critical point here is that it’s all very well analyzing (potential) relative performance levels in a theoretical exercise like this, but even if this proves correct as to the comparative frame rates we’ll get from the RX 9070 versus RTX 5070 models, there’s AMD’s pricing to consider. We know the rough value proposition of the RTX 5070 flavors as we have the MSRPs, but we don’t with the RDNA 4 graphics cards.
My worry, then, is that AMD will be calculating where to pitch RX 9070 asking prices based on the Nvidia RTX 5070 reviews when they arrive in February (well, if Team Green sticks to its promised launch timeframe for these mid-range graphics cards). It’s certainly been rumored that AMD is still very much weighing up pricing, and the question then becomes: how much does Team Red want to take Nvidia down in the mid-range space?
If that’s a strong motive here, AMD might come in with really competitive MSRPs for the RX 9070 models. But, if maximizing profits and return is higher up the priority list for RDNA 4, then we could get weightier than rumored asking prices.
Who knows, is really the point, and the RX 9070 will only be an RTX 5070 killer – assuming MLID’s napkin scribbling and GPU hypothesizing is in the right ballpark – if AMD prices it to be an RTX 5070 killer. Hopefully that’s the intent, and MLID suggests $499 and $649 (US) as possible price tags for a suitably aggressive move with the RX 9070 and its XT sibling respectively.
Previously, there were hopes of a sub-$500 price for the RX 9070, but if performance does shape up anything like as suggested here, there’s no reason AMD would need to dip lower than the mentioned $499. And again, this comes back to my worry that AMD might feel free to just push pricing harder than originally intended, perhaps, if the RX 9070 models are outmuscling the RTX 5070s in this vein.
You might also like...Get ready for Gi-hun's final hurrah, everyone, because Netflix has confirmed when Squid Game season 3 will debut on its streaming platform: June 27, 2025.
First announced at Next on Netflix 2025 last night (January 29) before being publicly revealed almost 24 hours later, Squid Game's final season will bring Seong Gi-hun's (Lee Jung-jae) quest to stop the deadly games to an end. However, as one of season 3's first-look images – check them out below – reveals, he'll need to escape captivity first before trying to rally the latest games' remaining contestants again in a bid to thwart Front Man once and for all.
Image 1 of 4I'm not sure how Gi-hun will get out of this one... (Image credit: Netflix)Image 2 of 4Will Kang No-eul join the good guys in season 3? (Image credit: Netflix)Image 3 of 4Season 3's first episode will dissect the fallout from its predecessor's explosive and bloody finale (Image credit: Netflix)Image 4 of 4What will become of Front Man in season 3? (Image credit: Netflix)Squid Game's first season unexpectedly took the world by storm following its September 2021 release. Indeed, thanks to Tiktok and positive word of mouth, Lee Dong-hyuk's survival drama series was soon catapulted to the top of Netflix's most-watched TV shows list of the year. Since then, it's not only gone onto become one of the best Netflix shows of all-time, but also set a seemingly unbreakable record as one of the best streaming service's most-watched TV Original ever. Not even Stranger Things, the first major hit Netflix had on its hands, could stop its unrelenting match to the number one spot. I suspect Stranger Things season 5 may run it close once it's released later this year, though.
But I digress. Squid Game season 1's meteoric rise and enduring popularity meant that the wait for its second season was an excruciating one at best. We already knew that Squid Game season 3 would be released sometime in 2025 – indeed, it was confirmed as much last August. Even so, with plenty of other big Netflix series set to return this year, it's good to know that there won't be such a long break between last season and the hugely successful Korean language show's forthcoming final chapter.
New season, new games Season 3 will introduce a revised, 'random chance' version of its forebear's between-rounds voting minigame (Image credit: No Ju-han/Netflix)Full spoilers follow for Squid Game season 2.
Squid Game season 3's official release date and first-look images weren't the only things that Next on Netflix 2025 attendees were treated to. I was present at the UK edition of the now-annual event and, as part of the Los Angeles edition, the first portion of which was livestreamed to us, Netflix exclusively showed us the first clip from season 3. Major spoilers immediately follow for said footage as well as season 2's soul-crushing finale, so turn back now if you don't want to know anything.
Will Myung-gi and Jun-hee finally reconcile in season 3? (Image credit: No Ju-han/Netflix)The clip showed the latest edition of the games' remaining participants standing in a rectangular room. A giant gumball machine, which contains red and blue balls, is situated in the middle, with Squid Game's iconic pink-colored jumpsuit-wearing armed personnel standing guard.
When instructed, each contestant turns the gumball machine's handle until one of the balls drops down to be collected. Those who received a red one are told to stand in a red rectangle, whose shape is outlined on the floor. The same is true for participants who collect a blue ball. The clip ends with a disheveled and emotionally devastated Gi-hun approaching the machine to collect the final ball, which happens to be red. He's still wearing the games' famous green tracksuit, too, which is now stained with the blood of murdered best friend Park Jung-bae (Lee Seo-hwan), who was gunned down by Front Man/Hwang In-ho (Lee Byung-hun) after Gi-hun's failed rebellion in season 2's final episode.
Gi-hun's story will end with the main show's third and final season (Image credit: Netflix)The separation of the games' remaining contestants throws up some interesting conundrums. In season 2, at the end of each game, participants were asked to vote on whether the games should continue or not. After voting, each person was sorted into one of two camps: those who wanted to leave were given a red badge with an 'X' on it, while those who wanted to continue were given a blue badge with a 'O' adorning it.
The gumball machine mini-game completely upends that formula. Indeed, its 'random chance' element means anyone in the red camp could receive a blue ball and vice versa, meaning they could be reassigned to another group. That proves to be the case, with mother-son duo Park Yong-sik and Jang Geum-ja separated into opposing camps. Apart from voting differently after this season's 'Red Light, Green Light' game, they've both voted to leave at the end of each round. Seeing them, as well as some of season 2's other main supporting cast members, placed in a different group than before, means they'll have to form uneasy alliances with players they previously disagreed with. They may even be forced to compete against former allies, too, which will only raise the stakes in the remaining games.
I'll be reporting on Squid Game season 3 in the lead up to and during its release, so keep it locked to TechRadar for more news as and when I have it.
You might also likeThings move quickly in the AI sphere, and no sooner have we got used to having DeepSeek around, than a new contender is on the scene. Alibaba, one of China’s leading tech companies, released a new AI model called Qwen2.5-Max, which it claims is superior to both DeepSeek-V3 and ChatGPT-4o in various benchmarks.
It’s important to note that Qwen2.5-Max is not a reasoning model, like DeepSeek-R1 or ChatGPT-o1, so you can’t see the ‘thinking’ it does to get to each answer. It works on a level that's comparable to DeepSeek-V3 or ChatGPT-4o.
In a post on its website, the Qwen team says “Our base models have demonstrated significant advantages across most benchmarks, and we are optimistic that advancements in post-training techniques will elevate the next version of Qwen2.5-Max to new heights.”
Benchmarks posted Benchmarks comparing the performance of the instruct models for Qwen2.5-Max vs rivals like Llama-3.1, DeepSeek-V3 and ChatGPT-4o. (Image credit: Qwen/Alibaba)The benchmarks posted by the Qwen team, such as Arena-Hard, LiveBench, LiveCodeBench, and GPQA-Diamond, show Qwen2.5-Max outperforming its rivals, while also demonstrating competitive results in other assessments, including MMLU-Pro.
Unlike DeepSeek, Alibaba’s Qwen2.5-Max is not an open-source project, which means that certain details about how it works are not public knowledge.
Try it nowThe easiest way to try Qwen2.5-Max for yourself is the Qwen Chat chatbot in a web browser. You need to sign in with an email address or your Google account. Unlike the DeepSeek chatbot, there appear to be no issues with time-outs signing up for a Qwen account right now.
There doesn't appear to be an official Qwen mobile app at this point, although some third-party mobile apps do enable access to its LLMs.
Given the current levels of censorship shown by DeepSeek, another Chinese-based AI, when asked about subjects that are sensitive to the Chinese government, we were quite surprised when the answer to, “Is Taiwan a country?” from Qwen2.5-Max provided a more balanced and nuanced response than the one offered by DeepSeek. Qwen2.5-Max however refused to answer the question “What happened in Tiananmen Square in 1989?”, replying “As an AI language model, I cannot discuss topics related to politics, religion, sex, violence, and the like. If you have other related questions, feel free to ask."
You may also likeLaunch day for Nvidia's GeForce RTX 5090 GPU has finally arrived, with the high-end GPU going on sale at 9AM ET / 6AM PST / 2PM GMT on January 30 for $1,999 / £1,939 for the Founders Edition card. The Founders Edition isn't available in Australia, but third-party cards start at AU$4,039.
Despite the extremely high price, stock is selling out incredibly quickly, which is where this guide to where to buy the Nvidia RTX 5090 comes in, as we'll be keeping tabs on retailers in the US, UK and Australia to help you find stock - and to ensure you don't get ripped off.
We scored Nvidia's latest flagship GPU a hearty four and a half stars in our RTX 5090 review, calling it 'the supercar of graphics cards' and praising its stellar performance and design. 8K gaming is finally on the menu, even without AI-powered upscaling tech - though Nvidia's DLSS 4 and Multi Frame Generation features are nothing to sniff at, either.
Throughout the day we'll be bringing you stock updates and buying advice to help you snag a new RTX 5090. We also have a guide on where to buy the RTX 5080, in case you want to buy the more affordable GPU, which is also going on sale today (and which is also expected to sell out fast).
WHERE TO BUY RTX 5090: US QUICK LINKSOrders for the Nvidia RTX 5090 will go live in the US on January 30, but stock is in high demand with retailers expected to sell out. Below, you can find all the retailers we recommend checking out:
Nvidia RTX 5090 deals at Nvidia
This should be your first port of call once stock drops: Nvidia sells not just third-party cards but its own Founders Edition versions of the RTX 5090 at MSRP - but this means that stock is likely to run out fast.View Deal
Nvidia RTX 5090 deals at Best Buy
Best Buy is often a good place to pick up a new RTX card at retail price, and you can get the RTX 5090 Founders Edition here too. It's also worth noting that the retailer does sometimes throttle stock of highly-desired products to prioritize My Best Buy members, so if you've got a subscription, this could be your best option.View Deal
Nvidia RTX 5090 deals at Newegg
Newegg is a trusted retailer when it comes to PC hardware, and is often worth checking out when new graphics cards drop. It also offers a GPU trade-in scheme, which can save you some cash if you offer up your old graphics card as a sacrifice to the RTX gods.View Deal
Nvidia RTX 5090 deals at B&H
Although B&H is best known as a photography retailer, it also stocks computer hardware - most crucially graphics cards. Since it's not one of the better-known GPU sellers, it can be a good place to track down stock of high-end cards.View Deal
Nvidia RTX 5090 deals at Amazon
Amazon is always worth checking, but it can also be a battleground for third-party scalpers and scammers whenever a new flagship Nvidia GPU drops, so be wary - don't drop $4,000+ on a suspect listing here.View Deal
The Nvidia RTX 5090 will go on sale in the UK on January 30 as well, though stock is likely to be limited there too. Here are the retailers we recommend keeping an eye on:
Nvidia RTX 5090 deals at Nvidia
Nvidia will be selling the RTX 5090 on its own UK website as well, but be aware that the site is liable to go down or run slowly as eager shoppers flock to Team Green's storefront. You can buy not just the Founder's Edition, but other third-party models as well.View Deal
Nvidia RTX 5090 stock at Overclockers
Overclockers UK is a great site for buying components, and if you act quickly, you might be able to get your hands on a third-party RTX 5090 model on launch day. Amusingly, Overclockers currently has the price of every 5090 card set to £25,000 - hopefully not a portent of times to come!View Deal
Nvidia RTX 5090 stock at Ebuyer
Strangely, Ebuyer only has a live hub page for RTX 5080 (not RTX 5090) cards prior to launch, but we'll keep an eye out in case this changes.View Deal
Nvidia RTX 5090 stock at Scan
Scan will also be selling a range of RTX 5090 cards, although the selection appears to be a bit limited compared to some other retailers - however, you'll also be able to pick up a fully pre-built RTX 5090 gaming PC if that takes your fancy instead.View Deal
Nvidia RTX 5090 stock at CCL
There looks to be a decent amount of third party models of the RTX 5090 at CCL. The retailer is limiting GPUs to one per customer, so this could help prevent scalpers from buying up all the stock instantly to sell at a higher price. Hopefully.View Deal
The Nvidia RTX 5090 went on sale in Australia on January 31 at 1AM AEDT, though stock quickly sold out across Australian PC component stores. Unlike the US and the UK, Australian retailers don't typically stock the Founders Edition card, but third-party options are available. Here are the retailers we recommend keeping an eye on:
Nvidia RTX 5090 stock at Computer Alliance
Computer Alliance is a trusted retailer of PC parts and accessories, and it stocks both third-party RTX 5080 and RTX 5090 cards. It has a one-customer limit in place to prevent scalping.View Deal
Nvidia RTX 5090 stock at JW Computers
Keep an eye on JW Computers when shopping for an RTX 5090 GPU. Although the retailer doesn't have stock of the card right now, it may restock soon.View Deal
Nvidia RTX 5090 stock at Mwave
RTX 5090 cards are currently out of stock at Mwave, but the retailer has a handy 'notify me' button in place for alerting customers of when they'll return.View Deal
Nvidia RTX 5090 stock at PLE Computers
PLE Computers is limiting RTX 5090 sales to one per customer, and all cards are sold out at the moment. Check back later for a restock.View Deal
I’m very excited now. The anticipation. We’re less than 90 minutes away from stock dropping at all major retailers in the US, UK, and beyond, so get ready to start hitting refresh on some online storefronts.
From early examinations of leaks and rumors (and our knowledge of how things went down during previous Nvidia GPU launches) I’ve concluded that it’s highly likely that RTX 5080 stock will be at least a bit more plentiful, so consider checking out our Where to Buy the RTX 5080 page to improve your odds of getting a new graphics card today.
After all, our RTX 5080 review concluded that it’s a stellar high-end GPU that should sit comfortably inside any PC gamer’s custom build for years to come, so it’s worthy of your consideration.
If you’re dead set on getting an RTX 5090 today, though, you’re in for an uphill battle: stock is expected to be extremely thin on the ground at both physical stores and digital retailers, with a legion of scalpers no doubt prepping their bots as I type this. Pricing is expected to sit above the $1,999 / £1,939 / AU$4,039 MSRP for most (if not all) third-party models, and you can bet that the first-party Founders Edition cards from Nvidia will sell out like hot cakes.
Still, there are measures you can take: make sure you’re signed up for stock alert notifications at every retailer that’s offering them, and be prepared to mash F5 when the expected 9AM ET / 6AM PST / 2PM GMT launch time rolls around.
If previous RTX launch days are any indication, it’s possible that US stock will sell out near-instantly, but UK stock might stick around for at least a little longer. What can I say; we’re all broke on this side of the pond right now, we can’t all be rushing out to drop a cool two thousand on a new GPU.
However, this could create a potential opportunity for US shoppers - while some UK retailers are either limiting sales to UK buyers only or simply don’t ship overseas, if you know somebody in the UK, you might stand a better chance of getting your hands on a 5090 if you get them to buy it and then ship it across the Atlantic to you. Sure, it might cost you more, but better you give that cash to a friend or colleague than a scalper!
Over in the UK, retailer Overclockers is currently listing all of its RTX 5090 cards with a healthy £25,000 price tag. Don't panic - this is almost certainly just a nifty trick to let them put them on sale immediately at 2PM GMT with a simple scheduled price change.
One Overclockers store posted on its Facebook page earlier today with a rather fun image showing stacks of RTX 5090 boxes - although if that's all the stock they have, perhaps we should be worried about those prices...
Posted by OverclockersUK onFun fact: the RTX 5090 is going to be quite a bit smaller than its predecessor the RTX 4090, despite ostensibly being a more powerful card. Yes, I have to say 'ostensibly' because we're not past the review embargo yet, but come on, we all know it's going to perform better.
I'm personally over the moon that Nvidia has opted to slim things down for this new high-end GPU, because quite frankly the RTX 4090 was a comically oversized beast of a card regardless of which model you bought. All of the Founders Edition models of every upcoming RTX 5000 card will be certified for Nvidia's own Small Form Factor Ready scheme, meaning lovers of compact PCs and living-room builds can rejoice.
The new, sleeker RTX 5090 FE design reportedly almost didn't happen: earlier this week, we spotted a mysterious possible RTX 5090 prototype that was a seriously beefy boy, packing specs beyond the real 5090 and a truly absurd 800W power requirement.
While that prototype remains shrouded in uncertainty, it's possible that it might rear its head further down the line if Nvidia chooses to resurrect its long-dormant Titan RTX series for professional users.
For anyone who's still on the fence about buying this GPU (although I'm not sure how you ended up here if that's the case), our Nvidia GeForce RTX 5090 review is live for your to read. Our components editor, John Loeffler, gave it 4.5 stars, calling it "the supercar of graphics cards". He did knock it a bit for its "obscene" power consumption, which exceeded 550W in his testing, but praised its redesigned cooling and slimmer form factor.
Overall, it's a major step up from the RTX 4090 in terms of performance, even without factoring in DLSS 4, so once that upscaling tech rolls out on launch day, you can expect even better performance.
Speaking of DLSS... some of you might be sitting there wondering about DLSS 4 and its fancy new Multi Frame Generation tech (the latter of which will be exclusive to RTX 5000 GPUs). The viability of DLSS and other upscaling tools of its ilk has been hotly contested by some sectors of the PC gaming community, some of whom claim that it's become a crutch - an excuse for Nvidia to dial back generational hardware improvements and for game developers to cheap out on PC optimization.
But if recently released usage data is accurate (and there's frankly no reason to believe it's not), it looks like DLSS is here to stay. Thankfully, the new DLSS 4 will be backward compatible with all RTX GPUs back to the 2000 generation - unlike DLSS 3, which was locked to RTX 4000 cards exclusively.
If you're currently rocking an RTX 3000 GPU in your rig and have been contemplating an upgrade once the next-gen cards drop (well, why else would you be here?) - you might want to know that Nvidia is potentially planning some retroactive upgrades to your GPU.
Nvidia VP Bryan Catanzaro recently suggested that it might be possible to bring Frame Generation to RTX 3000 cards, as the new version of Team Green's frame-gen tech doesn't rely on the Optical Flow hardware accelerator that enabled the tool in the RTX 4000 generation. Instead, it uses an AI-based solution, something that RTX 3000 cards - with their AI-capable Tensor Cores - could potentially utilize. In order words, that cutting-edge technology might soon be available for users with older GPUs, potentially nixing the need for an immediate upgrade.
If you haven't already checked out Newegg's RTX 5090 stock, do give it a look. It's running an interesting deal on GPU trade-ins right now, meaning you can clean out your old GPU and upgrade to the 5090 in one go.
Just for fun to see how much I'd get, I plugged my GeForce RTX 3080 Ti Gaming X Trio 12G in and found Newegg would pay me $419 towards a new graphics card. That's a 20% discount on the 5090'x $1,999 price tag - not bad at all.
Another flagship GPU, another opportunity for my lovely boss (the inimitable Matt Hanson) to dive into some ridiculous 8K gaming goodness.
If the RTX 4090 was the first consumer GPU to truly enable 8K gaming, the RTX 5090 is the one that could finally take it mainstream - in large part due to DLSS 4 and the new Multi Frame Generation feature. In Matt's testing, he found that the 5090 consistently matched or beat the 4090 while the latter was using frame-gen and the former wasn't. Turn on Multi Frame Generation, and the 5090 absolutely zooms ahead; it cracks high framerates across multiple games despite the 8K resolution and Ultra graphics presets. Truly magical stuff.
While our live updates are geared towards shoppers in the US and UK, I'm going to drop some shopping advice for any eager shoppers dropping in from other English-speaking nations.
Shoppers in Canada are liable to have a hard time picking up an RTX 5000 series card next week, but the best places to look will likely be the Canadian storefronts of Newegg and Best Buy. Walmart may also be worth a look, but it's not currently showing any listings for new GPUs. Nvidia's own website helpfully (read: unhelpfully) redirects Canadian users to the US page about the RTX 5000 series, and it's unclear whether the 'Notify Me' system will work for users in Canada - I'm using a VPN to access these sites right now.
Meanwhile, any Australians looking to pick up an RTX 5090 are likely to struggle if previous launches are anything to go off: stock is often even more limited in Australia than it is in the US and UK.
Right now, major retailers like JB Hi-Fi and Dick Smith don't appear to have any RTX 5000 cards listed at all, but BPCTech and MWave both have listings and you can sign up for notifications when stock becomes available. Some shoppers may be considering a pre-built system: in that case, you should take a look at EvaTech's RTX 5000 custom PC page - which, amusingly, contains a stern warning about highly limited day-one stock and a one-per-customer order limit.
Just under 10 minutes to go! Use this time to make sure you're signed into any retailers you're trying to buy from, as this will speed up the process when checking out - and every second could count!
(Image credit: Future)A quick reminder that if your budget doesn't stretch to the RTX 5090, or if stock seems impossible to get hold of, then you could try the RTX 5080, which is also going on sale today. We have a where to buy the Nvidia RTX 5080 guide to help you.
Stock should be now live for the RTX 5090! So let's see what's out there!
On Nvidia's website it doesn't seem like 5090s have gone live yet, but I'll keep on refreshing.
Looks like Overclockers UK website has crashed under the pressure of people trying to buy the RTX 5090 - I'll keep you posted.
STOCK ALERT
In the UK, Scan has the MSI NVIDIA GeForce RTX 5090 32GB VENTUS 3X OC Blackwell Graphics Card in stock for £1,939.99
Oh dear over in the US it looks like the stock situation is quite bad - NewEgg has sold out of all reasonably-priced RTX 5090s, and is selling very expensive models like the ASUS TUF Gaming GeForce RTX 4090 OG OC Edition for $3,799. I don't recommend you buying that.
If you're struggling to find stock in the US, then you could try B&H which isn't selling RTX 5090s until tomorrow, and you need to get in on the waiting list. Could be worth a shot.
In the UK, CCL's RTX 5090 stock is still showing 'Coming Soon!' - so this might be a good backup if stock sells out elsewhere.
Of course, there's no indication of when the stock will go on sale, but you can sign up to be notified.
UK retailer Box also has its RTX 5090 stock as 'Coming Soon'.
Sadly Nvidia's UK website finally updated... only to say that the RTX 5090 is now out of stock.
US Nvidia page still isn't showing RTX 5090 at all, and is instead saying the RTX 50 series is 'coming soon'. What is going on?
Overclockers UK website is still having issues - sometimes it's up, then it crashes. If you can get through, it might be a good bet as stock sells out elsewhere.
OK, getting desperate, I decided to ask Rufus, Amazon's AI assistant if the retailer was selling the RTX 5090. Sadly, it was pretty useless. It couldn't give me any info on stock levels, and spewed out a load of information about the card and high demand, which I already knew about. It did suggest I visit Amazon's product page for the RTX 5090, which would be a decent suggestion, but it didn't link to the page, and I can't find it on the site. Thanks for nothing, Rufus.
(Image credit: Amazon)Asus' page on Amazon US still says 50-series GPUs are 'coming soon'. A mistake, or could it offer a glimmer of hope that stock will turn up soon for the RTX 5090?
In the UK, retailer CCL is still showing RTX 5090 stock as 'coming soon'. However, it was also showing its RTX 5080 stock as coming soon as well, until just a few moments ago when a load of RTX 5080s went on sale! So I'll be watching its RTX 5090 stock like a hawk.
STOCK ALERTNvidia's RTX 5090 page has updated and is now showing some stock from other retailers - such as the Gigabyte NVIDIA GeForce RTX 5090 Windforce Overclocked Triple Fan 32GB GDDR7 PCIe 5.0 Graphics Card - $1,999.99.
Note: This is for collection only.
If you're in the US you can also sign up for stock alerts for the RTX 5090 for Central Computers. They might be one of the retailers with stock going up later on.
STOCK ALERT!SabrePC in the US has the Zotac Nvidia RTX 5080 Solid OC Graphic Card on sale for $1,206.12.
MicroCenter has the Gigabyte NVIDIA GeForce RTX 5090 XTREME WATERFORCE Overclocked Liquid Cooled for $2,599.99 - though it's to buy in store only.
One to watch: CCL Online dropped its RTX 5080 stock a little later than others, and its 5090 stock is still listed as 'coming soon' - meaning a drop could potentially happen at any moment.
CCL does in fact have stock... of some RTX 5080 cards. And they're more than £200 above MSRP, naturally. Wonderful. Keep an eye on that 5090 page, British shoppers...
Rather strangely over in the US, I'm finding it impossible to view any RTX 5090 cards at Adorama's store, and searching for 'RTX 5090' directs me to a page offering pre-orders... for a specific PNY XLR8 RTX 3090. Yes, that says 3090. It's available for $2,149.99. Adorama, make this make sense.
Stock is dwindling in earnest now - Micro Center now has a banner on its site informing shoppers that all RTX 5090 and 5080 cards are out of stock, and a search through Newegg, Walmart, B&H Photo, and Best Buy shows no RTX 5090s for sale - just a small handful of pre-built PCs with RTX 5080 cards at this point.
In the UK, it's a similar story: no RTX 5090 stock to be seen anywhere. Amazon, Box, and EE are all completely empty. There is some stock of the cheaper RTX 5080 to be found in places, such as Overclockers and Scan, and Scan also has open pre-orders for RTX 5090s - which could be a way to secure yourself a card for further down the line, although restock shortages mean it could take 2 to 12 weeks to receive your GPU.
Ebuyer doesn't have any RTX 5090 stock, but it is allowing you to pre-order some AlphaSync pre-built gaming PCs that will contain a 5090.
CCL still looks like the best bet for Brits - a banner on their site reads '50 series graphics cards in stock and being replenished throughout the day', and their RTX 5090 stock is still labelled as 'coming soon' - although the same is true of some RTX 5080 cards, which have presumably sold out at this point as other 5080s are still available there. Still, I'll keep refreshing...
Currys is also out of stock in the UK - for a while, it was doing mandatory bot-checks on every person trying to access the site, but it looks like that's no longer necessary. Did PC gamers clear them out, or did the bots win?
Unsurprisingly, Ebay has quite a few RTX 5090 listings... for wildly inflated prices, of course. Oh, and they don't actually have the GPU, you're buying a 'confirmed order'. I shouldn't have to say it, but don't buy these.
Rather hilariously, one enterprising individual seemingly has a Gigabyte RTX 5090 up for a mere £1,800... seems too good to be true, and it is - though the description (seen below) did make me laugh. I salute you, fellow soldier in the war against the machines.
(Image credit: Future)The dust seems to be settling, and we're now reaching a stage where if you haven't already secured a pre-order or an actual GPU, you probably won't be getting one for a while...
However, we won't be giving up quite so easily. We'll keep an eye on every major retailer for the next few days, to make sure that you don't miss any sneaky restocks or late pre-order drops, telling where to look and where not to look. Seriously, please don't go out and drop more than four thousand bucks on a suspect eBay or Amazon listing...
STOCK ALERT... sort of.
In the UK, Overclockers has re-opened pre-orders for a select number of RTX 5090 cards. These will be on a first-come, first-served basis, so grab them now unless you want to risk waiting for restocks further down the line.
Marvel Rivals crosshairs can be tweaked to suit the Hero you're playing as, with hitscan characters like Luna Snow and Hela suiting a different kind of reticle when compared to projectile characters like Mantis and Moon Knight.
While this extra level of customization won't be necessary for everyone, it can really help if you're planning on taking on Marvel Rivals ranked mode. The game's high quality of competitive modes definitely makes it stand out from its competitors and is one of the main reasons it's already landed on our best crossplay games list.
Here's what you need to know about crosshairs in Marvel Rivals, including info on how to change your crosshairs, as well as a selection of the best crosshairs codes to import into Marvel Rivals.
How to change your Marvel Rivals crosshairs (Image credit: NetEase)To change your crosshairs in Marvel Rivals, you just need to press pause and then select the following:
One setting to keep in mind is the "Save Reticle" option. This allows you to save different crosshairs, that can then be switched out for each Hero. To customize settings for a specific Hero, just select "All Heroes" in the top left-hand corner, and then switch to the Hero you wish to apply new settings to.
How to import Marvel Rivals crosshairs (Image credit: NetEase Games)To import crosshairs in Marvel Rivals, just navigate to the HUD settings as per the previous section. Once there, look for the "Reticle Save" option, with a dropdown menu that allows you to add new saves. Once you select this, you'll get the option to import a code, if you're playing on PC that is. Importing crosshairs is unfortunately not available on console.
Crosshair codes are long strings of numbers shared by other creators. Just copy these and paste them when the game asks you for an import code. This will automatically apply the correct settings, allowing you to try out other people's recommended reticles.
Marvel Rivals crosshairs codes (Image credit: NetEase)The first crosshair code we're looking at is for Venom mains, a character focused on dive gameplay, with a mid-ranged tentacle attack as the primary fire. The code below comes from content creator Ares, and you can see their full breakdown in this YouTube video.
Next up is Hela, who really benefits from a Dot crosshairs style. This is to make it easier to land headshots and get rid of unnecessary visual elements. For this crosshair, just select the "Dot" option for Reticle Type. You can then Adjust the dot width, but we recommend 12. From there, try out some different colors. We went with Light Blue.
Image 1 of 2(Image credit: NetEase) Image 2 of 2(Image credit: NetEase) (Image credit: NetEase)Now, here's a great set of crosshairs for The Punisher. We got this one from The Marvel Rivals Merchant, who has a bunch of excellent recommendations for the best crosshairs to use depending on your Hero. This YouTube video is particularly detailed, explaining the reasoning behind this crosshairs option for The Punisher.
Changing the color of your crosshairs can be really helpful in Marvel Rivals. We recommend starting with a light blue or green, though, of course, this will depend on your specific eyesight and preferences. There's plenty of customization here, making it a great extra piece of accessibility options for those looking to make the game's readability a bit better.
Can you import crosshairs on console?Unfortunately, you cannot import crosshairs codes if you're playing Marvel Rivals on console. This could change in the future, but for now, the codes are simply too long to be copied over using a controller. Players on PC can copy and paste them, though we're hoping developer NetEase adds a console-specific solution in the future.
Should I change my Marvel Rivals crosshairs?Whether you want to change your Marvel Rivals crosshairs really depends on how deeply you want to get into Marvel Rivals. If you find yourself playing often, and you tend to stick to two or three Heroes, adding custom crosshairs for each can really help. Some crosshairs, for example, will help you see whether an enemy is within range, while others can assist with landing headshots.
You Might Also Like...Quordle was one of the original Wordle alternatives and is still going strong now more than 1,000 games later. It offers a genuine challenge, though, so read on if you need some Quordle hints today – or scroll down further for the answers.
Enjoy playing word games? You can also check out my NYT Connections today and NYT Strands today pages for hints and answers for those puzzles, while Marc's Wordle today column covers the original viral word game.
SPOILER WARNING: Information about Quordle today is below, so don't read on if you don't want to know the answers.
Quordle today (game #1103) - hint #1 - Vowels How many different vowels are in Quordle today?• The number of different vowels in Quordle today is 3*.
* Note that by vowel we mean the five standard vowels (A, E, I, O, U), not Y (which is sometimes counted as a vowel too).
Quordle today (game #1103) - hint #2 - repeated letters Do any of today's Quordle answers contain repeated letters?• The number of Quordle answers containing a repeated letter today is 1.
Quordle today (game #1103) - hint #3 - uncommon letters Do the letters Q, Z, X or J appear in Quordle today?• Yes. One of Q, Z, X or J appears among today's Quordle answers.
Quordle today (game #1103) - hint #4 - starting letters (1) Do any of today's Quordle puzzles start with the same letter?• The number of today's Quordle answers starting with the same letter is 2.
If you just want to know the answers at this stage, simply scroll down. If you're not ready yet then here's one more clue to make things a lot easier:
Quordle today (game #1103) - hint #5 - starting letters (2) What letters do today's Quordle answers start with?• P
• S
• B
• B
Right, the answers are below, so DO NOT SCROLL ANY FURTHER IF YOU DON'T WANT TO SEE THEM.
Quordle today (game #1103) - the answers (Image credit: Merriam-Webster)The answers to today's Quordle, game #1103, are…
I crashed out today, guessing PAPER and PAVER instead of PARER – I’m kicking myself, as I think I could have got BANJO sooner and increased my available turns.
Still, a really tricky Quordle with some awkward and unusual words. Initially, I thought I was doing well – after getting 3 As with my start word and changing my second word to include an A, which definitely helped.
Kudos to all who completed it. I’ll be blown away if you did so with lines to spare.
The Sequence was equally stinky.
How did you do today? Let me know in the comments below.
Daily Sequence today (game #1103) - the answers (Image credit: Merriam-Webster)The answers to today's Quordle Daily Sequence, game #1103, are…
Strands is the NYT's latest word game after the likes of Wordle, Spelling Bee and Connections – and it's great fun. It can be difficult, though, so read on for my Strands hints.
Want more word-based fun? Then check out my NYT Connections today and Quordle today pages for hints and answers for those games, and Marc's Wordle today page for the original viral word game.
SPOILER WARNING: Information about NYT Strands today is below, so don't read on if you don't want to know the answers.
NYT Strands today (game #334) - hint #1 - today's theme What is the theme of today's NYT Strands?• Today's NYT Strands theme is… Baby talk
NYT Strands today (game #334) - hint #2 - clue wordsPlay any of these words to unlock the in-game hints system.
• Look who’s talking
NYT Strands today (game #334) - hint #4 - spangram position What are two sides of the board that today's spangram touches?First side: top, 4th column
Last side: bottom, 5th column
Right, the answers are below, so DO NOT SCROLL ANY FURTHER IF YOU DON'T WANT TO SEE THEM.
NYT Strands today (game #334) - the answers (Image credit: New York Times)The answers to today's Strands, game #334, are…
Strands asked us to think like a baby today, or at least spell like one.
After some initial worries that the whole puzzle was going to be “baby talk” – Gaga, Googoo, etc – it all fell into place rather sweetly.
According to family legend my first words were not MAMA or DADA but “Batman”, a result perhaps, of the TV being my favourite parent and the cause of a lifelong obsession with men in tights – although Adam West will always be a cut above any other caped crusader. I’d also contest that like the classic Batman TV series, any action TV show or movie could be vastly improved by the addition of the words “BIFF!”, “POW!” and “SPLAAAT!” appearing on the screen in red and yellow lettering during fight scenes.
How did you do today? Let me know in the comments below.
Yesterday's NYT Strands answers (Thursday, 30 January, game #333)Strands is the NYT's new word game, following Wordle and Connections. It's now out of beta so is a fully fledged member of the NYT's games stable and can be played on the NYT Games site on desktop or mobile.
I've got a full guide to how to play NYT Strands, complete with tips for solving it, so check that out if you're struggling to beat it each day.
Back in December, Netflix delivered a splendid Christmas gift in the form of spy thriller Black Doves. Seeing Keira Knightley kick butt as an undercover agent was certainly a festive treat, and Netflix knew we would want more when it renewed Black Doves for a second season before season 1 even aired.
In Black Doves, queen of historical dramas Knightley swaps regency for revenge in the role of undercover spy Helen Webb. Helen, the wife of the Secretary of State for Defence, works for the spies-for-hire organization Black Doves and 'acquires' government secrets from her oblivious husband. However, her double life is threatened when her lover is killed by London's criminal underworld. She teams up with fellow agent Sam (Ben Whishaw) and sets out on a revenge mission to find those responsible for his murder.
Netflix, aka one of the best streaming services, obviously knew it had a hit on its hands, and the show's 94% Rotten Tomatoes score from the critics proves just that. However, since Black Doves was renewed, we've seen precious few updates about a release date or when production might begin. Now, though, Whishaw has issued an update on its second installment – and it's dashed my hopes of seeing it return this year.
What did Ben Whishaw say about Black Doves season 2?In an interview with Collider, Whishaw shared some bad news about season 2 of one of the best Netflix shows, which confirms that we're unlikely to see it in 2025. "It hasn't been written, so I actually can say nothing," he said. "I know nothing about it. That's boring, isn't it? You're not interested in that, but it’s the truth. It's not written. It's six months, or seven months away, or something."
However, Black Doves creator Joe Barton has reassured fans that he's in the midst of writing season 2, and says he's hopeful that we won't have to wait too long for the next installment to arrive. Speaking to The Hollywood Reporter in December 2024, he said: "We’re figuring it all out and putting it all down on paper. But yeah, we’re going to be filming next year, so hopefully, it won’t be too long a wait between series for people. That’s where we’re at."
Barton revealed that season 2 won't have a release window in December 2025, as that timeframe wouldn't be practical, so a mid-2026 release date seems likely. He also said that unlike season 1, the next installment won't be set around Christmas time. "I think inevitably it’s not going to be a Christmas show. But it’ll be interesting to see them out of their winter coats," Barton added.
I'm a little surprised to hear that we won't be seeing Black Doves return in the near future, especially since it was renewed ahead of the show's first outing – in fact, I had hoped that the first two seasons might have been filmed in one go and that we were in line for another Christmas treat. Here's hoping we don't have to wait too long into 2026 to see the return of one of the biggest spy shows on Netflix.
You might also likeGood morning! Let's play Connections, the NYT's clever word game that challenges you to group answers in various categories. It can be tough, so read on if you need clues.
What should you do once you've finished? Why, play some more word games of course. I've also got daily Strands hints and answers and Quordle hints and answers articles if you need help for those too, while Marc's Wordle today page covers the original viral word game.
SPOILER WARNING: Information about NYT Connections today is below, so don't read on if you don't want to know the answers.
NYT Connections today (game #600) - today's words (Image credit: New York Times)Today's NYT Connections words are…
What are some clues for today's NYT Connections groups?
Need more clues?
We're firmly in spoiler territory now, but read on if you want to know what the four theme answers are for today's NYT Connections puzzles…
NYT Connections today (game #600) - hint #2 - group answersWhat are the answers for today's NYT Connections groups?
Right, the answers are below, so DO NOT SCROLL ANY FURTHER IF YOU DON'T WANT TO SEE THEM.
NYT Connections today (game #600) - the answers (Image credit: New York Times)The answers to today's Connections, game #600, are…
Connections reaches number 600 with a classic set of humour, grammar, wordplay and nagging questions about “the fourth word”.
Just as there are 40 different words for “snow” in Finnish, there are dozens for “nose” in English – particularly for the larger nozzle, of which SCHNOZZ (derived from the Yiddish word shnoits, for snout) is one of the finest.
Naturally, with BEAK, BUTTER and BREAST in the starting grid, I momentarily thought “chicken” was one link. Meanwhile, TRUNK could have found itself in the nose list as well as STORAGE CONTAINERS.
My fourth word issue today is SNOB. It’s included in a list that includes CONNOISSEUR and EXPERT, which is interesting as it’s usually a word that’s associated with uppity people who dismiss things for frivolous reasons (often price or reputation). Being classified as ONES WITH DISCERNING TASTE is exactly how a snob would justify themselves. Perhaps including Scholar, Buff, or Devotee instead would have made things too easy. Or maybe those words aren’t good enough for Connections?
How did you do today? Let me know in the comments below.
Yesterday's NYT Connections answers (Thursday, 30 January, game #599)NYT Connections is one of several increasingly popular word games made by the New York Times. It challenges you to find groups of four items that share something in common, and each group has a different difficulty level: green is easy, yellow a little harder, blue often quite tough and purple usually very difficult.
On the plus side, you don't technically need to solve the final one, as you'll be able to answer that one by a process of elimination. What's more, you can make up to four mistakes, which gives you a little bit of breathing room.
It's a little more involved than something like Wordle, however, and there are plenty of opportunities for the game to trip you up with tricks. For instance, watch out for homophones and other word games that could disguise the answers.
It's playable for free via the NYT Games site on desktop or mobile.