Error message

  • Deprecated function: implode(): Passing glue string after array is deprecated. Swap the parameters in drupal_get_feeds() (line 394 of /home/cay45lq1/public_html/includes/common.inc).
  • Deprecated function: The each() function is deprecated. This message will be suppressed on further calls in menu_set_active_trail() (line 2405 of /home/cay45lq1/public_html/includes/menu.inc).

Technology

New forum topics

I Tried MyFitnessPal's New Feature and It Helped Me Plan Healthy Meals That Actually Taste Good

CNET News - Thu, 07/10/2025 - 04:03
I tried MyFitnessPal's new Meal Planner feature, which allows you to meal plan and order your groceries in one app.
Categories: Technology

Why the AI boom requires an Wyatt Earp

TechRadar News - Thu, 07/10/2025 - 03:41

At a time when many believe that oversight of the Artificial Intelligence industry is desperately needed, the US government appears to have different ideas. The "One Big Beautiful Bill Act" (OBBBA)—recently given the nod by the House of Representatives—includes a 10-year moratorium on state and local governments enacting or enforcing regulations on AI models, systems, or automated decision-making tools.

Supporters claim the goal is to streamline AI regulation by establishing federal oversight, thereby preventing a patchwork of state laws that could stifle innovation and create compliance chaos. Critics warn that the moratorium could leave consumers vulnerable to emerging AI-related issues, such as algorithmic bias, privacy violations, and the spread of deepfakes.

Basically, if the AI sector is the Wild West, no one will be allowed to clean up Dodge.

Why should we care?

History may not literally repeat itself, but there are historical patterns and trends that we can view and hopefully be informed by, and our history books are packed with examples of technology reshaping the lives of the workforce.

And be it in the form of James Watt’s steam engine or Henry Ford’s moving assembly line, the cost of the progress brought by fresh technology is regularly paid by the large numbers of people sent home without a pay packet.

And AI will cost jobs too.

Experts such as those at McKinsey, the Lancet, or the World Economic Forum (WEF) may not agree on exact numbers or percentages of lost jobs, but the consistent message is that it will be bad:

  • 30% of US work hours across all sectors will be automated by 2030 says McKinsey
  • 25% of medical administrative tasks could vanish by 2035 according to a Lancet study
  • 39% of existing skill sets will become outdated between now and 2030 warns WEF

Of course, as with all new technologies, new jobs will be created. But we can’t all be prompt engineers.

The Great Brain Robbery

Essentially, those hit hardest by the bulk of new technologies from the Spinning Jenny onwards were the ones engaged to carry out physical work. But AI wants to muscle in on the intellectual and creative domains previously considered uniquely human. For example, nonpartisan American think tank the Pew Research Center reckons 30% of media jobs could be automated by 2035.

And those creative jobs are under threat because creatives are being ripped off.

Many AI models are trained on massive datasets scraped from the internet, and these often include articles, books, images, music and even code that are protected by copyright laws, but AI companies lean heavily towards take-first-ask-later. Obviously, artists, writers, and other content creators see this practice as unauthorized use of their intellectual property and they argue that ultimately, it’s not even in the best interests of the AI sector.

If AI takes work away from human creatives—devastating creative industries already operating on thin margins—there will be less and less innovative content to feed to AI systems which will result in AI feeding off homogenized AI content – a derivative digital snake eating its own tail.

A smarter way forward would be to find a framework where creatives are compensated for use of their work to ensure the sustainability of human produced product. The music industry already has a model where artists receive payments via performing rights organizations such as PRS, GEMA and BMI. The AI sector needs to find something similar.

To make this happen, regulators may need to be involved.

Competitive opportunity versus minimizing societal harm

Without regulation, we risk undermining the economic foundations of creative and knowledge-based industries. Journalism, photography, literature, music, and visual arts depend on compensation mechanisms that AI training currently bypasses.

The United Kingdom and the European Union are taking notably different paths when it comes to regulating AI. The EU is pursuing a strict, binding regulatory framework, an approach designed to protect fundamental rights, promote safety, and ensure ethical use of AI across member states. In contrast, the UK is currently opting for a more flexible approach, emphasizing innovation and light-touch oversight aiming to encourage rapid AI development and attracting investment.

But this light-touch strategy could be a massive misstep – one that in the long term could leave everyone wishing we’d thought things through.

While AI enthusiasts may initially be pleased with minimal interference from regulators, eventually AI businesses will come up against consumer trust, something they absolutely need.

While AI businesses operating in Europe will be looking at higher compliance costs, there is also a clearer regulatory landscape and therefore more likely to be greater consumer trust – a huge commercial advantage.

Meanwhile, AI businesses operating in light-touch markets (such as the UK) need to consider how their AI data practices align with their (and their competitors’) brand values and customer expectations. As public awareness grows, companies seen as exploiting creators may face reputational damage. And a lack of consumer confidence could lead to a shift in mindset from previously arm’s-length regulators.

Regardless of the initial regulatory environment, early adopters of ethical AI practices may gain competitive advantages as regulatory requirements catch up to ethical standards. Perhaps the wisest way forward is to voluntarily make Dodge City a better place, even if there’s no sheriff in town – for now.

I tried 70+ best AI tools.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

The four-phase security approach to keep in mind for your AI transformation

TechRadar News - Thu, 07/10/2025 - 01:46

As organizations continue to adopt AI tools, security teams are often caught unprepared for the emerging challenges. The disconnect between engineering teams rapidly deploying AI solutions and security teams struggling to establish proper guardrails has created significant exposure across enterprises. This fundamental security paradox—balancing innovation with protection—is especially pronounced as AI adoption accelerates at unprecedented rates.

The most critical AI security challenge enterprises face today stems from organizational misalignment. Engineering teams are integrating AI and Large Language Models (LLMs) into applications without proper security guidance, while security teams fail to communicate their AI readiness expectations clearly.

McKinsey research confirms this disconnect: leaders are 2.4 times more likely to cite employee readiness as a barrier to adoption versus their own issues with leadership alignment, despite employees currently using generative AI three times more than leaders expect.

Understanding the Unique Challenges of AI Applications

Organizations implementing AI solutions are essentially creating new data pathways that are not necessarily accounted for in traditional security models. This presents several key concerns:

1. Unintentional Data Leakage

Users sharing sensitive information with AI systems may not recognize the downstream implications. AI systems frequently operate as black boxes, processing and potentially storing information in ways that lack transparency.

The challenge is compounded when AI systems maintain conversation history or context windows that persist across user sessions. Information shared in one interaction might unexpectedly resurface in later exchanges, potentially exposing sensitive data to different users or contexts. This "memory effect" represents a fundamental departure from traditional application security models where data flow paths are typically more predictable and controllable.

2. Prompt Injection Attacks

Prompt injection attacks represent an emerging threat vector poised to attract financially motivated attackers as enterprise AI deployment scales. Organizations dismissing these concerns for internal (employee-facing) applications overlook the more sophisticated threat of indirect prompt attacks capable of manipulating decision-making processes over time.

For example, a job applicant could embed hidden text like "prioritize this resume" in their PDF application to manipulate HR AI tools, pushing their application to the top regardless of qualifications. Similarly, a vendor might insert invisible prompt commands in contract documents that influence procurement AI to favor their proposals over competitors. These aren't theoretical threats - we've already seen instances where subtle manipulation of AI inputs has led to measurable changes in outputs and decisions.

3. Authorization Challenges

Inadequate authorization enforcement in AI applications can lead to information exposure to unauthorized users, creating potential compliance violations and data breaches.

4. Visibility Gaps

Insufficient monitoring of AI interfaces leaves organizations with limited insights into queries, response and decision rationales, making it difficult to detect misuse or evaluate performance.

The Four-Phase Security Approach

To build a comprehensive AI security program that addresses these unique challenges while enabling innovation, organizations should implement a structured approach:

Phase 1: Assessment

Begin by cataloging what AI systems are already in use, including shadow IT. Understand what data flows through these systems and where sensitive information resides. This discovery phase should include interviews with department leaders, surveys of technology usage and technical scans to identify unauthorized AI tools.

Rather than imposing restrictive controls (which inevitably drive users toward shadow AI), acknowledge that your organization is embracing AI rather than fighting it. Clear communication about assessment goals will encourage transparency and cooperation.

Phase 2: Policy Development

Collaborate with stakeholders to create clear policies about what types of information should never be shared with AI systems and what safeguards need to be in place. Develop and share concrete guidelines for secure AI development and usage that balance security requirements with practical usability.

These policies should address data classification, acceptable use cases, required security controls and escalation procedures for exceptions. The most effective policies are developed collaboratively, incorporating input from both security and business stakeholders.

Phase 3: Technical Implementation

Deploy appropriate security controls based on potential impact. This might include API-based redaction services, authentication mechanisms and monitoring tools. The implementation phase should prioritize automation wherever possible.

Manual review processes simply cannot scale to meet the volume and velocity of AI interactions. Instead, focus on implementing guardrails that can programmatically identify and protect sensitive information in real-time, without creating friction that might drive users toward unsanctioned alternatives. Create structured partnerships between security and engineering teams, where both share responsibility for secure AI implementation.

Phase 4: Education and Awareness

Educate users about AI security. Help them understand what information is appropriate to share and how to use AI systems safely. Training should be role-specific, providing relevant examples that resonate with different user groups.

Regular updates on emerging threats and best practices will keep security awareness current as the AI landscape evolves. Recognize departments that successfully balance innovation with security to create positive incentives for compliance.

Looking Ahead

As AI becomes increasingly embedded throughout enterprise processes, security approaches must evolve to address emerging challenges. Organizations viewing AI security as an enabler rather than an impediment will gain competitive advantages in their transformation journeys.

Through improved governance frameworks, effective controls and cross-functional collaboration, enterprises can leverage AI's transformative potential while mitigating its unique challenges.

We've listed the best online cybersecurity courses.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

Today's NYT Mini Crossword Answers for Thursday, July 10

CNET News - Wed, 07/09/2025 - 21:48
Here are the answers for The New York Times Mini Crossword for July 10.
Categories: Technology

6 Natural Sugar Substitutes To Satisfy Your Cravings

CNET News - Wed, 07/09/2025 - 17:24
Cut your processed sugar intake with these tasty alternatives.
Categories: Technology

Today's NYT Connections: Sports Edition Hints and Answers for July 10, #290

CNET News - Wed, 07/09/2025 - 17:24
Here are hints and the answers for the NYT Connections: Sports Edition puzzle for July 10, No. 290
Categories: Technology

Ceramic-based startup wants to put more than 100,000TB in a 42U rack by 2030 — but it will take almost 50 years to fill it up

TechRadar News - Wed, 07/09/2025 - 16:46
  • The first-generation system is slower than tape but aims to scale up rapidly by 2030
  • Cerabyte’s roadmap involves physics so advanced it sounds like sci-fi with helium ion beams
  • Long-term capacity hinges on speculative tech that doesn’t yet exist outside lab settings

Munich-based startup Cerabyte is developing what it claims could become a disruptive alternative to magnetic tape in archival data storage.

Using femtosecond lasers to etch data onto ceramic layers within glass tablets, the company envisions racks holding more than 100 petabytes (100,000TB) of data by the end of the decade.

Yet despite these bold goals, practical constraints mean it may take decades before such capacity sees real-world usage.

The journey to 100PB racks starts with slower, first-generation systems

CMO and co-founder Martin Kunze outlined the vision at the recent A3 Tech Live event, noting the system draws on “femtosecond laser etching of a ceramic recording layer on a glass tablet substrate.”

These tablets are housed in cartridges and shuttled by robotic arms inside tape library-style cabinets, a familiar setup with an unconventional twist.

The pilot system, expected by 2026, aims to deliver 1 petabyte per rack with a 90-second time to the first byte and just 100MBps in sustained bandwidth.

Over several refresh cycles, Cerabyte claims that performance will increase, and by 2029 or 2030, it anticipates “a 100-plus PB archival storage rack with 2GBps bandwidth and sub-10-second time to first byte.”

The company’s long-term projections are even more ambitious, and it believes that femtosecond laser technology could evolve into “a particle beam matrix tech” capable of reducing bit size from 300nm to 3nm.

With helium ion beam writing by 2045, Cerabyte imagines a system holding up to 100,000PB in a single rack.

However, such claims are steeped in speculative physics and should, as the report says, be “marveled at but discounted as realizable technology for the time being.”

Cerabyte’s stated advantages over competitors such as Microsoft’s Project Silica, Holomem, and DNA storage include greater media longevity, faster access times, and lower cost per terabyte.

“Lasting more than 100 years compared to tape’s 7 to 15 years,” said Kunze, the solution is designed to handle long-term storage with lower environmental impact.

He also stated the technology could ship data “at 1–2GBps versus tape’s 1GBps,” and “cost $1 per TB against tape’s $2 per TB.”

So far, the company has secured around $10 million in seed capital and over $4 million in grants.

It is now seeking A-round VC funding, with backers including Western Digital, Pure Storage, and In-Q-Tel.

Whether Cerabyte becomes a viable alternative to traditional archival storage methods or ends up as another theoretical advance depends not just on density, but on long-term reliability and cost-effectiveness.

Even if it doesn't become a practical alternative to large HDDs by 2045, Cerabyte’s work may still influence the future of long-term data storage, just not on the timeline it projects.

Via Blocksandfiles

You might also like
Categories: Technology

PlayStation Plus Subscribers Can Get Chromed Out in Cyberpunk 2077 Now

CNET News - Wed, 07/09/2025 - 16:02
Subscribers -- and their kids -- can also play other games on PlayStation Plus, like the Bluey game, soon.
Categories: Technology

An OpenAI Web Browser Is Imminent, Report Says. That Would Really Shake Up the Web

CNET News - Wed, 07/09/2025 - 15:41
An AI-powered browser from the ChatGPT maker would inevitably compete with Google Chrome.
Categories: Technology

This is the weirdest looking AI MAX+ 395 Mini PC that I've ever seen — and you can apparently hold it comfortably in the palm of your hand

TechRadar News - Wed, 07/09/2025 - 15:34
  • AOOSTAR’s NEX395 has the power, but the cooling system remains a complete mystery
  • Radeon 8060S beats RX 7600 XT in specs, making external GPU pairing confusing
  • Without OCuLink, the eGPU dock likely suffers major bottlenecks in real-world tasks

AOOSTAR NEX395 is the latest in a growing field of AI-focused mini PCs which comes in a box-like casing that departs from the more common designs found in the segment.

The company says the NEX395 uses AMD’s flagship Strix Halo processor, a 16-core, 32-thread chip with boost speeds up to 5.1GHz.

It includes 40 RONA 3.5 compute units and appears to support up to 128GB of memory, most likely LPDDR5X given the compact casing.

Memory capacity matches rivals, but key hardware details are missing

This level of memory is in line with other mini PCs targeting AI development workflows, especially those involving large language models.

However, no details have been confirmed regarding storage, cooling, or motherboard layout.

The device looks more like an oversized SSD enclosure or an external GPU dock than a full-fledged desktop system.

Its slim, rectangular, vent-heavy design completely deviates from the usual cube or NUC-style mini PCs.

Holding it in your palm feels more like gripping a chunky power bank or a Mac mini cut in half, definitely not what you’d expect from a 16-core AI workstation.

The layout makes you question where the thermal headroom or upgradable internals even fit.

The AOOSTAR NEX395 includes an integrated Radeon 8060S GPU, part of the Ryzen AI MAX+ 395 APU.

However, it also sells an external eGPU enclosure featuring the Radeon RX 7600 XT.

Given that the integrated GPU already offers a newer architecture and more compute units than the RX 7600 XT, the use case for pairing the two is unclear.

Also, the NEX395 does not appear to support high-speed eGPU connectivity like OCuLink, which would limit bandwidth for external graphics support.

Port selection includes dual Ethernet ports, four USB-A ports, USB-C, HDMI, and DisplayPort outputs, along with a dedicated power input, suggesting reliance on an external power brick.

Without confirmed thermal design or sustained performance metrics, it’s unclear whether this system can function reliably in roles normally filled by the best workstation PC or best business PC options.

Unfortunately, the pricing details for the NEX395 are currently unavailable.

Given the $1500–$2000 range of comparable models such as the HP Z2 Mini G1a and GMKTEC EVO-X2, AOOSTAR’s model is unlikely to be cheap.

Via Videocardz

You might also like
Categories: Technology

Today's NYT Connections Hints, Answers and Help for July 10, #760

CNET News - Wed, 07/09/2025 - 15:00
Here are some hints and the answers for the NYT Connections puzzle for July 10, #760.
Categories: Technology

Today's NYT Connections Hints, Answers and Help for July 10, #760

CNET News - Wed, 07/09/2025 - 15:00
Here are some hints and the answers for the NYT Connections puzzle for Thursday, July 10, No. 760.
Categories: Technology

Today's Wordle Hints, Answer and Help for July 10, #1482

CNET News - Wed, 07/09/2025 - 15:00
Here are hints and the answer for today's Wordle for July 10, No. 1,482.
Categories: Technology

Today's NYT Strands Hints, Answers and Help for July 10 #494

CNET News - Wed, 07/09/2025 - 15:00
Here are hints and answers for the NYT Strands puzzle for July 10, No. 494.
Categories: Technology

AMD is surpassing Nvidia in one particular market, and I don't understand why — 11th eGPU based on AMD Radeon RX 7000 series debuts and even has Thunderbolt 5

TechRadar News - Wed, 07/09/2025 - 14:24
  • OnexGPU Lite reuses the same chip but adds Thunderbolt 5 to stay relevant
  • The RX 7600M XT continues to show up while RDNA4 remains nowhere in sight
  • AMD keeps winning eGPU slots while Nvidia remains largely absent from this niche segment

The external GPU market has been quietly evolving in recent years, and AMD appears to be securing a rather strange lead in this niche.

The debut of OnexGPU Lite makes it the 11th known eGPU powered by an AMD Radeon RX 7000 series chip, and it’s now clear vendors are consistently choosing AMD over Nvidia for their modular graphics solutions.

However, the reason(s) behind this momentum remains unclear, especially when broader market trends still favor Nvidia for desktop and mobile gaming.

Thunderbolt 5 takes the spotlight

The OnexGPU Lite is the latest entry in a growing list of eGPUs using the Radeon RX 7600M XT, a mobile RDNA3 GPU with a known 120W power ceiling.

Although not the best GPU in AMD's lineup, it has become a go-to for modular setups.

According to Onexplayer, the Lite version is currently undergoing beta testing and will launch "soon," but there is no confirmed price, release date, or detailed spec sheet.

Unlike the higher-end OnexGPU 2 that features the Radeon RX 7800M, the Lite version isn’t targeting raw power.

Instead, it seems designed to balance portability and futureproofing, with one key upgrade: support for Thunderbolt 5.

This is a notable development, as it marks one of the first eGPUs to adopt the new interface.

Onexplayer claims Thunderbolt 5 will mean "PCIe bandwidth will be doubled," although the actual PCIe tunneling remains at 64Gbps, the same as OCuLink.

What sets Thunderbolt 5 apart is its ability to support both power delivery and display output over a single cable, features that OCuLink lacks.

This emphasis on all-in-one connectivity is likely to appeal to creators using a laptop for video editing or for Photoshop.

For them, fewer cables and more streamlined setups can make a real difference.

Still, the reliance on the RX 7600M XT, with no sign of RDNA4 hardware on the horizon, does raise questions about performance ceilings.

That said, it appears that the selling point of this device will be the inclusion of Thunderbolt 5, but whether this will justify its place in a market still searching for a truly compelling external graphics solution remains to be seen.

Without more powerful mobile chips available, vendors are essentially repackaging the same core GPU in new chassis with slightly upgraded ports.

The AMD-centric trend in the eGPU space might seem surprising, but it could reflect pricing, power efficiency, or driver integration preferences.

Via Videocardz

You might also like
Categories: Technology

I'm a Samsung User and Almost Never See Galaxy AI on My Phone

CNET News - Wed, 07/09/2025 - 14:14
Commentary: Samsung touts its AI features as transformative to your mobile experience. How come my experience feels pretty much the same?
Categories: Technology

SAVE Borrowers, Your Student Loans Will Start Accruing Interest Again on Aug. 1: What to Know

CNET News - Wed, 07/09/2025 - 14:11
Nearly 8 million borrowers will be notified starting as early as July 10 to select a new payment plan.
Categories: Technology

'The situation is so fluid, I feel like I have to check my phone right now' – Samsung exec on the impact of US tariffs on the mobile business

TechRadar News - Wed, 07/09/2025 - 14:00

This week, Samsung is boldly unveiling some of its most remarkable folding phones ever. They're thinner, lighter, smarter, and, yes, more expensive. In the US, at least, that trend may continue in more dramatic fashion if President Trump moves forward with his 25% tariff on goods produced in South Korea.

Perhaps you didn't realize that despite its ubiquity in the US, Samsung is based and operated out of South Korea. Like many global tech companies, it manufactures products at its home base, as well as in Vietnam, India, and Taiwan. In the US, President Donald Trump is trying to drag manufacturing back to the US shores and doing so mostly through the coercion of tariffs, which are basically taxes applied to all goods shipped into the US. It's a cost that some worry will eventually be passed along to the consumer.

While not directly addressing the price of the now more expensive Galaxy Z Fold 7 and other Samsung mobile devices, Samsung Executive Vice President of Mobile Experience Dave Das said, during a Samsung Unpacked breakfast panel this week in response to a question on the impact of tariffs, "I'll say the chips have not fully fallen where they may."

Das joked, "The situation is so fluid, so rapidly changing, that I feel like I have to check my phone right now to make sure whatever I'm saying is still applicable."

Samsung, Das contends, could be in a better position – at least as it refers to mobile products – to weather these fast-changing global trade circumstances. "I think one of Samsung's greatest strengths is how agile and flexible we are," said Das, referring to Samsung's skills in manufacturing and supply chain management.

His team is gaming out various scenarios, but they are also keeping the lines of communication open. "We are working closely with this administration to ensure that no matter what, Samsung is able to deliver the best products, the best experiences, the best services to US consumers at an attractive price and a competitive price."

Das didn't talk specifically about any product or reference the $100 price increase on the latest Z Fold model, though it's fair to assume that this adjustment is less about tariff concerns and more about more expensive components (the new 200MP sensor) and manufacturing (4.2mm thickness).

A dynamic situation

Flexibility in the rapidly evolving tariff picture is key, noted Das, adding that the team wants to manage and "work with the administration, again, to ensure we stay on course, and focused and we're delivering great products."

It's a solid and rational answer in the face of what may be some irrational forces. Keeping track of where the US Administration is applying tariffs and by how much is almost impossible because it has changed if not by the hour, then certainly by the day.

As I write this, the tariffs on South Korea could equal 25%. By the time you read it, it could be lower or higher. What will matter to consumers most, though, is what they'll be paying for the Samsung Galaxy Z Flip 7, 7 Flip FE, Z Fold 7, and all those wonderful Galaxy S25 handsets.

You may also like
Categories: Technology

We've Rounded Up Carrier Deals for the New Samsung Galaxy Z Fold 7, Z Flip 7 and Watch 8

CNET News - Wed, 07/09/2025 - 13:37
Check out the best deals from major carriers for Samsung's thin and light foldable phones.
Categories: Technology

Yet another mini PC vendor launches an eGPU — but AOOStar's Radeon RX 7600XT can be used with a USB 4.0 port and is actually affordable

TechRadar News - Wed, 07/09/2025 - 13:27
  • AOOStar RX 7600XT eGPU runs hot but slower than its full desktop counterpart
  • Not all laptops will handle USB4 reverse charging or PCIe 4.0 bandwidth gracefully
  • At 61 decibels, the cooling solution trades thermal control for constant ambient noise

As more compact computing solutions crowd the market, mini PC vendors are increasingly turning to external graphics units to offer an upgrade path.

AOOStar is the latest to join this trend with the release of its XG76XT eGPU, built around AMD’s Radeon RX 7600XT and supporting 16GB of GDDR6 memory on a 128-bit interface.

This desktop-grade GPU is based on the RDNA 3 architecture, built using a 6nm process, and features 32 compute units.

Performance limits and thermal design

Marketed as a modular solution for users seeking to enhance visual performance without transitioning to a full desktop, the device’s specifications appear solid on paper.

The graphics processor supports a game clock of 2470 MHz and a power ceiling of 150W in this enclosure, down from the GPU’s full desktop TGP of 190W.

This limitation could affect sustained performance, especially in thermally demanding applications.

However, for those seeking a compromise between mobile convenience and graphical muscle, it may offer a boost, particularly when integrated GPUs fall short for tasks such as editing high-res images or handling multiple 4K displays.

The enclosure includes a custom vapor chamber cooling solution, a full copper heatsink, and a fan housed under a honeycomb-style top grill.

While this setup appears capable of keeping thermals in check, the noise level under load reportedly reaches up to 61 decibels.

That’s not whisper-quiet by any standard, and it could be disruptive in shared or silent workspaces.

AOOStar XG76XT supports both Oculink and USB4, which allow hot swapping and offer up to 100W reverse power delivery, potentially charging your laptop over the same cable.

This might seem convenient for those using a laptop for video editing or for Photoshop, although not all systems will support these features equally.

USB4 relies on PCIe 4.0 lanes, which improve bandwidth over legacy eGPU approaches, but performance bottlenecks compared to internal GPUs are still possible.

On the display side, the XG76XT features one HDMI 2.1 port, two DisplayPort 2.1 outputs, and a Type-C port that supports DisplayPort 1.4 with 15W power delivery.

At ¥3399 (roughly $470), the pricing is not unreasonable for an eGPU with a current-generation GPU.

Yet for anyone looking for the best GPU for demanding creative work or high-end gaming, internal desktop cards in a traditional tower still offer better performance per dollar.

At the time of writing, this device is out of stock and there is no confirmed global release or restock date.

Via Videocardz

You might also like
Categories: Technology

Pages

Subscribe to The Vortex aggregator - Technology