Error message

  • Warning: date_timezone_set() expects parameter 1 to be DateTime, bool given in format_date() (line 2062 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, bool given in format_date() (line 2072 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, bool given in format_date() (line 2062 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, bool given in format_date() (line 2072 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, bool given in format_date() (line 2062 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, bool given in format_date() (line 2072 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, bool given in format_date() (line 2062 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, bool given in format_date() (line 2072 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, bool given in format_date() (line 2062 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, bool given in format_date() (line 2072 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, bool given in format_date() (line 2062 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, bool given in format_date() (line 2072 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_timezone_set() expects parameter 1 to be DateTime, bool given in format_date() (line 2062 of /home/cay45lq1/public_html/includes/common.inc).
  • Warning: date_format() expects parameter 1 to be DateTimeInterface, bool given in format_date() (line 2072 of /home/cay45lq1/public_html/includes/common.inc).
  • Deprecated function: implode(): Passing glue string after array is deprecated. Swap the parameters in drupal_get_feeds() (line 394 of /home/cay45lq1/public_html/includes/common.inc).
  • Deprecated function: The each() function is deprecated. This message will be suppressed on further calls in menu_set_active_trail() (line 2405 of /home/cay45lq1/public_html/includes/menu.inc).

Technology

New forum topics

Supercharge your phone with the ultimate wireless power-up

TechRadar News - Wed, 07/23/2025 - 03:46

What's better than wireless charging? Even faster wireless charging. The latest Qi2.2 wireless charging standard makes wireless power much faster, much smarter and even more useful – and while several brands have recently obtained Qi2.2 certification, Baseus is the first to publicly release visuals and detailed specifications of three certified devices. So while others make promises, Baseus is already making Qi2.2 products.

That means Baseus customers will be among the very first people to get a massive wireless power-up.

The AM52 is a super-slim power bank with speedy 25W wireless charging (Image credit: Baseus)Why Qi2.2 is brilliant news for you

Qi2.2 is the very latest version of the world's favourite wireless charging standard. Qi charging is supported by all the big names in smartphones and accessories, delivering convenient and safe wireless charging for all kinds of devices. And the latest version is the best yet. Qi2.2 is much faster, even more efficient and even safer.

There are three key parts to Qi2.2: supercharged wireless power, smarter heat control and magnetic precision. The first means that instead of maxing out at 15W of power like existing wireless chargers do, Qi2.2 can push the limit to 25W. That means much faster charging and less time waiting: Qi2.2 can charge your phone up to 67% faster than Qi2.0.

Wireless charging generates heat, and Qi2.2 keeps that down with next-generation thermal regulation, stricter surface temperature limits and improved coils. And the new Magnetic Power Profile (MPP) built into the standard ensures more precise alignment with your phone, reducing energy waste and improving charging efficiency by 15% whether you're charging in the car, at home or on the go.

The powerful PicoGo AM61 comes with its own USB-C cable so you can charge wired and wirelessly at the same time. (Image credit: Baseus)Qi2.2 is made for everything everywhere

Qi2.2 is made to work across all kinds of devices from the iPhone 12 and endless Androids to future models that haven't even been made yet. And while it's focused on the future it's also fully backwards compatible: your Baseus Qi2.2 power bank or charger will happily power up a device made for older Qi standards, and Qi phone cases can add wireless charging capability to older phones that weren't built with wireless charging inside.

Baseus is the industry leader in Qi2.2 charging, and it's just launched three new products that take full advantage of Qi2.2's extra power and improved efficiency: two powerful PicoGo magnetic power banks for any device and a really useful foldable 3-in-1 PicoGo charger for your phone, earbuds and smartwatch.

The two magnetic power banks are the PicoGo AM61 Magnetic Power Bank and the PicoGo AM52 Ultra-Slim Magnetic Power Bank. Both versions deliver a massive 10,000mAh of power, both have a 45W USB-C charging port so you can charge two things at once, and both can charge your device wirelessly at up to 25W via the new Qi2.2 standard without any danger of overheating.

The AM52's ultra-slim design features a graphene and aluminium shell for heat dissipation and smart temperature control that protects all of your devices while charging, and the slightly larger AM61includes a built-in USB-C cable for extra convenience.

If you're looking for a super-speedy compact charger, you'll love the PicoGo AF21 foldable 3-in-1 wireless charger. It delivers the same super-fast 25W wireless charging as its siblings, and with a total 35W of power across its three modules it can wirelessly power up not just your phone but your earbuds and smartwatch too.

That makes it an ideal bedside charger as well as a great travel charger: it’s extremely small at just 75.5 x 80 x 38.11am and it’s highly adjustable for optimal viewing and charging. You can rotate the watch panel 180º, adjust the phone panel through 115 degrees and adjust the base bracket too.

The PicoGo AF21 foldable 3-in-1 wireless charger is super-portable and extremely adjustable. (Image credit: Baseus)Ride the next wireless wave with Baseus' brilliant power-ups

Baseus is setting the standard for Qi2.2 wireless charging, and whether you grab the powerful dual-charging PicoGo AM61, the super-slim PicoGo AM52 or the multi-talented PicoGo AF21 charger you're getting the latest, greatest and fastest charging for your phone. With Qi2.2 Baseus isn't just riding the next wireless wave. It's shaping it.

The Baseus PicoGo AM61 Magnetic Power Bank, PicoGo AM52 Magnetic Power Bank and PicoGo AF21 3-in-1 Foldable 3-in-1 Wireless Charger will all be available this August, and you'll be able to order them directly from Baseus’s website and from major retailers such as Amazon.

Categories: Technology

Secure your supply chain with these 3 strategic steps

TechRadar News - Wed, 07/23/2025 - 03:38

Third-party attacks are one of the most prominent trends within the threat landscape, showing no signs of slowing down, as demonstrated by recent high-profile cyber incidents in the retail sector.

Third-party attacks are very attractive to cybercriminals: threat actors drastically increase their chances of success and return on investment by exploiting their victims’ supplier networks or open-source technology that numerous organizations rely on.

A supply chain attack is one attack with multiple victims, with exponentially growing costs for the those within the supply chain as well as significant financial, operational and reputational risk for their customers.

In a nutshell, in the era of digitization, IT automation and outsourcing, third-party risk is impossible to eliminate.

Global, multi-tiered and more complex supply chains

With supply chains becoming global, multi-tiered and more complex than they have ever been, third-party risks are increasingly hard to understand.

Supply chain attacks can be extremely sophisticated, hard to detect and hard to prevent. Sometimes the most innocuous utilities can be used to initiate a wide-scale attack. Vulnerable software components that modern IT infrastructures run on are difficult to identify and secure.

So, what can organizations do to improve their defenses against third-party risk? We have outlined three areas organizations can take to build meaningful resilience against third-party cyber risk:

1. Identify and mitigate potential vulnerabilities across the supply chain

Understanding third-party risk is a significant step towards its reduction. This involves several practical steps, such as:

i) Define responsibility for supply chain cyber risk management ownership. This role often falls between two stools - the internal security teams who will focus primarily on protecting the customer, while the compliance and third-party risk management programs who own responsibility for third party risk and conduct, but don’t feel confident addressing cyber risks given their technical bias.

ii) Identify, inventory and categorize third parties, to determine the most critical supplier relationships. From a cyber security perspective, it is important to identify suppliers who have access to your data, access into your environment, those who manage components of your IT management, those who provide critical software, and – last but not least – those suppliers who have an operational impact on your business.

This is a challenging task, especially for large organizations with complex supply chains, and often requires security teams to work together with procurement, finance and other business teams to identify the entire universe of supplier relationships, then filter out those out of scope from a cyber security perspective.

Assess risk exposure by understanding the security controls suppliers deploy within their estate or the security practices they follow during the software development process, and highlight potential gaps. It is important to follow this up with agreement on the remediation actions acceptable to both sides, and to work towards their satisfactory closure. The reality is that suppliers are not always able to implement the security controls their clients require.

Sometimes this leads to client organizations implementing additional resilience measures in-house instead – often dependent on the strength of the relationship and the nature of the security gaps.

Move away from point-in-time assessments to continuous monitoring, utilizing automation and open-source intelligence to enrich the control assessment process. In practice, this may involve identifying suppliers’ attack surfaces and vulnerable externally-facing assets, monitoring for changes of ownership, identifying indicators of data leaks and incidents affecting critical third parties, and monitoring for new subcontractor relationships.

2. Prepare for supply chain compromise scenarios

Regrettably, even mature organizations with developed third-party risk management programs get compromised.

Supply chain attacks have led to some of the most striking headlines about cyber hacks in recent years and are increasingly becoming the method of choice for criminals who want to hit as many victims as possible, as well as for sophisticated actors who want to remain undetected while they access sensitive data.

Preparedness and resilience are quickly becoming essential tools in the kit bag of organizations relying on critical third parties.

In practice, the measures that organizations can introduce to prepare for third-party compromise include:

i) Including suppliers in your business continuity plans. For important business processes that rely on critical suppliers or third-party technology, understand the business impact, data recovery time and point objectives, workarounds, and recovery options available to continue operating during a disruption.

ii) Exercising cyber-attack scenarios with critical third parties in order to develop muscle memory and effective ways of working during a cyber attack that may affect both the third party and the client. Ensure both sides have access to the right points of contact – and their deputies – to report an incident and work together on recovery in a high-pressure situation.

iii) Introducing redundancies across the supply chain to eliminate single points of failure. This is a difficult task, especially in relation to legacy suppliers providing unique services or products. However, understanding your options and available substitutes will reduce dependency on suppliers and provide access to workarounds during disruptive events such as a supply chain compromise.

3. Secure your own estate (monitor third-party access, contractual obligations)

Protecting your own estate is as important as reducing exposure to third-party risk. Strengthening your internal defenses to mitigate damage if a third party is compromised involves a number of important good practice measures, including but not limited to:

i) Enhanced security monitoring of third-party user activity on your network,

ii) Regular review of access permissions granted to third-party users across your network, including timely termination of leavers,

iii) Continuous identification and monitoring of your own external attack surface, including new internet-facing assets and vulnerable remote access methods,

iv) Employee security training and social engineering awareness, including implementation of additional security verification procedures to prevent impersonation of employees and third parties.

Security vetting of third-party users with access to your environment or data

As third-party threats evolve and become more prominent, organizations must have a clear view of who they’re connected to and the risks those connections pose. An end-to-end approach to cyber due diligence, encompassing assessment, monitoring, and response capabilities to threats across their supply chains before damage is done.

Third-party risk will remain a challenge for many organizations for years to come, especially as more threat actor groups begin to explore supply chain compromise as an attractive tactic, offering high rewards with relatively low resistance.

Regulators across all sectors are beginning to pay greater attention to supply chain security. Frameworks such as DORA, NIS2 and the Cyber Resilience Act reflect the growing concerns that supply chain security must be a key component of digital strategy. Those who lead on this issue will be best placed to navigate supply chain compromise.

We list the best identity management software.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

Wikidata’s next leap: the open database powering tomorrow’s AI and Wikipedia

TechRadar News - Wed, 07/23/2025 - 01:00

Many people have never heard of Wikidata, yet it’s a thriving knowledge graph that powers enterprise IT projects, AI assistants, civic tech, and even Wikipedia’s data backbone. As one of the world’s largest freely editable databases, it makes structured, license-free data available to developers, businesses, and communities tackling global challenges.

With a gleaming new API, an AI-ready initiative, and a long-standing vision of decentralization, Wikidata is redefining open data’s potential. This article explores its real-world impact through projects like AletheiaFact and Sangkalak, its many technical advances, and its community-driven mission to build knowledge “by the people, for the people,” while unassumingly but effectively enhancing Wikipedia’s global reach.

Wikidata’s impact: from enterprise to civic innovation

Launched in 2012 to support Wikipedia’s multilingual content, today Wikidata centralizes structured data — facts like names, dates, and relationships — and streamlines updates across Wikipedia’s language editions. A single edit (like the name of a firm’s CEO) propagates to all linking pages, ensuring consistency for global enterprises and editors alike. And beyond Wikipedia, Wikidata’s machine-readable format makes it ideal for business-tech solutions and ripe for developer innovation.

Wikidata’s database includes over 1.3 billion structured facts and even more connections that link related data together. This massive scale makes it a powerful tool for developers. They can access the data using tools like SPARQL (a query language for exploring linked data) or the EventStreams API for real-time updates. The information is available in a wide variety of tool-friendly formats like JSON-LD, XML, and Turtle. Best of all, the data is freely available under CC-O, making it easy for businesses and startups to build on.

Wikibase’s robust and open infrastructure drives transformative projects. AletheiaFact, a platform for verifying political claims based in São Paulo, harnesses Wikidata’s records to drive civic transparency, empowering communities with trusted government insights and showcasing open knowledge’s transformative impact. In India, Wikidata was used to create a map of medical facilities in Murshidabad district, color-coded by type (sub-centers, hospitals, etc.) , making healthcare access easier.

In Bangladesh, Sangkalak opens up access to Bengali Wikisource texts, unlocking a trove of open knowledge for the region. These projects rely on a mix of SPARQL for fast queries, the REST API for synchronization, and Wikimedia’s Toolforge platform for free hosting, empowering even the smallest of teams to deploy impactful tools.

A lot of large tech companies also use Wikidata’s data. One example is WolframAlpha, which uses Wikidata through its WikidataData function, retrieving data like chemical properties via SPARQL for computational tasks, or analyzing chemical properties. This integration with free and open data streamlines data models, cuts redundancy, and boosts query accuracy for businesses, all with zero proprietary constraints.

Wikidata’s vision: scaling for a trusted, AI-driven future

Handling nearly 500,000 daily edits, Wikidata pushes the limits of MediaWiki, the software it shares with Wikipedia, and the team is working on various areas of scaling Wikidata. As part of this work, a new RESTful API has simplified data access, thereby energizing Paulina, a public domain book discovery tool, and LangChain, an AI framework with strong Wikidata support. Developers enjoy the API’s responsiveness, sparking excitement for Wikidata’s potential in everything from civic platforms like AletheiaFact to quirky experiments.

The REST API release has had immediate impact. For example, developer Daniel Erenrich has used it to integrate access to Wikidata’s data into LangChain, allowing AI agents to retrieve real-time, structured facts directly from Wikidata, which in turn supports generative AI systems in grounding their output in verifiable data. Another example is the aforementioned Paulina, which relies on the API to surface public domain literature from Wikisource, the Internet Archive and more, a fine demonstration of how easier access to open data can enrich cultural discovery.

Then there is the visionary leap of the Wikibase Ecosystem project, which enables organizations to store data in their own federated knowledge graphs using MediaWiki and Wikibase, interconnected according to Linked Open Data standards. Decentralizing the data reduces strain on Wikidata and lets it go on serving core data. With its vision of thousands of interconnected Wikibase instances, this project could create a global open data network, boosting Wikidata’s value for enterprises and communities.

The potential here is enormous: local governments, enterprises, libraries, research labs, and museums could each maintain their own Wikibase instance, contributing regionally relevant data while maintaining interoperability with global systems. Such decentralization makes the platform more resilient and more inclusive, offering open data stewardship at every scale.

Community events drive this mission. WikidataCon, organized by Wikimedia Deutschland and running from 31 October to 2 November 2025, unites developers, editors, and organizations in an effort to refine tools and data quality. Wikidata Days, local meetups and editathons foster collaboration and offer support for budding projects like Paulina. These events embody Wikidata’s ethos of knowledge built by the people, for the people, and help it remain transparent and community-governed.

Wikidata and AI: the Embedding Project and beyond

The Wikidata Embedding Project is an effort to represent Wikidata’s structured knowledge as vectors, enabling generative AI systems to employ up-to-date, verifiable information. It aims to address persistent challenges in AI — such as hallucinations and outdated training data — by grounding machine outputs in curated, reliable sources. This could render applications like virtual assistants significantly more accurate, transparent, and aligned with public knowledge.

The next decade holds promising opportunities for Wikidata’s continued relevance. As enterprise needs become more complex and interconnected, the demand for interoperable, machine-readable, and trusted datasets will only grow. Wikidata is uniquely positioned to meet this demand — remaining free, open, community-driven, and technically adaptable.

Enterprise IT teams will find particular value in Wikidata’s real-time APIs and its nearly 10,000 external identifiers, which link entries across platforms like IMDb, Instagram, and national library systems. These links reduce duplication, streamline data integration, and bridge otherwise isolated datasets. Whether it’s mapping identities across services or enhancing AI with structured facts, Wikidata provides a scalable foundation that saves time and improves precision.

With AI chatbots and large-language models now woven into everything from enterprise search to productivity software, the need for accurate, real-time information is more urgent than ever. Wikidata’s linked data embeddings could herald a new generation of AI tools — blending the speed of automation with the reliability of human-curated, public knowledge.

As AI reshapes the digital landscape, Wikidata stands out as a beacon of trust and collaboration. By empowering developers, enterprises, and communities alike through projects like AletheiaFact and Sangkalak, it supports transparency, civic innovation, and educational equity. With the Embedding Project improving AI accuracy, the Wikibase Ecosystem enabling federated knowledge networks, and events like WikidataCon and Wikidata Days sparking global collaboration, Wikidata is building an accountable future full of open data. More than a knowledge graph, it’s a people-powered infrastructure for the trustworthy web.

I tried 70+ best AI tools.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

I am a privacy expert and this is why I believe user personalization is the future of privacy

TechRadar News - Wed, 07/23/2025 - 00:38

Personalized content is now a fact of life – what was once considered innovative is now standard for online marketing. As anybody who has indulged in a bit of online retail therapy can tell you, websites are now surprisingly accurate in what they recommend, with promotions appearing at just the right time and content adapting as if by magic.

Whilst that’s the super power of personalization, it’s also a bit… disconcerting? As convenient as it might be to see exactly the right product at exactly the right time, this also raises a lot of questions: where does all of this information actually come from? Exactly how much does this company know about me? Did I really consent to sharing all of this data?

These questions are only becoming more frequent as consumers become more aware of the value of their data. A recent Deloitte study showed over two-thirds of smartphone users worry about data security and privacy on their devices, whilst in the US 86% of consumers are more worried about their data privacy than the state of the economy.

These are sobering statistics and beg the question: if consumers are crying out for better data protection, how can businesses enact a privacy-first approach to data-driven personalization?

Thinking strategically about personalization

The first important part of making personalization fit for our privacy-conscious age is ensuring that it’s done with purpose. Thinking strategically about personalization, as opposed to just considering the technical aspects of it, is crucial to building a model which is both useful to a business and respects data privacy demands from consumers.

Personalizing without a clear goal risks losing consumer trust: just because a business can collect a certain piece of data or display content to a specific target group, it doesn’t mean they should. Over-personalization or irrelevant suggestions can cause rejection – especially when it’s unclear where the information comes from, so it is always better to personalize with purpose.

This also applies to the data that businesses collect. Even with consent, users today expect to decide what information they share. The starting point shouldn’t be a tracking script, but a deliberate content strategy: Which data is truly necessary? What do we want to achieve with it? And how can we explain it clearly and understandably?

Doing this properly brings two benefits: the data is legally secure and often significantly better in quality. Transparency also builds trust – which is more important than ever in digital marketing. Instead of asking for a full set of personalization data upfront, businesses should consider asking for smaller data points like a postcode to show local offers. This approach creates value for both sides and, crucially, builds consumer trust.

Segments rather than individuals

Advances in technology now mean that personalization can be really granular – but is that always desirable? In a privacy-conscious world, definitely not.

Not every user wants to be individually addressed, and not every website needs to do so. Often, it’s more effective to tailor content for groups with similar interests, behavior, or needs. Common segments include first-time visitors vs. return users, mobile vs. desktop users, regional audiences, or browsers who never add items to their cart.

Targeting these groups allows for impactful content variation – without the complexity of individual personalization. Privacy preferences can also be respected: cautious users are addressed neutrally, while opt-in users get a more personal experience.

Flexibility is key

Many companies struggle to reconcile data protection and personalization – often because they see them as contradictory. But the opposite is true: taking data protection seriously builds trust and allows for better personalization.

Take consent banners as an example: one which clearly differentiates data types and allows easy management of preferences is more transparent and, so consistent data shows, reduces bounce rates.

The key is to recognize that flexibility on what consumers expect is king. Personalization is not a one-time project and, just as regulation is continuously evolving, so are user expectations. Successful privacy-first personalization means regularly reviewing and adapting content, processes, and technology.

The bottom line is that personalization is not an end in itself. Rather, it’s meant to help deliver the right content to the right audience at the right time – without crossing lines. Focusing on what users truly need and are willing to share often leads to better results than collecting as much data as possible.

A privacy-first approach to personalization isn’t an oxymoron, it’s a necessity in the modern world. Personalization shouldn’t just be a technical concept, but one that places consumers at the heart of what a business does and offers – not just relevant content, but a brand built on clarity, consistency and respect for consumer attitudes towards privacy.

We list the best Linux distro for privacy and security.

This article was produced as part of TechRadarPro's Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro

Categories: Technology

Pages

Subscribe to The Vortex aggregator - Technology