Flappy Bird set the bar (or rather pipes, placed randomly) for mobile game simplicity. Between that and its bizarrely high level of difficulty, it created a devilish blend of game-playing compulsion I have rarely seen, before or since.
When indie developer Dong Nguyen launched it in 2014, it was almost an instant hit. Everyone was desperately tapping on their iPhones and iPad screens in a vain attempt to keep a tiny animated bird aloft without slamming into a series of bright green pipes. There was almost nothing to the classic side-scrolling game, just the flapping bird and pipes racing toward it with small gaps that the bird would fly through – assuming you could tap just enough to keep Flappy flying but not too high or too low.
Most people failed within the first few pipes. Experts, though, could thread through dozens. I still remember watching my youngest's laser-like focus as they navigated Flappy through dozens of pipes. The most I ever did was 13, I think.
Despite the game's extremely high frustration quotient, people played it with the same devotion they now commit to Wordle or Connections. But at least those games are solvable. Flappy Bird really wasn't.
The Flappy crazeAs you may recall, the fascination with the game became a phenomenon and eventually, the intense interest and nonstop attention drove Nguyen into hiding. He removed the game from the app store and has scarcely been seen or heard from again.
Over the years there've been numerous attempts to bring Flappy Bird back. The app was so simple that probably anyone could've coded a new one, but whatever has shown up has not captured the imagination like the original.
Now, there might be a new Flappy Bird, not from Nguyen, but from a legion of fans convinced they can rebuild what was into something new and maybe better.
Naturally, they are as misguided as the game's eponymous character and have as much chance of flying to similar Flappy Bird heights as, well, Flappy Bird navigating through those pipes, Which is to say - not much.
Flappy pic.twitter.com/YbGnLMdYpESeptember 13, 2024
The new Flappy Bird will start off on the wrong..., er... wing by not recreating the original Flappy Bird but by adding, levels, skins, and multi-player features. In other words, they're going to make Flappy Bird on iOS and Android an extremely traditional mobile game. It may even resemble Angry Birds but without cleverness or finesse.
Flappy Bird was not successful because people craved something more or perhaps visually better. They played and played because Flappy Birds triggered some simian part of their brains intent on problem-solving. And Flappy Birds' quest was an almost unsolvable problem. Nguyen programmed it in such a way that there was no fuzziness to the flight control. Instead, it required a sort of tapping precision not seen in another game before or since.
One might argue that many hate-played it in a desperate attempt to beat the Flappy Bird system. Few if any did, and yet we played and played and often complained to Nyguyen on social media (and drove him away).
The new Flappy Bird will invariably be easier. People will win and compare total flight times through the maze. The skill level will be far reduced, but at least you'll have entertaining levels.
Enough with the nostalgiaI don't know why we have to revisit every moment from our past and then take out the defibrillator and restart memory hearts. If we can't revive them, we go all Dr. Frankenstein and rebuild them.
Like Frankenstein's monster, these rebuilt memories bear little resemblance to the originals but they do have just enough to trigger that other simian response: nostalgia. It's why we're rewatching Beetlejuice after nearly 35 years. Sure, that new movie might be good but for every Beetlejuice 2, there's a Land of the Lost (sorry, Will Ferrel).
The return of Flappy Birds will not be cause for celebration, it'll be a reminder that we can't leave well enough alone. I don't want a new Flappy Birds, I want the original, untouched, and back in the App Store so I can fail over and over again until I wish I'd never discovered, or rediscovered, the game in the first place.
You might also likeResearchers at the Department of Instrumentation and Applied Physics (IAP), Indian Institute of Science (IISc) and collaborators have developed a new supercapacitor that can be charged by light.
This innovation could be used in streetlights and self-powered electronic devices, including sensors, as unlike standard capacitors, which store energy electrostatically, supercapacitors use electrochemical methods to hold significantly more energy, allowing them to “release charge more quickly than batteries,” explained Abha Misra, Professor at IISc’s Department of Instrumentation and Applied Physics, and the study’s author.
The supercapacitor's electrodes are made from Zinc Oxide nanorods on a transparent Fluorine-doped Tin Oxide substrate, allowing light to pass through and charge the device. When exposed to ultraviolet (UV) light, the supercapacitor demonstrated a large increase in capacitance - its ability to store electrical energy. “The ideas were simple... but when combined together, they worked very well,” Misra adds.
Necking behaviorIn addition to the impressive increase in capacitance, the researchers discovered two unusual behaviors. First, the device’s capacitance increased with voltage under light exposure, a phenomenon co-author AM Rao from Clemson University refers to as "necking behavior." Second, while energy storage typically decreases when charging faster, the team found that their supercapacitor stored more energy during rapid charging when exposed to UV light.
The research team used a liquid electrolyte to improve performance by enhancing the electric double layer effect, which plays a key role in the high energy storage capacity of supercapacitors. “We have miniaturized supercapacitors to the micron scale so that they can be integrated along with microelectronic chips,” Misra notes, pointing to potential applications in mobile phones and other small devices.
Misra believes this new technology could eventually replace solar cells in streetlights due to its faster charging time and high power density.
The research was published in Journal of Materials Chemistry A, and the team hopes to further develop the supercapacitor to charge using visible and infrared light.
More from TechRadar ProA little over two years ago, I wrote about how integrated graphics were the future of gaming. I stand by the things I said in that article - if anything, recent developments in the computer hardware industry have vindicated me, and further convinced me that we’re seeing the slow death of the graphics card.
That’s right: I think the dedicated GPU is going to go the way of the dodo. It’s an almost heretical thing to say as a long-time PC gamer and system builder; I own more than 500 games on Steam alone, and I’ve built a ridiculous number of PCs both for work and personal use over the years. I’m not afraid of being crucified by Reddit for saying that I both believe and hope that GPUs will die out, but I’d frankly understand if they did so. It’s a dramatic statement to make.
Getting the best graphics card is your number one priority when it comes to building a gaming PC, almost invariably the most expensive component in your system, and it’s a common aspiration among PC players to have a fully tricked-out liquid-cooled build with an RTX 4090 at the center of it all. So why am I so convinced that soon, we won’t need them anymore?
The great graphics shake-upThe answer to that question requires two separate parts: a look at the CPU industry, and a look at the AI industry. As anyone who knows me well will tell you, I think AI is kinda sus, so let’s start with the CPU side of the story.
Earlier this year, we saw the triumphant arrival of Qualcomm’s Snapdragon X Elite chips at Computex 2024. A new challenger in the laptop processor arena, something to finally take aim at Intel’s dominance in the market - something AMD has been trying and failing to do for years. It was a strong showing all around for the new chips, but the part that stuck in my mind the most was seeing an ultrabook without a graphics card run Baldur’s Gate at 4K.
Now I can look at Gale's beautiful face on a thin-and-light laptop without a dedicated GPU. Welcome to the future! (Image credit: Larian Studios)Yes, CPUs with integrated graphics just keep getting better and better, even if Qualcomm itself insists that it doesn’t have real plans to take over the gaming market. It’s not just Snapdragon, either; Intel plans to hit back with powerful gaming performance on its upcoming Lunar Lake chips, and AMD has been enjoying huge success with its custom-tuned chips for PC gaming handhelds like the Asus ROG Ally X, Lenovo Legion Go, and Valve’s Steam Deck. Sure, these chips aren’t going to rival the best 4K graphics cards when it comes to high-end gaming, but they’re more than capable of providing a solid gaming experience.
There’s a key reason why gaming on integrated graphics is now actually feasible, and that’s upscaling software. Tools like Nvidia DLSS, AMD FSR, and Intel XeSS are what make this performance possible; my colleague John Loeffler saw an Asus ZenBook with an Intel Lunar Lake chip at IFA 2024 hit an average fps of 60 in Cyberpunk 2077 on 1080p on medium settings thanks to XeSS - a notoriously demanding game.
All in on AIXeSS and DLSS (though notably not AMD’s competing FSR upscaler) are powered by AI hardware, which gives me a nice segue into my next point: AI is killing the gaming GPU industry, and if it continues at its current pace, it threatens to swallow it entirely.
Nvidia has been making waves in the AI space for a while now. Although a potential slowdown in AI expansion saw Nvidia shares drop last week, the company remains committed to its AI vision: CEO Jensen Huang’s Computex keynote was chock-full of AI-related schemes that may or may not destroy the planet, and the company keeps putting out new AI-powered tools as well as supplying hardware for the training of AI models around the globe.
Jensen Huang won't stop talking about AI, and I don't really blame him - it's made Nvidia a LOT of money. (Image credit: Nvidia)Jensen isn’t alone, either. Earlier this week, AMD Senior VP Jack Huynh revealed in an interview that AMD is seriously targeting the AI market, and a knock-on effect of this is that Team Red will be withdrawing from the high-end GPU race, so we probably won’t be getting a Radeon RX 8900 XTX, at least not anytime soon. Instead, AMD’s consumer efforts will be focused on the budget to midrange space - further closing the performance gap between their discrete graphics cards and new integrated processor graphics (iGPUs).
An ignoble end for the humble graphics card?Simply put, the increasing demand for GPUs for AI projects is incompatible with a future where GPUs are necessary for gaming PCs. It’s been clear for a while now that the focus is no longer on consumer hardware (especially for Nvidia), but with iGPUs improving at a faster rate than traditional graphics cards, it won’t surprise me if RTX 5000 is the final generation of Nvidia GPUs aimed at gamers.
After all, nothing lasts forever. Sound cards and network adaptors were an integral part of custom PC builds for years, but those eventually got swept away as motherboards improved and started to integrate those features. When it comes to the requirements of the average gamer, we’re likely not far off from CPUs that can handle everything you need - even if that’s gaming at higher resolutions.
I won’t weep for the dedicated GPU when it dies, either. Not only are they very expensive, but being able to improve my gaming performance by simply swapping out a single chip would make future system upgrades quicker and easier, as well as allowing for more compact PC builds. Yes, I love my chunky RGB-filled tower, but it takes up too much damn space on my desk.
A group of researchers have identified a security flaw in Apple’s Vision Pro mixed reality headset which let them reconstruct user’s passwords, PINs and messages.
Dubbed ‘GAZEploit’, the researchers used eye-tracking data to allow them to decode what users typed using their eyes with the virtual keyboard.
Since the avatars are visible to other users, the researchers did not have to hack into anything, or to gain access to the user’s headset, they just had to study the eye movements of their avatar. The avatars can use the virtual keyboard to log into Slack, Teams, Twitter, and more.
All patched upThe researchers were able to predict keyboard placement with impressive accuracy, able to deduce the correct letters typed within a maximum of five guesses with over 90% accuracy in messages, 77% of the time for passwords, and 73% of the time for PINs.
The vulnerability was discovered in April, and Apple issued a patch to fix the issue in July, and the avatar will no longer be displayed when the virtual keyboard is being used. It is said to be the first of its kind, and exposes how biometric data can be used to surveil users, the researchers confirmed,
“These technologies … can inadvertently expose critical facial biometrics, including eye-tracking data, through video calls where the user’s virtual avatar mirrors their eye movements,”
Wearable technology has ushered in a new set of privacy concerns for users, with more information captured and stored in people’s day to day lives. Health data, locations, biometric information, could all be used against users if it fell into the wrong hands.
Via Wired
More from TechRadar Pro