The human brain knows no limits. The fact that we only use 10% of it remains a myth, as antennas and bendable batteries are furthering biomedical engineering. Lately, researchers at Wits University in Johannesburg have made the greatest breakthrough yet with the “Brainternet.” (But it’s not exactly what you might think!)
The project works by taking brainwave EEG signals gathered by an Emotiv EEG device connected to the user’s head. The signals are then transmitted to a low cost Raspberry Pi computer, which live streams the data to an application programming interface and displays the data on an open website where anyone can view the activity.
In essence, you can download information about your brain and pretty much study the thing. So, no, you can’t update your Twitter in your sleep. However, the technology is still potentially valuable in transferring brain data.
“Brainternet can be further improved to classify recordings through a smart phone app that will provide data for a machine-learning algorithm. In [the] future, there could be information transferred in both directions – inputs and outputs to the brain,”
Sorry to disappoint you, millennials, but keep in mind that understanding brain functions could make mind-controlling Facebook possible eventually. For now, stick to a MacBook.
I wouldn’t be surprised if one day AI systems ruled the world. While there are skeptics, most people in the technological industry remain pretty optimistic. And with good reason. A system built by researchers at the Georgia Institute of Technology can recreate video games by observing them for only 2 minutes.
The team did this by training the AI on footage of two distinct types of players making their way through Level 1 of Super Mario Brothers. One that adopted an “explorer” style of play and the other a “speedrunner” style, where they headed straight for the goal.
The system managed to rebuild an accurate representation of the game with only minor deviations. It’s impressive and also far less creepy than an AI creating its own language.
“Our AI creates the predictive model without ever accessing the game’s code, and makes significantly more accurate future event predictions than those of convolutional neural networks,”
Okay, so its capabilities are still sort of spooky, but useful, nonetheless. The program, which is algorithm-based, could be vital in pattern recognition, among other things. In the end, the model is an effective training method that can also be easily controlled by users — phew!
Since the aftermath of Hurricanes Harvey and Irma, people have been doing their best to reach out to victims. Nonprofit groups are replacing ruined cars with bikes. Millionaires are welcoming displaced children into their homes. Conceivably, the best way to avoid a disaster is to be able to predict it. Thanks to scientists at MIT, there is a new algorithm that may be able to foresee extreme weather patterns.
Themistoklis Sapsis, associate professor at MIT, [said] “We have applied this framework to turbulent fluid flows… They’re encountered in climate dynamics in the form of extreme rainfall, in engineering fluid flows such as stresses around an airfoil, and acoustic instabilities inside gas turbines,”
The system is complicated — we’ll leave it at that — but it can help us formulate evasion strategies. In the past, engineers relied heavily on mathematical equations in the hopes of being able to identify extreme weather patterns. Nonetheless, the data proved to be lacking.
Sapsis said that the framework is generalisable enough to apply to a wide range of systems in which extreme events may occur. He plans to apply the technique to scenarios in which fluid flows against a boundary or wall, such as air flows around jet planes, and ocean currents against oil risers.
In their fear and dislike of Math, people tend to forget that more than just being fancy abstract numbers, algorithms do have practical, visible, and useful manifestations. And this isn’t just me promoting the subject.
In the past years, cancer treatments have flourished in abundance and effectivity. Experimental medications such as personalized vaccines and gene altering have made for smoother recoveries. At any rate, discovering such conditions remains tricky, if not for a simple blood test. The new method can detect eight common but evasive cancers.
“The sort of ultimate vision is that at the same time that you are getting your cholesterol checked when you are getting your annual physical, you will also get your blood screened for cancer,” said lead study author Joshua Cohen.
The test, CancerSEEK, sifts through cancer compounds that allow for early detection. It can even pinpoint cancers without current screening tests — that is, ovarian, stomach, esophageal, liver, and pancreatic. The process is a melting pot of new technologies such as artificial intelligence and algorithms.
“The test needs to be validated in a large-scale study that would evaluate tens of thousands of healthy individuals to confirm the sensitivity and specificity,” Cohen said.
Though CancerSEEK’s accuracy levels for early testing remain at 60%, it’s a step up from having no means of diagnosis to begin with. It’s a slow and steady affair that will hopefully, one day, win the race.
At the rate technology is advancing, we can teach machines to do almost anything. From programming drones to plant trees to manufacturing a robot that can detect water pollution, gadgets have become more capable than ever. But can we train a device to rely entirely on nature? Microsoft is teaching this motor-less plane to fly just like a bird.
The researchers have found that through a complex set of AI algorithms, they can get their 16 1/2-foot, 12 1/2-pound aircraft to soar much like a hawk would, by identifying things like air temperature and wind direction to locate thermals — invisible columns of air that rise due to heat.
The plane is one of the only AI systems to act based on predictions it makes. In a nutshell, it is is akin to a simple thinking being.
“Birds do this seamlessly, and all they’re doing is harnessing nature. And they do it with a peanut-sized brain,”
If successful, the planes could be implemented in farming and providing internet connections to remote areas. If cars can drive themselves, planes can follow in their tread marks.
People have very differing opinions on the value of artificial intelligence. Some are skeptical, while others are optimistic. Either way, there is no denying that AI is becoming increasingly powerful. In fact, they can now create realistic worlds based on their memories.
[The] AI works from rough layouts that tell it what should be in each part of the image. The centre of the image might be labelled “road” while other sections are labelled “trees” or “cars” – it’s painting by numbers for an AI artist.
The AI, called an imaginative neural network, functions on an algorithm that essentially knows what goes where.
[Creator] Chen’s system starts by processing a photo of a real street it hasn’t seen before, but that has been labelled so the AI knows which bits are supposed to be cars, people, roads and so on. The AI then uses this layout as a guide to generate a completely new image.
While developers are trying to come up with a more practical use for the technology, it’s safe to say future video games will be out of this world.
Now that drones have proven themselves vital in the technological universe, gadget firms are pushing its limits even further. From delivering blood transfusions to restoring forests, drones are now making more conventional deliveries. Thanks to Alphabet, Australians will be receiving spontaneous burrito bags by — you guessed it — drone.
Project Wing has teamed up with Mexican food chain Guzman y Gomez, along with pharmacy chain Chemist Warehouse to allow customers to order items through a dedicated app. The drones are then sent off to collect goods from the stores’ loading sites and dropping off to the testers at their homes, traveling at up to 120 km/h.
Part of Project Wing, the drones are giving Alphabet the breakthrough they’ve been after. The company is targeting the Australian Capital Territory, which is a 40-minute round trip to the nearest store. Gauging from the success of rural deliveries, Alphabet is challenging the precision of its drones.
Project Wing is training its drones to deliver items anywhere, using its sensors to identify new obstacles and each time that it does so, improving the onboard algorithms and its capacity to pick out a safe spot for delivery.
While accuracy is a must for any drone-related activity, I wouldn’t mind a splattered burrito. Anyway, it’s all about taste.
Since the birth of the smartphone, Google, Apple, and Android have been working to make newer models appropriate for… well, everything. Not only are they a source of entertainment — they are becoming equally health-centered. But smartphones cater to everyone, including amateur and professional photographers. This new algorithm edits phone photos before you even take them.
Machine learning networks were set to work on a database of 5,000 sample images improved by five professional photographers, teaching the software how to tweak a picture to get it looking its best.
Because of resolution issues, the algorithm processes in low-quality, later scaling up results without ruining the image. Using this mechanism, the app uses only a hundredth of the phone’s memory. Like most sterling apps, it also comes with additional features.
As well as brightening dark spots and balancing contrast, for example, the algorithms could even mimic the style of a particular photographer.
Does this mean my work has the potential to exhibit at the MET? While the app makes photography seem easy, let’s not forget that snapping a great picture also takes a level of skill.
I’ve seen my fair share of extraneous inventions — from automatic toothbrushes to bulletproof skateboards. While they may seem somewhat pointless, they are also incredibly fascinating. In line with these high-tech novelty items is the Hyperface, an AI mask that visually communicates human emotions.
The Hyperface is worn like a visor, but includes a transparent screen to flip in front of the eyes. A screen at the top of the visor reflects an image onto the one over the wearer’s eyes, making it look like a digital face. Someone looking at the wearer would see a pair of digital eyes staring back at them, which change based on the wearer’s facial expressions.
How does the Hyperface analyze what its wearer is feeling? Simple — a secret algorithm. Creator Eun Kyung Shin argues that the technology can even tell when you are interested in someone (or not).
Shin wanted the device to display a person’s emotions as closely as possible. She refers to faces we normally put on in public as ‘social masks,’ especially when people deal with intense situations.
Is the device futile? That depends. While it aims to help with social anxiety, it seems that Hyperface would also attract unwanted attention. It’s not every day you wear a digital face.
Throughout the years, patients with neurological disorders have relied on prosthetics and animal testing in the hopes of regaining the ability to walk. In the U.S. alone, nearly 5.4 million people suffer from a type of paralysis. Expensive and often difficult to obtain, treatments are hard to come by. But this new medical algorithm can help the nervous system ‘relearn’ movements.
The smart walk assist is an innovative body-weight support system because it manages to resist the force of gravity and push the patient back and forth, to the left and to the right, or in more of these directions at once, which recreates a natural gait and movement that the patients need in their day to day lives.
After just a single hour on the harness and algorithm, all 30 tested patients saw an improvement. The procedure has overcome the obstacle of losing muscle mass and neurological wiring.
“This is a smart, discreet, and efficient assistance that will aid rehabilitation of many persons with neurological disorders.”
While patients are literally taking it a step at a time, this is definitely a huge leap for the medical field.