This week we learned about an ingestible origami robot that can remotely operate on a patient’s stomach, wireless ear buds that could soon translate a conversation in a foreign language and a new way to evolve supermaterials in the cloud. Take a look.
Teams at the Massachusetts Institute of Technology, the University of Sheffield and the Tokyo Institute of Technology have developed a digestible “origami robot” that unfolds itself from a pill-like capsule and allows surgeons to operate on patients remotely. Doctors place the patient inside a magnetic field and use the field to guide the robot along the stomach and other cavities. The robot can remove small swallowed objects like buttons or patch wounds. “For applications inside the body, we need a small, controllable, untethered robot system,” Daniela Rus, a professor of electrical engineering and computer science, told MIT News. “It’s really difficult to control and place a robot inside the body if the robot is attached to a tether.”
Researchers in Switzerland and California are using computer modeling, machine learning and other artificial intelligence techniques to create new supermaterials by mixing and matching the known properties of materials stored inside a data cloud. “We probably know about 1 percent of the properties of existing materials,” Gerbrand Ceder, a materials scientist at the University of California, Berkeley, told Nature magazine. Nature reported that “at least three major materials databases already exist around the world, each encompassing tens or hundreds of thousands of compounds.” Neil Alford, a materials scientist at Imperial College London, told the magazine that “we are now seeing a real convergence of what experimentalists want and what theorists can deliver.”
The New York City-based startup Waverly Labs says it is developing ear buds that can translate in real time conversations between users speaking different languages. The earpiece, called Pilot, will begin working in English, Spanish, French and Italian, but other languages are reportedly coming. The buds, designed to work in pairs, will be connected to a smartphone app and use speech recognition and machine translation to make sense of a conversation. The details of the technology remain unclear and Waverly released only a couple of videos of the future product. But that didn’t stop the company from raising more than $1 million for the project on the on the crowdsourcing website Indiegogo in just hours. If Pilot takes off, it could be revolutionary.
Scientists at the University of Cincinnati’s James L. Winkle College of Pharmacy are using math and algorithms to model chemicals and test them virtually. The technique could reduce the need for using animals to test consumer products like shampoos and hand lotions. The equations allow the researchers to determine whether a chemical compound will penetrate skin or induce allergy based on the results of prior compounds, according to Gerald Kasting, professor of pharmaceutical sciences at the college. “A lot of people have models, but we have predictive models,” Kasting said. “Instead of doing testing on 30,000 compounds, we are able to test a subset of, say, 200 and make predictions about the other 29,800 based on the subset.”
Engineers at the Georgia Institute of Technology have found a new way to control a driverless vehicle “as it maneuvers at the edge of its handling limits.” They tested the technology “by racing, sliding, and jumping one-fifth-scale, fully autonomous auto-rally cars at the equivalent of 90 mph.” They say that the approach could help improve the safety of self-driving cars when road conditions get rough. The team is using a method called model predictive path integral control (MPPI) developed to steer robotic vehicles “near its friction limits.” They wrote, that “the MPPI control algorithm continuously samples data coming from global positioning system (GPS) hardware, inertial motion sensors, and other sensors. It calculates 2,000 different possibilities within 15 milliseconds.The onboard hardware-software system performs real-time analysis of a vast number of possible trajectories and relays optimal handling decisions to the vehicle moment by moment.”