Skip to main content

The Electronics Industry Replies to COVID – Part One

Since the pandemic hit our shores earlier this year, we’ve been keeping our eye out for ways in which the electronics industry has responded. Several of our earlier posts have addressed this. In Tech Takes on the Coronavirus, I wrote about two Italian innovators who’d come up with a way to use 3D printers to create valves needed for ventilators. Oura Rings Enlisted in the Fight Against COVID talked about this technology, which the NBA planned to use to keep players in their bubble safe. Most of us have gotten used to social distancing, and in this post I took at look at some applications that enable social distancing in a retail setting.

EDN shares our interest in, and they recently asked a provocative questions: “Has the electronics industry done enough tackling the pandemic?”

To answer this and other questions, EDN/AspenCore pulled together a series on “real design solutions related to diagnostics, treatment, surveillance and prevention of the spread of the novel coronavirus,” which I’ve been reading on EE Times.

In this post, I’m providing a very brief summary of what’s in first article in their series. (The remaining articles will be fodder for future posts.)

Majeed Ahmad offered a “sneak peek into the brand-new design ecosystem built around the fight against the coronavirus pandemic.” He began with automated ways to detect and maintain social distancing, and the role that photoelectric proximity sensors, combined with miniature programmable logic controllers, is playing. The sensors determine which way the traffic is flowing, and the PLCs counts the number of people coming and going and provides red light-green light data.

The next element of the coronavirus design ecosystem he introduces is COVID testing via camera. Using thermal imaging cameras to take someone’s body temperatures is a fast, efficient, and contact-free way to figure out who’s running a fever. An alarm is triggered, and the person with an elevated temperature can be taken aside for further screening.

A third area focuses on pre-diagnostic screening, and I found this one particularly interesting. The one Majeed writes about is predicated on analyzing samples of coughing sounds to “identify unique cough patterns associated with COVID-19 infections.”

SensiML, a subsidiary of QuickLogic, which develops software solutions for ultra-low-power IoT end-points, is crowdsourcing the cough samples from healthcare facilities to create large datasets. Next, it’s going to apply the AI-based cough analysis algorithms to these datasets to build an efficient COVID-19 screening mechanism.

Given that we all get a bit panicked when someone near us starts coughing – even if they’re wearing a mask and six-feet away – it will be great if someone could figure out how to detect a possible COVID infection as opposed to a tickle in the throat.

The final part of the ecosystem Majeed sees emerging is wireless patches for contact tracing. These wearables would be equipped with temperature sensing chips. “The low-cost and no-touch temperature sensing tags can remotely sense body temperature and ensure that people with symptoms—having placed the patch on the skin—can stay at home.”

So far, all the articles I’ve read in this series have been worth the read. The electronics industry isn’t going to cure COVID-19, but it’s certainly going to help us work our way through the pandemic while we await a viable vaccine. And it will help us be better prepared to respond to anything that happens in the future.

 

On the road to autonomous vehicles

As a car enthusiast, and a devoted techie, I’ve been an avid follower of all the news about autonomous vehicles. While we’re definitely on the road to autonomous vehicles, the road is long and winding. And there are speedbumps along the way. But as everyone who’s sat behind the wheel – hands off! – while their car perfectly executed a parallel parking maneuver knows, we’ll get there.

While a lot of media attention has gone to experiments with fully autonomous driving, “getting there” is going to arrive thanks to incremental changes that have been making their way into consumer vehicles for a while now: cruise control, alerts when you’re veering out of your lane, backing up guidance. The Society of Automotive Engineers has established a six-level continuum that defines automated driving. It ranges from Level 0, where the driver is in full control (but with light assistance like automatic emergency braking), up to Level 5, where the driving is 100% left to the control system, and the human “driver” is for all intents and purposes a passenger.

The available technology currently puts us at Level 2. At this level, the driver still needs to pay attention and take care of the driving tasks. But it includes Active Driving Assistance, with adaptive, or dynamic, cruise control (with speed automatically adjusted so that the vehicle stays a safe distance from the vehicles ahead) and lane keeping assistance working together.

How’s Active Driving Assistance working out? Well, according to AAA, the state-of-the-art puts things at a considerable distance from 100% reliability.

AAA automotive researchers found that over the course of 4,000 miles of real-world driving, vehicles equipped with active driving assistance systems experienced some type of issue every 8 miles, on average. Researchers noted instances of trouble with the systems keeping the vehicles tested in their lane and coming too close to other vehicles or guardrails. AAA also found that active driving assistance systems, those that combine vehicle acceleration with braking and steering, often disengage with little notice – almost instantly handing control back to the driver. A dangerous scenario if a driver has become disengaged from the driving task or has become too dependent on the system. (Source: AAA)

For their study, AAA tested out active driving assistance systems in routine scenarios, under both real-world and closed-course settings.

On public roadways, nearly three-quarters (73%) of errors involved instances of lane departure or erratic lane position. While AAA’s closed-course testing found that the systems performed mostly as expected, they were particularly challenged when approaching a simulated disabled vehicle. When encountering this test scenario, in aggregate, a collision occurred 66% of the time and the average impact speed was 25 mph.

Their recommendation? When it comes to Active Driving Assistance, we’re still in the early development stage. So proceed with caution.

Moving technology that’s not quite there yet is dangerous, especially given the tendency to oversell new tech. Because of this, drivers may not fully grasp that they still need to stay engaged and be ready to take over full control. Another problem with getting Active Driving Assistance into the public realm is that bad experiences will turn folks off to acceptance of autonomous driving – even when it becomes more perfected. As it stands, a 2020 AAA survey found that only 12% of drivers “would trust riding in a self-driving car.” All things considered, this is not surprising. But the automotive industry doesn’t want to encounter a stall in acceptance once the technology is more proven.

In the meantime, Active Driving Assistance is a step along the way to the fully autonomous vehicle. Can’t wait for more!

 

The Perseverance rover is headed for Mars. Safe journey, Percy!

Somewhat lost in all the pandemic and political news is NASA’s launch of the Mars 2020 Perseverance rover (nickname: Percy), which on July 30th headed off on its seven-month trip to the Red Planet, where it will join Curiosity, which touched down in August 2012 and is still active. (The earlier rovers – Sojourner (1997) and Spirit and Opportunity (2004) – are no longer operational.)

For all you space/science/tech nerds, NASA has a great website full of all sorts of information on this mission, but this space/science/tech nerd was drawn to the section on the instruments that Percy carries.

As you can see, it’s full loaded with “state-of-the-art tools for acquiring information about Martian geology, atmosphere, environmental conditions, and potential biosignatures.”

Mastcam-Z is the camera system mounted on the mast. It’s got panoramic and stereoscopic imaging capabilities that are deployed to take high speed 3D pictures and videos of objects. It has a zoom function that lets it take a look at objects at a distance. Mastcam-Z is on the lookout for objects of interest, rocks and soil that the earthbound scientists can choose to grab and (once back on earth) examine for signs of life. Mastcam-Z will also be used to look for evidence of earlier water-related features (lakes, streams) that might have supported life.

MEDA (Mars Environmental Dynamics Analyzer) is “a set of sensors that will provide measurements of temperature, wind speed and direction, pressure, relative humidity and dust size and shape. Its role is to check out the weather conditions that will let future human Mars travelers know what to expect.

MOXIE (Mars Oxygen In-Situ Resource Utilization Experiment) is what will enable Mars explorers to convert Martian carbon dioxide – which accounts for approximately 96% of the gas in Mars’ atmosphere –  to oxygen that they can use to breathe, and to propel the Martian vehicle off of Mars so that those first explorers can get back home safely. “MOXIE makes oxygen like a tree does. It inhales carbon dioxide and exhales oxygen.” Nice image!

PIXL (Planetary Instrument for X-ray Lithochemistry) is used “to measure the chemical makeup of rocks at a very fine scale,” and determine whether there are any signs of earlier microbial life. It also contains a camera capable of taking really close-up close-ups. It can really do some very fine-grained work. “PIXL detects over 20 chemical “fingerprints” – even when the amount is only a few parts per million. It finds the exact tiny spot in a rock where each chemical is.”

RIMFAX (Radar Imager for Mars’ Subsurface Experiment) is “ground-penetrating radar that will provide centimeter-scale resolution of the geologic structure of the subsurface,” which will make Percy the first of the rovers to let scientists see what lies beneath. It’s designed to “detect ice, water or salty brines more than 30 feet (10 meters) beneath the surface of Mars.”

SHERLOC (Scanning Habitable Environments with Raman & Luminescence for Organics and Chemicals) is another first for Mars: the first UV Raman spectrometer to land there. It will be used for “fine-scale detection of minerals, organic molecules and potential biosignatures.” Just like Sherlock Holmes did, its namesake SHERLOC uses a magnifying glass to check out fine details. It will carry with it spacesuit material to help determine how it will withstand the Martian environment. (And I thought Syracuse winters are bad…)

SuperCam “is a remote-sensing instrument…that uses remote optical measurements and laser spectroscopy to determine fine-scale mineralogy, chemistry, and atomic and molecular composition of samples encountered on Mars.” It includes a laser that can clear away dust so that the other instruments can get a better view of an object. It also measures dust hazards so that humans on Mars will know whether they contain elements that might do them some harm.

I’ll be keeping my eye on Percy, and look forward to next year when it starts reporting back from Mars. Safe journey!

Keeping Hackers Out of Industrial Control Systems

It seems that there’s always a hacking story in the news, the most recent being a hack of Twitter accounts which you’ve likely read about. Hackers had managed to con a couple of Twitter employees into giving them access to their credentials. They used these credentials to make their way past security and sending out some bogus tweets – from some pretty big-name accounts, including Bill Gates, Elon Musk, and Apple – that linked to a bitcoin scam. To some extent, this attack seemed pretty ludicrous, but given how so much important information is conveyed via tweets these days, the ease with which the hackers managed to breach Twitter security and make some trouble is sobering.

And, of course, gets me thinking about how hackers can do some truly terrible damage. One of the most critical areas where hackers could do major harm is industrial control systems, which pretty much oversee all elements of our most vital infrastructure and industrial production. Think power generation, water treatment facilities, transportation systems, defense…If these systems get hacked, the damage could be far greater than a few folks getting duped into buying/selling cryptocurrency.

Security experts are continuously looking for new and better way to prevent hackers from succeeding. In (virtually) thumbing through some (virual) back copies of Spectrum IEEE, I came across one approach that seems promising. In her post, New Approach Could Protect Control Systems from Hackers, Michelle Hampson wrote about an “algorithm [that] creates ‘background noise’ during data transmission to alert officials to hacking.”

Because the major industrial control systems transmit data so rapidly, “hackers need interfere with the transmission of real-time data only for the briefest of moments to succeed in disrupting these systems.” The example Hampson uses underscores the seriousness of these types of threats: the 2010 Stuxnet attack in which hackers damaged over a 1000 centrifuges used in Iran’s uranium enrichment factory. Researchers have now developed a new means to quickly figure out that an incursion is occurring so that a system can be shut down before too much harm comes to it.

The researchers – Zhen Song of Siemens and his colleagues – drew on “the concept of “watermarking” data during transmission, a technique that can indicate when data has been tampered with” and applied it to industrial control systems.

When hackers try toying with how data are being transmitted, the watermark signal is altered and the system is alerted that it’s under attack.

The approach involves the transmission of real-time data over an unencrypted channel, as conventionally done. In the experiment, a specialized algorithm in the form of a recursive watermark (RWM) signal is transmitted at the same time. The algorithm encodes a signal that is similar to “background noise,” but with a distinct pattern. On the receiving end of the data transmission, the RWM signal is monitored for any disruptions, which, if present, indicate an attack is taking place. “If attackers change or delay the real-time channel signal a little bit, the algorithm can detect the suspicious event and raise alarms immediately,” Song says.

Initial tests have been promising.

Whatever thwarts hackers from damaging the world, I’m all for it!

Keep your distance! Technology for maintaining social distancing.

If you’ve been in a grocery, pharmacy, or big box store recently – or lined up outside a Starbucks – you’ve seen the duct-taped X’s telling you where to stand to keep your social distance from the person in front of you. And if you’ve been taking walks – or just walking down the aisle in the grocery store – you’ve probably gotten pretty good at gauging just what six feet away looks like.

Not surprisingly, as the pandemic continues on and, with it social distancing requirements, there will be technology-based solutions that can save us from having to rely on the tape measure and the eyeball.

I recently came across a post on the topic, from the perspective of retail applications, written by Andrea Berry of STMicroelectronics and published on embedded.com.

Presence-detection solutions are not one-size-fits-all. The core sensors and technologies required to accomplish the variety of features and use cases are as diverse as the applications and environments in which they are used. In each case, system architectures must take into consideration environmental factors; the shape, structure, and layout of the space; power availability; node-to-node communications requirements and constraints; and data and information security.

Here are the areas that Berry notes are likely candidates for technical solutions, and what those solutions might look like:

Occupancy-density indication

Many stores limit the number of shoppers who can be in the store at the same time. But once they’re in there, how do you keep them from jamming the aisles and clumping together? In stores that are set up using aisle-based layouts – and this covers pretty much all grocery stores, chain pharmacies, and big-box stores, but not department stores or boutique shops:

Simple motion-detection technology, similar to what is used in home security systems (typically based off of passive infrared, time-of-flight, microwave, mirror optic, or ultrasonic sensors, or a combination thereof), could be used at each end of the aisle to determine when someone is entering or exiting.

Alerts could be triggered if someone were entering the wrong way – many stores are now using tape arrows to indicate which direction you should be rolling your cart in – or if the capacity of the aisle was reached and shoppers would not be able to socially distance. This type of solution wouldn’t do anything to prevent anyone from getting in your space when you’re reaching for that last box of pasta, but it could be used to keep shoppers moving in the right direction, and limiting the number of shoppers in each aisle.

“Designing a variety of proximity sensing building blocks requires a range of sensor, power management, and connectivity solutions.”

Absolute social-distancing measurement

There are a number of different technologies that can be used to solve the social distancing problem: optical sensors, radar/LIDAR, pressure sensors, microphone arrays, RFID localization and zoning, Wifi-Bluetooth.

Given the presence of security cameras, stores may already have the infrastructure in place to put optical sensors in place.

Image processing may be performed at the edge by the sensor node, with real-time alert at the sensor if there was an occupancy exception in that area. …Both visual and thermal optical sensors could be used in such an application.

Like optical sensors, “radar and LiDAR could be used to accurately pinpoint the location of every person within a given space.” The advantage here is that they are longer range than optical sensors, so you’d need fewer of them. The disadvantage is that stores with aisle formats have a lot of obstructions, so deployment could be tricky.

Pressure sensors (MEMS, strain gauge) could be embedded in the flooring to detect when shoppers are getting too close to each other for comfort, but this would be a costly solution that might be overkill when it comes to social distance monitoring.

Microphone arrays, with ultrasonic microphones picking up the sound a shopper’s making, could work for social distancing but the work (signal conditioning, intelligence processing) involved in implementing a solution may not be worth it.

Carts and baskets could be given an RFID tag, which could be used to determine where those carts and baskets are. But, as anyone who’s ever done any shopping knows, people do from time to time step away from their carts, and it doesn’t do much good, health-wise, if the carts are socially distanced but the living, breathing shopper aren’t.

While there are privacy issues involved, Wi-Fi and Bluetooth location determination could be easily implemented, given that nearly everyone has a cellphone with them and given that many stores already track patrons entering when their mobile starts looking for a Wi-Fi signal. Not only could a Wi-Fi approach be useful for social distancing, but it could also be used for contact tracing if needed.  

Social distancing requirements will likely be with us for a long time to come. So far, we’ve been relying on old-fashioned methods to maintain it. It won’t be long before the technology to enable social distancing comes along.

Oura rings enlisted in the fight against COVID

Remember mood rings? They were a 70’s fad that recycles every once in a while. The rings contained liquid crystal, which changes colors when the temperature of your finger changed. Red means you’re excited. Gray, nervous. Blue, calm. The science was a bit sketchy, but there was something to it. Anxiety sends blood towards your body’s core, decreasing the temperature in your extremities. And now there are rings that are health wearables that have sensors that measure temperature and heart rate with fairly good accuracy.

There are a number of such rings on the market, one of which is the Oura. The Oura ring is pretty lightweight – it weighs less than .25 oz – but it packs a lot in there:

Infrared LEDs, NTC temperature sensors, an accelerometer, and a gyroscope all wrapped around your finger — the most precise and convenient place to capture body measurements like heart rate, HRV, temperature, steps, and more.

The Oura has good battery life, lasting up to 7 days, and recharges wirelessly.

All in all, it sounds like a reasonably good device.

But it’s not getting the attention it’s getting by being a reasonably good device.

Research is underway at a number of universities to see how it can be enlisted in the fight against coronavirus by early detection of COVID-19 symptoms.

And the NBA recently announced that when their season resumes in late July in the Disney World Resort in Orlando bubble, players will have the option of wearing an Oura ring.

One completed study – from the West Virginia University Rockefeller Neuroscience Institute (RNI) – tapped more than 600 healthcare professionals and first responders to participate in its first phase, and shows promising results:

According to RNI’s results from its first phase of the study, released in late May, their technology – which uses a combination of an app, the Oura Ring and artificial intelligence models – was able to predict the onset of COVID-19 symptoms with over 90% accuracy.  (Source: CBS News)

The second phase of the study has been opened up to 10,000 participants.

Meanwhile, the NBA will also be deploying other technology.

…players will be receiving several different pieces of tech and safety equipment to utilize while in the Disney bubble, including a Disney Magic Band to be worn at all times and used to get through checkpoints and into rooms, social distancing alarms that will go off if players are too close to each other for too long, and these “smart” rings that they will have the choice to wear or not. (Source: Complex.com)

Basketball may not be a full contact sport, but the players get pretty up close to each other when they’re on the floor, so I’m wondering how this will work.

In any case, for those who’ve been missing sports during the lockdown, the NBA’s season making a comeback is welcome news. And it’s good to hear that technology is getting credited with an assist.

Underwater energy harvesting

I’m always bookmarking articles of interest, and in going through my backlog, I just came across one on underwater energy harvesting that appeared in EETimes this past February. (And speaking of February, doesn’t that seem a million years ago?)

In the article, Bill Schweber describes a new approach that some MIT researchers are using.

The team combined two very different phenomena — the piezoelectric effect and backscattering — to provide a modest data-rate, battery-free underwater sensor and data link, which they call a Piezo-Acoustic Backscatter (PAB) system. Backscatter itself is a well-known technique often used with passive RFID and other systems; it uses directed, impinging energy to stimulate, power, and provide a response, usually in the electromagnetic RF world. (Source: EE Times)

As Bill points out, the idea of energy harvesting – tapping sources of energy like wind, solar, underwater currents, and sea water – is inherently appealing. The sources are free for the asking, and, unlike oil and other extractive energy sources, are renewable. But the cost of developing free sources is high, which is why the work being done by the MIT team and others to make energy harvesting more feasible is so crucial – even if the MIT work is not aimed at providing energy at scale to fuel the grid.

The applications for this project are largely around oceanography: sea temperature monitoring to track climate change and marine life movement. The solution being developed by MIT researchers will allow scientists to deploy large networks of sensors that can operate for prolonged periods without human intervention. These networks of sensors, something of an IOT that runs underwater, gather data and transmit it to devices on the surface.

Here’s how it works:

In the MIT team’s PAB system, a transmitter sends directed acoustic (pressure) waves through water toward a submerged piezoelectric sensor and circuit that has stored the sensed data — which could be water temperature, flow, salinity, or other parameter of interest. This submerged node has a circuit board that houses a piezoelectric resonator, an energy-harvesting unit, and a microcontroller. When this energy wave hits the sensor, the piezo material vibrates and stores the resulting electrical charge — that’s the beginning of the energy-harvest cycle. Next, the sensor uses that stored energy to reflect or to not reflect a wave back to a receiver. The receiver sees a reflection as a 1 and a non-reflection as a 0 — and can so decode the serial data stream.

Those 1’s and 0’s, as is so often the case, are key.

“Once you have a way to transmit 1s and 0s, you can send any information,” said co-author Fadel Adib, an assistant professor in the MIT media lab and the department of electrical engineering and computer science and founding director of the Signal Kinetics Research Group. “Basically, we can communicate with underwater sensors based solely on the incoming sound signals whose energy we are harvesting.”

So far the MIT research has been done in the lab, not in the ocean. (That’s a picture of the tank used for the experiments.) The next step is to test in real-world conditions, where water salinity and wave movements will vary.

Overall, an interesting story that I’m glad I took off the virtual shelf of my backlogged, bookmarked library.

Your smartphone’s getting even smarter

When the pandemic hit, I’m sure I wasn’t alone in making sure I knew where the thermometer was, replacing the battery on the pulse oximeter, and checking the expiration dates on the cold medicine in the cabinet. Your home “doctor’s bag” may well have a blood pressure monitor in it: a device with an inflatable arm cuff attached to it. Once the cuff’s inflated and deflation begins, your diastolic and systolic pressure are measured on the way down.

Since the cuff technique for measuring BP was devised more than 120 years ago, this technology has been pretty much the only non-invasive way to measure blood pressure.

Now a Swiss-based company, Leman Micro Devices, has come up with a cuff-less solution. Rather than measure an upper-arm artery, Leman’s e-Checkup with V-Sensor gauges BP – with clinical accuracy – through the fingertip. It also measures other vital signs: temperature, blood oxygen, heart rate and respiration rate. And it works on your smartphone. Talk about a “doctor’s bag”.

There are three parts to LMD’s health sensing solution: the V-Sensor hardware module, eCheckup software for gathering and analyzing data, and the server where data are stored and where diagnostic information is offered.

The V-Sensor has a pressure sensor embedded in a soft, flat epoxy resin that is pressed against the fingertip. LEDs and a photodiode form a conventional pulse oximeter that illuminates the tip of the finger. A thermopile measures heat radiation to estimate body temperature. The application-specific integrated circuit designed by LMD controls the LEDs; captures and digitizes data from the photodiode, pressure sensor, and thermopile; and communicates with the mobile phone processor. (Source: article by Maurizio di Paolo Emilio on embedded.com)

To take your blood pressure, you just hold your phone, putting your index finger on the V-Sensor, which is pretty tiny: only 15 mm long and resides on the smartphone’s back, and your thumb on the front screen. The e-Checkup app lets you know whether you’re getting too much pressure or not enough. After less than a minute, the reading is taken. No worrying about whether you’ve got the cuff in the right place.

And there’s even more:

LMD’s research and development have shown that V-Sensor data can be combined with accelerometer data in the phone when held against the chest to find the timing of the heart function: left ventricular expulsion time (LVET), aortic and mitral valve timing, and many others. Further measurements may combine images from the phone’s camera.

All this is well beyond what’s possible with other wearables and health apps, and promises to be a real breakthrough. When people can monitor their vitals so easily, without having to acquire and learn to use all sorts of separate devices, they’ll be able to keep much better track of their health. As will their physicians. Right now, we’re seeing more and more use of telehealth, through which many routine appointments are handled over the phone or via conferencing software. The ability to remotely provide such a wealth of health information through a simple smartphone application is going to greatly improve the quality of virtual medicine.

Bottom line: another great technological approach to making real improvements to life.

Now if only they can come up with a way to determine whether someone has COVID-19, or has survived a bout with it and built-up antibodies.

 

Send in the drones!

Like those belonging to pretty much everyone else, the electronics devices and gadgets I rely on occasionally run out of juice. I forget to charge my phone. Or my smart watch. I misjudge how long my laptop has been running untethered, and now I’m finding that I’m running on empty. Sure, it’s not as terrible as realizing that the gas gauge is telling you that you have 32 miles left in the tank – and passing a sign on the highway informing you that the next rest stop is 34 miles away. Which means pulling off the highway near some pokey town – when all you want to do is get home! No, charging those devices tends to be a pretty quick and painless fix. Annoying when the battery runs out, but generally a problem that’s easy enough to remedy.

This is true for personal devices, and to a slightly lesser extent true for devices used in commercial or industrial settings. But there are some situations in which the devices are deployed in an environment where it’s not so simple to recharge or replace their sensors when the batteries start to fade. Think of, say, the security perimeter around a large energy, agricultural or military installation that’s remote and isolated.

Here, drones may be coming to the rescue.

In a recent Spectrum/IEEE article, writer Michelle Hampton discusses a system through which drones in flight can recharge sensors:

A new system is designed to make the charging process easier by using unmanned aerial vehicles (UAVs) to deliver power using radio waves during a flyby. A specialized antenna on the sensor harvests the signals and converts them into electricity.

The researchers behind this, including Joseph Contantine and others at American University of Beirut, were looking at using radio frequency ways for recharging sensors, but they were facing a significant challenge in terms of getting close enough to the sensor to give it a good charge. The answer they came up with was the UAV, which can get up close and personal with a sensor, hovering right above it, and following “an optimized trajectory that maximizes energy transfer.”

In the proposed approach, a UAV transmits radio frequencies to each sensor, which has an antenna for detecting the signals. The signals are then conveyed to a rectifier, which converts the signals into electricity. This power can be used to charge the sensor and/or activate it.

The UAV can also selectively target sensors.

The UAV supports both charging and wakeup (which can be done at a greater distance: 27.5 meters, as opposed to a distance of 1.2 meters when charging is involved.

Now, the team is working to develop a load-independent rectenna (antenna and rectifier) that maintains high efficiency across a wide range of loads and frequencies. 

This solution would be overkill, but I have to admit I’m getting a kick out of imagining a drone flying around my house, stopping to charge the batteries in all the devices I somehow forgot to plug in to the charger!

Mind over matter. In this case, a prosthetic hand.

Mind-controlled prosthetics for amputees have been around for years. And they’re amazing. But the functionality they provide is limited. They can be awkward to use and don’t provide fine motor control. And their use was generally confined to those who hadn’t just lost a limb, but who were also paralyzed. That’s because it required a direct interface to the brain, an invasive and risky procedure.

Now engineering researchers at the University of Michicgan have come up with a way to gain better control by using the nerves present in the residual part of an amputee’s limb. This technique takes advantage of:

… faint, latent signals from arm nerves and amplified them to enable real-time, intuitive, finger-level control of a robotic hand.

To achieve this, the researchers developed a way to tame temperamental nerve endings, separate thick nerve bundles into smaller fibers that enable more precise control, and amplify the signals coming through those nerves. The approach involves tiny muscle grafts and machine learning algorithms borrowed from the brain-machine interface field. (Source: University of Michigan)

This is a major breakthrough. Fingers will now work more like fingers, which will enable users to better manipulate objects. Study participants, working in a University research lab:

…were able to pick up blocks with a pincer grasp; move their thumb in a continuous motion, rather than have to choose from two positions; lift spherically shaped objects; and even play in a version of Rock, Paper, Scissors called Rock, Paper, Pliers.

Beyond being able to play Rock, Paper, Scissors Pliers, there are some more practical uses, like the ability to manipulate a zipper. This new approach enables full rotation of the thumb, as opposed to the limited directional possibilities of earlier approaches.

Here’s what the Michigan researchers did to make all this possible:

They wrapped tiny muscle grafts around the nerve endings in the partici ves new tissue to latch on to. This prevents the growth of nerve masses called neuromas that lead to phantom limb pain. And it gives the nerves a megaphone. The muscle grafts amplify the nerve signals. Two patients had electrodes implanted in their muscle grafts, and the electrodes were able to record these nerve signals and pass them on to a prosthetic hand in real time.

For the user, the manipulation of the prosthetic is done intuitively. They don’t have to think twice, or concentrate hard on the motion they’re trying to achieve. There’s no learning curve. The learning is all done thanks to machine learning algorithms.

This is quite an advancement, and opens up all sorts of new possibilities when it comes to the use of prosthetics. As I’ve said more than once, one of the proudest aspects of being an engineer is the engineering work dedicated to improving lives.

If you’re interested in learning more, you should definitely check out this presentation from the University of Michigan, Thought Into Action, which explores this incredible research.

In the meantime, safe well!