These 4 new uses for sensors are changing everything

Picture of Richard van Hooijdonk
Richard van Hooijdonk
  • New LiDAR systems provide sight to autonomous vehicles
  • Everyone’s football to benefit from sensor tech
  • Synthetic sensors and the future of smart devices
  • Smart cities made more intelligent by sensor tech

Sensor tech is at the heart of many of the latest innovations, enabling everything from 70 mile-per-hour self-driving cars to smart homes that don’t need expensive gadgets. Futurists are watching these trends with interest, and it’s clear that the future of sensors is nothing short of revolutionary. Tomorrow’s sensors will shield athletes from injury and help us deal with chronic pain without the need for drugs. They’re already helping ease congestion and enabling us to monitor our infrastructure, saving time, money, and lives. They keep these promises by leveraging some pretty cool tech from a broad range of fields, demanding scientists, engineers, and designers to work together. As Timothy Swager, the John D. MacArthur Professor in the Department of Chemistry at MIT explains, “If you look at what’s happening with sensors, you’ll see that many different disciplines have to come together. Ubiquitous sensing has so many aspects — chemical, biological, physical, radiological.”

New LiDAR systems provide sight to autonomous vehicles

In a recent study of traffic in the UK, INRIX revealed that daily congestion cost the UK a shocking £30.8 billion in 2016 alone. Numbers like these can be hard to imagine, so keep in mind that traffic steals about £1000 from each British driver. No one has a hard time coming up with better ways to spend that kind of money! But driverless cars promise more than an end to the harrowing commute; they save lives too. Rebecca Lynn, an investor in this new tech, believes “LiDAR will be able to save more lives than any technology being developed today, from genomics to AI. There are many things that will make our lives easier or better. But looking at tech that can truly save lives, it is autonomous driving. And you cannot have fully safe autonomous driving without LiDAR.”

LiDAR, or light detection and ranging, works by emitting pulsed lasers that bounce off objects and return to sensors, much like the SONAR submarines use to navigate. Without these systems, even the most connected cars would be blind to pedestrians, dogs, bicycles, and other hazards. The problem to date has been the poor range and reliability of car-based LiDAR. For instance, conventional LiDAR systems have a hard time seeing dark objects in low light or functioning in inclement weather like rain or fog.

But Luminar Technologies Inc. has a solution, and with $36 million in venture capital, they’ve built their own system from the ground up. The result, tested by equipping the Luminar system in a variety of off-the-lot cars, outperforms its competitors by a wide margin. In contrast to their 35 to 50 metre range, limiting self-driving vehicles to a paltry 40 mph to give them enough time to react, Luminar’s LiDAR has no problem seeing bicyclists, mannequins dressed in dark colours, and even a black tarp in dim light at 200 metres, allowing driverless cars to travel at normal highway speeds while still remaining safe. This is a huge breakthrough, changing the calculus that drives the adoption of autonomous vehicles.

White Ford Fusion with a lidar system on top
LiDAR, or light detection and ranging, works by emitting pulsed lasers that bounce off objects and return to sensors, much like the SONAR submarines use to navigate.

Everyone’s football to benefit from sensor tech

Scientists and physicians recently discovered a hidden danger in American people’s favourite sport: that traumatic brain injuries in football are surprisingly common. When two giant men crash into each other again and again, every Sunday for a season, even the best helmets aren’t enough to prevent damage. Careful brain scans of former players reveal that as many as 40% are suffering brain injury from their time on the field, often including chronic traumatic encephalopathy, which can leave its victims demented and suicidal.

But the National Football League is experimenting with sensor-embedded helmets that would alert doctors on the sidelines to concussions even before players know they’ve been hurt. The goal is to offer prompt care, but also to recognise injuries that might go unnoticed by these tough men in the heat of competition. In the end, the hope is that lives can be saved and the game can go on.

FIFA, too, has adopted sensors to augment the role of line judges, approving chip-embedded soccer balls that tell referees when the ball completely crosses the goal line. But the latest push for sensors in soccer is strikingly similar to the concerns of the NFL. Star soccer players are amazingly expensive, with Cristiano Ronaldo of Real Madrid netting £19 million, and Shenhua FC’s Carlos Tevez earning an incredible £32 million per year. At those prices, investing in expensive sensor tech to monitor the health and fitness of players becomes economically viable, and coaches, trainers, and physicians want to know when Tevez is sweating more than he should or whether Ronaldo’s hamstrings show signs of injury.

In the future, sports analytics will be driven by nano-sensors attached to the body, creating an ‘active skin’ of printed micro-sensors that sends big data to AI systems to predict, diagnose, and treat injuries during play, in real-time. That kind of tech isn’t as far off as you might think: in 2015, Quell released its Relief band, worn around the top of the calf, that uses neurotech to quiet knee pain. Drug free, 24/7 therapy like this is the future of sensors in healthcare.

Synthetic sensors and the future of smart devices

You’re probably already familiar with the idea of smart thermostats like Nest that can adjust the temperature of your home to your preferences. For some, the vision of the future of sensors is populated by legions of smart devices scattered around your house or apartment. Cool as this sounds, it invites some unexpected problems: smart devices can be expensive, especially as they add up, they only collect information relevant to them, and they don’t always talk to one another as well as we’d like.

But the Future Interfaces Group, an interdisciplinary laboratory at at the Human-Computer Interaction Institute at Carnegie Mellon University, thinks they have an answer. Rather than rely on multiple smart sensors embedded in connected devices, an approach called direct or distributed sensing, they think it’s easier, cheaper, and more effective to explore the possibilities of a single monitoring point. Their goal is to develop a ‘plug and play’ sensor board capable of gathering information on anything you need to run your home, from whether the refrigerator door is open to how much soap you’ve got left. It’s a ‘synthetic’ sensor because it synthesises data, taking raw information and converting it into real knowledge.

Their breakthrough is in ‘virtualisation,’ the higher-order organisation of data into something that the user understands and cares about. Much as in big data, raw information is useless; what you want is insight. You don’t care, for instance, about the precise temperature of the air near your kitchen floor, but you do want to know if you’ve left the oven on and open. Their brainchild is a sensor board that makes a ‘dumb’ room smart, a single device that gathers and analyses a wide range of data and turns it into actionable (kitchen!) insight.

Smart cities made more intelligent by sensor tech

Montreal, the French-Canadian capital of Quebec, is among the world’s smartest cities. In 2016, it was awarded the title of “Intelligent Community of the Year,” and since then, it hasn’t rested on its laurels. Montreal faces the problems of all major urban centres, crippling traffic congestion, aging infrastructure, and inclement weather. Its solution to governing these issues intelligently is pairing the best sensor tech with artificial intelligence (AI).

 A digital representation of the IoT with multiple icons over an urban landscape at night
Montreal, the French-Canadian capital of Quebec, is among the world’s smartest cities.

Collaborating with a private startup, Infra.AI, the city hopes to provide real-time information to help it better serve its citizens. By mounting LiDAR on city vehicles, and by gathering complementary information from drones and satellites, Montreal thinks it can better manage its resources and provide the quality of life its residents crave. For example, potholes only 30 cm across can be recognised by these sensor systems, which then automatically summon a repair crew. Similarly, intelligent traffic information can reroute emergency vehicles to avoid potentially deadly delays. And by providing constant monitoring of critical infrastructure, Montreal’s government will be forewarned of impending problems. For instance, in partnership with Waze, a crowdsourced Google traffic app, the city uses more than 500 traffic cameras and 700 smart signals to prioritise public transport, shortening commute time by up to 15 to 20%.

Even if your city hasn’t embraced sensors and AI, it soon will. No urban centre can afford to ignore the power of these tools to mitigate problems like congestion. And with synthetic sensors set to surge in popularity, this is a trend certainly worth watching. We’re pretty sure that you can look forward to a day in the near future when one simple device can make your home smart. And if that wasn’t enough, advances over the next few years will bring us driverless cars, guilt-free football, and better public services.

Sensors, then, are at the heart of innovations improving our quality of life. From fighting congestion to saving lives, from transforming our homes to keeping score in the World Cup, sensor tech is increasingly essential.

Share via
Copy link