Inside The Near-Future World Where All Our Data And Machines Are In Constant Communication

Intelligent, predictive, and adaptive, a network of robotic minds will soon start changing every aspect of how we live.

Inside The Near-Future World Where All Our Data And Machines Are In Constant Communication
[Illustrations: Kasezo via Shutterstock]

The insurance drone skims above the roadway, humming softly as it tracks your car. Multiple failed login attempts caused by an out-of-date token have prevented the vehicle from downloading a firmware update to its stabilization system. The vehicle broadcast this to all connected stakeholders, so now the drone is watching for unexpected guidance problems. It mainly serves to add a bit more to the evidence stack in case there’s an accident. The car is sending detailed analytics that show problems immediately, and if it posed a risk, the highway’s control system would know and quickly pull the car from the road.


And now the other cars know too. They’re all connected by a wireless mesh net, exchanging data and using an array of sensors to continuously map their surroundings. They move a bit further away in their lanes, their guidance systems heightened to run the probabilities of your vehicle maintaining trajectory, slowing down, or veering off suddenly. In machine time, even an 80 mph event looks slow and manageable.

A fleet of 18-wheelers pushes through the line, snakelike and determined, 10 rigs long and linked together in an efficient swarm formation. The cars change lanes to let the fleet have priority without slowing down the stream. The rigs roar ahead. Suddenly, they break formation, skidding to a stop across the seven-lane freeway, blocking it entirely. The road comes to a halt with all vehicles on alert trying to understand the situation, their radios crackling with static.

“This is a system intervention. We are the People’s Freedom Consortium of New Delhi. We have taken control of this roadway.”

An EMP radiates out from the truck wall, knocking all the vehicles offline, into silence. A loud bang shocks you as the dead drone bounces off the hood of your car and clatters onto the pavement.

* * *

This near-future is already unfolding–and it’s enabled by the convergence of a suite of technologies that have become cheap enough and powerful enough to work their way into the hardware of our lives. High-speed LTE wireless networks are nearly ubiquitous in most developed regions, connecting smart objects to each other and to remote services. These networks, combined with GPS and beacons, enable precise telemetry–the sharing of location, trajectories, and waypoints across transportation networks. Sensors have become much more sophisticated, miniaturized, and affordable, enabling devices at the edges of networks to scan and capture reality with tremendous fidelity. They pair with powerful computation riding the seemingly endless arc of Moore’s Law to crunch volumes of real-time data and turn it into analytics, predictive models, and algorithmic corrections.


This is how the brains of the Industrial Internet are forming, leveraging data from networks and sensors to model the world, evaluate contexts, predict outcomes, and respond and adapt to feedback. Now, these young capabilities are beginning to animate vehicles and ships, aircraft and robots. And, as we’ll see, they’re starting to socialize and collaborate.


Factory assembly lines have been managed by industrial robotics for years but increasingly these systems are becoming mobile. Robots from Kiva Systems are automating order fulfillment in Amazon’s warehouses. They use sensors to follow barcode markers on the floor that lead them to the exact shelving location to fulfill an order. Some robots, like the FastRunner, mimic the ways that an ostrich can run at high speeds. Others, like those being tested at Carnegie Mellon University, move like snakes, sidewinding up and over dunes. With camera eyes and computational algorithms, they’re able to move into piles of rubble and detect fingerprints, faces, handwriting, and bodies.

More sophisticated humanoid robots are learning how to overcome obstacles and move across more varied terrain. The Big Dog, from Google’s Boston Dynamics, has shown how it walk and run like a pack animal, using a spider-like array of sensors to map the landscape. The Atlas is their latest effort, standing erect on two legs and designed to negotiate outdoor terrain, carry objects, and even climb when needed. It has a mechanoid body, a digital nervous system, and the algorithmic strength to reason with the world. The team hopes their efforts will win the 2015 DARPA Robotics Challenge.

Another Google company, the Japanese group SCHAFT, has a similar humanoid born from years of developing musculoskeletal systems, powerful actuators, and high-torque motors. It’s noisy and slow and a bit scary, but it’s got the eye of DARPA. Recently, the U.S. military introduced a mandate to lower its troop count and bring in more unmanned robotic power. Some will look like Big Dog, some will look like regular vehicles without drivers, and others may cross the battlefield on legs like Atlas.

Computation has become so cheap and powerful that it’s relatively simple to bring cognitive capacities to bear on advances in sensing, mapping, mechanical articulation, and behaviors. These capabilities are steadily transforming dumb, human-powered mechanics into smart self-directed systems.

Autonomous Systems

Google’s driverless car has been the prototype for the development of automated, sensing, and responsive vehicles. Their groundwork has drawn car manufacturers into the race. Tesla’s Elon Musk has claimed that their cars will be 90% autonomous by the end of 2015. Honda has added a suite of Honda Sensing assistive technologies to its flagship CR-V. Audi recently became the first automaker to receive a permit from the California Department of Motor Vehicles to test its self-driving car on California roads.


The California DMV is the first state to offer such permits but it shows the critical role that regulators will play in the evolution of autonomous vehicles. The U.S. Department of Transportation has published its policy on automated Vehicle Development, listing five classes from “no automation” to “full self-driving automation.” They hope automation will dramatically reduce the 30,000 annual auto fatalities on US roads, but they worry about the pace of technology and how these systems will scale in numbers and vehicle size. For example, Mercedes has demonstrated a self-driving semi-truck they hope to see in use use by 2025, but so much tonnage in the hands of an algorithm is a hard pill for regulators to swallow. It may be that roadways will develop special lanes for autonomous shipping vehicles. Ultimately, regulation and human behavior will have the largest roles in the future of our roadways.

Sea lanes are also feeling the impact of algorithmic transformation. Rolls Royce is bringing the sensing and intelligence capacity of a self-driving car into cargo ships. In parallel, the European Union has funded the Maritime Unmanned Navigation Through Intelligence in Networks program to build a prototype vessel. They hope to make the $350 billion shipping industry less polluting and more efficient, conjuring a near-future of oceanic shipping fleets plying the seas under algorithmic guidance.

Autonomous systems are also heading below the surface. NATO has sent self-directed underwater vehicles to make detailed maps of the seafloor in an effort to better control for mines. The National Oceanography Center has deployed eight autonomous craft to explore the entire water column, from seafloor to surface, of the rich ocean front between the Atlantic and the English Channel.

Naturally, weapons systems are also becoming increasingly self-directed. Arms manufacturer Raytheon has developed the Phalanx Close-In anti-ship defense system. Their website claims that “Phalanx automatically carries out functions including search, detection, threat evaluation, tracking, engagement, and kill assessment. “ Algorithmic kill assessment is worth a pause to consider the implications. Plextek Consulting sells a suite of tools to defense organizations, including support for a “swarm of unmanned platforms” to share information and better coordinate. This swarm capability is going to increasingly shape the behavior of autonomous systems and potentially take them even further beyond our control.

The Swarm

Ants and bees are often studied to understand swarming. Individually, they’re rather simple, following a small set of rules that determine their behavior. From this simple set, the colony and the hive coordinate to produce very sophisticated, large-scale behaviors that are far greater than the ability of any individual.

Alex Kushleyev and Daniel Mellinger of KMel Robotics demonstrated quadcopter UAV swarming in a popular YouTube video they posted in 2012. In a stunning follow-up they’ve recently partnered with Lexus to produce the carmaker’s latest video, Amazing In Motion. It shows how much these little swarming quads have evolved to have precisely coordinated behaviors.


Radhika Nagpal is a faculty member of Harvard’s Wyss Institute for Biologically Inspired Engineering. His lab recently created 1,024 Kilobots–small, motorized robots–that can self-organize as a swarm to assemble into visible patterns like a star or the letter K. They observe each other with infra-red cameras and then use very simple rules to manage their relationships.

In August of 2014, the Office of Naval Research demonstrated how quickly this technology is advancing, and how well it can now scale. They deployed 13 autonomous lightweight littoral combat boats to escort a Navy vessel. The boats held formation around the vessel and then broke free to swarm around a simulated attacker. The Navy talks about deploying these units around vessels, tankers, and oil rigs beginning in 2015. They reduce costs, can operate longer than humans, process information much more quickly, and will ultimately remove more people from the line of fire.

Machine-to-machine communications allow autonomous systems to share their maps and sense data, modify their trajectories in response to dynamic conditions, and communicate goals and problems across the swarm. This will increasingly impact the built environment as infrastructure takes a larger role in coordination and governance. It will also impact how networks attach to high-data robotics and how we secure those networks against malicious intrusions. How we interact with and command autonomous swarms will be an exercise in understanding complexity, while simultaneously releasing a control to self-directed systems that we may not really understand.

Ultimately, these efforts arise from our deepening understanding of nature and our ability to manipulate matter with greater precision. We’re training networks to pay attention and make decisions. We’re giving senses to our machines, and the ability to reason and act quickly to adapt to changing conditions. By helping them communicate and coordinate we’re starting to toy with complexity, and enabling the potential for unexpected emergent behaviors.

Autonomous systems are, after all, autonomous. This path will help us understand larger emergent phenomena like cities and economics and climate, but it may also unleash outcomes that we’re currently unable to adequately plan for or contain.

* * *


“Stay in your vehicles. We have identified all users on this network. You are now part of the People’s Freedom Consortium of New Delhi.”

Your car blinks back to life showing the People’s seal on the console. Hacked and rooted, the doors are locked and all communications dead. Your car and the others begin to roll towards the convoy wall, stalling against it one by one, adding to its mass across the freeway.

Just then the air is sheared by the whirring props of hundreds of quadcopters flying down the roadway. They break into multiple flanks and formations, swarming around the People’s barricade, releasing a crackling of small pops behind a hail of projectiles. The People’s trucks become pocked with mesh nodes, wrapping the vehicles in a wireless attack.

Several small wheeled vehicles streak under the cars into the roadblock, jumping off the roadway and attaching to the trucks. They join the mesh, offering massive GPU cores to decrypt the People’s network. Once in, they crawl the People’s wearables, painting each freedom fighter with a bright network ID. Larger drones spray them with non-lethals and foaming agents.

Your car shudders and reboots, reversing from the barricade, the console glowing with the face of the regional chief of the Department of Homeland Security.

“This political intervention has been contained. Your car will be directed to the DHS holding grounds for debriefing. Please cooperate with the receiving mechanoids. Thank you for your patience.”


About the author

Chris Arkenberg is a Research & Strategy Lead at Orange Silicon Valley. Prior to joining OSV, he was a Research Fellow at the Deloitte Center for the Edge and a Visiting Researcher at the Institute for the Future