The global design and strategy firm Frog asked experts from its offices around the world to identify key technology trends that’ll shape how we live, work, and play in 2017. Below, we’ve selected 12 that are most relevant to the design community. –Eds.
Not only will smart cities leverage sensors to use energy efficiently, buildings and highways will also be constructed out of materials that work more intelligently. Taking a nod from natural patterns, material scientists and architects have developed bricks with bacteria, made cement that captures carbon dioxide, and created building cooling systems using nothing but wind and sun.
This trend could be a massive opportunity for cities, as well as industries like hospitality that depend on large energy-intensive buildings. MGM Resorts, Wynn, and Las Vegas Sands have all recently been outfitted for solar, for example, showing that large companies are taking the first steps toward sustainable infrastructure, both to cut costs and to appeal to environmentally conscious customers. –Agnes Pyrchla, Frog San Francisco
With a growing need for global initiatives to reduce greenhouse gas emissions, fight climate change, and evolve cruelty-free foods, the race is on to define how we consume protein, potentially without involving any animals. We see two distinct product categories taking shape: one in which plant-based proteins are extracted, reengineered, and repurposed for products that simulate a meat-like experience. We also see technology breakthroughs in tissue engineering and synthetic biology implemented to grow food—like meat, eggs, and dairy—in laboratory environments.
In 2017, we’ll see a broad range of new plant-based meat replacements at your local grocery store. They will extend well beyond the vegan aisle, where most are currently relegated, and they will taste better than ever. For consumers, a likely question will arise: Was that Thanksgiving turkey sprouted from a seed? –Andreas Markdalen, Frog Austin
Robots today are hard, made of metal, and tend to operate in a deliberate and sharp manner. As we start encountering robots in our everyday lives, we will need our robots to interact with us in a human way, with a human touch.
Over the next few years, we will begin to see robots soften, using materials that closely resemble the human body. A movement is already underway to leverage soft robotics in products such as the SoftRobotics Gripper. Taking this a step further, some R&D departments are experimenting with electro active polymers, such as a dielectric elastomer, that change shapes when a current is applied. We will see applications of this in multiple industries. For example, we will want our vehicles to be equipped with soft robotics to support us and help us perform functions in transportation; in medicine, soft robotics may not only help us treat patients, but may also find a home inside the body as well.
The soft robotics revolution will be gradual but vast. As robots and robotics become increasingly pliable, they will fold into our everyday lives in vital ways. –Mark Freudenberg, Frog San Francisco
As VR becomes mainstream, live entertainment venues and performers will be increasingly displaced by low-cost, high-engagement entertainment options that people can access from the comfort of their home. As a supplement to the cost of empty seats, or perhaps even to profit on top of an already sold-out-show, the entertainment industry will find ways to sell VR tickets to the best seats in their live events in 2017—from watching Beyoncé at Madison Square Garden to seeing the UEFA Champions League—as well as opening up an immersive streaming VR catalog of past performances.
One particularly intriguing concept is the idea of VR micro-experiences, which allow users to transport themselves in space and time to experience wonderful little moments that refresh their senses. Think of it as a fast and inexpensive vacation for the mind. Content creators will be able to deliver low-cost, high-quality experiences that are traded on an open, social market. For those consumers who lack VR hardware, the community can provide “VR Stations” in malls, transportation terminals, and open spaces. –Piet Aukeman and Sonny King, Frog New York
Spaces will no longer simply house and support your activities–they will participate. Enabled by the proliferation of low-cost sensors, which can be easily embedded into an environment, machine learning will be used to identify how people use spaces. It can then suggest reconfiguring a space to encourage new behaviors.
Such technology has applications in health care, retail, research, manufacturing, work, and residential spaces. In health care, for instance, hospitals could shift room layouts, update signage, and adapt lighting and sound to optimize patient experiences. These could be tailored to patients’ stress levels, severity and type of conditions, schedules, as well as personal lifestyle and fitness data. As these spaces learn and evolve, they could lead to better health outcomes for patients and help hospitals lower costs.–Chad Lundberg and Jud Holliday, Frog Seattle
By now we’ve all heard the news: 1.2 million people die in car accidents each year worldwide, but autonomous vehicles (AVs) are coming soon to save us from ourselves. The optimism is probably warranted, given that 90% or more of accidents are caused by human error, but even the most ardent AV supporters know that self-driving cars will still sometimes get into accidents. What might an AV do to avoid trouble?
In the instant before an accident, an AV should maneuver in dramatic and utterly non-human ways to preserve life. Imagine this: An AV stopped at a red light might suddenly roar into an intersection to avoid being struck by a human-driven car approaching too quickly from behind. This same car in Emergency Evasive Mode might even be able to stop the cross traffic and flip the lights to red as it enters the intersection. This isn’t about smart cars–it’s about smart swarms acting in concert to save lives.
That’s the headline to watch for in the coming months: Autonomous Car Performs Dramatic Action-Movie Stunt to Save Family of Five. –Matt Conway, Frog Seattle
Sony CSL Research Laboratory recently synthesized thousands of pages of sheet music to produce “Daddy’s Car,” a song generated by AI and refined by a human composer. We’ve felt the effects of procedural generation in video games, too, from the randomized dungeon corridors of Diablo to the nearly infinite worlds of No Man’s Sky. Sunspring, a short sci-fi film written entirely by AI, premiered at a London film festival earlier this year.
Sony’s Beatles-esque track is surely a pale imitation–impressive enough to draw YouTube views, but not burning up the charts. And Sunspring works as a surreal, bite-size experiment, but would be difficult to stretch to feature-length. So what happens when the song is a hit? When the film nets an Oscar? When your favorite artist is artificial? These imitative algorithms we find writing pop songs, short films, and generating first-person shooter levels will evolve to process broad and diverse inputs–cross-pollinating rhythms, language, and imagery from deep and unlikely corners of our physical and virtual worlds. This is our new creative frontier. –Zach Marley and Graeme Asher, Frog Seattle
Virtual reality therapies (VRT) will extend beyond simply distracting patients and will create multi-sensory environments that can work as treatment. Initially, we will see VRT treating psychological problems such as phobias, addictions, and other conditions. But soon it could enable physiological outcomes and help with practices such as neurorehabilitation.
Mindmaze, a pioneer in this space, is already creating virtual environments for stroke patients, causing their brains to rewire themselves and reestablish mobility in forgotten limbs. As the creation of immersive environments becomes more common, we will see more experimentation in this space and continue to discover and unlock what the brain can do. Future patients of cognitive behavioral therapies and systematic desensitization can expect virtual reality to become a critical component of their treatment. –Kyle Wolf, Frog San Francisco
Medicine has taken a “one-size-fits-all” approach, but what if we could assess treatments based on the abundant data points captured by caretakers and health care professionals? This is precision medicine, a new form of health care that is based on data, algorithms, and other tools.
The University of California, San Francisco is a leader in precision medicine. It is training doctors to have different conversations with their patients to better understand their social, environmental, and economic contexts when diagnosing and treating illness. Precision medicine changes the focus of health and medical efforts from identifying symptoms to understanding and treating the mechanisms of disease. It also focuses on environmental and social determinants of health, like your postal code. Those who understand technology, and the goals of medicine, will be able to create value in the precision medicine value chain by offering platforms to interpret and connect data points. –Allison Green-Schoop, Frog San Francisco
Since the early 1980s, human-computer interaction has relied primarily on the Graphical User Interface. However, the combination of screen fatigue and technology embedded in everything from cars to homes, is exposing a need for new types of interfaces that extend beyond the visual. Auto companies, for example, have tried to keep drivers’ eyes on the road using audio interaction (though it may be more distracting than many of us think). Devices like Here One are exploring the potential of augmenting reality through sound. Recently, Apple unveiled AirPods, which will likely lead to more ubiquitous audio experiences for one of the largest consumer demographics in the world. 2017 will be the year of the AUI— the Audio User Interface. –Christine Todorovich, Frog San Francisco
Rwanda is building the world’s first drone airport to provide medicine that can be quickly flown to those who need it. Rather than wait months for roads to be built, drones can quickly provide critical support to people living outside of urban areas. This is an example of a wider movement that is happening globally in developing and developed countries: drones for good.
Many leading innovators of this movement aren’t just from the government or large foundations; they are individual citizens. With a few thousand dollars, citizens are able to experiment with how this powerful, but affordable, technology can be used for good, like helping to identify illegal poachers or find children trapped under rubble after an earthquake.
Drones have made it affordable to do humanitarian work that was only accessible to large institutions with the resources to invest in satellite and helicopter technology. The definition of a drone is “unmanned aircraft,” but behind the unmanned aircraft is a person driving the intent and potential of what the aircraft can do for people in need. And this year we’ll see more folks begin to push this potential. –Lillian Tse, Frog London
Machine learning has been a constant on tech trend lists for years. But this year we’ll embrace what we can learn from interacting with machine learning.
AlphaGo’s victory over the world’s greatest human Go player, Lee Sedol, marked a milestone for machine learning. But in training and playing against AlphaGo, the human Go players also became better players. We are already indirectly learning from algorithms in other ways, whether by refining our music tastes while helping Spotify refine its algorithm, or by learning about the brain by watching neural networks learn.
In the future, as we draw on people’s behaviors and choices to help machines learn, we will embrace the way humans learn from machines as well. Could watching a computer forge new connections between words make us more creative writers? What if we could teach a child and a computer to translate at the same time?
Learning from machine learning could have an immediate impact on the way we think about education and training, fostering a symbiotic approach to human-machine learning. –Rebecca Blum, Frog San Francisco
Read Frog’s full report here.