You probably know that in the future, what you’ll need to be successful at work will be different than what it takes right at this moment. There will be new skills to learn, new tools and procedures to master, and new responsibilities to take on. It’s difficult to predict exactly what your job will look like in 10 years. When social media came along, not many people predicted that it would lead to a new career path.
But uncertainty doesn’t stop predictions. People do make them on good grounds, but there are ideas that harm your career and mind-set rather than help it. Here are some of those “future of work” myths that might be holding you back from success.
1. You need to be a quick learner to succeed
Yes, the workforce is always changing and, yes, you’ll constantly need to learn new things. But the idea that you should be able to pick up a new skill and master it at the drop of a hat in order to thrive is . . . well, unrealistic and not entirely true.
No doubt about it, there are some advantages to picking things up quickly. But if this isn’t something that comes naturally to you, you’re not doomed to fail. First of all, practice can help. Second of all, there is no correlation between how long it takes someone to learn something and how intelligent and capable they are. Todd Rose, author of The End of Average: How We Succeed in a World That Values Sameness and the director of the Mind, Brain, and Education program at Harvard School of Education, referenced a TED Talk where Kahn Academy founder Sal Kahn presented a chart showing students who needed to spend extra time to understand and master a concept. The chart showed that once they got over that hurdle, they were able to “race ahead.”
The thing is, it takes a long time to master a difficult skill. And mastery is about continuous commitment to improve and fine-tune. Don’t get discouraged if you feel like it’s taking you longer than your peers to learn a new skill at work–what matters is your openness to adapt to change. As former Y combinator alum and ex IBM strategist previously wrote, “Some of us are born with more potential to adapt, but each of us can get better at it over time. We all have that friend who loathes change and another who thrives on new experiences.” The key is to train yourself to be the latter.
2. If you want to do something innovative, you have to be an entrepreneur or work at a startup
In today’s era, innovation is synonymous with entrepreneurship. Startups and new businesses “disrupt” industries, and those that are passionate about solving large-scale problems should either join a startup or establish their own ventures. People don’t often consider legacy businesses when they think about innovation.
But this isn’t always true. Yes, the Kodaks of the world have been very slow to move and adapt with change, and there are many startups that are tackling issues that very few businesses have chosen to focus on, or that their behemoth competitors can’t crack. But many legacy brands know they have no choice but to be innovative if they want to survive in the future. And this responsibility isn’t confined to the CEO and the owner. Companies will provide opportunities for employees to be innovative because that’s what they need to do. That will always be the case as technology forces businesses to transform their practices.
And as George E. L. Barbee, author of 63 Innovation Nuggets for Aspiring Innovators told Stephanie Vozza in a previous Fast Company article, you don’t always need to be in a leadership position or be the “visionary” to be innovative. He said, “Realize that top executives have to have internal innovation; they can no longer depend on acquisition to grow a company . . . Find a way to link to that by looking for two or three people in the middle of the organization who want to break through and do something customer facing.”
3. You need to learn to code
Yes, there is supposedly a tech talent shortage (although many have questioned the validity of that statement.) Learning to code isn’t going to hurt, but you don’t need to aspire to be a software engineer, data analyst, or UX designer if your interests (and frankly, talent) are elsewhere.
As neuroscientist Tara Swart put it in a previous article for Fast Company, the key to thriving in the future of work isn’t about practicing computer science, but training your brain how to think computationally. Swart wrote, “If you want to set yourself apart from the pack, you need to break down problems and become familiar with the way that machines come up with solutions and sequences.”
Columbia astronomer Moiya McTier also suggested that workers can benefit from learning how to think like a scientist to thrive in the future of work. In a previous article for Fast Company, she suggested that one can learn to approach their work with a series of questions, not tasks. That way, you’re not stuck on doing things with a tunnel vision and you’re less likely to miss out on potential solutions. She wrote, “Asking the right questions can help you identify what you’re really looking for while leaving room to explore all possible solutions.”
4. Automation is bad for workers
Speaking of understanding how machines work, perhaps one of the most pervasive (and fear-inducing) myths is the idea that robots are here to take over your job, and there’s not much you can do about it.
This is a dangerous belief to have, according to Swart. Fearing robots puts our brain in a “loss-avoidant” and fight-or-flight mode, which hampers the very skills you’ll need to work with machines. After all, as Swart pointed out, machines can actually help you do your job better. But in order to do that, you need to be willing (and open) to learning about how you can do your job in conjunction with machines in the first place.
Joe Greenwood, executive lead of data and program director at MaRS, a Toronto-based innovation hub, previously wrote for Fast Company that the introduction of machines will create more jobs than it kills. He gave the example of the ATM. When it was introduced, many believed that it was going to be the job-killer for bank tellers. The reality? Banks hired more tellers, because ATMs lowered the cost of running a bank.
5. A humanities education is pointless
The debate on the merits of a liberal arts degree isn’t new. But in a tech-centric world (accompanied by rising student debt and the expensive price of a college education), many are questioning its validity more than ever. The thing is, understanding the humanities is even more important in a machine-oriented world–whether or not that education comes from a university.
Avi Goldfarb, coauthor of Prediction Machines: The Simple Economics of Artificial Intelligence told Fast Company‘s Ruth Reader that while technical skills are important, it’s understanding subjects like art, philosophy, sociology, and psychology that will allow people to understand how to put artificial intelligence to use. Doing so, according to Goldfarb, requires a broad range of knowledge and a multidisciplinary mind-set.
Entrepreneur and author Faisal Hoque put it this way, “The constant cascade of new technologies will continue to create a more empowered population. We will be increasingly connected and isolated at the same time. We will demand more self-expressions. Within this topsy-turvy context, we all will have to learn how to leverage humanities to connect, inspire, and influence others and ourselves.”