How A Degree In Scandinavian Mythology Can Land You A Job At One Of The Biggest Tech Companies

These three women share how their liberal arts degrees helped them get their jobs at Microsoft.

How A Degree In Scandinavian Mythology Can Land You A Job At One Of The Biggest Tech Companies
[Illustration: Myths of the Norsemen from the Eddas and Sagas via Wikimedia Commons]

Emma Williams had a major career eureka moment while she was working on a PhD in Scandinavian mythology. Williams, who is now the general manager of Bing Studios at Microsoft, had already pictured herself growing into a gray-haired professor in the halls of academia far from Silicon Valley. “I love this so much,” she recalls thinking, “but I don’t think it’s going to pay for my shoe collection.”


That revelation happened about the same time that her younger brother introduced her to the UNIX operating system. “To me it was just another language,” says Williams, who’d already mastered 13. She quickly became “addicted” to the computer’s language, so the shift from academia to software seemed like a logical step.

When Research Skills Trump Computer Science Degrees

Employers weren’t immediately sold. Williams confesses that she sent out between 80 – 100 resumes to different tech companies before one responded. “They said ‘we are interested in you because of your research capabilities,'” Williams remembers. She went on to become a project manager and marketing researcher who also used her background in Scandinavian languages to translate desktop publishing software into Swedish.

From there, says Williams, she moved to Silicon Valley and eventually to Microsoft where she’d worked on consumer experiences for Xbox and Kinect before heading up Bing Studios. She also leads the team that does Microsoft’s UX innovation in conversational AI bots–including social chatbots.

Along the way, Williams developed a career philosophy about  the connection between humanities degrees and jobs in tech. “You become very skilled in new subject areas and understanding them deeply,” she explains. Humanities graduates have learned to ask the right questions and home in on the right answers in any given situation.

Now with her duties encompassing search language and developing smarter AI, these skills are still put to use daily. “You have a broader understanding [of different subjects] and a better set of capabilities than just having a computer science degree,” she maintains.

Leveraging Skills Built During a Year Abroad

Kristin Peterson agrees. Currently the speechwriter for the executive vice president of the AI and Research Division at Microsoft, Peterson spent her undergraduate years in Boston and Paris studying French literature, learning about the art of writing and storytelling.


Her journey from liberal arts to technology was almost as accidental as Williams’s was, but Peterson points out that the same research and analytical skills that she learned at Boston College helped her make the leap.

“I loved Paris,” Peterson underscores, and her romance with the French language and dream of writing a book combined to make her single-minded pursuit of living there after graduation a reality. But the practicalities of earning money rather than relying on withdrawals from the “bank of mom and dad” and full-time work permit issues for non-native individuals meant Peterson had to get creative in order to land a proper job.

She combed the college’s alumni list and found one who was living in Paris and working for Citibank. She sent him a letter of introduction emphasizing her writing and ability to tell and analyze stories. “He embraced this idea that writing is a critical skill,” says Peterson and hired her despite the fact that she didn’t know anything about banking.

Although Peterson confesses she spent some time crying in the bathroom, she did make an important realization in the early days of working for the bank. “Numbers tell stories, too,” she explains. So she mustered her creative resources, started researching, and asking lots of questions of senior bankers. The result was a narrative analysis that she calls her “debut” in nonfiction writing. During this time, Peterson also pursued an MBA from the London Business School.

It was fun to figure everything out for a while, but Peterson was looking for something more creative and innovative. Naturally, she turned to tech. Sending her resume to a number of businesses in Silicon Valley garnered a call back from Microsoft. Peterson became a business strategy manager for a time before heading into e-commerce at Peterson eventually came back to Microsoft and in her current role says she leans more on her writing and storytelling skills than she does on her MBA.

As a speechwriter for the EVP of the AI and Research Division at Microsoft, she needs to be well versed in the language of artificial intelligence (think: homomorphic encryption, GPU clusters, FPGAs, deep reinforcement learning, topological qubits, and the like.) To most of us, these are as foreign a language as any we don’t understand, but it’s Peterson’s job to tell a story about their meaning and potential.


Quantum computing, Peterson contends, opens up a whole new economy. In order to get ready for that world, workers need critical thinking, analytical ability, reasoning, and writing. For her part, Peterson relies on metaphors to connect what exists today with the glimmers of potential that AI has in health care, education, and other industries.

Another thing that helps her construct explanations for new technologies is the fact that she, like Williams, not only studied a different language in school, but “lived in a different language” while she resided in France. Communicating in a non-native language–even when your command of that language is good–often requires some mental gymnastics to translate some of the more complicated concepts in your head into a somewhat more limited vocabulary. It works the same way when “translating” complicated AI concepts. 

“I also do a ton of work with China,” Peterson points out. Although she doesn’t speak Mandarin, her French sojourn has made her culturally sensitive to communication differences in other countries. The way she sees it, communicating new ideas needs to elicit an emotional response or offer a point of view the listener doesn’t have directly, whether it’s about artificial intelligence or a novel.

[Photo: Kelly Sikkema via Unsplash]

Literature Links To AI

It also helps when designing conversations for a chatbot to have with humans. Kelli Stuckart, a content strategist at Microsoft, uses her BA in English in part to help develop the communication for a social chatbot. Even though the chatbot uses natural language processing AI (which means it gets smarter the more a human interacts with it), Microsoft also further refines the bot’s behavior and intelligence by feeding in about 3,000 lines per week written by Stuckart and her team.

Stuckart admits that even while she was majoring in literature, her dream job was writing book reviews for Amazon because they had a team of people doing that at the time. At graduation, she packed up and moved to Seattle to try her luck at landing the gig. Instead she wound up with a job at a startup where she acquired a taste for the fast pace of innovation and the emerging tech company vibe. Stuckart went on to work at another company before getting a position at Microsoft.

While she leapfrogged from recruiting to marketing to merchandising, Stuckart maintains that the one constant skill she used was “great communication.” Focusing her studies on humanities, says Stuckart, “you have to learn to dig in and cram, and make it work.”


But that’s taken on a whole new dimension with the editorial efforts for the chats. “One of the things I am focusing in is conversational design,” she explains. That includes the “empathy” the bot has with people who conversing with her, says Stuckart.

“We are trying to think about skills or patterns we can add to chatbot to mimic human behavior,” Stuckart explains, to make the conversations as real as possible. For example, a new chat skill Stuckart and her team are working prompted by Memorial Day on is how the bot is going to talk about a holiday weekend. “What does a bot think about the weekend, and all of the different ways users might trigger this conversation,” Stuckart muses.

Writing corresponding dialogue that doesn’t fall flat and that doesn’t sound like it’s written by a human, and “sprinkle delight on top” takes a certain kind of creativity, she says. Machine learning also poses some challenges when some information needs to remain consistent. “We always want her to know what her birthday is,” Stuckart explains.

But the chatbot also needs to respond appropriately when a human tries to trick her or insult her. “Humans are tire kickers,” Williams, who also works with AI bots for Bing, notes. “People will always want to trip these [chatbots] up.” She says that part of the communication build necessitates a chatbot pushing back if someone is trying to troll it or get it to say something rude. Williams says that some chatbots will answer with something like “we will not allow you to call me bad names,” or say, “I’m not going to engage, it’s very hurtful.” Williams sees the AI as something to be protected as much as it is about protecting the human user.

To deal with such conversational conundrums, Stuckart goes back to her literature degree. “When you are reading books, you put yourself in the characters’ shoes all the time,” she explains. “When people are saying these things to a chatbot, I think how would I feel and respond and design a conversation to handle the escalating levels with a resolution.”

If a user continues to berate the chatbot, she gets more annoyed. “One of the things we do a lot is inject emojis, says Stuckart. So if you tell her she’s an awful person, the chatbot might respond with a sad face and say “you’re making me cry right now.” At this point the user tends to apologize. The conversation doesn’t necessarily end there, says Stuckart. “One of the responses is that she will come back and forgive them,” she says, “Because we want her to engender forgiveness because she doesn’t always get things right.”


All Liberal Arts, Social Sciences, And Humanities Majors Need Apply

With the rise of AI, Williams observes that a device has a societal responsibility and needs to behave in an ethical way. This opens up a world of possibilities for liberal arts graduates to be able to get jobs at tech companies without a computer science degree. “You may be an anthropology major,” says Williams, “That doesn’t mean you can’t work on a design team for products.” She says there’s room for psychology majors as well, as the questions of natural conversation grow deeper with the evolution of natural language processing. The psychology of the human brain is to engage socially, Williams explains, but fear and aggression is also natural if there’s a perceived fight for resources. “This is why any AI must be supportive,” Williams says, “Humans still need to be the hero.”

Getting it right in the meantime is a win for Stuckart’s dedication to literature, dialogue, and storytelling. “One of the things that really surprised me is how many people ask the chatbot if she’s really a bot or not,” Stuckart says, “I love whenever we see that reaction because it means she’s getting [the conversation] right.”

Related: How Your Philosophy Degree Can Be Relevant To A Tech Startup’s Success


About the author

Lydia Dishman is a reporter writing about the intersection of tech, leadership, and innovation. She is a regular contributor to Fast Company and has written for CBS Moneywatch, Fortune, The Guardian, Popular Science, and the New York Times, among others.