Mark Cuban, the brash billionaire Shark Tank celebrity, and owner of the NBA’s Dallas Mavericks, recently made headlines with his views concerning the future of the job market.
“I personally think there’s going to be a greater demand in ten years for liberal arts majors than for programming majors and maybe even engineering,” Cuban said, before citing degrees such as English, philosophy, and foreign languages as being the most valuable. “Maybe not now,” he admitted. “They’re gonna starve for a while.” Their moment to shine, though, is likely coming, he says.
Cuban’s point is that automation will soon lead to even greater automation. Software will write itself. Coding will be internally coded. Data will be ripe for “self”-interpretation. He sees a great deal of the high-tech industry following the path of manufacturing and mining jobs and becoming obsolete to all but a select few of the American workforce.
While these claims are not new — people have been saying for decades that machines will “be” the new humans — they are picking up momentum amongst a strong group of economists, entrepreneurs, and higher education professionals, who still recognize the need for critical thinking and well-rounded intellectualism in the workplace. “Soft skills”, those traits like adaptability, communication, and attention to detail, will be more necessary than ever in the technologically evolving landscape.
While degrees in STEM fields (Science, Technology, Engineering, and Mathematics) and technical trades are largely preferred by parents of college-age offspring, there’s new evidence to suggest that value will still be found in the good ol’ Liberal Arts degree. Artificial intelligence might be able to offer consumers better energy-efficient appliances, but who will be there to explain the advantages to a skeptical customer? (That sounds like all Liberal Arts majors will wind up in marketing, but Hartley has a more optimistic outlook.) Elaborate, highly technical missile launch and defense systems are designed to wreak havoc in far-off lands. However, as we are learning all too well in this current political climate, it takes an interpersonal nuance, verbal interpretation, and skilled negotiation to control such technological terrors for the betterment of all humankind.
Though Cuban’s celebrity provides flashy news headlines for these types of observations, a great deal of time and energy is spent examining future trends and innovations by people like Scott Hartley, a leading venture capitalist and global startup adviser who has worked at Google, Facebook, and Harvard’s Berkman Klein Center for Internet and Society. In his new book, The Fuzzy and the Techie: Why the Liberal Arts Will Rule the Digital World, Hartley examines those workers that bring context to elaborate codes and data, and humanistic ethics to the cold calculus of algorithms. Companies need to rely on those who can understand necessary life issues and who can best manage the complexities of day-to-day interactions between living and breathing human beings. The machines can only do so much, and for society and business to continue moving forward, it’s often the fuzzies, rather than the techies, who play key and vital roles in making this happen.
Hartley first encountered the easily memorable terms — fuzzy and techie — while he was a student at Stanford University studying political science. Those like him, who majored in the humanities or social sciences, were fuzzies. If you gravitated towards the computer sciences or engineering worlds, you were more often than not labeled a techie. Though both groups have brilliant scholars — it was Stanford after all — Hartley noticed a troubling assumption that the techies were the ones who would automatically drive the economy and make all of the important decisions moving forward. This mindset crept into the job force, as well, with employers, university personnel, and the media driving forth the notion that those from the fuzzie background would be left behind in the new world economy.
Hartley spends the entirety of his book’s eight chapters examining things from a contrary viewpoint. He studies military technology, transportation initiatives, mental health counseling, and public school pedagogy to filter through the various ways in which these industries are enhanced by those attacking issues from a fuzzie perspective. He deftly illustrates the correlations between the techie-led behind-the-scenes research and development and the real-world implementations being put in place by fuzzies. As he states in one memorable example, it’s great that the techies have given the world access to ride sharing technologies like Lyft and Uber, but to fully succeed, it’s the fuzzies that collaborate to make these types of apps the conveniences they have become.
Though it relies too much on Ivy League success stories — nearly everyone profiled is a CEO or high-end entrepreneur — Hartley’s book raises important questions and examines timely trends in a variety of diverse industries and professions. Furthermore, he spends the bulk of two chapters, “Making Our Technology More Ethical” and “Building a Better World” looking at ways in which fuzzies and techies can work in tandem for the common good. After all, shouldn’t the amount of time we spend logged into our devices do more than just create fleeting, individual moments of joy? Hartley spends time with a couple of innovators who are making it their mission to promote justice, freedom, and democracy through the blend of technological innovation and intellectual savvy.
Technology is only as valuable as those that use it and interpret it appropriately, therefore the need for creative thinkers and effective communicators is stronger than ever. Hartley’s book makes for an enlightening read and an effective rebuttal to the parent, teacher, or guidance counselor telling their student to minimize an interest in the humanities.