Late last month at the Workplace Stack Exchange, a question-and-answer site for professionals about workplace ethics and etiquette, an anonymous programmer posed an intriguing question: “Is it unethical for me to not tell my employer I’ve automated my job?”
The employee, writing under the pseudonym Etherable, had been hired as a programmer for a job that turned out to amount to little more than “glorified data entry”; within a year, Etherable had figured out the organization’s legacy software system enough to write a program that has been doing most of their work for them for the past six months:
Now the problem is, do I tell them? If I tell them, they will probably just take the program and get rid of me. This isn’t like a company with tons of IT work – they have a legacy system where they keep all their customer data since forever, and they just need someone to maintain it. At the same time, it doesn’t feel like I’m doing the right thing. I mean, right now, once I get the specs, I run it through my program – then every week or so, I tell them I’ve completed some part of it and get them to test it. I even insert a few bugs here and there to make it look like it’s been generated by a human.
The question took off on the Workplace thread and was shared on Hacker News and Reddit. At Quartz, Keith Collins rounded up some of the most interesting responses:
In the MIT Tech Review, George Anders flags a recent study from the Brookings Institution’s Hamilton Project, which analyzed US Census Bureau data from 2010 to 2013 to trace the job choices of 1.2 million college graduates and to answer the question: What do people who major in (X) typically end up doing for a living?
Overall, the study offers a fascinating look at how college educations in the sciences, arts, and humanities translate into careers, but Anders highlights one finding that may be of particular interest to employers, that “many people working as computer scientists, software developers, and programmers used their college years not to major in computer programming or software development, but instead to major in traditional sciences or other types of engineering”:
Among graduates with degrees in physics, math, statistics, or electrical engineering, as many as 20 percent now work in computing-based fields. At least 10 percent of people who majored in aerospace engineering, astronomy, biomedical engineering, or general engineering have made the same migration. Even geography, nuclear engineering, and chemistry departments send 3 to 5 percent of their undergraduate majors into software development or similar fields, the Hamilton Project reports.
Quartz’s data editor Christopher Groskopf analyzed data from the US census and the American Community Survey to find out how many Americans are doing their work entirely from home these days. Full-time home workers, the Quartz analysis found, now make up a record 2.6 percent of American employees, which Groskopf notes is more than the number who walk and bike to work combined:
The data show that telecommuting has grown faster than any other way of getting to work—up 159% since 2000. By comparison, the number of Americans who bike to work has grown by 86% over the same period, while the number who drive or carpool has grown by only 12%. We’ve excluded both part-time and self-employed workers from these and all results.
Intriguingly, with an average annual income of nearly $80,000, people who work from home earn the highest wages of any major category of commuters tracked by the US census. (Broken down further, remote workers are edged out by those who commute by non-subway trains, taxis, or ferryboats.) This is mostly due to the nearly 550,000 remote workers who are managers—the largest group of home workers in any single job category.
In no profession is work from home more popular, however, than computer programming. In fact, Groskopf adds, “among the most experienced, some are even beginning to demand it“:
Unconscious bias remains a major barrier to diversity and fairness in recruiting, but it has proven difficult to remove from the hiring process, as by definition, we don’t automatically recognize our implicit biases for what they are. The ability to remove or mitigate the effects of bias on the hiring process is part of the promise of new recruiting technologies that either enable blind interviews, concealing candidates’ race and gender, or make hiring decisions based on skills assessments or even games.
The tech sector has had a particularly hard time diversifying its workforce and becoming a more welcoming environment for women and minorities. In this context, the BBC’s Regan Morris profiles Iba Masood, CEO and co-founder of Tara.ai. Tara, which stands for Talent Acquisition and Recruiting Automation, is an artificial intelligence-based project management software aimed at combating bias by hiring programmers based solely on the quality of their work:
Tara analyses and ranks programmers’ code, removing biographical information such as age, race, gender or where you have worked in the past or where you went to university. The algorithm means that people are judged on the work they have produced rather than who they are or who they know.
To cope with shortages of talent with digital skills, many organizations have turned to hiring graduates of coding bootcamps, where students learn programming languages in an intense, accelerated course of study. While these bootcamps don’t provide the same level of in-depth training as a college degree in computer science, they are widely seen as a useful mechanism for rapidly filling gaps in the tech talent market, as well as a way for underrepresented minorities to break into the tech field.
“But the great promise of these schools training a new generation of skilled engineers has largely fallen flat,” Sarah McBride warned recently at Bloomberg Technology, pointing to Coding House, a Silicon Valley bootcamp ordered by regulators to shut down last month, as an example of how coding schools are often oversold, both to students as a ticket to a good job and to employers as a source of qualified workers:
Coding House’s spectacular fall is an extreme case, but interviews with more than a dozen coding school graduates reveal that when they do land a job, often their engineering education doesn’t cut it. Many admit they lack the big-picture skills that employers say they want. Training them often requires hours of hand-holding by more experienced staff, employers say. The same holds true for graduates holding computer science degrees, but those employees generally have a better grasp of broader concepts and algorithms, recruiters said.
Facing a talent crunch in critical IT and cybersecurity roles, many organizations are looking to entice women back into a heavily male-dominated tech workforce. The tech sector is known for being a less than hospitable work environment for women, and one reason the sector has been leading on family-friendly benefits like parental leave is that is has to in order to convince women to work there.
So it may come as a surprise to learn that in its early years, computer programming was a predominantly female occupation. At the Atlantic, Rhaina Cohen explains why that was so, and how it changed:
In the early years of computing, the area that garnered respect was hardware development, which was thought of as manly work. Meanwhile, the work most women performed, programming, lacked prestige. The gender makeup of programmers and the status of the job were mutually reinforcing. Women were hired because programming was considered clerical work, a bit of plug-and-chug labor that merely required women to set into motion preset plans.
Programming was later recognized to involve complex processes of analysis, planning, testing, and debugging. Initially, though, the job was poorly understood. Janet Abbate, a professor of science and technology in society at Virginia Tech, explains in her book Recoding Gender that, in the absence of a concrete grasp on the job, “gender stereotypes partially filled this vacuum, leading many people to downplay the skill level of women’s work and its importance to the computing enterprise.” Notably, where more egalitarian gender roles prevailed, so did the job options available to women in computing. While American and British women were effectively barred from building hardware during the mid-20th century, women in the relatively more equitable Soviet Union helped construct the first digital computer in 1951.
That coding skills are the only steady meal ticket left in the modern job market is an assertion so often repeated that it has become something of an article of faith. Indeed, the market for software engineers and other prized STEM talent is currently very friendly to job seekers, with short supply driving up the value of those who have the right skills. Some observers, however, have questioned how much longer this cohort can expect to enjoy comfortable salaries and relative job security, if automation progresses to the point at which even their skills become devalued, programs replacing programmers.
At Wired, Jason Tanz argues that this shift is already happening. “Our machines,” he writes, “are starting to speak a different language now, one that even the best coders can’t fully understand”:
Over the past several years, the biggest tech companies in Silicon Valley have aggressively pursued an approach to computing called machine learning. In traditional programming, an engineer writes explicit, step-by-step instructions for the computer to follow. With machine learning, programmers don’t encode computers with instructions. They train them. If you want to teach a neural network to recognize a cat, for instance, you don’t tell it to look for whiskers, ears, fur, and eyes. You simply show it thousands and thousands of photos of cats, and eventually it works things out. If it keeps misclassifying foxes as cats, you don’t rewrite the code. You just keep coaching it.