Computer science programs need to teach the law
Source: Jason Tashea
During the last few years, the pendulum has swung against internet-enabled technology companies. Once heralded as great equalizers and modernizers, the consequences—intended or otherwise—of these companies are under full review. In response to criticism, academic, corporate and non-profit ethics projects and programs have cropped up in an attempt to give computer science some humanity.
These projects are all a little different, but the thrust is that people across subject matter areas can create ethical structures for the development and application of technologies like artificial intelligence that will hopefully save us from reliving the moment we find ourselves in now. (Disclosure: the author is a member of the Law Committee for the IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems.) This insurgence of ethics programming is notable in computer science education, where the trend has caught fire.
Of course, these efforts are not without their detractors. In an era where many companies have failed to follow basic laws or even their own terms of service, what good are ethics? Ryan Calo, a law professor at the University of Washington School of Law, made the point on Twitter that “it is important to avoid the pull of ethical guidelines or principles of governance, over official policy and the authority of law.”
With this criticism in mind and tracking the investment computer science programs are making in the ethics space, I wondered what it will look like when these departments realize it’s also worth teaching the law.
From: the dean
To: trustees; administration; cs faculty
Cc: corporate sponsors
Bcc: dark money trustees; shadow faculty
Date: Nov. 19, 2018 at 2:29 PM
Subject: New Computer Science Law Curriculum
Our university’s computer science department strives to provide a world-class education.
Naturally, our students learn about design, object programming and surveillance-capitalism architecture. But these courses are offered at any accredited program, and the field is changing.
The inability of major tech companies to protect their users’ data and control fake news and online abuse makes identity theft more widespread, fuels domestic hate crimes and even helped incite genocide in Myanmar. As the New York Times recently noted, these failings have created new pressures and now “universities that helped produce some of Silicon Valley’s top technologists are hustling to bring a more medicine-like morality to computer science.”
It turns out being technically proficient at software creation and data manipulation can still—inexplicably—be evil.
The Times article continued to say that our rivals at Harvard University and the Massachusetts Institute of Technology are jointly offering a new course on the ethics of artificial intelligence. Not alone, the University of Texas at Austin added a course titled ‘Ethical Foundations of Computer Science’—they anticipate requiring it for all majors.
We will not be left in the herd’s dust.
Constantly looking to give our students an edge in the job market, I’ve sat down with many of you to discuss the next phase of our educational offerings. Like these other computer science programs, we considered teaching ethics to our majors.
It wasn’t until a later interview, someone piped up and said, “Why teach ethics when there are laws?”
It was a true GE C-Life smart-lightbulb (with Bluetooth) moment™.
After many meetings with rigorous murmuring, doctoral throat clearing and looking over our glasses at each other, it was decided: The department would teach law, the only field that disrupts disruption.
Our students’ brains are swimming with startup ideas that will quickly pay off their student loans and make them generous alumni. To that end, our core curriculum remains unchanged, but starting with our next incoming class, they will also learn about laws. Upon graduation, our students will now have the insight to know if their startup is illegal, a major liability or a trundling human rights abuse.
Adding to their good fortune, Law for Computer Scientists will be taught by an excellent new addition to our adjunct faculty, an accomplished contracts attorney who can now pay off his student loans one month earlier.
The department is still trying to iron out some of the bugs—or are they features?—of the course. But, classes may include:
The Internet: A Lawless Hellscape of Opportunity
“Privacy Policies are a Contract” and other phrases to say with a straight face
Why Cryptocurrency Won’t Keep You Out of Federal Court (Intro to Civil Procedure)
Your Social Media Platform/Search Engine/Online Marketplace Helped Spread Genocide/A Pandemic/Human Trafficking–Now What?
If-Then Jail (Intro to Criminal Law)
Until Robot Utopia, Labor has Rights
I Don’t Care if You Love “Her,” Your Robot Can’t Vote
Even Online, Your Company Still Has to Pay Taxes
Your Crowdfunding Campaign is a Security and Telling Me About it is a Crime
How Class Actions Can be a Check on Unbridled Corporate Power (Legal History)
3D Printed Guns and Originalism (Jurisprudence)
Congratulations, You’re a Unicorn: An Introduction to the Sherman Anti-Trust Act
Who Enforces Corporate Liability During the AI Apocalypse?
While these classes seem disparate, it is worth noting these diverse topics have a lot in common. For example, it turns out that most of the issues covered are “settled law” and apply to all businesses. Our students are bound to learn a lot. I did.
Before Clippy, Pets.com and the Zune—all alumni contributions—our curriculum has provided cutting-edge know-how, so our students could land any job, anywhere, in any era.
Now, with this course, we remain the vanguard in computer science education. Already producing top technical talent, our students will continue to have a competitive edge in an ever-changing job market. This new curriculum will help them pipe up and save their employers headaches and money. No longer will startups have to blindly bring projects to market that may violate the law, individual privacy rights or a user’s due process.
Unless, of course, their VC tells them otherwise.