I wrote an email to an acquaintance who is a lawyer, advising them on the most pressing issues that a lawyer should deal with in a world that's changing quickly by AI, cryptocurrency, and some other technologies. Here's the edited version of it.
Have you studied more about AI? Kai-fu Lee wrote a new book
AI Superpowers, and it is worth a read. You could read and watch a presentation
here, and an interview with Wharton
here.
There will be a very strong market for lawyers proficient in legal issues around AI and other emerging technologies, and I think you can benefit from learning more. Here are some problems for you to think about:
- Income inequality is increasing more and more in the world due to AI. Read a piece by Kai-fu Lee to see why. How should the laws be changed so that those who lose their jobs won't sink into poverty? Universal Basic Income? Free education? Unemployment benefits?
- If a self-driving car kills someone, who are legally responsible?
- If a self-driving car senses that the brakes failed and there is a group of people walking in front, should it crash into the crowd, killing 3, or crash into the side of the road, killing 1 in the car?
- How would the laws treat bitcoins and other cryptocurrencies?
- What are the legal tasks that AI can perform? What would the lawyers do instead? Watch this TED talk. There are already companies like Kira that do lawyer work with AI, and their capabilities would only grow and grow.
- Algorithmic discrimination: what to do where the algorithms create discrimination? See this and this.
My answers to them would be like
- I've not read much into economy, so I can't say the details, but I believe in a combination of UBI, education, and a change of culture to place less merit on hard work itself (which would encourage people to seek work just for cultural approval, even when it would be otherwise meaningless) and more on personal improvement and hobbies.
- Practically, the manufacturer would. This would encourage them to produce safer cars. Ethically, I actually don't know. Perhaps nobody would be responsible. Humans find it hard understand that the concept of ethical responsibility can't be applied to everything, and find it an ethical need to blame someone. But as time goes on they may need to see that ethical responsibility really isn't that useful of a concept.
- I am undecided on the issue. All else being equal, people would want to buy cars that protect people inside the cars, but would want utilitarian cars to be popular so that they would be safer on average. The problem is a classic case of tragedy of the commons, and has a precedent in cars, too: in a collision of SUV and sedan, SUV protects its inmates at the expense of inmates of the sedan. This created another tragedy of the commons.
- Perhaps a more libertarian approach would best suit the potential of cryptocurrency, but I foresee that governments, who cannot assert their power over the currency itself (a 51% attack on bitcoin is beyond the power of even the Chinese government), would simply assert their power over users of the currency. This would constrain the potential of cryptocurrency into roles less innovative than what they could do and it annoys me, and I see no real way to avoid that.
- As in many other jobs, the top lawyers gets better productivity. The routine lawyers either get automated out of their jobs or improve themselves into better lawyers.
- It seems to be fundamentally of two parts. One is the technical part, where algorithms are constrained so that they are not allowed to learn certain features, and we find out a way to formalize the idea of equality as a goal, in the face of factual inequality. The other part is social, where people actually figure out what kind of equality they want.
No comments:
Post a Comment