Law blogs usually take notice when the New York Times, the Wall Street Journal, or Bloomberg News write about the state of legal education. It is less common to take notice when undergraduate journalists do. Here is a story from the Pitt News, the University of Pittsburgh’s student-produced newspaper, that combines interest in “application numbers are way up” and “what about AI and legal education?” Consider it a blended take on how future law students are looking at the world of law schools and law practice.
There is a bit more, below the jump.
The student reporter did a nice job of contacting multiple faculty members and administrators for comment. I show up in the story, more on the “what about application numbers?” side and, somewhat to my chagrin, not on the “what about AI and legal education?” side. Like almost all law professors, I like to see my name in print (metaphorically as well as literally), so I am reproducing here the text of my comments on that question, emailed to the reporter but until now lost to posterity.
Coincidentally, like Paul Caron, I took note of Dan Rodriguez’s recent comments about law deaning in the age of AI, which strike me as complementary to my views.
Q (reporter):
What is your perspective on generative artificial intelligence with regard to law – do you believe it may enhance and aid student learning, or that it could be a cause for concern? Have you noticed students expressing concerns about how AI might redefine or impact tasks typically assigned to first-year associates?
A (me):
Generative AI has already made enormous inroads into the day-to-day practice of law, across essentially every field of law and every type of legal organization or law office, from solo practices in small towns and neighborhoods to global law firms, from small companies to enormous corporations, from prosecutors’ offices to state and federal administrative agencies, and nonprofit organizations of all sizes. There are few lawyers anywhere in the world whose professional lives have not been changed significantly by Generative AI.
Whether those changes are for the better or for the worse varies a lot by context: by field of law, by type of law practice, by expertise level of the lawyers involved. Senior lawyers are seeing their worlds change, too. Generative AI does not simply automate tasks that junior lawyers might have done on their own.
All of the changes are very early stage. Generative AI has only been in widespread public use since late 2022. Data analytics and automation technologies of other sorts were already becoming common by that time. Observers have been writing for at least a decade about the end of “classic” law practice. For a long time, that “futurology” seemed pretty radical and speculative. Now many of the predictions are coming true in real time. Most large law firms have either built or bought significant technology departments that blend Generative AI models and other IT platforms, and those firms look at those areas as major sources of profit separate from profit derived from traditional legal services.
Law students seem to approach Generative AI with the same range of belief and understanding that we see across lots of educational areas and fields of practice. Some students are horrified by the sense that Generative AI is being imposed on them, as a practical matter. Not by teachers, but by the expectations of the profession that they are training for. Other students are enthusiastic about blending Generative AI with their other educational experiences, including internships and summer jobs, where Generative AI use is now common. Many students are somewhere in the middle, aware that Generative AI is changing both school and career paths but unsure as to what exactly to do about that.
Q (reporter):
Is ‘AI literacy’ becoming fundamental (or even beneficial) to a law degree? Is the Pitt Law curriculum evolving at all to reflect this? Have you noticed students asking for advice on how to build a “future-proof” resume that highlights their ability to work alongside legal tech and/or their ability to effectively use AI?
A (me):
Knowing something about Generative AI and its strengths and weaknesses is certainly something that all legal employers now expect from their new hires. I steer away from “AI literacy,” because no one truly knows what that phrase means. It is probably true already that a new law graduate who knows nothing about Generative AI is at a disadvantage in the job market relative to their peers. Meaning that Generative AI knowledge is something closer to “fundamental,” as you write.
I have not heard from students asking about how to “future proof” a resume. Students today are quick enough, clever enough, and attentive enough to the job market to know, on their own, that Generative AI today is a form of “table stakes” in job searches. The additional thing to know here is that there is no such thing as “future proofing” a resume, for a law graduate (or an undergraduate, or a graduate of any other program). The technical side of the world changes and is changing much too quickly. The future-oriented skills that new graduates need to focus on are not technical skills, or law-and-technology skills. They are social skills: emotional intelligence, collaboration, communication, project management, curiosity, and resilience. I was a “law and technology” graduate in 1987, when I got my law degree. At Stanford, in Silicon Valley. “Law and technology” then meant MS-DOS and computer software on 5 1/4″ floppy disks, and no public computer networks. My law firm in San Francisco installed its first computer network around 1990; I sent my first email on the Internet around 1993. I had to figure it out. New graduates always need to figure it out. New graduates should prioritize how to figure things out while they are still in school.
At Pitt Law, no, so far we have not made any changes to the curriculum to reflect the transformative effects of Generative AI on law practice. Some professors have made changes to their courses to embrace Generative AI as part of the student experience; other professors have taken steps to try to keep Generative AI out of students’ hands. I am not aware of any efforts to change our curriculum going forward. A couple of faculty members (including me) are trying to track how Generative AI is being used or addressed in our courses, but there is no plan to translate that information into new courses or new faculty or new areas of programming.




