A law student talks about how she integrates AI programs, such as Chat GPT, into her studies and what dangers she sees as a result.
Editorial: Tell us a bit about yourself – how old are you, what are you studying, and how far along are you in your degree?
Student: I’m 26, studying law, and currently slogging through my 9th semester – right in the thick of the final stretch before my first state exam. My day-to-day right now? Flashcards, law books, and desperately trying to cram as much as I can into my brain.
Editorial: Are you already using artificial intelligence in your studies?
Student: I’ve tried out AI tools like ChatGPT, but I don’t use them on a regular basis. In law school, you have to be super precise, and AI is often too vague or gives answers that just aren’t legally correct. Especially when it comes to German law, outdated info or odd wording from AI can be a real problem. Still, I find it interesting to use AI every now and then – but strictly as a supplement, not a replacement.
Editorial: What kind of situations have you used AI for so far?
Student: Mostly for getting quick overviews or breaking down complex cases into simpler terms that anyone – even without a legal background – could understand. That sometimes helps me organize my own thoughts more clearly. I’ve also used AI for a bit of inspiration when it comes to structuring my outlines or arguments. But for actual casework or prepping for exams, it’s just too unreliable for me. I stick to commentaries, prep course materials, and traditional casebooks.
Editorial: Has AI changed the way you study?
Student: Honestly, not really. Law is a subject where you have to develop a really specific way of thinking and working – no AI can do that for you. To write a good law exam, you need to understand how the statutes are structured, know your definitions, be aware of academic debates, and apply all that cleanly to cases. Those are skills you only develop through hardcore practice and doing your own work. AI can sometimes give me a new perspective or help me keep track, but the core techniques – the craft – I have to learn myself. So my study routine is still all about classic methods: working through cases, making flashcards, and constantly wrestling with statutory texts.
Editorial: Which AI tools are you actually familiar with?
Student: Mostly just ChatGPT, to be honest – it’s the best known and easiest to access right now. I’ve heard of other AI platforms, but haven’t really tried them out in depth. For me, it’s enough to know one tool really well and know how reliable it is – and so far, ChatGPT is basically my only go-to.
Editorial: Overall, how do you feel about using AI in your studies?
Student: I’m kind of on the fence. On the one hand, AI can definitely be helpful – especially for getting a quick overview or ideas on how to structure something. That can save time, and for super-theoretical topics, it can even be motivating. But on the other hand, I worry that students might rely too much on AI and not train the skills that really matter. Law school isn’t just about knowing stuff, it’s about applying that knowledge in the right way. If you let AI take over that part out of convenience, you’re going to be lost when it comes to actually sitting the exam. So: great for supplementing, but definitely not a replacement.
Editorial: Do you think integrating AI into your field of study makes sense?
Student: In principle, yes – but with a lot of caveats. Law demands a level of precision that AI just can’t fully deliver yet. For organizing material, structuring your learning, or maybe drafting phrasing, AI can totally help. But when it comes to deep legal content – like proper statutory analysis, case law, and literature – AI quickly hits its limits. It gets really risky if AI gives wrong or outdated info and you don’t catch it. So I see AI as a helpful extra tool in small areas, but nowhere near a core study method for law school.
Editorial: What concerns or challenges do you see in using AI during your studies?
Student: My biggest worry is that a lot of students just accept whatever AI spits out without checking it. In law, that can be disastrous – one wrong element or sloppy phrasing and your entire argument falls apart. There’s also the issue of keeping up-to-date – laws change, new cases come out all the time, and AI isn’t always current. Another thing is dependency: If you lean on AI too much, you lose the ability to think independently and develop arguments yourself. And then there’s the ethical aspect: Where does legit support end and cheating start?
Editorial: How could AI actually improve the quality or effectiveness of your university work?
Student: AI would be most helpful for me if it could deliver really reliable legal information – ideally with up-to-date statutes, references, and clear sources. That would save me the time I spend double-checking every tiny detail. I’d also find it useful if AI could help structure my arguments – not giving the answers, but showing me possible approaches, so I can keep working on my own. AI could also help with revision – for example, by generating custom practice questions based on my previous mistakes. Features like that wouldn’t replace studying, but they’d definitely make it more efficient.
Editorial: Do you think it’s morally or ethically acceptable to use AI extensively for academic work?
Student: I think it really depends on how you define “extensively.” If AI is just used for support – brainstorming, organizing, simplifying explanations – I don’t see a problem. But it’s a different story if students get AI to write their entire essays and pass them off as their own. That’s not just against academic integrity, it also means you’re missing out on developing real skills. In law, so much depends on your own argumentation – outsource that and you’re not learning. So: AI is fine, but it needs to be used transparently and in moderation.
Editorial: How do you think AI use should be regulated or limited at university?
Student: I think we need clear, uniform guidelines – ideally set by the university. There should be strict rules about when AI is allowed and when it isn’t, like for essays, term papers, or of course, exams. I also think students should have to declare if they’ve used AI, similar to citing sources. That keeps things transparent about what’s your own work. Plus, it would be great if there were mandatory training so students can learn to question AI’s output and not just trust it blindly. That way, we can get the benefits without risking academic standards.
Editorial: Do you think AI increases or decreases the risk of plagiarism?
Student: Honestly, it can do both. On one hand, AI makes it super easy to spit out whole texts that students might copy without thinking – and some people think AI texts are always “plagiarism-proof” because they’re not copied directly from anywhere, but that’s not true. If all your ideas or structure come from the AI, it’s still not your own work. On the flip side, AI could actually help reduce plagiarism – for example, by helping phrase things differently or offering alternative ways to express something. At the end of the day, it all depends on how you use it: as a tool or as a shortcut.
Editorial: What would be the most important features of the ideal AI software for your studies?
Student: Top of my list is reliability – it needs to have up-to-date legal info, with proper sources. I’d really love a feature where I can narrow things down to German law and get direct references to relevant cases and commentaries. It shouldn’t just spit out answers, but also show different possible solutions and lines of argument. Also, having a mode like a private tutor that quizzes you, spots gaps in your knowledge, and creates personalized practice questions would be amazing. So basically: a mix of legal database, tutor, and smart learning tool.
Conclusion
After our conversation, one thing is clear: this law student isn’t an AI skeptic, but she’s definitely not putting blind faith in chatbots either. For her, AI is like a helpful classmate in the library – you might ask them how they’d approach a problem, but you’d still want to do the real work yourself. She knows law is all about practice – and no AI in the world can do that for you.
Her mantra: AI can have a seat at the desk, as long as it doesn’t try to take the place of the law book. And when things get tough? She’s going to trust her own head (and a good coffee) every time.
Would you like to save yourself even more time and learn more productively? Then our all-in-one study app Learnboost is perfect for you (start for free). This allows you to create well-structured summaries and flashcards with AI at the push of a button. Study Mode seamlessly helps you learn by heart and repeat. You can answer questions and clarify complex subjects directly with Learnboost's Tutor AI. Good luck with productive learning preparation, memorization and reminders for your exams and learning phases!
Learnboost is the only AI study app you'll ever need. Your all-in-one solution for more productive learning in no time.