Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the way it should be. AI to speed up the understanding process, and one final evaluation without any help to cement the understanding.
 help



I don't think the final evaluation is to "cement the understanding" so much as _verify_ that students have taken accountability for their own learning process.

^ This

This is what a student, who truly wants to learn rather than simply complete a course / certification, would do... Use AI tools to explain + learn, but not outsource the learning process itself to the tools.


> AI to speed up the understanding process

What’s your hypothesis of how AI can accelerate how your brain understands something?


Quick, easy access to explanations and examples on complex topics.

In my case, learning enough trig and linear algebra to be useful in game engine programming / rendering has been made a lot easier / more efficient.

The same way Google or Wikipedia enables learning.


> What’s your hypothesis of how AI can accelerate how your brain understands something?

What are your beliefs / hypothesis of how having a human teacher can help you understand something?

AI explanations are no longer terrible garbage. The LLM might not be doing original research, but it has definitely read the textbook. :/ And 1000 related works.

You shouldn't believe the LLM when it tells you how to micro-optimize your code, but you can take suggestions as a starting point and verify them.


"What are your beliefs / hypothesis of how having a human teacher can help you understand something?"

One precondition is the human teacher can challenge you independently of your own self-control/will.


This is a bad case of whataboutism (I hate this word but it describes the answer you gave), what do you mean by accelerating understanding? Maybe they are good as suggestion engines, but it is very early to state what you did.

I use them every day to learn things.

Theyre alot more than "suggestion engines". They can reason with you, show you examples, tell you how to dig deeper and verify what theyre saying, etc.


I have some success with this method: I try to write an explanation of something, then ask the LLM to find problems with the explanation. Sometimes its response leads me to shore up my understanding. Other times its answer doesn’t make sense to me and we dig into why. Whether or not the LLM is correct, it helps me clarify my own learning. It’s basically rubber duck debugging for my brain.

An AI is never going to make me feel bad for asking a "stupid" question, nor will it ever behave as though I have to earn the right to ask a sophisticated one. The answer is, in some cases, almost immaterial. Building in students the willingness to ask questions accelerates learning.

Does one still need to read, take notes, write, act, etc? Yes. The feeling of newfound knowledge that comes from reading/watching a virtuoso demonstration is always a mirage--I suspect most of us have the memory of watching someone do algebra on the board and then having the feeling of understanding melt away later when faced with the page. That's not changing, so far as I can tell.

But we should acknowledge that allowing students to ask questions at will, even ill-posed questions, might have some value.


I disagree. I think we should treat AI tools like calculators for the exam.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: