Idk, I think we're back to "it depends on how you use it". Once upon a time, the same was said of the internet in general, because people could just go online and copy and paste shit and share answers and stuff, but the Internet can also just be a really great educational resource in general. I think that using LLMs in non load-bearing "trust but verify" type roles (study buddies, brainstorming, very high level information searching) is actually really useful. One of my favorite uses of ChatGPT is when I have a concept so loose that I don't even know the right question to Google, I can just kind of chat with the LLM and potentially refine a narrower, more google-able subject.
Microblog Memes
A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.
Created as an evolution of White People Twitter and other tweet-capture subreddits.
Rules:
- Please put at least one word relevant to the post in the post title.
- Be nice.
- No advertising, brand promotion or guerilla marketing.
- Posters are encouraged to link to the toot or tweet etc in the description of posts.
Related communities:
Something I think you neglect in this comment is that yes, you're using LLMs in a responsible way. However, this doesn't translate well to school. The objective of homework isn't just to reproduce the correct answer. It isn't even to reproduce the steps to the correct answer. It's for you to learn the steps to the correct answer (and possibly the correct answer itself), and the reproduction of those steps is a "proof" to your teacher/professor that you put in the effort to do so. This way you have the foundation to learn other things as they come up in life.
For instance, if I'm in a class learning to read latitude and longitude, the teacher can give me an assignment to find 64° 8′ 55.03″ N, 21° 56′ 8.99″ W
on the map and write where it is. If I want, I can just copy-paste that into OpenStreetMap right now and see what horrors await, but to actually learn, I need to manually track down where that is on the map. Because I learned to use latitude and longitude as a kid, I can verify what the computer is telling me, and I can imagine in my head roughly where that coordinate is without a map in front of me.
Learning without cheating lets you develop a good understanding of what you: 1) need to memorize, 2) don't need to memorize because you can reproduce it from other things you know, and 3) should just rely on an outside reference work for whenever you need it.
There's nuance to this, of course. Say, for example, that you cheat to find an answer because you just don't understand the problem, but afterward, you set aside the time to figure out how that answer came about so you can reproduce it yourself. That's still, in my opinion, a robust way to learn. But that kind of learning also requires very strict discipline.
Your example at the end is pretty much the only way I use it to learn. Even then, it's not the best at getting the right answer. The best thing you can do is ask it how to handle a problem you know the answer to, then learn the process of getting to that answer. Finally, you can try a different problem and see if your answer matches with the LLM. Ideally, you can verify the LLM's answer.
So, I'd point back to my comment and say that the problem really lies with how it's being used. For example, everyone's been in a position where the professor or textbook doesn't seem to do a good job explaining a concept. Sometimes, an LLM can be helpful in rephrasing or breaking down concepts; a good example is that I've used ChatGPT to explain the very low level how of how greenhouse gasses trap heat and raise global mean temperatures to climate skeptics I know without just dumping academic studies in their lap.
And just as back then, the problem is not with people using something to actually learn and deepen their understanding. It is with people blatantly cheating and knowing nothing because they don’t even read the thing they’re copying down.
To add to this, how you evaluate the students matters as well. If the evaluation can be too easily bypassed by making ChatGPT do it, I would suggest changing the evaluation method.
Imo a good method, although demanding for the tutor, is oral examination (maybe in combination with a written part). It allows you to verify that the student knows the stuff and understood the material. This worked well in my studies (a science degree), not so sure if it works for all degrees?
I might add that a lot of the college experience (particularly pre-med and early med school) is less about education than a kind of academic hazing. Students assigned enormous amounts of debt, crushing volumes of work, and put into pools of students beyond which only X% of the class can move forward on any terms (because the higher tier classes don't have the academic staff / resources to train a full freshman class of aspiring doctors).
When you put a large group of people in a high stakes, high work, high competition environment, some number of people are going to be inclined to cut corners. Weeding out people who "cheat" seems premature if you haven't addressed the large incentives to cheat, first.
Medical school has to have a higher standard and any amount of cheating will get you expelled from most medical schools. Some of my classmates tried to use Chat GPT to summarize things to study faster, and it just meant that they got things wrong because they firmly believed the hallucinations and bullshit. There's a reason you have to take the MCAT to be eligible to apply for medical school, 2 board exams to graduate medical school, and a 3rd board exam after your first year of residency. And there's also board exams at the end of residency for your specialty.
The exams will weed out the cheaters eventually, and usually before they get to the point of seeing patients unsupervised, but if they cheat in the classes graded on a curve, they're stealing a seat from someone who might have earned it fairly. In the weed-out class example you gave, if there were 3 cheaters in the top half, that means students 51, 52, and 53 are wrongly denied the chance to progress.
No. There will always be incentives to cheat, but that means nothing in the presence of academic dishonesty. There is no justification.
Except I find that the value of college isn't just the formal education, but as an ordeal to overcome which causes growth in more than just knowledge.
an ordeal to overcome which causes growth
That's the traditional argument for hazing rituals, sure. You'll get an earful of this from drill sergeants and another earful from pray-the-gay-away conversion therapy camps.
But stack-ranking isn't an ordeal to overcome. It is a bureaucratic sorting mechanism with a meritocratic veneer. If you put 100 people in a room and tell them "50 of you will fail", there's no ordeal involved. No matter how well the 51st candidate performs, they're out. There's no growth included in that math.
Similarly, larding people up with student debt before pushing them into the deep end of the career pool isn't about improving one's moral fiber. It is about extracting one's future surplus income.
That's the traditional argument for hazing rituals, sure.
That's a strawman's argument. There are benefits to college that go beyond passing a test. Part of it is gaining leadership skills be practicing being a leader.
But stack-ranking isn't an ordeal to overcome.
No, but the threat of failure is. I agree that there should be more medical school slots, but there still is value in having failure being an option. Those who remain gain skills in the process of staying in college and schools can take a risk on more marginal candidates.
Similarly, larding people up with student debt before pushing them into the deep end of the career pool isn't about improving one's moral fiber.
Yeah, student debt is absurd.
As a college instructor, there is some amount of content (facts, knowledge, skills) that is important for each field, and the amount of content that will be useful in the future varies wildly from field to field edit: and whether you actually enter into a career related to your degree.
However, the overall degree you obtain is supposed to say something about your ability to learn. A bachelor's degree says you can learn and apply some amount of critical thought when provided a framework. A masters says you can find and critically evaluate sources in order to educate yourself. A PhD says you can find sources, educate yourself, and take that information and apply it to a research situation to learn something no one has ever known before. An MD/engineering degree says you're essentially a mechanic or a troubleshooter for a specific piece of equipment.
edit 2: I'm not saying there's anything wrong with MD's and engineers, but they are definitely not taught to use critical thought and source evaluation outside of their very narrow area of expertise, and their opinions should definitely not be given any undue weight. The percentage of doctors and engineers that fall for pseudoscientific bullshit is too fucking high. And don't get started on pre-meds and engineering students.
Well, this just looks like criteria for a financially sucessful person.
Even more concerning, their dependance on AI will carry over into their professional lives, effectively training our software replacements.
While eroding the body of actual practitioners that are necessary to train the thing properly in the first place.
It’s not simply that the bots will take your job. It that was all, I wouldn’t really see that as a problem with AI so much as a problem with using employment to allocate life-sustaining resources.
But if we’re willingly training ourselves to remix old solutions to old problems instead of learning the reasoning behind those solutions, we’ll have a hard time making big, non-incremental changes to form new solutions for new problems.
It’s a really bad strategy for a generation that absolutely must solve climate change or perish.
How people think I use AI "Please write my essay and cite your sources."
How I use it
"please make my autistic word slop that I wrote already into something readable for the nerotypical folk, use simple words, make it tonally neutral. stop using emdashes, headers, and list and don't mess with the quotes"
God I am sick of seeing emdashes but am so glad it helps me filter out aislop on certain subreddits.
I'm a human I swear - I've been writing like this all along!
See if you just use a hyphen like that I assume you're human.
No child left behind already stripped it from public education...
Because there was zero incentives for a school performing well. And serious repercussions if a school failed multiple years, the worst schools had to focus only what was on the annual test. The only thing that matters was that year's scores, so that was the only thing that got taught.
If a kid got it early. They could be largely ignored so the school could focus on the worst.
It was teaching to the lowest common denominator, and now people are shocked the kids who spent 12 years in that system don't know the things we stopped teaching 20+ years ago.
Quick edit:
Standardized testing is valuable. For lots of rural kids getting 99*'s was how they learned they were actually smart and just for in their tiny schools.
The issue with "no child left behind" was the implementation and demand for swift responses to institutional problems that had been developing for decades. It's the only time moderates and Republicans agreed to do something fast, and it was obviously something that shouldn't be rushed.
One of the worst parts about that policy was that some states had both a "meets standards" and "exceeds standards" results and the high school graduation test was offered five times, starting in sophomore year.
So, you would have students getting "meets standards" on sophomore year and blowing off the test in later attempts because they passed. You would then have school administrators punishing students for doing this since their metrics included the number of students who got "exceeds standards".
I literally just can't wrap my AuDHD brain around professional formatting. I'll probably use AI to take the paper I wrote while ignoring archaic and pointless rules about formatting and force it into APA or whatever. Feels fine to me, but I'm but going to have it write the actual paper or anything.
AFAIK those only help the instructor with grading as it would put all the essays they need to review on an even (more or less) playing ground. I've never really seen any real use in the professional world outside of scholarly/scientific journals.
My opinion is that they tend to stifle creativity of expression and the evolution of our respective languages.
don't worry, you can become president instead
Cries in "The Doctor" from Voyager.
The Doctor would absolutely agree. He was intended to be a short-term assistant when a doctor wasn't available, and he was personally affronted when he discovered that he wouldn't be replaced by a human in any reasonable amount of time.
Using AI doesn't remove the ability to fact check though.
It is a tool like any other. I would also be weary about doctors using a random medical book from the 1700s to write their thesis and take it at face value.
If we are talking about critical thinking, then I would argue that using AI to battle the very obvious shift that most instructors have taken, (that being the use of AI as much as possible to plan out lessons, grade, verify sources.......you know, the job they are being paid to do? Which, by the way, was already being outsourced to whatever tools they had at their disposal. No offense TAs.) as natural progression.
I feel it still shows the ability to adapt to a forever changing landscape.
Isn't that what the hundred-thousand dollar piece of paper tells potential employers?