Started By
Message
locked post

What is your opinion on kids using AI to write papers, letters, etc.?

Posted on 8/8/24 at 10:43 am
Posted by GetCocky11
Calgary, AB
Member since Oct 2012
53131 posts
Posted on 8/8/24 at 10:43 am
During the Olympics, this commercial kept playing. It's a Google AI commercial, and the dad asked the AI to write a fan letter for his daughter to send to her favorite athlete. Instead of the dad teaching his kid to write, he decided to let AI just do the entire job instead. I didn't like this. It feels like yet another overreliance on technology that will result in children being dumber in the long run.
Posted by geauxtigers87
Louisiana
Member since Mar 2011
26030 posts
Posted on 8/8/24 at 10:46 am to
Big push in education to harness how it's used. We've spent a lot of time trying to find the line between its uses as a tool but also prevent them from just cheating with it too.
Posted by Raging Tiger
Teedy Town
Member since Jun 2023
935 posts
Posted on 8/8/24 at 10:46 am to
I just used it to make some emails grammatically correct, and sound more professional/appealing. It has a place in the world. Saved me time.
This post was edited on 8/8/24 at 10:47 am
Posted by el Gaucho
He/They
Member since Dec 2010
56699 posts
Posted on 8/8/24 at 10:47 am to
It’s sad that they’re teaching kids to computer type when they should be learning to stack dimes

The whole American “computer” experiment has failed and now all the people with fake email jobs are laid off and welders, plumbers, electricians, etc are the only ones making money
Posted by el Gaucho
He/They
Member since Dec 2010
56699 posts
Posted on 8/8/24 at 10:49 am to
quote:

Big push in education to harness how it's used.

I’ve heard ai can write a children’s book about changing your gender in less than 30 minutes
Posted by StringedInstruments
Member since Oct 2013
19751 posts
Posted on 8/8/24 at 10:52 am to
There are various ways to approach AI, but it’s outsourcing intellectual thought to computer programmers from Silicon Valley, some of the most dehumanized NPCs imaginable.

I like AI as a way to think through things, bounce ideas off of. But I prefer to refine my writing through practice and revision. I attempt to teach my students the same philosophy as well.

But the reality is that humans are mostly stupid and want ease and convenience to be their main goals in life. I think eventually LLMs will satisfy what most people want out of life and (true) higher education will be reserved for elitists (intelligently and financially) to experience.
Posted by Odysseus32
Member since Dec 2009
8441 posts
Posted on 8/8/24 at 10:55 am to
I think it should be removed from schools until the 10th grade level.

I just used AI to summarize an analyst report for documentation yesterday. But I had to write correct prompts and understand what I was looking for, what was relevant, what was incorrect, what looked funky, etc.

I’m not discounting how much AI will be used in a professional setting in 10-15 years, but it means nothing when 60% of students can’t read or write.
Posted by Cheese Grits
Wherever I lay my hat is my home
Member since Apr 2012
58681 posts
Posted on 8/8/24 at 10:56 am to
No
Posted by Gifman
Member since Jan 2021
14756 posts
Posted on 8/8/24 at 10:57 am to
AI is great for thank you letters.
Posted by Jcorye1
Tom Brady = GoAT
Member since Dec 2007
74877 posts
Posted on 8/8/24 at 10:57 am to
I don't like it for a lot of things, seems to take the humanity out of things like letters.
Posted by WaterLink
Baton Rouge
Member since Sep 2015
19563 posts
Posted on 8/8/24 at 11:02 am to
Maybe it's me getting into my old man "new thing scary" phase. But the whole AI thing makes me uncomfortable. Feels like humans are gradually bringing forth their own obsolescence.
Posted by BluegrassBelle
RIP Hefty Lefty - 1981-2019
Member since Nov 2010
103843 posts
Posted on 8/8/24 at 11:04 am to
We're seeing it on the professional end as well. I've heard of therapists using AI to write their session notes. I don't feel anywhere comfortable enough to input potential client identifiers to do something like that.

General treatment planning it can be helpful, for generating potential interventions or something like that. I don't like the idea of going beyond that.

I also did a CEU last year that talked about using it to identify whether a client was suicidal. Basically you would open up the AI program to "listen" during the session and it would potentially pick up cues that could suggest the client was suicidal. Again, I have bigger concerns about the security of doing something like that. And I think the program that was suggested was something like $30,000 a year and it's not near worth that.
Posted by oVo
Member since Dec 2013
11983 posts
Posted on 8/8/24 at 11:06 am to
AI will be our rapture and the end of humanity.
Posted by SUB
Silver Tier TD Premium
Member since Jan 2009
23009 posts
Posted on 8/8/24 at 11:11 am to
It's an interesting question. On one hand, AI will be used in tandem with humans in the workforce, so it is important to learn how to use it. BUT, over-reliance on it, like you said, will be our downfall. I don't think the use of AI as an aid should be part of curriculum until college. Kids should understand what AI and how to use it, but when it comes to learning, that should all be 100% human driven. Kids need to train their brains so that they will understand how to responsibly use AI when they are adults. It shouldn't be that kids let AI do the heavy lifting in their education instead of their brain. Because when they become adults, AI will be their brain. They won't have the ability to think for themselves.
Posted by RedHawk
Baton Rouge
Member since Aug 2007
9213 posts
Posted on 8/8/24 at 11:12 am to
I remember when my math teacher told me I wouldn’t have a calculator everywhere I go so I better learn how to do math in my head or on a piece of paper. Oh wait…
Posted by SUB
Silver Tier TD Premium
Member since Jan 2009
23009 posts
Posted on 8/8/24 at 11:12 am to
quote:

I've heard of therapists using AI to write their session notes.


If the AI they are using is public facing (like OpenAI), that could be a HIPAA violation.
Posted by RogerTheShrubber
Juneau, AK
Member since Jan 2009
281843 posts
Posted on 8/8/24 at 11:14 am to
Terrible idea. What's the point?
Posted by Funky Tide 8
Bayou Chico
Member since Feb 2009
54798 posts
Posted on 8/8/24 at 11:14 am to
Its just the next step in stunting the creativity and critical thinking of young people, making the future world increasingly boring and uninteresting.
This post was edited on 8/8/24 at 11:15 am
Posted by BluegrassBelle
RIP Hefty Lefty - 1981-2019
Member since Nov 2010
103843 posts
Posted on 8/8/24 at 11:15 am to
quote:

If the AI they are using is public facing (like OpenAI), that could be a HIPAA violation.


It's a really, really thin line into one depending on what you input.

If you're writing in a general sense without client identifiers you can maybe get away with it.

That's not a line I'm willing to flirt with. Sadly, some are.
Posted by 1MileTiger
Denver, Colorado
Member since Jun 2011
1809 posts
Posted on 8/8/24 at 11:15 am to
I'm using ChatGPT as a integral tool for software engineering. Like the calculator, it's a tool that improved the workflow and efficiency of those who know how to use them. ChatGPT takes a lot of the repetitive work out of my job and speeds things up. That being said, you still have to be a skilled programmer to verify the work and tweak it to your specific needs.

It has replaced Stack Overflow and Google for me almost entirely. It's not going away, so might as well learn how to use it to improve your life and skill sets.
first pageprev pagePage 1 of 3Next pagelast page

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on X, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookXInstagram