Home page
ع
Photo of ChatGPT home page

ChatGPT: Danger to Learning or Opportunity for Efficiency?

Abigail Flynn
February 27, 2023

Universities around the world are facing fresh concerns brought on by the use of artificial intelligence, as newly launched programs like ChatGPT and Chatsonic allow users to enter a question and receive a structured essay in response. Ethically, is using these programs cheating the system or does it merely optimize efficiency? Logistically, if a professor wanted to ban the use of ChatGPT, is there a way to check for AI usage in their students’ work? 

Keep an eye out for AI-generated content in this article and see if you can spot where it was used.

What is AI-generated content?

A robot says "Artificial Intelligence-generated content is created using artificial intelligence" and a person responds "That's incredibly unhelpful. Thanks!"AI generated content is created using Artificial Intelligence technology. This technology uses algorithms to analyze large sets of data and create content that is tailored to the specific requirements of a given project. AI technology is also capable of learning from past experiences, allowing it to continually improve its accuracy. 

 

Programs like ChatGPT and Chatsonic use this AI technology to create a bot that can respond to questions a user asks it. To generate the paragraph above, I asked Chatsonic: “How does AI-generated content work?” 

 

Take note: the paragraph uses incorrect capitalization, is quite vague and the first sentence is repetitive (“Artificial Intelligence-generated content is created using Artificial Intelligence technology”- not helpful). The effectiveness of the program, in addition to the ethics of students using it, are major concerns for instructors.

 

Are these programs allowed at AUC?

At the moment, AUC has not instituted any University-wide policy regarding the use of AI, instead allowing faculty to dictate its use on a case-by-case basis. While many of these programs are not accessible in Egypt, tech-savvy students can utilize virtual private networks (VPN) to work around this problem. 

The University’s Center for Learning and Teaching (CLT) has been hosting community circle conversations to introduce this technology to faculty members.

“The community circle conversations aim to empower faculty with enough knowledge of what is possible and all the options they have available to them, whether they eventually choose to ban it, use it with caution or attribution, or embrace it and encourage transparency,” explains Maha Bali ‘01, professor of practice in CLT.

Will AI-generated content hurt learning?

There are some major risks of using AI-generated content. It could lead to plagiarism if students do not take the time to understand the content generated by AI and rewrite it in their own words. Furthermore, if students rely too heavily on AI generated content, this could lead to a lack of originality in their work. AI-generated writing could make students become too reliant on the technology and make them less likely to think critically and creatively. 

Unfortunately, there are no programs currently on the market that can reliably detect AI-generated content. According to Bali, AI text detectors are inaccurate and produce both false negatives and false positives. This means that students could use AI assistance without their professors knowing, making it difficult to prevent.

“I don’t think going after detection is the way to go, to be honest,” says Bali. “I’d rather encourage students to be transparent about their process of how they may have used AI so they can reflect on the value of using AI and see where it helped or hindered them.”

It could be useful, with proper training…

AI generated content could also have some benefits. For instance, AI can help students to get ideas on how to structure their essays, as well as providing them with an understanding of the structure of a well written essay. Additionally, it can provide students with a better understanding of the topic and even provide them with helpful resources to further their knowledge. 

"Similar to Wikipedia, AI tools can also provide students with a general overview about many different topics that may be unfamiliar to them, but then condense these topics into a distilled version for the common reader in a real-time response format," explains Meredith Saba, instructor in the Department of Libraries and Learning Technologies. "AI tools can cut writing and project time, generate notes faster and it can also help students improve their English reading, writing and communication abilities by modeling sentences and structures well."

However, reaping the benefits of AI-generated content requires proper training, according to Saba and other instructors. “For something like ChatGPT to be useful, someone needs to already have a lot of good knowledge about the subject because ChatGPT often makes up inaccurate information,” says Bali. “They also must already be a good enough writer, or else the writing will be generic and disjointed. I think students can learn how to harness ChatGPT by refining their writing prompts so that it produces better quality content.”

In the future, bosses may expect their employees to know how to use AI-generated content. “As a university we may want to consider where in the curriculum we consider this program to be especially useful as a marketable skill,” says Hoda Mostafa, professor of practice and director of CLT. “We must also aim to incorporate it without jeopardizing fundamental learning and intellectual skills.”

But will efficiency sacrifice learning? 

If students are trained to use AI correctly, it could substantially improve their efficiency. However, some argue that students in university should be learning how to write well without assistance. In the same way that students must learn the basics of mathematics by hand before using a calculator, critics of AI-generated content say college is the time to learn the basics through trial and error instead of using a crutch. 

“Students do not write an essay because the professor has a hobby of collecting essays; students write an essay because they need to become better writers and engage with the content of the lessons,” argues Mario Hubert, assistant professor in the Department of Philosophy, in an op-ed he wrote for CLT. “A university is not a place to find the easiest route to submitting an assignment; rather, a university is a place for mindbuilders.”

In Hubert’s perspective, students are meant to learn how to become good writers by themselves during university. Perhaps AI assistance can be used after they’ve built this skill on their own. 

Maybe the answer is simply “time and place”

Other instructors have expressed that AI-generated content has an appropriate time and place for use. A student in an introductory writing and rhetoric class should not be using the bot, for instance, since the point of the course is to build the skill set. However, it may be permissible in other courses. 

“I think it might be appropriate to use AI in advanced courses where ‘writing’ is not the main learning outcome, and the AI can help students write faster — when they’ve already done the hard work itself of doing an experiment in the lab or researching a topic, and they’re just using the AI to help them put it together,” says Bali.

But wait… is AI-generated content problematic?

In addition to problems with inaccurate information, disjointed writing and removing a student’s authorial voice, AI-generated content also may end up regurgitating problematic perspectives from the data it uses.

“Much of this data is skewed toward Anglo/Western culture and ways of thinking and can therefore reproduce hegemonic knowledge structures in the world,” states Bali. “It is important to remember that ChatGPT is only building on data it has already seen before and synthesizing it into new content based on the prompt. So it will not produce anything extremely creative — just a synthesis of the creativity of other humans over time.”

If AI-generated content is utilizing skewed data, users will have to be careful to ensure they are conveying thoughtful and nuanced perspectives when being assisted by these bots. 

Did you spot the AI? 

Personally, I found the AI-generated content to be bland, disjointed and inconsistent in voice. I often felt like I was editing a freelance writer’s first draft, a writer who I probably would not hire again. You’ll find that I only used the bot’s writing for three paragraphs; I did try to use it more, but I could not get the bot to produce interesting content. 

To be fair, as Bali and Mostafa point out, I have never been trained to use these programs. Perhaps a more experienced individual would be able to coax more impressive content from the bots. The following paragraphs were lifted from Chatsonic with no editing or revision, did you pick them up?


AI generated content is created using Artificial Intelligence technology. This technology uses algorithms to analyze large sets of data and create content that is tailored to the specific requirements of a given project. AI technology is also capable of learning from past experiences, allowing it to continually improve its accuracy. 

There are some major risks of using AI-generated content. It could lead to plagiarism if students do not take the time to understand the content generated by AI and rewrite it in their own words. Furthermore, if students rely too heavily on AI generated content, this could lead to a lack of originality in their work. AI-generated writing could make students become too reliant on the technology and make them less likely to think critically and creatively. 

AI generated content could also have some benefits. For instance, AI can help students to get ideas on how to structure their essays, as well as providing them with an understanding of the structure of a well written essay. Additionally, it can provide students with a better understanding of the topic and even provide them with helpful resources to further their knowledge.

Share