[ad_1]
Editor's Note: This article comes with an interesting ChatGPT: Explore the origins of generative AI and natural language processing.
As ChatGPT entered the scene, there was much fear and uncertainty expressed by those working in education at all levels. Educators worry about cheating, and rightly so. ChatGPT can do everything from writing an essay in iambic pentameter to solving algebraic equations and explaining the solution. We see some schools rushing to ban the use of ChatGPT. Some educators take this tool so seriously that they eschew online assignments and revert to hard paper and pencil. What should the teacher do? First, let’s look past all the hype and hysteria and understand what ChatGPT does and where its credit lies.
In the evenings, I teach a focused Natural Language Processing (NLP) class called Computational Approaches to Human Language to graduate students at Duke University. I dedicated the second class of the semester to the elephant in the room—ChatGPT. When this huge disruptor hit the scene, I was never more grateful to be in the NLP space than I am today. I typically didn’t delve into big language models like GPT until I was closer to the midpoint of the semester, after I had a better understanding of how the landscape had evolved. However, this is not a normal time, so jumping ahead was important. Most of my students have heard of ChatGPT and many have already played it.
First, I walked through the evolution of NLP that culminated in ChatGPT, including content from the first article in this series, “Interesting in ChatGPT: Explore the Origins of Generative AI and Natural Language Processing.” What ChatGPT typically does best is communicating in human language, retaining memories of previous interactions within the same conversation, referencing physical, emotional, and cultural experiences in the data it was trained with, and can dynamically retrieve from a scientific repository. and technical expertise to answer questions. Then we got into things that weren’t so good, like basic arithmetic. It can write a program for you to do arithmetic, but it’s not great at basic math other than working with 2 or 3 digit numbers.
I asked for a solution to a long-standing (but not too crazy) multiplication problem. What is 123456789 times 1234? returned 1,522,756,166. It’s not even close – the actual answer is 152,345,677,626. This is clear to me. This is not a calculator. It is a language model and improves the ability to solve equations. Sometimes someone who is good with words is not so good with numbers! We all have our strengths. However, if your 5th grader is going to ask ChatGPT to do their long division homework, they might want to check out how ChatGPT works.
ChatGPT won’t ask follow-up questions or ask for clarification, and it certainly won’t admit when it doesn’t know something (see multiplication above). He just confidently answers the answer and you might get in trouble if you don’t check the answer.
ChatGPT is well trained in writing code, especially Python, so I gave my students an assignment.
- Create an account with OpenAI for ChatGPT.
- Ask ChatGPT to generate Python code to perform the action of your choice.
- Run the code and note any challenges or issues.
- Ask ChatGPT to refine or modify it (at least 3 iterations).
- Submit your code and a short note (no more than one page) about your experience, challenges and problems.
Games and basic operations
The 60 students in the class had a mix of positive and not-so-positive experiences. Students who asked ChatGPT to do simple things like create a Hangman or tic tac toe game, generate sorting algorithms, or create art using ASCII characters had a pretty good experience. ChatGPT produced near-flawless code, or at least produced code that required very little intervention. Games like 2048, FizzBuzz, and Sudoku handled pretty well, though they needed some fine-tuning.
Requests for more complex games, such as recreating the dinosaur game at Google, have led to more frustration – mostly with buffering limitations. ChatGPT seems to have a buffer of around 75 lines when generating code. Once it hits, the generated code is interrupted – sometimes mid-line. This was frustrating because there was no good way to get ChatGPT to complete the rest of the code needed.
Graphics
Students who asked ChatGPT to draw a graph were less fortunate, and after thinking about it, it’s not surprising. They essentially asked the bot to render something graphically that humans could recognize. The renderings (except for the dog below) were still recognizable, but not quite right. ChatGPT isn’t human, so it stands to reason that it won’t understand some of the nuances in things that we can instantly recognize by sight.
He created this clock using a turtle package for one student. He added a layer of difficulty by using Italian for every request. ChatGPT handled the Italian requirements well, but clock rendering needed help.
One student started with code that prompted the user to draw a basic shape (circle, triangle, square). This worked decently. When they moved on to more complex shapes (tree, house, car), the best code produced by ChatGPT for drawing a house was:
Back and forth asking ChatGPT to generate a card for a fantastic card trade brought it back. It looks like a card and now it will be more sophisticated.
Asking the code to generate a turtle produced a surprisingly good turtle using the turtle package, and then did another decent job when asked to draw a turtle square.
It didn’t do so well when the turtle was asked to draw a dog (left) or a more realistic dog (right).
This output is produced after asking ChatGPT to generate code to draw a flower, then a more realistic flower – a clover. The result was not spectacular even after the clover leaves were required to join the stem.
Audio, app creation and more
The code generated to process the audio files had mixed results. ChatGPT produced decent source code, but required a bit of debugging and documentation to meet requirements.
One student tasked them with creating a mobile app. ChatGPT provided good guidance on brainstorming the content of the app, and then went further to help illustrate the steps involved in creating a job and implementing the code.
Still other students tried to get ChatGPT to generate code to extract text and found that ChatGPT would embellish and add text that wasn’t requested or requested (in machine learning parlance, this is called hallucination).
Trying to get ChatGPT to generate a chatbot wasn’t terribly successful.
Several students have had to generate code to solve problems presented on LeetCode, a website where people can practice solving coding problems and prepare for technical interviews. Interestingly, ChatGPT handled most of these types of requests well, with few interventions required.
Two students discovered that ChatGPT can generate haikus, but they are not as intelligent as the haikus generated by real people.
Ethical data acquisition
Attempts to generate website rendering code (both government and commercial) for several different ChatGPT applications were successful, but students received content policy warnings. I explained to my class that just because you can scrape sites doesn’t mean it’s legal or acceptable, and that that’s what the content policy warning was trying to convey. Data is valuable and scraping is generally portrayed as a breach of service on many websites.
student presenters
Beyond the generated code, these students came away from this exercise knowing that using ChatGPT to do their work without putting in their own work is not appropriate. Many reported that ChatGPT would return incorrect answers, then apologize and return the same incorrect answer when challenged on the wrong answer. Most of the generated code required some level of human intervention to run – sometimes it was as simple as missing package import statements in the code, but sometimes it was significant syntax issues. Asked to make ChatGPT go through at least three revisions, many students reported that the more revisions they went through, the worse the code got. Sometimes ChatGPT would invent code or appear to just make things up.
It is important to help ChatGPT with coding, especially for basic operations. It is always easier to start with something than to start with nothing. However, some students reported that for more complex tasks, it took them longer to get the ChatGPT code up and running than it would have if they had built it themselves from scratch.
The future of AI in education
AI is here to stay. As educators, we have a responsibility to help our students understand the pros and cons of all available technologies. We’re not yet at the point where any of us, students or otherwise, should trust AI to do all the work for us, but it can be an incredibly useful tool given the right set of circumstances. Rather than being quick to ban the technology, we should see it as a unique opportunity to influence the future of AI and the next generation of consumers.
Learn more
[ad_2]
Source link