University Students Fooled By Robot TA

0
60

Washington: Imagine discovering someone you thought was human is, in fact, a robot.

It sounds like the stuff of science fiction. But that’s what happened to a class full of Georgia Tech students recently, when they learned that “Jill,” their teaching assistant, was actually a piece of software.

- Advertisement -বাংলায় পড়তে হলে ক্লিক করুন Kolkata24x7

CBC Radio technology columnist Dan Misener explains what happened.

The story starts with a computer science professor named Ashok Goel, who teaches at the Georgia Institute of Technology.

For the past few years, he’s been teaching a graduate-level online course on artificial intelligence (AI). It’s a popular class. About 300 students enroll in the course every semester, and it’s run by Goel and eight teaching assistants.

About a year ago, he realized that his students were asking a lot of questions in the class’s online forum — questions about assignments, due dates, what was going to be covered on any given week, things like that.

“These 300 students generated something like 10,000 postings on the discussion forum,” he said. “That’s like receiving 100 emails every day.”

Not only did the students ask a huge number of questions, but different students would ask the same question over and over. It was repetitive.

So Goel decided to create a robot teaching assistant that would answer these questions. He named her “Jill Watson,” trained her using human teaching assistants, and then listed her as a teaching assistant on the syllabus.

But Goel’s students only found out that Jill was an AI chatbot after the end of the term.

How did they keep students from finding out Jill was a robot?

Goel said the goal wasn’t to keep the AI a secret. They always intended to reveal that Jill wasn’t human. The question was when to make that reveal.

That said, Jill had a few things going for her that helped her avoid detection as a robot. First, this was an online course, so most of the interactions between students and teaching assistants happened through text, on an online discussion board. It’s much easier to pass yourself off as human if you never have to meet face-to-face with another human.

Also, Jill wasn’t overly formal. She used conversational language.

And finally, Goel and his assistants made sure Jill was good enough at answering questions before they unleashed her on the class’s online forum. Jill was programmed to only respond to a small subset of questions. And she was programmed to only respond when she was at least 97 per cent certain she could answer correctly.

How did Jill learn to answer questions?

Jill Watson is a machine, so they used an approach called “machine learning.”

First, they had to train Jill. Goel and his assistants took all the questions that students had asked in previous semesters. These had already been answered by human TAs, so it was a pretty good set of training data.

Initially, Jill’s performance wasn’t very good. According to one of Goel’s assistants, she’d get stuck on keywords in questions and give irrelevant answers. So they continued to train her on new student questions, and her performance gradually improved.

“By the end of the term, her performance had reached a level where we could let her loose into the discussion forum on her own,” Goel said.

ill’s full name is Jill Watson, and that’s not by accident. The machine learning system Ashok used was Watson, from IBM — the technology best known for beating humans at Jeopardy! It’s well-suited to these types of question and answer tasks.

How else could this technology be used?

Beyond building robot TAs (and competing on Jeopardy! several years ago), Watson has been used in lots of different ways. We’ve seen it used in financial planning, for clinical research trials in medicine, and in computer security.

But I think what’s most compelling about Jill Watson, the robot TA, is that it uses a combination of machine learning that gets better over time, alongside natural language processing — the ability to pull meaning from the way humans speak or write naturally.

We’ve seen a huge amount of buzz recently about conversational chatbots, and the underlying technologies of Jill Watson are very similar to the kinds of customer service bots and sales bots that companies hope we’ll start using on a regular basis.

The question, I think, when we interact with these types of robots is — will we know we’re interacting with robots?

Could robots put teachers out of work?

Goel said a lot of the media coverage of Jill Watson has played up that angle — robot teaching assistants taking jobs from human TAs.

But he also pointed out a long list of things that human teachers do — acting as mentors, coaching, tutoring, motivating. These are things Jill Watson can’t do, and may never be able to do.

“We do not think of Jill, ever, in our lab as replacing any human,” he said.

“Instead, we think of Jill as complementing humans, in such a way that where humans cannot go, Jill can go. And what humans do not want to do, Jill can automate.”

For example, answering the same question about when an assignment’s due — over and over and over.

Goel sees software like Jill as a way to free up humans to focus on what humans are best at. He’s so confident in this approach, he’s planning to run the program again in the next semester — although he plans to change Jill’s name.

So if you’re taking an online course anytime soon, and the instructor seems a bit “stiff” — there may be a reason for that.

CBC News

---
---