New UT program puts focus on ethical AI, building robots to do ‘helpful tasks’
[ad_1]
AUSTIN (KXAN) – How does society deal with the ethical use of artificial intelligence and the construction of robots?
A question that students at the University of Texas at Austin will soon have to address more closely.
‘This doctor has courage ‘: KXAN speaks to an ex-attorney suing a Texas doctor who intentionally violated abortion laws
Called CREATE (Convergent, Responsible, and Ethical AI Training Experience for Roboticists), it offers graduate courses and professional development in responsible design and implementation.
“It aims to train the next generation of robotics who can design ethical robots for different environments, at home, in the office or in a factory,” said Junfeng Jiao, associate professor at the UT School of Architecture and program director. “To my knowledge, this will be one of the first in ethical AI and ethical robotics training in the US.”
The program will help students understand the positive and potentially negative effects of their creations.
CREATE is a collaboration between Texas Robotics, industry partners and the UT research initiative Good Systems for the Grand Challenge, which aims to develop AI technologies that benefit society. The program received a $ 3 million grant from the National Science Foundation as part of its Research Traineeship program, which supports 32 graduate students in receiving courses, mentoring, professional development, internships, and research and public services.
“If I’m on this side, I’m sure it will move around me,” said Professor Peter Stone, director of Texas Robotics. “The goal is for these robots to be able to do a lot more than just navigate the hallways. To be able to do helpful tasks. “
Do we have to fear a robot takeover?
“I think people shouldn’t be scared,” said Stone. “I think people should be excited, but we should also keep our eyes open and be aware of the exciting opportunities and risks. We want to open the students’ eyes to whom the technology benefits, but whom it could harm. “
[ad_2]