Close the cookie banner Close the cookie banner

A note about cookies

Cookies are used to help improve this website and some of them are currently on your computer. By continuing to use this website, you automatically accept these cookies.
Find out more about how we use cookies in our Cookies policy or learn how to manage them at www.aboutcookies.org.
If you continue to use this website further cookies will be placed on your computer.
If you are a Jobcentre Plus claimant and do not want to accept cookies you may be able to use a DWP internet access device; please speak to your Jobcentre Plus adviser.
Main Content

Jobs

Job details

Do you have the skills you need?

Quickly see if you're suitable for the jobs that interest you.

Review your skills

Job summary

Job ID
45540940
Posting Date
28/03/2018
Company
University of Leeds
Location
UK-Yorkshire-Leeds
Industries
Education
Job type
Full time
Education Level
Doctorate or specialist award
Salary
32,548.00 - 38,833.00  per year
Hours of Work
Temporary contract
Fixed term contract
Job reference code
ENGCP1062

University of Leeds

Research Fellow in Robotics and Machine Learning, Faculty of Engineering

Job description

School of Computing, Faculty of Engineering

 

Are you interested in making robots more able to perform complex manipulations, particularly in cluttered environments?  Would you like to work on an EPSRC funded project in their Human-Like Computing initiative? Do you have the knowledge and experience to take data from humans performing manipulations of objects in virtual environments and apply machine learning methods to extract rules which embody the strategies humans use to reach their goals? Would you like the challenge of then implementing these rules on a robot to test their efficacy?

 

This is one of two roles on an 18 month feasibility study entitled “Human-like physics understanding for autonomous robots” to investigate whether data garnered from how humans manipulate objects in cluttered environments can be used to improve robots’ abilities to do the same. State-of-the-art robot motion/manipulation planners use low-level probabilistic methods often based on random sampling. There are two drawbacks to this approach: (1) it restricts robots to plan their motion at the bottom-most geometric level and, without any top-down guidance, this results in the limited object manipulation ability displayed by today’s intelligent robots; (2) this approach produces randomized motion that is not legible to humans, which limits robots’ collaboration capabilities with humans. Through incorporating human-like decision making in robot planning, we aim to overcome these limitations and produce a fundamental step-change in the sophistication of these robots.

 

An example task we are considering is how to reach something at the back of crowded fridge shelf; similar challenges arise in commercial settings – e.g. the Amazon picking challenge. We will start by exploring how humans perform such tasks in a VR setting – which will allow us to vary the task parametrically and extract data easily – this task will be primarily performed by the other post we are currently recruiting to. We then plan to use symbolic machine learning techniques to extract rules expressed using qualitative spatial representations to represent tacit human knowledge gained ontogenetically and phylogenetically. Finally we plan to test the learned model in a robotic setting. We will be guided in our research by an advisory team from our three industrial partners, Ocado Technology, Dubit and Shadow Robot.

 

You will also contribute to a second EPSRC project entitled “Multi-Robot Manipulation Planning for Forceful Manufacturing Tasks”. Imagine coming back home from a hardware store with planks of wood and working with your friend to manufacture a table for yourself. You will need to collaborate to perform operations such as cutting parts off, inserting nails, drilling holes, and screwing in fasteners. The goal of this second project is to get robots to perform similar manufacturing tasks. To do this, a robot team will need to decide how to grasp the workpieces (e.g. wood planks) and how to move to perform these operations. For a

planning algorithm to make such decisions, it will need to solve geometric collision constraints, forceful stability constraints, and sequential temporal constraints simultaneously.

 

This second project is closely related to a new Marie Curie funded Fellowship and the post holder will collaborate with the Marie Curie Fellow on this work.

To explore the post further or for any queries you may have, please contact:

 

Prof. Tony Cohn, School of Computing

Tel: +44 (0)113 343 5482

Or, tell us why you don't wish to apply for this job.