ACM ASSETS 2021 publication

We’ve been accepted to present our paper on “Disability-first Dataset Creation: Lessons from Constructing a Dataset for Teachable Object Recognition with Blind and Low Vision Data Collectors” at ACM ASSETS 2021, held virtually 18-22 October. Read a pre-print at https://openaccess.city.ac.uk/id/eprint/26424/

ORBIT on BBC Click

ORBIT was featured on BBC Click on the International Day of persons with Disability: https://www.bbc.co.uk/iplayer/episode/m000q5rw/click-a-vision-of-the-future

This episode explores the latest developments around accessibility and inclusion in tech. Blind reporter Lucy Edwards investigates how AI helps visually-impaired people identify people and objects with their phones, while Niamh Hughes looks at the strides made in gaming accessibility, through the prism of her own experiences.

Phase 1 data collection

Help us with research to build better AI!

Data collection for Phase 1 is now closed. Thank you for your interest.

Smartphones are helping blind users explore their world. For instance,  Seeing AI uses Artificial Intelligence to analyse a picture you take with the camera. While these kinds of systems currently can recognise common items, say for example “this is a mug”, the problem is that they can’t tell whether this is your mug. Also, they don’t know of items that are important to blind or low vision people, such as a white cane. We are trying to change this but to do so we need more examples of things that are important to you. Help us!

We are looking for people who are blind to use an iPhone app to take short videos of things that are important to them. This app will allow thousands of visually impaired people to contribute imagery to help train the kinds of AI behind these smartphone apps.

We are looking for adults (18 years or older) who:

  • Live in the United Kingdom
  • Are blind or have low vision
  • Speak English fluently
  • Have an iPhone
  • Do not have any cognitive impairment

Your participation would involve –

  • Filling in a background survey
  • Using an iPhone app to take videos of things you might want your phone to recognise

– and you’ll get a £50 Amazon voucher in appreciation of your time. You’ll also get an additional £10 for each person you refer to us and who completes the study.

We will finish collecting data by 12 July 2020.

If you are interested in participating in the study please contact Lida Theodorou at Lida.Theodorou.2@city.ac.uk.

This study has been reviewed by and received ethics clearance through the Computer Science Research Ethics Committee, City, University of London.

If you would like to complain about any aspect of the study, please contact the Secretary to the Senate Research Ethics Committee on 020 7040 3040 or via email: Anna.Ramberg.1@city.ac.uk

City, University of London is the data controller for the personal data collected for this research project. If you have any data protection concerns about this research project, please contact City’s Information Compliance Team at dataprotection@city.ac.uk

Participants needed for research in metalearning for personalized object recognition aimed at visually impaired people

Data collection for this pilot study has now closed. Thank you for your interest.

Smartphones are really useful in making visual information accessible to visually impaired people. For instance, the SeeingAI app allows you to take picture of your surroundings and then it reads aloud common things that are recognised in the picture, for example “a person sitting on a sofa”. SeeingAI uses AI to recognise items. The problem is that these kinds of AI currently can’t tell you which of the things it recognises is yours, and they don’t know things that are important to blind users. We are trying to change this. Help us!

We are looking for people who are registered blind who we can come visit, so you can show us what’s important to you and how you might use your phone to find it. This will inform the design of an iPhone app that will allow thousands of visually impaired people to contribute imagery to help train the kinds of AI behind these smartphone apps.
We are looking for adults (18+) who:

  • Live in the United Kingdom
  • Are registered blind
  • Speak English fluently
  • Have an iPhone
  • Do not have any mobility or cognitive impairment

Your participation would involve –

  • Filling in a background survey
  • Spending up to two hours with us at home, where
    • We’ll talk to you about things you might want your phone to recognise
    • You can show us these things and how you use them
    • You will try to capture them using your iPhone’s camera app

and you’ll get an incentive to the value of £50 in appreciation of your time.
We’re a group of researchers based in City, University of London. You can find more about the project at our website: http://orbit.city.ac.uk. For more information about the study please contact Lida Theodorou at Lida.Theodorou.2@city.ac.uk.

This study has been reviewed by, and received ethics clearance through the Computer Science research Ethics Committee CSREC, City, University of London.
If you would like to complain about any aspect of the study, please contact the Secretary to the Senate Research Ethics Committee on 020 7040 3040 or via email: Anna.Ramberg.1@city.ac.uk
City, University of London is the data controller for the personal data collected for this research project. If you have any data protection concerns about this research project, please contact City’s Information Compliance Team at dataprotection@city.ac.uk

Centre for Human-Computer Interaction Design, City, University of London.