Project Title: Transport Navigator

Help people navigate public transport options more effectively (e.g. with disabilities / voice assisted).

Voice Assisted example:

Most people / houses today have a connected device (Alexa, Google Assistant, Siri) on their phone on in their house. These devices aren't as smart as they could be and don't really understand transport. This project is to create a voice activated smart transport application determine whether we can make them smarter to understand our network by being able to ask them questions and get real world answers. For example: "How long is the train ride from Porirua to Wellington?" With a response of 26 minutes normally or 15 minutes on an express train. "Where is the nearest bus stop to (my house, my office, an address or point of interest)." "What time do I need to leave my home to get to work using public transport?" There are hundreds if not more questions that could be asked and it may be necessary to understand the question before answering it as if could be asked in multiple ways. Could either be linked with Metlink App (and understand your preferences and favourites) or generic for anyone to use.

Visually Impaired people example:

Metlink contracted a company many years ago to install a “board” in all their RTI signs. This board uses a RF Frequency along with a “Fob” (like a garage remote control). When a blind person presses the button the sign is supposed to read out the first line or the first three lines if pressed again so they know when their bus / train is going to arrive. The system wasn’t installed / configured correctly and thus to this date we have a partially functioning system that we haven’t let the public use to date. We really want to allow and make it easy for blind people to be able to travel using public transport and so something like this is essential.

The Options: Attempt to fix what we have in place – Currently in play, but I’m not holding my breath. Replace the current boards with a Blue Tooth Board option – Well over a $1M in hardware costs plus installation on top of that. Install “buttons” on the poles with “speakers” that do the same – Again a huge cost to install and configure. Build an App that does this and give “legally” blind people a Mobile Phone with this app installed.

Potential Solution: Using GPS technology, determine where the person is and using our open data platform work out which stop (geofence) is closest to where they are.Extract the prediction information from the Open Data portal and read this aloud to the user. This is the basic essence but there are some things that make it trickier Some stops are in close proximity to others and so need to try use heuristics to determine where we think they are going (i.e. they came into the city on bus 3 , so likely to go out on bus 3 etc.) The names of some stops and signs needs to be pronounced correctly (Maori inflections, Wellington names, and Messages)

Users

The primary end-users of this system are university staff/students and general public.