Building a driverless future: The next step in autonomous vehicles
6 min read
Once purely the preserve of science fiction, the driverless vehicle is increasingly a reality and there can be little doubt that it is coming to a street near you – and sooner than you might imagine.
The most high-profile example of an autonomous driverless vehicle is the Google car, which uses a combination of sensors and software combined with highly accurate digital maps to locate itself in the real world. A GPS is used, just like the satellite navigation systems in most cars, to get a rough location of the car, at which point radar, lasers and cameras take over to monitor the world around the car. Google has also announced a project to create driverless transportation 'pods' that are designed to operate safely and autonomously without requiring human intervention. They won't have a steering wheel, accelerator pedal, or brake pedal because they don't need them," says Google's self-driving car project director, Chris Urmson, on the firm's blog. But, while Google may be attracting the lion's share of the headlines, it is by no means the only kid on the autonomous block. There is any number of projects around the world exploring future transport technology. One of these is the UK-based LUTZ (Low-Carbon Urban Transport Zone) Pathfinder Project, which will see electric powered pods used in Milton Keynes next year. Coventry-based RDM Group will make the futuristic LUTZ Pathfinder pods, which will be trialled next year, after being appointed by the Transport Systems Catapult (TSC), which is project managing the enterprise. RDM and TSC are now working together with Oxford University's Mobile Robotics Group (MRG) to create three electric-powered pods, which are part of the TSC's Low Carbon Urban Transport Zone (LUTZ) Pathfinder programme which is studying the feasibility of autonomous and on-demand vehicles. The self-driving pods will carry up to two passengers and with a top speed of around 12 kph (7 mph), the pavement-based pods are intended to increase the number of mobility options available to the public, while also reducing congestion and carbon emissions. Once these trials are completed, the pods will be ready for testing in public on the pavements of Milton Keynes. With safety issues of paramount importance throughout the duration of the assessment programme, the three pods will continue to be manned by trained human operators. Transport Systems Catapult programme director Neil Fulton said: "The LUTZ Pathfinder project will redefine how people think of 'driving', and therefore fits in perfectly with our mission to promote UK business growth in the field of intelligent mobility. By mid 2017, it is planned that 100 pods that are fully autonomous will be running on pathways alongside people and will use sensors to avoid obstacles. The Milton Keynes-based programme actually pre-dates the Transport Systems Catapult, with the initial impetus having come from the Automotive Council and Cambridge University's John Miles, who began the initial work with Milton Keynes council to develop a project that could see up to 200 of these vehicles moving people around the city. Before making such a massive investment, however, Milton Keynes Council was understandably keen to look at the economic case for the pods, the viability of the technology and how the pods would interact with people – and vice versa. The TSC was therefore approached to oversee the LUTZ Pathfinder programme's test phase, which will see three autonomous pods trialled on the pavements of Milton Keynes. During the test period, all of the vehicles will be manned by a trained operative who will be able to take immediate control of the pod if necessary. The pods will be equipped with technology provided by Oxford University's Mobile Robotics Group and Fulton recently showed a video of the technology in action during a "test drive" of the navigation system around a university quad. Relying on left-side and right-side cameras for its primary data, the technology works by mapping out the environment in which it will operate so that subsequent journeys can be compared against a known 'norm'. Of course, there is a vast gulf between the theory and the practice and it doesn't require much imagination to visualise the huge range of problems potentially posed by the appearance of driverless vehicles on our roads. As part of a demonstration of the real world scenario that awaits the pods once they have been deemed ready for testing in public, the TSC has shown a film shot from a bicycle riding along part of the route where the pods will operate. Challenges identified on just that short bike ride included: the pod having to know how to safely avoid and overtake pedestrians; to correctly distinguish between genuine obstacles and irrelevant distractions such as empty carrier bags and other types of litter; to be able to navigate road crossings or car parks; and to deal with the potential confusion of things such as heavy shadows on sunny days, or the quick change of lighting conditions when driving through an underpass. Says Fulton: "There's a distinct difference between our programme and some of the other autonomous vehicle projects, which are mainly road-based. I would say that taking the vehicles off the road and onto pavements actually intensifies the challenge, because of the increased interaction you have with people and obstacles," "So, on the one hand, you can see the technological challenges that we're facing, but there are many other challenges to consider on top of that, such as the regulation and law changes that will be required to get autonomous vehicles onto the market. There is also the question of liability insurance in terms of who takes responsibility for these vehicles once they are out on the roads, or the pavements in the case of the pods. The business of getting the vehicles to meet these technological challenges falls to the Oxford Mobile Robotics Group, which specialises in mobile autonomy. In addition to its work on the pavement-based LUTZ Pathfinder pods, MRG is applying its technology to the Oxford RobotCar project – which is working towards the first public demonstration of a low-cost ,auto-drive navigation system on UK public roads. The MBG has long been involved in the area of autonomous vehicles, having worked jointly with BAE Systems to retrofit two Bowler Wildcats with fly-by-wire control systems, high-performance computing payloads and sensors for estimating the local terrain, including lidars and cameras. The first of which is used by the BAE Systems Advanced Technology Centre as part of its autonomous systems research. The second is still used by the Mobile Robotics Group as part of its ongoing research into lifelong infrastructure-free navigation for autonomous vehicles. Professor Ingmar Posner of the Mobile Robotics Group believes the project is not only useful in and of itself, but also represents a potentially crucial intermediate stage in the development of driverless vehicles. He says: "There needs to be an interim period between an autonomous vehicle being on a test track and being on an actual road, so having it as a pod in real complex environments is actually a very valuable thing. That's why Google are doing it and it's a crucial step towards autonomous cars." "It's really important to see this in the context of our ultimate goal, which is an autonomous car," he says. "So if you think of an autonomous car going along at 100mph and trying to interpret its environment, that's difficult because perception is a really tricky thing to achieve and is also really expensive from a processing point of view. So to have an intermediate, real-world test-bed is ideal. So, rather than controlling where people walk, it's nice to be able to concentrate on a domain where you drive slowly (7-10mph), where there are real obstacles around you and you interact with those obstacles at those speeds." Of course, the pods will run on pavements rather than roads, which throws up its own range of challenges. Says Professor Posner: "It is a less predictable environment than a road in some ways... for instance, it is well understood how other cars behave, whereas mapping human movement is much more difficult." And, according to Professor Posner, public acceptance is a bigger issue than some may realise. He says: "The bottleneck with this is not necessarily technology. The technology is getting better, but it's not quite there yet, but it is advancing at such a pace that legislation and the societal factors of it are having trouble keeping up with it... The whole thing is that it's a pilot project in a number of ways – not just in terms of the technology itself, but how people will interact with that technology." One of the key details of the project is that it will be 'infrastructure-free'. In other words, it will run without relying on any external factors such as GPS. Says Posner: "GPS is not really that accurate and is too easily denied – in street canyons, for instance. That isn't to say that the technology that we have can't use GPS – it would be foolish not to use information if we have it. But the key point is that, while it may use it, it doesn't depend on it." So how will it work? Clearly, it remains a work in progress at this stage, but much of the technology will be based on the MRG's existing 'Robot Car' platform. Posner explains: "The concept is very similar to that of the robot car whereby you have some sort of representation of the environment – like a map – which will have annotations in it in terms of where pavements are, etc. You then use features in the environment (such as local landmarks) to localise yourself within that environment." The question of whether the robot 'learns' is a vexed one, but Posner says this much: "In principle, we have solutions that allow us to do this (let it learn). But we're now at the stage where we are specifying the sensors for the first prototype pod. In our portfolio we have algorithms that allow us to do that and – as and when it is required – we will do that. "In the context of the robot car project, one key part of the robot car project vision is that the robot learns about its environment as you drive. One key part of that vision is that you buy a car and, rather than it knowing where you're going and you push a button and it takes you there, it will actually just observe where you drive and after a while it will know enough to drive you."