Posted on

Technology demonstrations

If you are planning or running a focused industry, technology or academic event to discuss autonomous vehicle technologies, we’d love to hear from you.

We can run a static indoor display in a small space, show a vehicle with working sensors or even run an outdoor vehicle demonstration with passengers in a controlled area, to demonstrate some autonomous vehicle functionality. These demos can enhance your audience understanding of the building blocks contributing towards the next generation of vehicle technology.

Members of our team have been organising demos of autonomous vehicle technology since 2016, so we know that first hand contact is a great way to complete the learning experience, whether the audience is management, engineers, or university students.

Use the contact page to get in touch, specifying event details including location, dates, size and type of audience, event website with further information (if public), and type of display you can provide space for.

Due to the number of requests we receive, we may not always be able to respond or accept invitations.

Posted on

Free Technical Resources For Autonomous Driving Addicts

technical resources programmers

It’s amazing the volume of free resources you can find online to fuel your self-driving car project.

The internet is bringing progressive developers together, encouraging shared use of tools and information to accelerate technological progress. When information is shared, it benefits society as a whole, not just individual corporations. WordPress founder and self-described ‘open-source hippie’ Matt Mullenweg believes that “everything, not just software, should be open-source”.

In that spirit, here’s a comprehensive list of free technical resources aimed specifically towards autonomous vehicle developers. We hope you find it helpful!

Github

Online hosting service GitHub is mostly used for computer code. It’s a meeting place for more than 31 million developers collaborating to manage projects and build software.

Github provides a long list of resources, links, papers and recommendations to inform your next autonomous driving project, including the MIT Self-Driving Cars course, End to End Learning for Self-Driving Cars, and Deep Multi-Scale Video Prediction. You will also find a curated list of awesome autonomous vehicle resources, starting with the foundations of AI, robotics and computer vision. All this plus papers, courses, datasets, and links to open source software like Comma.ai.

Github’s list also provides a hefty chunk of useful information on law and media around self-driving cars.

Mighty AI

Computer vision company Mighty AI trains AI programs to better see and understand the world, using high-quality data to validate computer vision models. Here you can find its Sample Training Dataset for Autonomous Driving, which looks to be updated regularly.

It takes a huge amount of data to train vehicles to understand their environments, which is why it’s helpful to get your hands on as many sample datasets as possible. Mighty AI has created a sample set of 200 fully segmented images of a specific Seattle-area route, which you can download and use for free.

ApolloScape

The Chinese multinational technology company Baidu lately released ApolloScape, a huge and freely available dataset for autonomous vehicle simulation – far bigger than other similar datasets. While still a work in progress, it offers 144k image frames, 10 times more high-resolution images, 26 different recognizable objects, and pixel-by-pixel annotations. The dataset forms part of Apollo version 2: Baidu’s open autonomous driving platform.

UC Berkeley

Like Baidu, UC Berkeley also has a huge self-driving dataset available for free public use by engineers and developers, containing 100,000 different video sequences from across the US, plus GPS information.

These videos may be especially useful in testing as they cover a variety of different weather and light conditions. Download the BDD100K dataset.

DIY Robocars

An example of a naturalistic driving study from towardsdatascience.com

Our good friends at DIY Robocars host events for people who want to make and race DIY autonomous cars. Even small, inexpensive models can run autonomous software. Plus, DIY Robocars provides a list of naturalistic driving studies programmers can use: some taken in real traffic, some from self-racing car events.

The data was collected across various European countries and in varying weather and lighting conditions – ideal for testing out semantic segmentation. This type of data has long been the favoured study source for driving researchers, even before the days of on-road autonomous research.

Autonomous Driving

This open knowledge platform for self-driving car development has a particularly comprehensive resources section packed with articles, courses, papers, tools, datasets and videos. Not for the first time, the Udacity and MIT courses come highly recommended. You’ll also find this amazing visual algorithm comparison tool, this open motion planning library, and this open source tool for point cloud visualization and editing.

Tiny YOLO

Also from Github comes Model Depot: an open platform for discovering and sharing machine learning models. Model Depot has been exploring in-browser object detection using Tiny YOLO – and now you can too. Tiny YOLO (you only look once) is a real-time object detection system that uses deep learning to find objects from your webcam. The system makes predictions with a single network evaluation, which makes it extremely fast.

There’s plenty more software to check out at Model Depot: some even have live demos.

We hope these free technical resources will prove helpful throughout your R&D. We will endeavour to keep this page updated as we go. Got suggestions? Let us know!

Posted on

Ultrasonic Sensors: More Than Just Parking

Driverless cars are the sum of many moving parts (literally), all working seamlessly together. To really ‘see’ the road ahead, unmanned vehicles need a series of different sensors with complementary skills. It’s a team effort – due to their physical properties, no single sensor can do the job on its own.

Autonomous vehicles rely on a range of sensors to perceive their environment: not just cameras but radar, LiDAR, sonar, computer vision and HD maps. Ultrasonic sensors, traditionally associated with parking, will come to have a much wider range of uses as autonomous driving systems develop.

What is an ultrasonic sensor?

schneider telemechanique ultrasonic sensors

Ultrasonic sensors are active sensors. Much like radar and sonar, ultrasonic transducers evaluate their targets by measuring reflected signals. Using ultrasonic waves, these sensors gauge distance between objects by measuring the time between emission and reception. This makes them particularly good at detecting approaching obstacles within close range – it’s why they make such good parking sensors.

The transducer generates a high-frequency sound wave (above 18 kHz), converting electrical energy into sound and then back again. Certain objects reflect ultrasonic sound waves in different ways: while solid materials like metal, plastic and glass reflect sound well, soft materials like fabric tend to absorb the waves. Therefore accuracy can vary; ultrasonic sensors excel at detecting solid hazards like traffic cones and barriers.

How ultrasonic sensors work

The best example of how ultrasonic sensors work comes from nature, through a process called echolocation. Bats and dolphins use this process to create sound waves which they send out and bounce back from the environment. The subsequent time it takes to return tells you the proximity of surrounding objects.

dolphin echolocation

In ultrasonic sensors, the transducer vibrates, emitting ultrasonic pulses that travel in a cone-shaped beam. An ultrasonic sensor’s range is determined by the transducer’s vibration frequency: the higher the frequency, the shorter the range. Long-range ultrasonic sensors, therefore, work best at lower frequencies.

The angle of an object, as well as its composition, can affect the accuracy of the reading. Flat surfaces placed at a right angle to the sensor give the longest sensing range. As the angle varies, so does its accuracy. For this reason, some sensors come with different modes to overcome these shortfalls.

Ultrasonic sensor modes

The Schneider Telemechanique XX range of ultrasonic sensors has three distinct modes to enable more accurate detection. These modes are:

  • Diffuse mode – where the object reflects the ultrasonic wave back to the sensor. Best suited to flat objects perpendicular to the ultrasonic beam.
  • Reflex mode – where the sensor remains in a permanently detecting state, seeking objects that break the ultrasonic beam. Best suited to finding soft objects that would normally absorb the ultrasonic wave.
  • Thru-beam mode – where the transmitter permanently transmits ultrasonic waves to the receiver, seeking objects that break the ultrasonic beam. Best suited to the detection of small objects.

Advantages and limitations

Each type of autonomous vehicle sensor has its own ideal function. Ultrasonic sensors are a great fit for short-range applications, but not for others. Here are the pros and cons:

Pros

  • Compact and reliable – no moving parts
  • Unaffected by object colour or transparency
  • Unaffected by atmospheric conditions like dust, rain and snow
  • Unaffected by light levels – works equally well in darkness
  • Excellent at measuring distance to a parallel surface

Cons

  • Accuracy affected by angle and soft materials
  • Accuracy affected by rapid shifts in temperature
  • Limited detection range – our long-range sensors have a maximum range of 8m

Beyond parking

how ultrasonic sensors work

Ultrasonic sensors are extremely common in passenger cars for parking systems. Usually found on the bumpers, they alert the driver to obstacles behind or in front of the car, activated while the car is in reverse gear or moving slowly to help avoid parking scrapes.

As the industry shifts its focus to autonomous driving systems, ultrasonic technology can take us further, with longer detection distances providing a 360° view of obstacles in the immediate vicinity – provided the car isn’t travelling at high speed. To give an example, Tesla’s Model S and X feature 12 long-range ultrasonic sensors that provide close range obstacle detection capability for their AutoPilot feature (although this is a long way from fully self-driving – there will be no fully autonomous vehicles on the road in 2019).

Why buy from us?

We aim to provide a superior customer experience and the best value for our customers, working with reputable sensor suppliers like Navtech Radar, Neobotix and Schneider Telemechanique. Our goal is to promote innovation in all its forms, making sophisticated technology available in the smaller volumes needed for R&D – at prices that don’t take the biscuit.

Level Five Supplies is taking a fresh approach to the CAV market, backed by a brainy pod of certified technical experts who will guide you each step of the way. Like helpful land dolphins.

Posted on

Technology In Autonomous Cars: What, Why, When & How?

traffic on a busy highway

A world of shared, autonomous vehicles may seem like a distant dream. But the industry has been making huge strides and generating a lot of attention, thanks to companies like Google and Tesla. Driverless cars are now a trending topic but behind the scenes, technology has been steadily moving this way for decades.

A future in which autonomous vehicles become widespread is inevitable. But it won’t happen overnight – technology develops gradually. Already, OEMs are creating fully or partially autonomous vehicles that promise to save time, reduce congestion and minimise fatalities on our roads – much to the dismay of classic car enthusiasts. While there are significant challenges to overcome, the odds look optimistic.  

Here’s the lowdown on technology used in autonomous cars and what we can expect.

A history of self-driving cars

Self-driving cars might seem like a recent thing, but experiments of this nature have been taking place for nearly 100 years. In fact, centuries earlier, Leonardo Da Vinci designed a self-propelling cart hailed by some as the world’s first robot. At the World’s Fair in 1939, a theatrical and industrial designer named Norman Bel Geddes put forth a ride-on exhibit called Futurama, which depicted a city of the future featuring automated highways.

However, the first self-driving cars didn’t arrive until a series of projects in the 80s undertaken by Carnegie Mellon University, Bundeswehr University, and Mercedes-Benz. The Eureka Prometheus Project proposed an automated road system in which cars moved independently, with cities linked by vast expressways. At the time, it was the largest R&D project ever in the field of driverless cars.

Since 2013, US states Nevada, Florida, California, and Michigan have all passed laws permitting autonomous cars; more will surely follow.

Autonomous vehicle components

How does an autonomous vehicle operate and make sense of what it sees? It comes down to a powerful combination of technologies, which can be roughly divided into hardware and software. Hardware allows the car to see, move and communicate through a series of cameras, sensors and V2V/V2I technology, while software processes information and informs moment-by-moment decisions, like whether to slow down. If hardware is the human body, software is the brain.

At Level Five Supplies, our tech is broadly categorised as follows:

Autonomous vehicles rely on sophisticated algorithms running on powerful processors. These processors make second-by-second decisions based on real-time data coming from an assortment of sensors. Millions of test miles have refined the technology and driven considerable progress – but there is still a way to go.

Driverless car technology companies

The race to be the first self-driving car on the road is heating up. Level 5 autonomy is still some time away, but there is plenty else happening in the autonomous space, with aspects of driverless tech already making an appearance in today’s mass-produced cars. Early adopters will enjoy benefits such as automatic parking and driving in steady, single-lane traffic.

Here are some of the companies working towards an autonomous future:

  • Google – presently leading the charge via Waymo, its self-driving subsidiary
  • Tesla – models are now being fitted with hardware designed to improve Tesla Autopilot
  • Baidu – in the process of developing Level 4 automated vehicles
  • General Motors – developed the first production-ready autonomous vehicle
  • Toyota – working with ride-hailing service Uber to bring about autonomous ridesharing
  • Nvidia – created the world’s first commercially available Level 2+ system
  • Ford – planning to have their fully autonomous vehicle in operation by 2021
  • nuTonomy – first company to launch a fleet of self-driving taxis in Singapore
  • BMW – has teamed up with Intel and Mobileye to release a fully driverless car by 2021
  • Oxbotica – will begin trialling autonomous cars in London in December 2019

Challenges in autonomous vehicle testing and validation

As we can see, there are many different autonomous vehicle projects in various stages of development. But there’s a big difference between creating a test vehicle that’ll run in reasonably tame conditions and building a multi-million strong fleet of cars that can handle the volatility and randomness of the real world.

One of the biggest challenges is putting the computer in charge of everything, including exception handling. By exceptions, we mean variable circumstances like bad weather, traffic violations and environmental hazards. Fog, snow, a deer leaping into the road – how does a fully autonomous vehicle interpret and react to these situations?

When we take the driver out of the picture completely, automation complexity soars compared to lower level systems. The software must handle everything. Rigorous testing is essential, but it won’t be enough on its own. Alternative methods such as simulation, bootstrapping, field experience and human reviews will also be necessary to ensure the safety of the vehicle.

For the time being, it means implementing each new capability carefully and gradually: hence why it will be a good few years before we see Level 5 vehicles on the road.

Global perceptions of autonomous car technology

How does the general public feel about the onset of autonomous cars? Generally, perception is pretty positive, but there’s still a way to go. Not everyone is convinced.

The main bone of contention is safety. Autonomous car manufacturers must prove beyond doubt the safety of their vehicles before they can hope for widespread adoption. One survey conducted across China, Germany and the US found that drivers want to decide for themselves when to let a car drive autonomously and when to take over. The survey also found that trust in autonomous cars is nearly twice as high in China.

A second survey discovered that the higher the level of automation, the higher the doubt. There are still reservations about giving the vehicle total control, despite having a positive view on autonomous vehicles generally.

Another factor playing on people’s minds is cybercrime: fear of personal data falling into the wrong hands. People are generally happy to share data for safety reasons, but less so when it ends up being sold to related service providers.

When will autonomous cars be available?

If we’re talking about self-driving features, then the future has arrived. There are already cars on the road with Advanced Driver Assistance Systems in play. But a fully autonomous vehicle that can encounter and navigate any driving scenario without human intervention won’t be mainstream for a little while yet, mostly due to cost and regulations.

Level 5 systems will have the ability to drive anywhere a human driver could. What we’ll likely come across first are Level 4 vehicles where autonomy is confined to mapped zones. They may also be limited by certain weather conditions.

Business Insider estimates there will be 10 million self-driving cars will be on the road by 2020. Chances are you’ll be able to ride in one long before you can buy one, possibly even within the year. As for Level 5, estimations range from 2021 onwards.

The past decade has been a pivotal one for automobile technology. In the face of changing political and legal circumstances, the future of autonomous vehicles is uncertain, and adoption will be gradual as society adjusts to the changes – not to mention upgrading transport infrastructures and systems. But of one thing we can be sure: the automotive industry will never be the same again.

Level Five Supplies is a one-stop shop for autonomous vehicle technology. Subscribe to our newsletter for regular updates.

Posted on Leave a comment

Autonomous Cars 101: What Sensors Are Used in Autonomous Vehicles?

As autonomous vehicles become closer to reality, a greater number of startups, universities, car manufacturers and technology companies are investing in the development of automated driving – with no sign of slowing down.

If you’re heading down the road (pardon the pun) to autonomy, it can be hard to know where to start. If you’re just beginning to learn about autonomous technology, this introductory roundup is for you.

Why do autonomous cars need so many sensors?

Imagine trying to drive down the road with a completely frosted over windscreen. It would be a matter of seconds before you hit something or ran off the road.  

Autonomous vehicles are no different. They must be able to ‘see’ their environment in order to know where they can and cannot drive, detect other vehicles on the road, stop for pedestrians, and handle any unexpected circumstances they may encounter.

Each type of sensor has its own strengths and weaknesses in terms of range, detection capabilities, and reliability. A host of technologies is required to provide the redundancy needed to sense the environment safely. When you bring together two heterogeneous sensors, such as camera and radar, this is called sensor fusion.

Autonomous vehicle sensor categories

Automotive sensors fall into two categories: active and passive sensors.

Active sensors send out energy in the form of a wave and look for objects based upon the information that comes back. One example is radar, which emits radio waves that are returned by reflective objects in the path of the beam.

Passive sensors simply take in information from the environment without emitting a wave, such as a camera.

Cameras

trifocal camera

Cameras are already commonplace on modern cars. Since 2018, all new vehicles in the US are required to fit reversing cameras as standard. Any car with a lane departure warning system (LDW) will use a front-facing camera to detect painted markings on the road.

Autonomous vehicles are no different. Almost all development vehicles today feature some sort of visible light camera for detecting road markings – many feature multiple cameras for building a 360-degree view of the vehicle’s environment. Cameras are very good at detecting and recognizing objects, so the image data they produce can be fed to AI-based algorithms for object classification.

Some companies, such as Mobileye, rely on cameras for almost all of their sensing. However, they are not without their drawbacks. Just like your own eyes, visible light cameras have limited capabilities in conditions of low visibility. Additionally, using multiple cameras generates a lot of video data to process, which requires substantial computing hardware.

Beyond visible light cameras, there are also infrared cameras, which offer superior performance in darkness and additional sensing capabilities.

Radar

autonomous vehicle radar

As with cameras, many ordinary cars already have radar sensors as part of their driver assistance systems – adaptive cruise control, for example.

Automotive radar is typically found in two varieties: 77GHz and 24Ghz. 79GHz radar will be offered soon on passenger cars. 24GHz radar is used for short-range applications, while 77GHz sensors are used for long-range sensing.

Radar works best at detecting objects made of metal. It has a limited ability to classify objects, but it can accurately tell you the distance to a detected object. However, unexpected metal objects at the side of the road, such as a dented guard rail, can provide unexpected returns for development engineers to deal with.

LiDAR

autonomous car lidar sensor

LiDAR (Light Detection and Ranging) is one of the most hyped sensor technologies in autonomous vehicles and has been used since the early days of self-driving car development.

LiDAR systems emit laser beams at eye-safe levels. The beams hit objects in the environment and bounce back to a photodetector. The beams returned are brought together as a point cloud, creating a three-dimensional image of the environment.

lidar point cloud

This is highly valuable information as it allows the vehicle to sense everything in its environment, be it vehicles, buildings, pedestrians or animals. Hence why so many development vehicles feature a large 360-degree rotating LiDAR sensor on the roof, providing a complete view of their surroundings.

While LiDAR is a powerful sensor, it’s also the most expensive sensor in use. Some of the high-end sensors run into thousands of dollars per unit. However, there are many researchers and startups working on new LiDAR technologies, including solid-state sensors, which are considerably less expensive.

Ultrasonic sensors

ultrasonic sensor autonomous vehicle

Ultrasonic sensors have been commonplace in cars since the 1990s for use as parking sensors, and are very inexpensive. Their range can be limited to just a few metres in most applications, but they are ideal for providing additional sensing capabilities to support low-speed use cases.

Other sensing and localization

The sensors discussed above aren’t the only source of information for a self-driving car to know where it is and where to go. Other source inputs include Inertial Measurement Units (IMUs), GPS, Vehicle-to-Everything (V2X) communication, and high definition maps.

Bringing it all together

All of these sensors output different types of data – and lots of it. This requires a considerable computing platform to fuse the data together and create a consolidated view of the vehicle’s environment. Launched in 2019, Level Five Supplies is a comprehensive supplier of autonomous vehicle hardware and technology.

Speak to one of our sales reps about your R&D requirements today.

Posted on Leave a comment

New International Autonomous Vehicle Technology Distributor to Launch in 2019

waymo self-driving car phoenix arizona
  • Level Five Supplies will launch in February 2019, competing with recently acquired US firm AutonomouStuff
  • The company will represent more than 25 suppliers at its launch, including many home-grown innovators
  • Industry and government collaborations through the Centre for Connected and Autonomous Vehicles, Transport Systems Catapult, and Knowledge Transfer Network underpin the decision

Level Five Supplies Ltd, a new autonomous vehicle technology distributor, will launch in the UK in February 2019. The company will initially work with suppliers of essential technologies and AI auto parts for connected and autonomous vehicle research, including lidar, radar, processing, sensors, connectivity and drive-by-wire systems, and expects to have more than 25 suppliers in place at launch. In a sector that’s expanding at considerable speed, enabled by new software technologies and advanced processing approaches, Level Five Supplies expects to experience rapid growth while developing its own portfolio of products and services.

Delivering more product choice internationally

behind the scenes at westfield autonomous vehicles

Following extensive discussions with several suppliers and potential customers, it was clear that there were numerous pain points in the purchase of the hardware that are universal among highly automated driving researchers in the autonomous vehicle market,” said Alex Lawrence-Berkeley, company founder.

With the acquisition of AutonomouStuff by one of their own suppliers earlier this year, the general feeling around the industry is a desire for more choice in this area, coupled with concern that the new owners could restrict product choice even further,he continued.

There are many technology companies that need a clearer route to access the market, particularly in the area of high-value manufacturing and specialist vehiclesexpertise that is prevalent in the UK. We have world-leading technology expertise and business development experts, a supportive business environment and an exciting research sector, as well as a number of companies whose products we’ll be representing to our international customers.”

Built on collaborative research projects

gateway project automated shuttle pod

Initially targeting sales in the US, the company will be working closely with suppliers around the world, including many in the UK who have directly benefitted in developing products as a result of collaborative R&D projects funded through UK research and innovation.

Alex Lawrence-Berkeley cited the UK’s recent R&D activity in connected and autonomous transportation as a “significant catalyst” to building the operation in the UK. These activities are led through the government’s funding of collaborative research projects via the Centre of Connected and Autonomous Vehicles, the work of Transport Systems Catapult, and the Knowledge Transfer Network.

Global sales will be managed in the automotive heartland of the Midlands, while the marketing office and HQ will be based on the outskirts of Bath, in the market town of Frome. Field sales engineers will support the company’s outreach in North America, Europe and Asia, demonstrating the latest technology used in autonomous cars to SMEs, universities and startups.

More than 50 years’ industry experience

google lexus Rx 450h self-driving car

Our leadership and advisory team is strong, with a combined 50+ years’ experience in some of the sector’s best-known autonomous vehicle companies, including Google, Baidu, Velodyne, AutonomouStuff, PolySync and AB Dynamics. They’ll be giving us the high-level support we need to build those relationships with customers and suppliers alike, as well as the benefit of their extensive international business management experience.”

Level Five Supplies, which is raising funds from angel investors through the government-backed SEIS scheme prior to launch, is a subsidiary of Level Five Holdings, which runs the sector’s first dedicated autonomous vehicle jobs board, Level Five Jobs, in addition to other specialist business services in the future cars sector.

Our leadership and advisory team

  • Alex Lawrence-Berkeley (Founder and CEO) is a widely experienced B2B marketer in this sector, having previously worked on autonomous vehicle technology events,  including hosting the UK’s first autonomous vehicle networking events and running the world’s first test track and training events in the sector.
  • Taj Kalsi (Co-Founder and Sales Director) is a highly experienced business development leader with more than 25 years’ experience in automotive technology sales, including time with Gentherm, Lear Corporation and Johnson Controls.
  • Tim Rogers (Chairman) is former MD of AB Dynamics, the UK’s most commercially successful unmanned vehicle developer.
  • Josh Hartung (Technical Advisor) is the former CTO of AutonomouStuff, and former founder and CEO of PolySync, the drive-by-wire technology company. He is based in the US.  He said “I am really excited to work with Alex and his team. Level Five is addressing a massive gap in the AV industry. As companies increasingly target public trials, access to the best technology is an important part of making these systems reliable and safe. Frankly, I was blown away at their plans… these guys are the real deal!
  • Heather Hannan (Advisor) is a former Chief of Staff at Velodyne, one of the leading lidar manufacturers, and is also ex-Baidu and ex-Google. US-based, Heather has opened doors to manufacturers and great commercial opportunities in the US.
Posted on Leave a comment

Three months to go

Exciting times here at Level Five as we are now only 3 months away from launching our brand new autonomous vehicle technology distribution arm: Level Five Supplies.

It’s been a few months since we decided to extend our business towards technology distribution, but it makes sense.  While we’ve always been big fans of the team at AutonomouStuff, as the established market leader, we want to do things a little differently.

We want to offer alternatives, new ideas and approaches, to maintain the required pace of change, innovation and creativity demanded in our sector.  With their acquisition by one of their suppliers Hexagon PI, earlier this year, it seemed more important than ever to ensure that choice was maintained and even widened further. Ultimately, any novel method to accelerate the development of safer roads should be encouraged.

The autonomous vehicle development ecosystem has grown at a remarkable pace over the past decade, and 2019 won’t be any different but for the addition of a new distributor with a fresh perspective.

We’ve been busy working to secure a good number of new suppliers, including many that have not been widely seen before, whose products are as innovative as they are exciting – as well as several that are bringing brand new products in to the marketplace, launching them with us before anyone else!

We will also be working with some very established technology and automotive companies that have developed new products with fantastic potential, proven on autonomous vehicle R&D programmes.

Customers want choice, new suppliers want to be seen, competition is good – so here we are.

It’s not just about telling hardware, we’re also committing to supporting education and outreach programmes, non-profit groups and meetups, so the next generation of engineers and technologists, which this industry is so in need of, can start improving their skills and even start getting their hands on equipment.  More on that next year.

Our initial focus is the US market, but if you’re not there, don’t worry – we’ll be widening our footprint to Europe and elsewhere as well.  While our offices are based in the UK, we have a great team that spans both sides of the Atlantic and will grow with the business.

If you want to keep up to date on who we’re working with, sign up to our newsletter for a monthly update.

Posted on

Why Is 5G Important to Autonomous Cars?

Marc Jadoul leads Nokia’s IoT Market Development and is a regular event speaker and technology evangelist.

Part of our mission in supporting the autonomous vehicle eco-system fulfil its goals is to bring out information from aligned parts of industry and technology to a wider audience, through interviews, articles and features – either through our website and newsletter, or on our YouTube channel.

In our first written interview, we talk to Marc Jadoul, a technology evangelist and leader of the Nokia’s IoT development, to grill him on how the company is changing and what the new demands are from more intelligent vehicles.

Many people see 5G as critical to autonomous vehicles, but others (notably from areas which are used to creating standalone robotics) disagree. State your case…

5G will be beneficial for many industries. You can look at it as a Swiss knife that will provide us with a superlative for all wireless networks technologies we’ve seen so far.

First, 5G technology will greatly improve wireless networks’ capacity and data speeds, allowing network providers and road operators to offer much more robust internet connections and meet growing demand for data-intensive services, like video streaming and Internet of Things applications.

But, ubiquitous coverage (LIDAR scanners have an operational radius of only a few tens of meters), milliseconds latency, and better geolocation services (centimeters, compared to GPS’ 10-15 meter accuracy) will play a central role in the proliferation of self-driving cars and the support or safety-critical V2X communications, and situational awareness that enables collaborative behaviour of cars, cyclists and pedestrians.

Therefore, I’m convinced that 5G connectivity will play an important role in collaboration with on-board intelligence and other short range communication technologies.

Nokia has been many things, a pulp mill, cable manufacturer and of course a mobile phone company but in the last 5 years, that’s changed a lot and now the company is a more general technology developer. Is this about selling chips, licensing software, contract design, selling consultancy or just staying relevant?

The past is the past, but tomorrow is right around the corner. So, it’s not about “staying relevant”. We create the technology to connect today’s world. Enabling the infrastructure to transform tomorrow’s human experience. We serve communications service providers, governments, large enterprises and consumers, with the industry’s most complete, end-to-end portfolio of products, services and licensing. The historic examples you mention illustrate Nokia’s capacity for transforming, developing new technologies, and adapting to shifts in market conditions.

Our 5G portfolio, which is a comprehensive combination of infrastructure, software, and services is a good illustration of what Nokia stands for.

Will the advent of 5G help developing countries with poor communications infrastructure catch-up beyond industrialised countries relying on upgraded 20th Century hardware?

I think your vision on “upgrading 20th century hardware” understates how big this change is. As 5G will revolutionize lives, economies, and societies in many exciting ways, it’s a crucial enabling technology of the future for all markets. Many industrialized countries have already started planning their 5G journey and are trialling the new technology. Look, for example, at what’s happening in the US, South Korea, and France.

Recruitment is a growing problem across the automotive sector as is having to compete with other more established digital industries – what else can the auto industry do?

We’re in telecoms, not the automotive sector. This said, I understand that the market is competitive, and many companies are looking for technically skilled people today. But, personally, I would assume that with all exciting stuff happening in the automotive and transportation sectors, it would not be too difficult to attract people. As long as companies keep investing in innovation and in delivering a great customer experience.

35 years ago – during your academic career, you were looking at Turing machines and automation – is that still relevant and useful to what you do now?

I remember that my old physics teacher had a sign on his classroom door that said, “If you don’t know mathematics, stay out!” It is generally accepted that mathematics is kind of a basis for all science, and I believe theoretical computer science plays kind of a similar role in informatics. Automata theory helps you understand how algorithms work, and computational theory helps you grasp their complexity and the resources it might take to solve a problem.

By the way, I see a similarity with data science today. Anyone can take a dataset and run a Machine Learning program on it. But you need some good knowledge about statistics and logical regression to interpret the outcome of your predictions and evaluate the performance of your model.

What advice would you give to someone looking to specialise in your expert area, who should they talk to, emulate, study, read and work for?

Always keep a close eye on what’s happening in the industry, go beyond the buzzwords, and try to understand how you can leverage new technologies like 5G, cloud, the Internet of Things, Artificial Intelligence, etc. for creating a new human experience.

Tell us about what Nokia offers to help people on their journey into the autonomous vehicle sector?

I’d recommend you start with visiting the Nokia blog. You’ll find a lot of interesting posts about 5G and the future of transportation.


Thanks Marc and Nokia, as well as the Taas.Technology conference for supporting the interview.  Subscribe to our newsletter for occasional updates from us, or explore upcoming events and jobs!

Posted on

All About AUTOSAR

AUTOSAR is a well-established suite of standards developed and adopted by a pan-industry group of manufacturers, many of whom deem it important enough to dedicate full-time engineers to its development and integration as a key route for products to communicate with each other once they’re in the vehicle R&D lifecycle.

Dr. Thomas Scharnhorst, consultant at WiTech and spokesperson for AUTOSAR.

To find out more about how this related to autonomous vehicles, we caught up with Dr Thomas Scharnhorst.  Thomas is an engineer and consultant for Wi-Tech Engineering, and active spokesperson for AUTOSAR, as such he travels regularly to international events to present, as well as bring industry players together so that new technologies can more readily work with other parts of the vehicle.

You’re heavily involved in AUTOSAR, which we’ll come to in a minute, but tell us about WiTech first…

WiTech-Engineering GmbH is a small management and technology consulting company located in Braunschweig, Lower Saxony, Germany. The management partner and CEO of WiTech is Professor Dr Ulrich Seiffert, a former member of the Volkswagen board responsible for group research and development. The aim is to promote cooperation between industry and research and to improve technology transfer between different institutions.

Is it a failure that the industry needs so much support from the academic sector, and likewise that universities rarely commercialise their research successfully?

This is a good question and not so easy to answer. Thinking from the end I would not speak of failure, since both pursue different goals and can support each other. In my experience, successful projects for both sides include common university and industry joint projects sponsored by national or European funds, where industry strategic needs are setting up the agenda. One example was Prometheus in the 1990s.

AUTOSAR is a widely adopted and mature architecture, what are the main challenges in maintaining an evolving architecture with so many partners?

Since the future architectures of automotive systems are seen besides embedded software systems more and more a mobile phone on wheels, the partnership needs to integrate new partners from IT hardware and software vendors. A second important challenge is to increase the penetration of the standard in various heterogeneous markets like China, India… For that, we are adapting our organization to get closer to such areas, with the aim of taking into account their specific needs earlier.

What’s the biggest new complexity that autonomous driving brings to the existing software eco-system in vehicles?

Decision making for engine, steering, braking on the basis of sensor object detection, data fusion, environmental models with estimation of other traffic participants behaviour.

How will AUTOSAR interact with autonomous vehicle systems and technologies?

AUTOSAR (Classic and Adaptive Platforms) offers a platform for safe and secure communication and diagnosis within a single ECU and a network of ECUs and HPC (high-performance computers) on which applications for autonomous driving and connectivity can be easily integrated, scalable across different automotive product lines. AUTOSAR supports the download of software over the air into the car and integrating new applications into the vehicle communication network while the vehicle is under operation.

What standards are still missing from these technology areas?

It’s a bit to early to discuss. For a new standard it needs a mature, applied technology and a strong desire to cooperate on standardization by many partners. In autonomous driving technology many items are still under strong competition and not yet proven.

Recruitment is a growing problem across this sector, as the automotive sector is having to compete with other more established digital industries – what else can the industry do?

Industry needs to recognise and organise with regards to that 90% of innovations are driven by software. We are willing to broadcast more and more AUTOSAR knowledge across the University world throughout courses/projects in order to increase the availability of skilled engineers in our specific area.

Level Five is a job board first and foremost, so we must dip into your own training – how did your first degree evolve into your career?

I believe that I had a very solid study at Technical University Berlin in mathematics, mechanics, electrotechnics and control theory ending up with an engineering diploma. Thereafter I got a scholarship for studying computer science and aeronautics at MIT, Cambridge, USA ending up with an MSc degree. This led to a career for some years at Volkswagen research department before becoming responsible for electronic development for series car development.

What advice would you give to someone looking to specialise in your expert area, who should they talk to, emulate, study, read and work for?

Since the future architectures of automotive systems are seen besides as embedded software systems more and more a “mobile phone on wheels”, the partnership needs to integrate new partners from IT hardware and software vendors.

Lastly, plug time!  Tell us about what AUTOSAR offers to help people on their journey into the autonomous vehicle sector

First of all, the AUTOSAR platforms are enablers for autonomous driving. So start-up companies setting up applications and functionality for autonomous driving can use AUTOSAR platforms on which their applications can be integrated. By this they find an entry point for discussing their solutions with the automotive industry in a solid way. More information is available on www.autosar.org – particularly on becoming a member.

I highly recommend students and young professionals to gain as much practical experience as possible to find out what they are interested in. Be curious and never stop learning! Visiting specialised conferences and talks are good for this as well as keeping up-to-date with new developments in research and industry.

You want to know more about AUTOSAR? We are happy to answer your questions, either at [email protected] or at [email protected]