Robo-taxis: what they mean to the Rideshare Industry

Let me let you in on a secret. Cars that can drive themselves, autonomous vehicles (AVs), have existed for about 10 years or so.

An autonomous vehicle used commercially becomes a robotaxi. It’s an automated Uber, if you will. AV robotaxis are on the road right now, in multiple cities. This is a fact, but most are still in test with human safety drivers along for the ride.

If you live in certain cities in California, Washington, Arizona, Michigan, Texas, Nevada, Georgia or Pennsylvania, I guarantee that you have seen one already without knowing it.

Watch AV in Action

The real question is, when will you see or use one in your neighborhood? The answer to that is different for everybody, but the answer for most is, “SOON!” The future is already here, it’s just not evenly distributed.

This blog explores what’s already invented, not something that has to be invented. We’ll talk about the timing of what is going to happen and what that means to the transportation industry overall, and specifically, the rideshare business.

AV: What Is It?

  • An AV is any vehicle capable of driving from point A to point B without a human driver. That means it is a robot that has wheels.
  • It must also have a brain (AI) to make decisions about how to move from point A to point B without causing an accident. That means it is an artificially intelligent robot that has wheels.
  • To make the decisions on how to move the robot properly, it must have sensors (like our eyes and ears) to provide information to its brain (AI).

Create an AV using those big ‘pieces’, use it to move people from point A to point B for money, and you have a robotaxi. Robotaxis produce the same end result as Uber except the human is replaced with a Robot + Brain + Sensors. That seems like a big task, and it is, but the economics and efficiencies gained by eliminating the human component to get that result are huge. That’s why it’s inevitable solutions will be found for all challenges.

Sounds like The Jetsons or Blade Runner type of future stuff, doesn’t it? Well, it’s here now and happening somewhere, every day. Let’s look at each AV piece.

The Robot

The robot piece is the simplest piece of all. The basic engineering was tackled when cars were invented, it’s that simple: control ‘forward’, ‘backward’, ‘stop’, ‘left’ and ‘right’. Five actions, physically controlled by mechanics. Any high school shop class can build the mechanics.

Then, connect the mechanics to computer controls. Now you can drive the car as you would drive a remote control car. Radio Shack has what you need.

Done.

Cruise control is an example of this technology. It simply controls ‘forward’ (via the accelerator) to maintain a set speed. It needs no sensors or brain (AI) as this tech is a ‘set and forget’ engineering solution. So, controlling a car using this level of technology is super, super, simple.

The Five Sensors

Different types of sensors collect the information an AV robotaxi needs to work. It needs to know what and where the driving surface is. It has to recognize and react to lane markings, obstacles, lights, pedestrians, and other cars. Basically, anything and everything that might crash into it or that it could crash into.

Our human world is visual. We navigate, and AVs need to be able to navigate, visually. If we cannot ‘see’ what is outside the car, we can’t react to what is outside the car. Robotaxi sensors build an information picture from which the AI responds. The sensors collect geo-location data, evaluate speed, see in 360 degrees, and evaluate distance to objects near and far away.

1. GPS

Everyone knows what GPS is. It shows you how to find your way from point A to point B. Although it’s very good, it is not nearly accurate enough to drive with. But it is good enough to determine basic, high-level navigation. The AV robot/brain pieces use it just as you do, not to turn the steering wheel, but to know to where to turn to get to point B.

2. Cameras

Digital cameras are now cheap. Dirt cheap and small. Think of the cameras on the front and back of any phone. They’re spectacular and tiny! Put them all over a car and the car has 360 degree sight. We can’t do that without turning our head!

Why would that be important? Because our physical world and its transportation system is built around us seeing and interpreting VISUAL cues: streetlights, traffic lights, signs, lane markers, road signs, warning signs, barricades, a random cat running across the road, or debris flying into our path. The list is endless, but the items on the list are ALWAYS visual.

Backup cameras are a very simple example of this tech in action.

3. Ultrasonic Sensors

The little circles you find on most cars nowadays are ultrasonic sensors that sense a physical object and its distance away from the sensor. They are cheap and require very little power to run or maintenance. However, they’re near-sighted, they can only ‘see’ about 10 feet or so away. So you need a sensor that can see farther away, too.

4. Radar

Radar in AVs is just like it was when they invented it in World War II except now the radio waves are tiny and digital. It detects objects, shapes, movement, and speed of remote objects. It ‘sees’ up to about 300 ft. away and it can work around corners, under cars and such. Very cheap.

5. LiDAR

LiDAR (stands for Light Detection and Ranging) is sort of like radar but it uses lasers to reflect back to the device instead of radio waves. A laser is like a tiny dot of light. For it to receive the same information that radar gets with one ping, LiDAR needs to spin around and go up and down very fast to build its picture of information.

LiDAR must spin constantly. Those spinning things you’ve seen on the tops of self-driving cars in videos, that’s LiDAR. Very expensive and very hard to work with. Junk. Why? Because LiDAR doesn’t present better information than the other ‘eyes’ above AND it has the unfortunate feature of not seeing in any but the most perfect of conditions. It’s blind in anything that refracts, such as snow, fog, even driving rain. That’s a problem because perfect conditions in real life rarely happen. Like I said, expensive junk that’s a bad engineering solution.

The AI Brain

The robot cannot be controlled, and the environment’s information can’t be used for navigation, unless there is ‘brain’ that interprets all sensor information relative to the car and tells the robot how to react. AI, like the human brain, makes sense of what it sees, where it is, and what it needs to do next to get it to where it’s going. Then it tells the robot what to do to make that happen. Obviously, this is the hard piece. The VERY hard piece.

In fact, AI is the only really hard piece. It can not be created using what we call ‘computer software’, the kind where someone says using software code, ‘if this happens, do that.’

To handle this level of complexity, the programming must be done by an Artificial Intelligence, a new form of Computer Science that is capable of learning. Yes, AI actively learns and it can teach itself.

Unlike problems solved by standard computer software programming, for driving scenarios, the AI cannot be given a list of all possible questions and answers in advance so it can sort through the list and pick the right one. The real world’s driving Q&A list approaches infinite possibilities.

Instead, a car’s AI must be able to examine and understand everything, in every frame, 100’s of frames per second, from every camera, from every sensor, from the radar, and from the lidar. From its understanding, the AI formulates a plan of action to fit those surroundings. It makes decisions.

Also, AI must be able to make a judgement based about what to do next based on its past experience. It remembers its past decisions. Something it’s learned, just like us. If a ‘system’ cannot do this, it is not ‘autonomous.’ It may be automated, it may be computer-controlled. But, without the ability to learn, grow, and work itself out of previously unforeseen circumstances, it is not autonomous.

There are many types of Artificial Intelligence. Let’s just get these two categories of AI out of the way quickly.

Machine Learning AI

These systems are nothing more than statistics and big data. The more data you have, the better your statistics and decision-making, but these systems do not learn. They do not update themselves. They are simply a big hammer to solve a specific problem.

These systems do work for very specific problem domains. Playing chess would be a good example. Or Google’s AlphaGo project, built on the DeepMind platform using statistics and pre-defined knowledge of the game Go. AlphaGo beat every human player, right up to the #1 World Champion in 2016. It was thought that this would never happen, and impressive as it was, this particular tech could never be used to drive a car.

Machine Learning AI computer programming is cool but it is not artificially intelligent, nor will it ever be. However, the next AI version, AlphaGo Zero, will be. Read on.

Neural Networks AI

A neural network (NN), in the case of artificial neurons called artificial neural network (ANN) or simulated neural network (SNN), is an interconnected group of natural or artificial neurons that uses a mathematical or computational model for information processing. It is based on a connectionistic approach to computation. In most cases, an ANN is an adaptive system that changes its structure based on external or internal information that flows through the network.

In more practical terms, neural networks are non-linear statistical data modeling or decision making tools. They can be used to model complex relationships between inputs and outputs or to find patterns in data.

Google used a neural network when they built AlphaGo Zero. This program beat the original AlphaGo program 100 to 0 after 8 hours of learning to play. How did it learn to play Go? It taught itself, by playing itself.

AlphaGo Zero literally learned the game and created its own game logic and strategy. It didn’t learn by analyzing how the game’s played by humans, or how it’s been played for the past 1000 years. It just figured out how to beat itself.

How? Nobody knows. Nobody programmed its learning and the AI does not speak. Developers can’t look at the code to see how it did it because there is no code, there’s just a huge set of artificial neurons that form connections (by themselves, mind you!) over time to ‘learn’ patterns.

AlphaGo Zero’s computer costs about $25 million dollars and is backed by Google. Too much money to run a single car but there will be other uses.

Another example of a neural network AI is Elon Musk’s OpenAI entry into professional gaming tournaments. This computer AI played against a team of professional human gamers in a completely unscripted, open-ended video game arena. The computer AI won.

Musk’s OpenAI non-profit works on open-sourced AI projects. Musk has since left the board of this company because its purpose (open source) is not compatible with the for-profit AI being built into Telsa cars. Tesla builds its own AI and the chips on which they run, which has implications for the future of rideshare. We’ll talk about that later.

OpenAI also created another AI (GPT-2) that was too dangerous to release into the wild:

OpenAI said its new natural language model, GPT-2, was trained to predict the next word in a sample of 40 gigabytes of internet text. The end result was the system generating text that “adapts to the style and content of the conditioning text,” allowing the user to “generate realistic and coherent continuations about a topic of their choosing.” The model is a vast improvement on the first version by producing longer text with greater coherence.

The takeaway is that neural network AI can be trained to work specific problem domains: play video games, read X-Rays, impersonate the writing of an author, interact with humans as a chat bot. It can also be trained to drive a car. These AI networks are not ‘generally intelligent’, they are ‘specifically intelligent’: the AI built to play a video game cannot drive a car. We don’t have generally intelligent neural networks today although someday it’s likely we will. Still, today’s neural networks can do a single task (like drive a car) very well.

Robotaxis: The Levels

As humans, we like our stuff in piles. It is easier to understand that way. The world doesn’t actually work like that, but we do, and we need common language so we can talk to one another.

Language, particularly for emerging tech, is first created dynamically by its inventors and users. Governments then incorporate common language and definitions into legislation. For sure that will include legislation for autonomous vehicles because their operation impacts public safety.

Automotive scientists and corporations have the expertise to weigh in on this process—but keep in mind they also have a huge vested interest to be sure any legislative language applied to legal circumstances will favor them and their industries.

SAE Levels

Several years ago, the Society of Automotive Engineers (SAE) recommended grouping self-driving cars into ‘Levels.’ These are the labels you often see used and mis-used by the public, officials, and the media when they talk about ‘self-driving’ vehicles.

Autonomy tech is complex and the language describing it is muddy. The SAE dealt with the human/AI/driving scenario interactions by focusing only on the definitions of the ‘driving automation system’ and not on the vehicle itself. Key terms include:

  • Dynamic driving task (DDT), “all the real-time operational and tactile functions required to operate a vehicle in on-road traffic.”
  • Operational design domain (ODD), “the specific conditions under which a given driving automation system or feature thereof is designed to function.”
  • Automated driving system (ADS), all hardware and software that works to perform the entire DDT on a sustained basis.

The Society of Automotive Engineers (SAE) defines autonomous driving levels like this:

Level 1: This driver-assistance level means that most functions are still controlled by the driver, but a specific function (like steering or accelerating) can be done automatically by the car.

Level 2: In Level 2, at least one driver assistance system of “both steering and acceleration/ deceleration using information about the driving environment” is automated, like cruise control and lane-centering. It means that the “driver is disengaged from physically operating the vehicle by having his or her hands off the steering wheel AND foot off pedal at the same time,” according to the SAE. However, the driver must still always be ready to take control of the vehicle immediately.

Level 3: Drivers are still necessary in level 3 cars but they are able to completely shift “safety-critical functions” to the vehicle, under certain traffic or environmental conditions. The driver is still present and will intervene if necessary but is not required to monitor the situation in the same way as necessary for the previous levels.

Level 4: This is what is meant by “fully autonomous.” Level 4 vehicles are “designed to perform all safety-critical driving functions and monitor roadway conditions for an entire trip.” However, it’s important to note that this is limited to the “operational design domain (ODD)” of the vehicle—meaning it does not cover every driving scenario.

Level 5: This refers to a fully-autonomous system that expects the vehicle’s autonomous performance to equal that of a human driver, in every driving scenario. This includes all roadway and environmental conditions that can be managed by a human driver, which would include extreme environments.

There is no practical use of these labels, but, they’ve been a place from which conversations start and work begins. Eventually, we’ll end up with what legislators, marketing departments, media, and the public want—standard definitions that accurately describe what our AVs do in real life.

Limitations of the SAE Levels

Driving context is everything. Technology is what it is and it is defined by and within a given context. In this case, it’s the complete set of all REAL LIFE driving situations, not only the ODD’s “specific conditions under which a given driving automation system or feature thereof is designed to function.”

A Level 2 car may be Level 2 on the freeway but it’s not Level 2 when you’re driving around in a parking lot. A Level 5 car operating in a tunnel may only operate as a Level 3 on a city street. So when is a Level 2 car Level 2, or a Level 5 car Level 5?

You get where I’m going with this. Because a car cannot correctly be labeled as operating at a specific, single ‘level,’ the Level labels shouldn’t be used. They rarely add to a discussion and often take comments off into the weeds.

Vehicles will always have an operating limit for a given driving context. The SAE Level definitions are so misused by so many because there’s no practical way to incorporate an operating limit and describe every driving scenario as context into the SAE definitions.

Without context, what’s the point of using any level label to describe a car’s capability? Anticipating regulation, how do these labels remove complexity? Address problems and solutions? Describe responsibilities and liabilities? At this point, for the situations in which the words are used, the Levels are useless and meaningless.

There’s another, different problem with the Level 5 definition. The SAE standard for all Levels is that the vehicle’s performance must be equal to that of a human driver’s performance handling the vehicle in the driving scenario defined for that level. Only in Level 5, ‘performance’ means handling every driving scenario.

In Level 5, the human driver as the operational interface is replaced by the AI-driven ‘driving automation system’ but ‘every’ driving situation (“all roadway and environmental conditions”) is not described and incorporated into the Level definition, and neither are the human ‘performance’ standards established for every driving situation.

Reality is, SAE Level 5 as defined above will never be reached. It’s not reached by humans today (there’s no valid definition of human driver performance under “all roadway and environmental conditions.”) But that’s okay, we don’t need to reach sci-fi Level 5. What we need is Level 5-type cars for specific types of driving context. Can that be done? Sure. It’s been done already.

Back to robotaxis. Forget about Levels. The only rating a robotaxi should have is the relative strength rating of its Artificial Intelligence. Period. Nothing else matters.

Robotaxis: Where Are They?

I am sure you are thinking this stuff is all in the future. Am I right?

Well, here are all the states that, at some level, have laws that allow a driver-less car. You will notice that the majority are actively looking at the issues and this map is ‘old’ (May 2018). All are struggling with language, terms, and definitions.

Check out the article, The state of self-driving cars across the US. California leads the way but by no means are they alone. Do you think anyone involved with this tech is waiting for this to be legal? Where do you live?

On April 2, California expanded its testing rules to allow for remote monitoring instead of a safety driver inside the vehicle. Waymo and another company have since applied to begin testing vehicles without drivers in the state. Yes, it’s happening. 

Robotaxis: Who Are They?

Now this is where things get real. Companies are pouring BILLIONS into this technology. They expect to get their money back. Some will, some won’t. The key is, ‘some will.’ Companies do not invest BILLIONS unless they expect BILLIONS PLUS back.

The above graphic illustrates companies that make parts for robotaxis. Some even build robotaxis.

But, the only vehicles that will ever become actual robotaxis will be those that have enough AI and sensors to make them run safely and economically. The AV must be a self-sufficient automated car (robot) that navigates on its own (using AI), in a prescribed environment (however that’s defined), at a cost that works for a profit-making enterprise.

That list of competitors is much, much smaller.

The Real Robotaxi Players

Google (aka Waymo). Everybody has heard of them. They have the AI part down. They have sensors and they are proponents of LiDAR. They actually sell LiDAR systems. Not many, but some. But, they do not make or sell cars. They have cars they have modified, but there is no such thing as a ‘Waymo car’ other than what Waymo owns. In other words, they can’t make money with their technology as it is other than by selling rides.

Waymo currently runs only small systems. Can they make a profit or grow? Probably not, or at least not very soon, but they are valued at $100 Billion dollars today. A Waymo-equipped car costs about $250,000 or more to put together. They may sell their technology to someone else on this list, but as of today, nope, just rideshare is how they’re planning on making money. (Then again, it’s Google…)

General Motors. They bought a company called Cruise Automation for $1 Billion dollars in 2016. In 2019, they are valued at $19 Billion dollars (1/2 of GM’s entire market cap). They have never sold a single AV car, but they do sell rides and have working cars in 2019. The cars they use are estimated to cost about $200,000 or more today. That’s basically their Chevy Bolt with $170,000 dollars worth of crap glued onto it.

“‘Today, we’re announcing the first production design of a self-driving car that can be built at massive scale,” Vogt said. “And more importantly, these vehicles can operate without a driver.’

“That means they have all the components in place, that “when the software’s ready,” Vogt added, they can remove the drivers and operate safely on roads.”

That was 2 years ago. They are not anywhere closer to going live. Reason? “When the software’s ready.” That is the AI part that they do not have. Hardware = Easy. Software = Hard.

Ford. Missed the boat entirely. So, they bought a little company called Argo AI for $1 Billion dollars to build a system for them.

Argo was literally yanked out of obscurity by the Ford deal, but there are dozens (if not more) AI / auto startups still wallowing in obscurity. “Artificial intelligence will be an essential player in autonomous vehicles of the future,” said Michelle Krebs, executive analyst with Cox Automotive’s Autotrader. “That’s why automakers like General Motors, Toyota, and Ford are snapping up companies with AI expertise.”

The company was 6 months old at the time (2017). Nothing material has been heard or seen from them since. They don’t have any cars, or even prototypes, yet. I think they are still a long, long, way off from having anything. But, VW has now thrown more money into that pot: 2.6 Billion. In 2019, Argo AI is valued at $7 Billion dollars.

Intel. The world’s largest chipmaker missed the boat, too. They bought an Israeli company called Mobileye for $15 Billion dollars to get into the game. They plan on having a Level 5 system ready by 2020. Not a car, a system. One that car manufacturers will buy. That might work, we will see. At least both Intel and Mobileye are tech companies working on tech.

Baidu/Nvidia. These are two enormous tech companies from China that joined forces and are now combining to form an alliance with 50 other partners to build the Apollo system.

Speaking at Baidu Create in Beijing this week, company group president and COO Qi Lu likened Apollo to the Android operating system, noting Baidu wants to build a “collaborative ecosystem” to fast-track the technology’s progress.

“Apollo is an important milestone for the automotive industry,” Lu said. “It is in essence the Android of the autonomous driving industry, but more open and more powerful.”

“Apollo is an open platform that allows access to the technology behind its autonomous vehicles. It is expected to support all major features and functions of an autonomous vehicle, including cloud services and an open software stack, reference hardware and vehicle platforms, and tools to support various functions such as obstacle perception, trajectory planning, and vehicle control.”

Almost everyone on the graphic above not specifically mentioned here is joining this group. None of them have AI experience. They plan on using high precision maps to deliver autonomy. Technically, that has failure written all over it. Target date is 2030. That will be about 10 years too late, if they even come up with something. These guys are throwing money down a hole but it’s the only way they can say they’re in the game.

VW, Hyundai, & Others. They, too, missed the boat but rather than buying their way in, they are collaborating with a company called Aurora. Well, VW has since left the partnership, Fiat Chrysler jumped in.

The 17-month-old Silicon Valley-based startup first tested its self-driving vehicles on public roads in September 2017. By year’s end, two of the world’s largest automakers—VW and Hyundai—had seen enough to select Aurora to provide their self-driving car software.

Aurora, a startup developing software for self-driving cars, had barely launched when it found its place among the biggest names in the car industry.

Aurora expects that their sensors and software will be in vehicles and launch by 2021. The company is founded by ex-Tesla and ex-Waymo engineers. One might ask why they are ex-anything.

Lyft. The rideshare companies want the AV technology, too. Badly. It is existential for them, even more so than the car companies.

Lyft took a $500 million investment from GM last year, and the two companies will reportedly add thousands of self-driving Bolts (probably with safety drivers at first) to the Lyft network in 2018. It has signed a deal with Waymo to test self-driving cars on its network, too.

They do not have AV expertise, so they are partnering with anybody and everybody. Specifically, Drive.AI (since bankrupt), nu Tonomy (an MIT spin-out), and others. The longest of long shots to be standing when the dust settles.

Of course, as we’ve seen, GM and Waymo may try to launch their own ride-sharing networks. That’s why deals with smaller companies are important to Lyft’s strategy. By working with smaller companies that don’t have strong brands of their own, Lyft ensures it will have a strong offering even if the big guys pull out of Lyft’s network. And it’s possible that one of those startups will beat Waymo, GM, and other major players to market.

Uber. They’re working on AV robotaxis and related tech (Uber Elevate) like their life depends on it, because it does.

But they are Uber and they have, and they will, screw this up. They have rolled out cars before they were ready, killed people, been sued by Waymo (Uber lost), and are now banned in many states from even testing their AVs. Nobody knows how much they are spending on their projects but they lost $5 Billion dollars in 2019 Q2 alone. You figure it out.

Uber previously made one of the first large commercial purchase of cars intended for an automated fleet. On November 20 of last year, Uber committed to buying 24,000 Volvo XC90 SUVs that will be delivered between 2019 and 2021. (Notably, Volvo itself plans to have an automated fleet of its own on the ground in 2021.)

As impressive as that number sounds, though, you shouldn’t expect automated cars to start completely replacing human Uber drivers within the next few years. At the time of writing, Uber still employs around 2.5 million human drivers worldwide.

Even so, current Uber CEO Dara Khosrowshahi expects the company will move beyond testing quickly, as he said in early 2018 that he expects Uber self-driving cars will be available to the public in as little as 18 months.

If Uber has its way, theirs are the only self-driving cars you’ll see. In a recent interview, Uber seemed to suggest that it wants to prevent individuals from owning self-driving cars in urban environments to better keep track of emissions and adhere to regulations.

Yeah, that’s called a monopoly. You know, like the taxi industry.

Tesla. Last, but certainly not least. Let’s bullet this because there’s a lot to unpack.

  • Tesla builds cars, so they have that covered.
  • All cars built are ready to become robotaxis, today. Each car has:
    • A full set of sensors (no LiDAR, Tesla-built ultrasonic sensors).
    • A Tesla-built super-computer, built specifically to run their AI.
    • A Tesla-built processor chip for processing sensor data, built specifically for their AI. (The Tesla chip for driving AI is better than any other chip technology available on the market.)
  • Tesla has AI experience second to none.
    • The Tesla Network platform software (like Uber’s but better) to roll out when they are ready to sell rides.
    • Every car made is already Internet-connected, at all times.
  • Tesla gathers data from all Teslas everywhere, in real-time.
    • Every car’s AI is learning to drive either by driving or by watching other Teslas drive.
    • Tesla invented Dojo AI, which trains the car’s AI how to drive.
    • Tesla’s created a database with 100x more real driving data than collected by any other source.
  • Every car’s AI can be updated at any time, over the air (WiFi).
    • Allows rapid development and then distribution of updates.

This means every car’s Tesla AI is learning at an exponential rate, from real world driving events, experienced not only by that car’s driver but also every other Tesla on the road. A mind-boggling competitive advantage.

You can buy one of these cars, fully equipped and ready for self-driving to be turned on, for about $50,000 USD. Today. That is simply a miracle. But wait, there’s more.

Tesla is the only company that controls the hardware and software stack from top to bottom. They do not need to partner with anyone, for anything.

Then there’s insurance, which everyone will need. Tesla plans to handle their own, and with the driving data they own, they should hit the actuarial marks better than any existing insurance company.

Musk says they’re going to roll out robotaxis as a service in 2020. I believe they will try for that target but may not make it. However, 2021 seems pretty certain. Drive a Tesla Model 3 today and you’ll see why that seems likely. Even if you don’t believe robotaxis will be here by then, drive one if you’re driving rideshare. I explain why here.

Robotaxis: When Are They Coming?

In this blog so far, we’ve established that robotaxis are coming or in some sense, are already here. We’ve established that they will be allowed to be on our roads and towns. What we don’t know for sure is when someone will be making money running them: Automakers Are Rethinking the Timetable for Fully Autonomous Cars.

The claims range from 2019 to 2030. Ten years seems too long to me. But, every company has announced that their plans are slipping. Except for one.

Two paths to Autonomy. Who gets there first?

Questions about the technology’s future reached full public view in April 2019, when Ford Motor Co. CEO Jim Hackett acknowledged what had already become painfully obvious to much of the engineering community. “We overestimated the arrival of autonomous vehicles,” Hackett was quoted as saying by numerous news outlets. “Its applications will be narrow, what we call geo-fenced, because the problem is so complex.”

Yes, this should have been a NO DUH realization for Ford at the beginning. Only now do they recognize the challenge is not the hardware!

Perhaps the biggest technical obstacle, however, is converting human understanding into robotic intelligence. The intelligence that enables human beings to drive a car is largely taken for granted, and replicating it is proving to be a bigger chore than engineers foresaw.

Big Business—once they woke up and recognized the disruptions Musk created carry real impact to their bottom line—thought they could just bully or buy their way to a solution. They just may, someday, just not today.

“The auto industry is getting scared, like when you’re practicing for something and then you have to get on stage and actually do it,” noted Mike Ramsey, senior director and automotive analyst for Gartner, Inc. “And then you suddenly realize, ‘Maybe I’m not as prepared as I thought I was.'”

This angst is not shared by all. But Tesla is the LONE outlier here.

Tesla Inc. CEO Elon Musk has maintained his belief that his company will have full autonomy in 2020. “My guess as to when we think it’s safe for somebody to essentially fall asleep and wake up at the destination – probably towards the end of next year,” he said in a February podcast. More recently, he doubled down on that statement, saying he plans to have more than a million robo-taxis on the road in 2020. The key, he said, is the fact that Tesla can more effectively test its autonomous driving technology because it accumulates “100 times more miles per day than everyone else.”

And finally, in spite of the challenges…

Still, virtually every automaker and supplier is forging ahead at full throttle. “It’s inevitable,” Sellars told us. “It’s going to happen. The only question is how long it will be before we can walk into a dealership and buy a Level 5 car.”

The Future of Rideshare

I think we will have the first robotaxis (AVs making money) running around somewhere in the world in 2020. That’s next year. They won’t be everywhere, but they will be somewhere. Maybe just in a tunnel or a loop. Maybe on one street. Does not matter. Once robotaxis are out in the wild they’ll multiply and the market will grow, just like Uber did at the beginning. It won’t take long for the rideshare paradigm to shift.

Loop is a high-speed underground public transportation system in which passengers are transported via compatible autonomous electric vehicles (AEVs) at up to 155 miles per hour. See FAQ for additional information.

The transformation will happen faster if there’s more than one player in the game. The determinant remains, who has the AI and the driving data to realistically go to market?

The 3 Real Players

Only three companies have real prospects: GM, Waymo, and Tesla.

The others are too far behind and literally have no chance at all to make it to market. They just don’t have the cars, tools, software, and platforms ready to go and there’s no time left to catch up. ‘The others’ is every other car maker on the planet. Market Disruption with a capital ‘D.’

I would bet that GM and Waymo merge. Waymo is worth more than GM, so eventually, GM could become a Google subsidiary. GM owns a chunk of Lyft, so that gets them a rideshare platform thrown in, too. Wow.

Lyft by itself is too small. They’ll end up being a buyer of cars with which they’ll run their fleets. Or, they’ll be absorbed.

Uber is larger than GM, Ford, Chrysler and all taxi companies put together. They could buy GM as well to get in the game.

Or, Apple could buy GM. They could buy five GM-sized companies with cash! GM is a much easier buy than Tesla, so it seems GM gets bought one way or the other.

Someone may try and buy Tesla but I don’t see that ever happening. Elon would never sell it, or if he did, it would be at an astronomical price, like 400-500 Billion dollars, maybe more.

So, who has the AI and the driving data to realistically go to market?

Tesla. They will be first in this endeavor, but they are small. They may have trouble rolling out enough cars to secure the market. We’ll see.

They do get a million-car head start—exactly why Musk created the Tesla Network and is giving Tesla owners the opportunity to join in the game. He’s counting on the revenue incentive to motivate Tesla owners to sign up and put their cars to work when they don’t need them.

‘Partnering’ with his existing Tesla owners in a win-win scenario not only helps him quickly scale to serve the rideshare market, it also brings in new buyers. Once it’s clear the revenue numbers are there, a Tesla’s value (and purchase price) goes up. When the system’s ready, it all goes live with a flip of a switch.

The Future of Rideshare Driving

Impact on Rideshare Companies

Uber or Lyft are betting they can stop losing money and turn a profit once they get to keep the whole fare pie, not just 30% of it. The cost of each robotaxi ($200,000 or more) and all operating costs (now paid by drivers) has to come out of this new 70% they get to keep. They’ll make money. Unlike a driver, the robotaxi doesn’t need to make minimum wage. Just a profit.

But, Uber and Lyft may not have a play unless they buy a car company and an AI company. They have a rideshare app. They have a lot of money.

They can’t buy Teslas because Tesla doesn’t allow their cars to be used by competing platforms and Elon won’t sell them cars because he plans to run his own fleet. But trust that Uber and Lyft will buy as many of whatever cars they can, because each car they buy will be one less driver to manage and split revenue with. They have stated this publicly.

Uber partnered with Volvo in 2016 and Volvos aren’t cheap. Neither is the LiDAR-based tech solution the companies are pursuing together.

“Volvo and Uber said in 2017 that the rideshare company planned to buy up to 24,000 self-driving cars from Volvo from 2019 to 2021 using the self-driving system developed by Uber’s Advanced Technologies Group. An Uber spokeswoman said Tuesday that the company plans “to work with Volvo on tens of thousands of vehicles in the future.”

To the heavy vehicle cost, add all vehicle operating costs that are now paid for, and actively managed by, the rideshare drivers. Fuel (electricity should be cheaper than gas), insurance (will be higher than current driver rates), licensing (will be higher than current driver averages), maintenance (will be higher running 24 x 7 x 365), and repairs (will be more costly because of the embedded tech—in fender benders or worse—plus a car out of service is lost revenue).

Somebody has to take on the driver’s role and oversee the daily management of the vehicle, but it won’t be a 1:1 driver-to-car relationship. There have to be staff and places for the AV to go to, where it’s charged, cleaned, and scheduled for maintenance and repairs. This is a operations reality for car manufacturers and everyone diving into this market.

Tesla’s situation and business approach is strategically different. It’s taking a percent of fares, probably at 30%, and essentially creating guaranteed revenue for itself. For that cut, it provides the Tesla Network platform connecting cars and customers (as Uber does now) and will likely include insurance (which Uber does not).

The major difference with this business model is that unlike all other major players, Tesla has no initial vehicle costs to service. The Tesla owners create the fleet inventory FOR FREE. Okay, for a piece of the earnings action, but that’s determined by after-the-fact revenue earned and then paid, it’s not an up front expense. Perhaps the fleet includes some Tesla-owned units, but as a group, the owners are effectively subsidizing and scaling the entire business model, ready on Day 1, at go live. They are committed, silent partners.

Tesla still maintains sole control of the business model and sets prices (as Uber does). But rather than cutting fares, they may be able to maintain current fare levels or command premium prices because the ride is in a Tesla. Customers may be happy to pay for the Tesla difference. Tesla AI is the most experienced and it’s rated as the safest car on the road.

If Tesla doesn’t take advantage of the owners, delivers a working system, and keeps the numbers favorable for all parties (let’s throw in fair fares to customers), it should work. Brilliantly.

Tesla owners will give up exclusive use of their car and learn to view their relationship with their car differently, but only if their net revenue is worth their time, effort, and sacrifice. Unlike Uber drivers who often have few other income-producing options, Tesla owners are unlikely to play if their asset contribution and maintenance efforts are effectively valued at minimum wage.

There is a history lesson here. This is a very similar approach to what Uber did with its rideshare drivers to supply and launch its version of a taxi service. Drivers provided ‘free’ labor (valued and paid for as a percent of fares collected, independent of driver time) and absorbed all vehicle-related costs and management tasks. Uber provided the platform and absorbed all related operating costs. Unfortunately, Uber sees drivers more as resources to be used (and used up) than silent partners essential to building the business for mutual gain. I doubt Tesla will make that mistake.

Impact on Rideshare Drivers

So, what becomes of rideshare as a business (with you as the driver) in, say the 2021-2025 time frame as these systems are rolled out?

It will continue as it does today, in virtually every place it is now, but rideshare using a driver will be on its way out. Just like taxis are now.

Uber and other rideshare platforms will still exist but drivers are going to have an even tougher time making their numbers work: increased competition and lower fares are headed your way.

The increased competition won’t be coming from new rideshare companies showing up and diluting your local market or from rideshare partners offering huge sign-up initiatives to bring on new drivers. It will be from robotaxis hitting your neighborhood streets, 24 x 7, 365 days a year. It will hit the drivers driving the tough shifts the hardest, because you’ve lost your edge over most humans: robotaxis won’t care that it’s 2AM.

Expect fare price wars to intensify. Initially, the AV robotaxi fleet owners will lure customers to this newfangled rideshare option with lower fares. Standard introductory marketing practice, right? At least until the monopolies are established.

There are other parallels to Uber as a start-up.

Are you thinking there will be a lot of rider resistance and nobody will want to get into an AV robotaxi? Consider this. Maybe not having a rideshare driver is a ‘plus,’ not some big, scary ‘minus.’ The new AV rideshare option carries a cool factor with its new tech in a new high-end car, and assures improved passenger safety—from fewer on-the-road accidents to never having to deal with the random, dodgy rideshare driver.

Sound familiar? Remember how Uber disrupted the taxi monopolies in every city they went into? Remember how nobody was going to get into a car with a stranger 10 years ago? And yet they did. It’s ten years later and people of all ages, everywhere, jump in— by the 100 millions!

Robotaxis: So When Can I Buy One?

The answer to that is both Today and Never.

Today you can buy a Tesla that promises full autonomy in the near future.

Tesla will let you jump into the rideshare business with your Tesla AV, but only on their rideshare network, the Tesla Network. How long they will let you play in the game? Well, that’s another question.

It seems likely Ford will never sell you an AV. GM won’t either. Neither will any other AV car manufacturer. Ever.

The companies that can build the AVs themselves are thinking they’ll make more money running their own robotaxi fleets than from selling a car to you. Uber’s goal alone is to replace its 2.5 million drivers, so that is 2.5 million cars.

For the next few years, AVs will be expensive to build and in limited supply. We’re talking about cars that have not been built yet, from AV factories that have not been built yet (except one). High buyer competition means AVs will be very expensive based on simple supply/demand… for a long time.

As a private buyer, you’ll be competing with all kinds of corporate car buyers who have the means to create partnerships and snatch up every robotaxi manufactured. How many AVs will be available for private purchase? At what price?

Car price will adjust for the new factor that impacts vehicle value: these cars are assets that can produce revenue. That income producing opportunity will show up in the car’s price. Musk estimated the net present value of a Tesla robotaxi at $200,000 even though today it’s priced at about $50,000. That $200,000 valuation is less than what Waymo, Uber and others have actually paid for for their AVs to date. Think the price of a Model 3 will stay there?

Also consider this. If a manufacturer sells you their AV, it will be for your personal use only. You won’t be allowed to use it as a AV robotaxi car on an existing rideshare platform. Uber won’t let you, neither will Lyft, or any other existing rideshare provider. They want to run their own AV robotaxi fleets and they don’t want to share the revenue. You’re competition.

And here’s the worst of it. For those of you driving rideshare now, as soon as the market supports AV-only rideshare options, they won’t want you driving your car on their platform. One way or another, drivers will be eliminated from the existing driver/rideshare business model. It was and always has been about the money.

You, my friend, will be left out of the rideshare future as AV robotaxis disrupt the industry.

Uber/Lyft will be replacing you. Rideshare drivers are not part of any equation going forward unless, perhaps, you’re servicing a niche market that requires a human interface—not to drive, but possibly to assist a passenger or delivery.

What’s the advice from Lyft COO, Jon McNeill, to you? Become a mechanic that fixes AVs. They’ll need a lot of maintenance from running 24 x 7 x 365. He likens your plight to that of the telephone switchboard operators in the 1970s who were necessary to make long distance calls, until digital switches replaced them. Software and hardware systems are taking your job away.

To date, you’ve been the majority revenue sharer. You will continue to be until AV robotaxis are viable and they take over the rideshare market. Then, you’ll be eliminated from the revenue side of the equation because the big companies can make more money using robots instead of paying you. You’ll be allowed to be part of the cost side of the equation if you want to be, as a mechanic or some other service support or vendor. But that’s it.

Your Gig Is Going Away

You know ex-taxi drivers lived this scenario firsthand as Uber’s disruptive tech demolished the taxi industry monopoly. Now you know the robotaxis are real and coming to your town, just as Uber did. It is inevitable.

Only one car manufacturer has any intention to sell an autonomous vehicle to the public to be used as a robotaxi: Tesla. That is true today and while I hope it will be true forever, once Tesla hits scale, they may decide it’s more profitable for them to simply run every AV car they make, in their own fleet.

So, slow to roll out? Sure, but regardless of your definition of ‘slow’, AV robotaxis are coming. If you live in the first cities where they arrive, the change will seem instantaneous. If you live in the Outback, it will seem like an eternity.

Regulations will allow it. They already do.

Insurance will cover it. At least one already does.

Car manufacturers will probably always sell you a car, just not a fully autonomous one that competes with them in the rideshare market. It will take you where you want to go, you just can’t make any money with it. Its cost for ‘private use only’ would be exorbitant.

Then, there will come the day when you won’t be allowed to drive your car manually, even if you want to. For your own (and others) safety, of course, and that part will be true. The SAE standard won’t be based on what humans can do, it will be based on what AI can do. And ultimately AI will drive better than humans can, statistically, under all conditions.

Isn’t it interesting how, if these scenarios play out this way, they reinforce the current trend in urban areas away from private car ownership and toward more use of rideshare services? Because of greater convenience and lower cost. But…

NO AV Robotaxis for YOU!

Ride in one, sure, own one, at some point it’s not gonna happen. Maybe sooner than we think. Not because of government, or regulations, or insurance. Nope. Because of the monopolies that build and own the robots.

Plan accordingly.

You may also like...

Popular Posts